DepthAnything

class grid.model.perception.depth.depth_anything.DepthAnything(*args, **kwargs)

DepthAnything: Depth Estimation Model

This class implements a wrapper for the DepthAnything model, which estimates depth maps from RGB images. The model configurations are defined based on different encoder types.

Credits:

https://github.com/LiheYoung/Depth-Anything

License:

This code is licensed under the Apache 2.0 License.

__init__()

Initialize the DepthAnything model with the specified encoder configuration.

The encoder can be one of 'vitb' or 'vitl'. The model is loaded onto the GPU if available, otherwise it defaults to the CPU.

Return type:

None

run(rgbimage)

Runs the DepthAnything depth estimation model on the given RGB image.

Parameters:

rgbimage (np.ndarray) -- The input RGB image.

Returns:

The depth map generated by the DepthAnything model.

Return type:

np.ndarray

Example

>>> img = np.random.randint(0, 255, (256, 256, 3)).astype(np.uint8)
>>> dam = DepthAnything()
>>> depth = dam.run(img)
>>> print(depth.shape)
(256, 256)