Back to Librealsense

Readme

readme.md

2.57.715.9 KB
Original Source
<p align="center"> <!-- Light mode --> <!-- Dark mode --> </p> <p align="center">RealSense SDK 2.0 is a cross-platform library for RealSense depth cameras. The SDK allows depth and color streaming, and provides intrinsic and extrinsic calibration information.</p> <p align="center"> <a href="https://www.apache.org/licenses/LICENSE-2.0"></a> <a href="https://github.com/realsenseai/librealsense/releases/latest"></a> <a href="https://github.com/realsenseai/librealsense/compare/master...development"></a> <a href="https://github.com/realsenseai/librealsense/issues"></a> <a href="https://github.com/realsenseai/librealsense/actions/workflows/buildsCI.yaml?query=branch%3Adevelopment"></a> <a href="https://github.com/realsenseai/librealsense/network/members"></a> </p>

Important Notice

We are happy to announce that the RealSense GitHub repositories have been successfully migrated to the RealSenseAI organization. Please make sure to update your links to the new RealSenseAI organization for both cloning the repositories and accessing specific files within them.

https://github.com/IntelRealSense/librealsense --> https://github.com/realsenseai/librealsense

Note: A redirection from the previous name IntelRealSense is currently in place, but we cannot guarantee how long it will remain active. We recommend that all users update their references to point to the new GitHub location.

Branch Policy

We have updated our branch policy: From now on, we will also push beta releases to the master branch, so users can access up-to-date code and features. In the near future, beta binaries will also be pushed to public distribution servers (e.g., APT). The last validated official release can be found on our Releases page on GitHub.

Use Cases

Below are some of the many real-world applications powered by RealSense technology:

RoboticsDepth Sensing3D Scanning
<a href="https://realsenseai.com/case-studies/?capability_application=autonomous-mobile-robots"></a><a href="https://realsenseai.com/case-studies/?q=/case-studies&"></a><a href="https://realsenseai.com/case-studies/?capability_application=autonomous-mobile-robots"></a>
DronesSkeletal and People TrackingFacial Authentication
<a href="https://realsenseai.com/case-studies/?q=/case-studies&"></a><a href="https://realsenseai.com/case-studies/?capability_application=monitoring-and-tracking"></a><a href="https://realsenseai.com/case-studies/?capability_application=biometrics"></a>

Why RealSense?

  • High-resolution color and depth at close and long ranges
  • Open source SDK with rich examples and wrappers (Python, ROS, C#, Unity and more...)
  • Active developer community and defacto-standard 3D stereo camera for robotics
  • Cross-platform support: Windows, Linux, macOS, Android, and Docker

Product Line

RealSense stereo depth products use stereo vision to calculate depth, providing high-quality performance in various lighting and environmental conditions.

Here are some examples of the supported models:

ProductImageDescription
D555 PoEThe RealSense™ Depth Camera D555 introduces Power over Ethernet (PoE) interface on chip, expanding our portfolio of USB and GMSL/FAKRA products.
D457 GMSL/FAKRAThe RealSense™ Depth Camera D457 is our first GMSL/FAKRA high bandwidth stereo camera. The D457 has an IP65 grade enclosure protecting it from dust ingress and projected water.
D455The RealSense D455 is a long-range stereo depth camera with a 95 mm baseline, global-shutter depth sensors, an RGB sensor, and a built-in IMU, delivering accurate depth at distances up to 10 m..
D435ifThe D435if is one of RealSense™ Depth Camera with IR pass filter family expanding our portfolio targeting the growing robotic market. The D400f family utilizes an IR pass filter to enhance depth quality and performance range in many robotic environments.
D405The RealSense™ Depth Camera D405 is a short-range stereo camera providing sub-millimeter accuracy for your close-range computer vision needs.

🛍️ Explore more stereo products

Getting Started

Start developing with RealSense in minutes using either method below.

1️. Precompiled SDK

This is the best option if you want to plug in your camera and get started right away.

  1. Download the latest SDK bundle from the Releases page.
  2. Connect your RealSense camera.
  3. Run the included tools:

Setup Guides - precompiled SDK

<a href="./doc/distribution_linux.md"></a> <a href="./doc/distribution_windows.md"></a>

Note: For minor releases, we publish Debian packages as release artifacts that you can download and install directly.

2️. Build from Source

For a more custom installation, follow these steps to build the SDK from source.

  1. Clone the repository and create a build directory:
    bash
    git clone https://github.com/realsenseai/librealsense.git
    cd librealsense
    mkdir build && cd build
    
  2. Run CMake to configure the build:
    bash
    cmake ..
    
  3. Build the project:
    bash
    cmake --build .
    

Setup Guides - build from source

<a href="./doc/installation.md"></a> <a href="./doc/installation_jetson.md"></a> <a href="./doc/installation_windows.md"></a> <a href="./doc/installation_osx.md"></a>

Python Packages

Which should I use?

  • Stable: pyrealsense2 — validated releases aligned with SDK tags (Recommended for most users).
  • Beta: pyrealsense2-beta — fresher builds for early access and testing. Expect faster updates.

Install

bash
pip install pyrealsense2 # Stable
pip install pyrealsense2-beta # Beta

Both packages import as pyrealsense2. Install only one at a time.

Ready to Hack!

Our library offers a high level API for using RealSense depth cameras (in addition to lower level ones). The following snippets show how to start streaming frames and extracting the depth value of a pixel:

C++

cpp
#include <librealsense2/rs.hpp>
#include <iostream>

int main() {
    rs2::pipeline p;                 // Top-level API for streaming & processing frames
    p.start();                       // Configure and start the pipeline

    while (true) {
        rs2::frameset frames = p.wait_for_frames();        // Block until frames arrive
        rs2::depth_frame depth = frames.get_depth_frame(); // Get depth frame
        if (!depth) continue;

        int w = depth.get_width(), h = depth.get_height();
        float dist = depth.get_distance(w/2, h/2);         // Distance to center pixel
        std::cout << "The camera is facing an object " << dist << " meters away\r";
    }
}

Python

python
import pyrealsense2 as rs

pipeline = rs.pipeline() # Create a pipeline
pipeline.start() # Start streaming

try:
    while True:
        frames = pipeline.wait_for_frames()
        depth_frame = frames.get_depth_frame()
        if not depth_frame:
            continue

        width, height = depth_frame.get_width(), depth_frame.get_height()
        dist = depth_frame.get_distance(width // 2, height // 2)
        print(f"The camera is facing an object {dist:.3f} meters away", end="\r")

finally:
    pipeline.stop() # Stop streaming

For more information on the library, please follow our examples or tools, and read the documentation to learn more.

Supported Platforms

Operating Systems and Platforms

UbuntuWindowsmacOS High SierraJetsonRaspberry Pi
<div align="center"><a href="https://dev.realsenseai.com/docs/compiling-librealsense-for-linux-ubuntu-guide"></a></div><div align="center"><a href="https://dev.realsenseai.com/docs/compiling-librealsense-for-windows-guide"></a></div><div align="center"><a href="https://dev.realsenseai.com/docs/macos-installation-for-intel-realsense-sdk"><picture><source media="(prefers-color-scheme: dark)" srcset="https://librealsense.realsenseai.com/readme-media/apple-dark.png"></picture></a></div><div align="center"><a href="https://dev.realsenseai.com/docs/nvidia-jetson-tx2-installation"></a></div><div align="center"><a href="https://dev.realsenseai.com/docs/using-depth-camera-with-raspberry-pi-3"></a></div>

Programming Languages and Wrappers

C++CC#PythonROS 2Rest API
<div align="center"><a href="https://dev.realsenseai.com/docs/code-samples"></a></div><div align="center"><a href="https://dev.realsenseai.com/docs/code-samples"></a></div><div align="center"><a href="https://dev.realsenseai.com/docs/csharp-wrapper"></a></div><div align="center"><a href="https://dev.realsenseai.com/docs/python2"></a></div><div align="center"><a href="https://dev.realsenseai.com/docs/ros2-wrapper"><picture><source media="(prefers-color-scheme: dark)" srcset="https://librealsense.realsenseai.com/readme-media/ros2-dark.png"></picture></a></div><div align="center"><a href="https://github.com/realsenseai/librealsense/blob/development/wrappers/rest-api/README.md"></a></div>

For more platforms and wrappers look over here.

Full feature support varies by platform – refer to the release notes for details.

Community & Support

🔎 Looking for legacy devices (F200, R200, LR200, ZR300)? Visit the legacy release.


<p align="center"> You can find us at </p> <p align="center"> <a href="https://github.com/realsenseai" target="_blank" aria-label="GitHub"><picture><source media="(prefers-color-scheme: dark)" srcset="https://librealsense.realsenseai.com/readme-media/github_light.PNG"></picture></a> &nbsp;&nbsp;&nbsp; <a href="https://x.com/RealSenseai" target="_blank" aria-label="X (Twitter)"></a> &nbsp;&nbsp;&nbsp; <a href="https://www.youtube.com/@RealSenseai" target="_blank" aria-label="YouTube"></a> &nbsp;&nbsp;&nbsp; <a href="https://www.linkedin.com/company/realsenseai?trk=similar-pages" target="_blank" aria-label="LinkedIn"></a> &nbsp;&nbsp;&nbsp; <a href="https://realsenseai.com/" target="_blank" aria-label="Community"></a> </p>