Visual navigation system from Slightech

MYNT EYE captures the environment with depth, location, and motion, creating a large 3D map to measure how an object interacts and moves through space.
Slightech announced the MYNT EYE stereo camera-based navigation system, which is a stereo vision camera designed for autonomous navigation, object tracking, and depth sensing. The ready-to-use visual solution applies to robots, unmanned aerial vehicles (UAVs), and drones.
Powered by Nvidia Jetson, the solution has a high frame-per-second (FPS) rate with an excellent resolution, global shutter, and minimum latency. At 60 FPS, the global shutter and high resolution minimize image distortion while the camera is moving, thereby increasing the accuracy and responsiveness of drone and robot navigation. By scanning the entire area of the image simultaneously, the system provides a scaled map with accurate positioning. This prevents skewed images that can occur when a rolling shutter camera takes pictures while in movement.
With low latency and onboard inertial measurement unit (IMU), the solution continues to track the orientation, location, and motion, increasing the accuracy of the vSLAM (Visual Simultaneous Localization and Mapping) algorithm.
For more information, visit https://slightech.com.