πŸ‘οΈ Vision SystemΒΆ

The tatbot system uses a combination of Amcrest IP cameras for wide-angle scene coverage and Intel Realsense depth cameras mounted on the robot arms for precise 3D sensing. AprilTags are used for object tracking and calibration. The system is enhanced with VGGT (Visual Geometry Grounded Tracking) for dense 3D reconstruction and markerless pose estimation.

CamerasΒΆ

  • src/conf/cams/default.yaml: The main configuration file for all cameras.

  • src/tatbot/data/cams.py: The Pydantic data model for camera configurations.

  • src/tatbot/cam/: The source module for all camera-related code, including calibration and capture.

CalibrationΒΆ

Camera calibration is a critical step. While intrinsic parameters (focal length, sensor size) are relatively static, extrinsic parameters (the 3D pose of each camera in the world) must be calculated for the specific setup.

The src/conf/cams/default.yaml file contains placeholder extrinsics. To get real values, you must run the calibration script:

uv run python -m tatbot.cam.extrinsics

This script uses AprilTags to find the precise location of each camera relative to the world origin.

IP PoE CamerasΒΆ

Amcrest 5MP Turret POE Camera, UltraHD Outdoor IP Camera POE with Mic/Audio, 5-Megapixel Security Surveillance Cameras, 98ft NightVision, 132Β° FOV, MicroSD (256GB), (IP5M-T1179EW-AI-V3)

the cameras are currently set at:

  • resolution: 1920x1080

  • fps: 5

  • bitrate CBR max 2048

  • frameinterval: 10

  • no substream, all watermarks off

RealSense Depth CamerasΒΆ

tatbot uses two D405 Intel Realsense cameras.

  • pyrealsense2

  • both realsense cameras are connected to hog via usb3 port

  • Follow the calibration guide.

  • Use the rs-enumerate-devices command to verify that both realsenses are connected. If this doesn’t work, unplug and replug the realsense cameras.

  • Should be calibrated out of the box, but can be recalibrated

  • FOV differs for depth and rgb cameras

  • TODO: these will somewhat randomly fail, need to create robust exception handling

AprilTagsΒΆ

AprilTags are used to track objects (i.e. palette) in the scene and for camera calibration.

see:

  • src/conf/tags/default.yaml

  • src/tatbot/data/tags.py

  • src/tatbot/cam/tracker.py

VGGT IntegrationΒΆ

VGGT (Visual Geometry Grounded Tracking) enhances the vision system with dense 3D reconstruction and markerless camera pose estimation from RGB images. See the VGGT documentation for detailed information on integration, usage, and architecture.

Key capabilities:

  • Dense 3D Reconstruction: High-quality point clouds from RGB-only images (10x+ denser than RealSense)

  • Markerless Pose Estimation: Camera poses without requiring AprilTags

  • Scale Alignment: Automatic metric alignment using AprilTag references

  • Cross-Node Processing: GPU processing on dedicated nodes while cameras remain on sensor nodes

2D to 3D MappingΒΆ

To tattoo on a non-flat surface like a practice arm, the 2D artwork must be accurately β€œwrapped” onto the 3D surface. This is a complex geometric problem solved using a technique called geodesic tracing. VGGT dense reconstructions can provide enhanced 3D surface data for improved mapping accuracy.

ImplementationΒΆ

The core logic for this process is in src/tatbot/gen/map.py. The pipeline is as follows:

  1. 3D Surface Reconstruction: A 3D mesh of the target surface is first created from Realsense point clouds using open3d’s Poisson surface reconstruction (src/tatbot/utils/plymesh.py).

  2. Projection: The flat 2D points of a stroke (from G-code) are transformed into 3D space based on the desired position and orientation of the design. These 3D points are then projected to find the closest vertices on the target mesh.

  3. Geodesic Tracing: Instead of simply connecting the projected points with straight lines (which would go through the surface), we trace the shortest path along the surface between each point. This is known as a geodesic path. We use the potpourri3d library for its efficient GeodesicTracer.

  4. Resampling & Normals: The resulting 3D path is resampled to have a uniform density of points. At each point, the surface normal of the mesh is calculated. This normal is crucial for orienting the tattoo needle to be perpendicular to the skin.

  5. Debugging: The src/tatbot/viz/map.py tool provides an interactive visualization for debugging this entire process.