https://github.com/Tbarkin121/ROS_SLAM

SLAM (Simultaneous Localization and Mapping)

  • Localization

    • Localization refers to the process by which a robot or an autonomous vehicle determines its position within an environment. Using various sensors and previously mapped data, the system identifies its location relative to its surroundings.

  • Mapping :

    • Mapping is the process of creating a map of an unknown environment while the robot or vehicle moves through it. The system uses sensors to detect the environment's features, such as walls, obstacles, and landmarks, and constructs a spatial representation of these features, often in the form of a 2D or 3D model.

  • Navigation :

    • Navigation involves planning and moving along a path from one location to another within the environment. It uses the data from localization and mapping to safely and efficiently determine a route that avoids obstacles, minimizes travel time, or meets other criteria specified by the application.

  • Feature / Landmark Slam

    • Feature or Landmark SLAM focuses on using distinct environmental features or landmarks to aid in the localization and mapping process. These features are used as reference points that help in accurately estimating the robot's position and orientation, as well as in extending and refining the map. Examples of features include corners, edges, or specific objects that are easy to recognize and track over different viewpoints.

  • Grid Slam

    • Grid SLAM divides the environment into a grid of cells and maintains a map in the form of probabilities that each cell is occupied, free, or unknown. This method is often used when dealing with large and complex environments. It uses techniques like particle filters to estimate the state of each grid cell based on sensor readings and the robot's movements, updating the probabilities as new data comes in.

So I tried everything I could to get ROS and Real Sense working on the macbook m2. The closest I got was using a VM which allowed the realsense camera to work correctly. ROS installed but there are gazebo packages not compiled for arm and it doesn’t seem like anyone has gotten it working successfully from my day and a half of looking. Its fine, after thinking about it I want to stick close to Isaac and Isaac ROS packages are linux only. But I needed to update from Ubuntu 20.04 to 22.04… I then built ROS but anaconda was causing problems but the best solution was removing the line “export PATH="/home/tyler/anaconda3/bin:$PATH" from .bashrc. This way I can keep anaconda and have ROS work without them stepping on each others toes. I got the real sense source code built and running buttttt it isn’t a ROS package so I think some realsense ros wrapper. The easy way out might be just installing the prebuilts with : sudo apt-get install ros-<distro>-realsense2-camera. Oh look, some help : https://dev.intelrealsense.com/docs/ros2-wrapper… Building from scratch cause I like things difficult… well i like editable code mostly…

Links to interesting ideas

  • https://arxiv.org/pdf/2304.06194.pdf

    • SiLK is a self-supervised framework for learning keypoints. SiLK focuses on simplicity and flexibility, while also providing state-of-art and competitive results on existing benchmarks.

  • https://www.emerald.com/insight/content/doi/10.1108/IR-11-2023-0309/full/html

    • Paywalls….

  • https://www.youtube.com/watch?v=ZaiA3hWaRzE

    • ROS2 Nav Stack with Lidar

  • Debugging Annoyances

    • Madgwick filter not getting IMU data from camera

      • RealSense Topics published at correct rates

      • The CLI settings when starting the imu filter weren’t being interpreted correctly.

        • https://github.com/introlab/rtabmap_ros/issues/848

      • Visualizing Filter Output

        • https://wiki.ros.org/rviz_imu_plugin

        • After starting the camera and imu filter, I built imu_tools from source

        • Then execute : ros2 run rviz2 rviz2

        • From here we can add rviz_imu_plugin : imu

        • Update the topic to /rtabmap/imu

        • We can now see the orientation of the camera in rviz

    • The RTABMAP node isn’t getting the published topics correctly

      • https://github.com/IntelRealSense/realsense-ros/issues/2564

      • Some Example RTAB Launch Files

        • https://github.com/introlab/rtabmap_ros/tree/ros2/rtabmap_examples/launch

      • ros2 run tf2_tools view_frames

        • Tells me all the valid frames. My frame_id for the rtabmap launch was incorrect