UC Berkeley Capstone Project
Search & Rescue Robot
ROSMaster aims to create an autonomous robot capable of navigating disaster-like environments, locating survivors, and returning to retrieve them without putting human responders at risk. The project uses a small mobile rover as a platform to explore how robotics can support faster, safer, and more reliable search-and-rescue missions in real-world scenarios.
Solution
-

Mapping
The LiDAR allows the system to understand the surrounding environment. With this information, the robot can determine the layout of the room as well as know where the robot is within that space.
-

Navigation
With the map, the robot can patrol the entire area, navigating through the space while continuously updating its surroundings. In disaster scenarios, it’s crucial that the robot can avoid obstacles as it moves through the environment.
-
Image Detection
The front-facing camera is used to detect survivors, providing information about who they are and where they’re located. In our prototype, we use identifiable tags to simulate survivor detection and simplify image recognition.
-

Rescue Tool
Using our custom electromagnetic arm, the robot can grab survivors and transport them to safety. For this proof of concept, we use magnetic dolls, making the retrieval process simple and focus on validating the rescue workflow.
Full Robotic Software Architecture
ROS2 Foxy in Ubuntu Linux 20.04
Software Stack
ROS2 Jazzy
Ubuntu Linux
Navigation2
Cartographer SLAM
OpenCV
Python3
RDK X3 - GPIO
A* Pathfinding