Skip to content

Vision-Based Autonomous Rover is a ROS 2 + Gazebo-based mobile robot that demonstrates autonomous navigation using LiDAR-based obstacle avoidance and a Finite State Machine (FSM). The rover operates in simulation and is designed with a modular architecture suitable for extending to real hardware.

License

Notifications You must be signed in to change notification settings

skunal3318/ROS2-Autonomous-Rover

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ROS 2 Logo

ROS2 Jazzy Gazebo Project Status

ROS2 Autonomous Rover is a simulation-first autonomous mobile robot built using ROS 2 (Jazzy) and Gazebo (gz-sim).
The project focuses on core autonomy concepts such as perception, decision-making, and control, implemented through a clean ROS2 architecture and a finite state machine (FSM).


🌟 Why This Project?

This project was built to:

  • Understand end-to-end autonomous robot architecture
  • Practice ROS2 node-based design
  • Implement FSM-driven obstacle avoidance
  • Bridge the gap between simulation and real-world robotics

The rover is designed to move autonomously in a simulated environment while reacting to obstacles using LiDAR-based perception.


🚗 Rover Capabilities

  • 🧭 Autonomous Forward Navigation
  • 🚧 LiDAR-Based Obstacle Detection
  • 🔄 FSM-Controlled Obstacle Avoidance
    • Forward → Stop → Reverse → Scan → Turn → Forward
  • 🎥 Camera Integration (for perception & visualization)
  • 🕹️ Manual Teleoperation Support
  • 🔌 ROS2 Topic-Based Control (cmd_vel)
  • 🧪 Fully Simulated in Gazebo (gz-sim)

🧠 System Architecture

The rover follows a modular ROS2 design:

  • Gazebo Simulation
    • Robot model (URDF/Xacro)
    • Sensors (LiDAR, Camera)
  • ROS2 Nodes
    • Sensor processing
    • FSM-based control logic
    • Velocity command publisher
  • Visualization
    • RViz2
    • rqt_graph

This separation ensures clarity, scalability, and easy transition to real hardware.


🔄 Finite State Machine (FSM)

The obstacle avoidance behavior is driven by a simple but effective FSM:

  1. FORWARD – Move straight
  2. STOP – Brief halt on obstacle detection
  3. REVERSE – Move backward to create space
  4. SCAN – Compare left vs right clearance
  5. TURN_LEFT / TURN_RIGHT – Rotate toward safer direction
  6. FORWARD – Resume motion

This approach avoids black-box planners and emphasizes decision-making transparency.


🛠️ Tech Stack

  • ROS 2 Jazzy
  • Gazebo (gz-sim)
  • Python (rclpy)
  • URDF / Xacro
  • RViz2
  • rqt_graph

▶️ How to Run

# Build the workspace
colcon build
source install/setup.bash

# Launch the simulation
ros2 launch rover_bringup rover.launch.xml

# Run obstacle avoidance node
ros2 run rover_control obstacle_avoidance.py

📜 License

This project is licensed under the MIT License.
See LICENSE.md for details.

About

Vision-Based Autonomous Rover is a ROS 2 + Gazebo-based mobile robot that demonstrates autonomous navigation using LiDAR-based obstacle avoidance and a Finite State Machine (FSM). The rover operates in simulation and is designed with a modular architecture suitable for extending to real hardware.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published