An open API service indexing awesome lists of open source software.

https://github.com/mazen-daghari/swarm_robot

The Swarm Robot Project aims to develop a coordinated group of autonomous robots that can navigate through complex environments, avoid obstacles, and reach designated targets.
https://github.com/mazen-daghari/swarm_robot

lidar pygame python ros2 swarm-robotics

Last synced: 18 days ago
JSON representation

The Swarm Robot Project aims to develop a coordinated group of autonomous robots that can navigate through complex environments, avoid obstacles, and reach designated targets.

Awesome Lists containing this project

README

        

# swarm_robot
The Swarm Robot Project aims to develop a coordinated group of autonomous robots that can navigate through complex environments, avoid obstacles, and reach designated targets.

About this project
-----------------------------------------------------------------------------------------------------------------------------------------------------------------------

The project leverages advanced algorithms, such as the A* pathfinding algorithm, to ensure efficient and collision-free navigation. The robots are equipped with LiDAR sensors to detect obstacles and communicate with each other to maintain a cohesive formation.

Objectives
Develop Autonomous Navigation: Implement algorithms that enable robots to navigate autonomously through a predefined environment while avoiding obstacles.

Implement Swarm Coordination: Ensure that robots can communicate and coordinate their movements to achieve a common goal without colliding with each other.

Enhance Obstacle Detection: Utilize LiDAR sensors to detect and avoid obstacles in real-time.

Simulate and Test: Create a simulation environment to test the algorithms and behaviors of the swarm robots before deploying them in real-world scenarios.

Key Features
Autonomous Navigation: Each robot can independently navigate towards a target while avoiding obstacles using the A* pathfinding algorithm.

Swarm Coordination: Robots communicate with each other to maintain a cohesive formation and avoid collisions.

LiDAR Sensing: Robots are equipped with LiDAR sensors to detect obstacles and adjust their paths accordingly.

Simulation Environment: A simulated environment is created using Pygame to test and visualize the behavior of the swarm robots.

Technical Details
Programming Language: Python

Libraries and Tools: Pygame for simulation, ROS (Robot Operating System) for real-world deployment, and Docker for containerized development.

Algorithms: A* pathfinding algorithm for navigation, collision avoidance algorithms for swarm coordination.

Implementation Steps
Set Up Development Environment: Install necessary tools and libraries, including Pygame, ROS, and Docker.

Develop Simulation: Create a Pygame-based simulation environment to test the navigation and coordination algorithms.

Implement Navigation Algorithms: Develop and test the A* pathfinding algorithm for autonomous navigation.

Integrate LiDAR Sensing: Implement LiDAR sensors for obstacle detection and avoidance.

Test Swarm Coordination: Ensure that robots can communicate and coordinate their movements to avoid collisions.

Deploy to Real Robots: Use ROS to deploy the algorithms to real robots and test in a controlled environment.

Expected Outcomes
A fully functional swarm of robots capable of navigating through complex environments and reaching designated targets without collisions.

A robust simulation environment for testing and visualizing the behavior of the swarm robots.

Successful deployment of the swarm robots in real-world scenarios using ROS.

Future Work
Enhance Communication: Improve the communication protocols between robots for better coordination.

Expand Capabilities: Add more advanced sensors and algorithms to enhance the capabilities of the swarm robots.

Real-World Applications: Explore real-world applications such as search and rescue, environmental monitoring, and industrial automation.

This project aims to push the boundaries of autonomous robotics and demonstrate the potential of swarm intelligence in solving complex navigation and coordination problems.

How to Start
-----------------------------------------------------------------------------------------------------------------------------------------------------------------------

Implementing the code on real robots involves several key steps, including selecting the appropriate hardware, configuring the software environment, and ensuring the robots can execute the commands generated by the code. Here’s a detailed guide to help you transition from simulation to real-world deployment:

### 1. Choose Your Hardware
Select suitable robots for your project. Here are a few popular options:
- **TurtleBot3**: An affordable, customizable, and widely-used research robot with ROS support.
- **Raspberry Pi**: Can be used with motor drivers and sensors to create custom robots.
- **Arduino**: Another option for custom robot builds with various motor and sensor shields.

### 2. Install and Set Up ROS
If you choose a robot compatible with ROS (Robot Operating System), you'll need to set up ROS on your computer and the robot. Here are the general steps:

#### On Your Computer:
1. **Install ROS**: Follow the [ROS installation guide](http://wiki.ros.org/ROS/Installation) for your operating system.
2. **Set Up a Catkin Workspace**:
```bash
mkdir -p ~/catkin_ws/src
cd ~/catkin_ws/
catkin_make
source devel/setup.bash
```

#### On the Robot:
1. **Install ROS**: If using a Raspberry Pi, follow the [ROS installation guide for Raspberry Pi](http://wiki.ros.org/ROS/Installation/UbuntuARM).
2. **Network Setup**: Ensure your computer and robot are on the same network.

### 3. Develop and Deploy Code
Modify the code to publish ROS messages for robot control. Here’s an example using the TurtleBot3:

#### Install ROS Packages:
```bash
sudo apt-get update
sudo apt-get install ros-noetic-turtlebot3 ros-noetic-turtlebot3-simulations
```

#### Update the Code:
Modify the robot class to publish velocity commands to the `/cmd_vel` topic:

```python
import rospy
from geometry_msgs.msg import Twist

class Robot:
def __init__(self, x, y, radius=10, sight_range=100, name='robot'):
self.x = x
self.y = y
self.radius = radius
self.sight_range = sight_range
self.color = BLUE
self.speed = 2
self.reached_goal = False
self.name = name

# ROS Publisher for each robot
self.cmd_vel_pub = rospy.Publisher(f'/{self.name}/cmd_vel', Twist, queue_size=10)

def send_cmd_vel(self, linear_x, angular_z):
cmd = Twist()
cmd.linear.x = linear_x
cmd.angular.z = angular_z
self.cmd_vel_pub.publish(cmd)

def move_towards(self, path, robots):
if path and not self.reached_goal:
next_pos = path[0]
if not any(np.linalg.norm(np.array([next_pos[0], next_pos[1]]) - np.array([robot.x, robot.y])) < self.radius * 2 for robot in robots if robot != self):
self.x, self.y = path.pop(0)
linear_x = self.speed
angular_z = 0 # Adjust as needed
self.send_cmd_vel(linear_x, angular_z)
if len(path) == 0:
self.reached_goal = True
```

#### Main Function:
```python
def main():
rospy.init_node('swarm_robot_navigation')
num_robots = 3
sight_range = 100
robots = create_robots(num_robots, sight_range)
target = (WIDTH // 2, HEIGHT // 2)
obstacles = create_obstacles(5)
paths = [[] for _ in range(num_robots)]
running = True
status = "Waiting for target selection"

while running:
for event in pygame.event.get():
if event.type == pygame.QUIT:
running = False
if event.type == pygame.MOUSEBUTTONDOWN:
target = event.pos
status = "Path calculating"
paths = [a_star((robot.x, robot.y), target, obstacles) for robot in robots]
status = "Calibration"
if event.type == pygame.KEYDOWN:
if event.key == pygame.K_o:
obstacles.append(pygame.Rect(np.random.randint(50, WIDTH - 50), np.random.randint(50, HEIGHT - 50), np.random.randint(20, 50), np.random.randint(20, 50)))
if event.key == pygame.K_UP:
num_robots += 1
robots = create_robots(num_robots, sight_range)
paths = [[] for _ in range(num_robots)]
if event.key == pygame.K_DOWN:
num_robots = max(1, num_robots - 1)
robots = create_robots(num_robots, sight_range)
paths = [[] for _ in range(num_robots)]

screen.fill(WHITE)

for obstacle in obstacles:
pygame.draw.rect(screen, GRAY, obstacle)

all_reached_goal = True
for i, robot in enumerate(robots):
if not robot.reached_goal:
all_reached_goal = False
robot.move_towards(paths[i], robots)
robot.draw(screen, obstacles)

if all_reached_goal:
status = "Goal reached successfully"

pygame.draw.circle(screen, RED, target, 5)

font = pygame.font.Font(None, 36)
text_surface = font.render(status, True, BLACK)
screen.blit(text_surface, (10, 10))

pygame.display.flip()
pygame.time.delay(30)

pygame.quit()

if __name__ == "__main__":
main()
```

### 4. Run Your Code
- **Start ROS Master**:
```bash
roscore
```

- **Run the Script**:
```bash
python your_script.py
```

### 5. Test and Debug
- **Monitor Robot Behavior**: Use ROS tools like `rqt_graph` and `rviz` to visualize and debug the robot's behavior.
- **Adjust Parameters**: Fine-tune parameters such as speed and LiDAR range based on the robot's performance.

This setup involves using ROS to send commands to real robots, leveraging the existing simulation code. Transitioning from simulation to real-world deployment requires careful testing and validation, considering the physical robot's capabilities and limitations. If you need further assistance, feel free to ask!

Licence (MIT)
-----------------------------------------------------------------------------------------------------------------------------------------------------------------------

Contact :[email protected]
-