This workshop is designed for engineers, computer scientists, and passionate hobbyists in Lahore who have a foundational understanding of Python programming and basic robotics concepts (e.g., controlling a simple robot with Arduino). We will move beyond basic movement and obstacle avoidance to tackle more sophisticated robotic behaviors and interactions, leveraging Python's powerful ecosystem.
Prerequisites:
Strong Python fundamentals: Variables, data types, control flow (if/else, loops), functions, classes, and object-oriented programming (OOP) concepts.
Basic robotics understanding: Familiarity with microcontrollers (like Arduino), DC motors, motor drivers, and basic sensors (ultrasonic, IR).
Linux familiarity (Ubuntu strongly recommended): Comfort with the command line, file system navigation, and package management.
Laptop with Ubuntu (20.04 or 22.04 recommended): With at least 8GB RAM, 25GB free disk space. Dual-booting or a virtual machine is acceptable.
By the end of this intensive workshop, participants will be able to:
Work proficiently with the Robot Operating System (ROS) using Python (ROS 1 or ROS 2).
Integrate various sensors (LiDAR, Camera) for advanced perception.
Implement fundamental algorithms for robot localization and mapping (SLAM).
Develop intelligent navigation strategies using ROS Navigation Stack.
Control robotic manipulators (arms) for pick-and-place tasks.
Utilize Python libraries for computer vision and basic AI in robotics.
Debug complex robotic systems effectively using ROS tools.
While many concepts can be practiced in simulation, hands-on experience with physical hardware is invaluable.
Mobile Robot Platform:
TurtleBot3 (Burger/Waffle Pi): Highly recommended due to its direct ROS integration, well-documented APIs, and active community. (Available through local suppliers or online).
Custom Mobile Robot: A 2WD/4WD chassis with encoders, IMU, and a Raspberry Pi (or similar SBC) as the main controller.
Sensors:
LiDAR: RPLIDAR A1/A2 or similar for 2D laser scans.
USB Camera: Standard webcam or Raspberry Pi Camera Module.
IMU: MPU6050 or similar for orientation data (if building a custom robot).
Optional:
Small Robotic Arm: (e.g., myCobot, Robotic Arm Kit with Servo Motors, or even a simulated arm in Gazebo/RViz).
Raspberry Pi 4 / NVIDIA Jetson Nano: For on-board processing if not using a TurtleBot3.
This workshop is structured around core robotics challenges, using Python and ROS as the primary tools.
Module 1: Deep Dive into ROS with Python (ROS 1 & Introduction to ROS 2)
Review of ROS Fundamentals (Nodes, Topics, Messages, Services, Parameters): How Python interacts with these concepts.
Writing Advanced Python Nodes:
Implementing custom message types (.msg files) and service types (.srv files) in Python.
Using rospy.Subscriber and rospy.Publisher effectively, including message filters.
Practical Example: Building a "Teleoperation Node" that uses keyboard input to publish geometry_msgs/Twist commands to a simulated or physical robot, and a "Robot Status Monitor" that subscribes to /odom (odometry) and /cmd_vel topics to display robot state in the terminal.
Code Snippet (Teleop Node - teleop_key.py):
#!/usr/bin/env python3
import rospy
from geometry_msgs.msg import Twist
import sys, select, tty, termios
# Function to get keyboard input
def getKey():
tty.setraw(sys.stdin.fileno())
rlist, _, _ = select.select([sys.stdin], [], [], 0.1)
if rlist:
key = sys.stdin.read(1)
else:
key = ''
termios.tcsetattr(sys.stdin, termios.TCSADRAIN, settings)
return key
def teleop_robot():
rospy.init_node('teleop_robot')
pub = rospy.Publisher('/cmd_vel', Twist, queue_size=10)
twist = Twist()
linear_speed = 0.5 # meters/second
angular_speed = 1.0 # radians/second
print("Use WASD keys to move the robot:")
print(" W: Forward")
print(" S: Backward")
print(" A: Turn Left")
print(" D: Turn Right")
print(" Space: Stop")
print(" Ctrl+C to quit")
while not rospy.is_shutdown():
key = getKey()
if key == 'w':
twist.linear.x = linear_speed
twist.angular.z = 0.0
elif key == 's':
twist.linear.x = -linear_speed
twist.angular.z = 0.0
elif key == 'a':
twist.linear.x = 0.0
twist.angular.z = angular_speed
elif key == 'd':
twist.linear.x = 0.0
twist.angular.z = -angular_speed
elif key == ' ': # Spacebar to stop
twist.linear.x = 0.0
twist.angular.z = 0.0
elif key == '\x03': # Ctrl+C
break
else:
twist.linear.x = 0.0
twist.angular.z = 0.0
pub.publish(twist)
rospy.sleep(0.01) # Small delay for smoother operation
if __name__ == '__main__':
settings = termios.tcgetattr(sys.stdin)
try:
teleop_robot()
except rospy.ROSInterruptException:
pass
finally:
twist = Twist()
twist.linear.x = 0.0
twist.angular.z = 0.0
rospy.Publisher('/cmd_vel', Twist, queue_size=10).publish(twist) # Ensure robot stops on exit
termios.tcsetattr(sys.stdin, termios.TCSADRAIN, settings)
Usage:
roscore in Terminal 1.
Launch your simulated robot (e.g., TurtleBot3 in Gazebo) or connect to your physical robot.
rosrun your_package_name teleop_key.py in Terminal 2.
Use WASD keys to control the robot.
ROS Launch Files: Organizing multiple nodes and parameters (.launch).
ROSbag: Recording and replaying sensor data for debugging and algorithm testing.
TF (Transformations): Understanding coordinate frames and using tf library in Python for robust robot kinematics.
Practical Example: Publishing a custom static transform for a sensor mounted on the robot or querying transforms between existing frames.
Introduction to ROS 2 (DDS, rclpy): Highlighting differences and advantages for future development.
Module 2: Robot Perception - Seeing the World with Python
LiDAR Data Processing (sensor_msgs/LaserScan):
Filtering noisy data.
Detecting obstacles and free space.
Practical Example: A Python node that subscribes to /scan topic, processes LiDAR data to detect the closest obstacle in a specific angular range, and prints its distance.
Code Snippet (LiDAR Nearest Obstacle Detector - laser_detector.py):
#!/usr/bin/env python3
import rospy
from sensor_msgs.msg import LaserScan
import numpy as np
def laser_callback(data):
# Process LaserScan data
# data.ranges is a list of distances (in meters)
# data.angle_min, data.angle_max, data.angle_increment define the scan angles
# Convert ranges to a NumPy array for easier processing
ranges = np.array(data.ranges)
# Filter out infinite (no detection) and NaN values
valid_ranges = ranges[np.isfinite(ranges)]
if len(valid_ranges) > 0:
min_distance = np.min(valid_ranges)
min_idx = np.argmin(valid_ranges)
# Calculate angle of the nearest obstacle
angle_at_min = data.angle_min + min_idx * data.angle_increment
# Convert to degrees for readability
angle_at_min_deg = np.degrees(angle_at_min)
rospy.loginfo(f"Nearest obstacle at: {min_distance:.2f} m, Angle: {angle_at_min_deg:.2f} degrees")
else:
rospy.loginfo("No valid laser readings.")
def laser_listener():
rospy.init_node('laser_detector', anonymous=True)
rospy.Subscriber('/scan', LaserScan, laser_callback) # Adjust topic name if different
rospy.spin() # Keep the node running
if __name__ == '__main__':
try:
laser_listener()
except rospy.ROSInterruptException:
pass
Usage:
roscore
Launch robot simulation with LiDAR (e.g., roslaunch turtlebot3_gazebo turtlebot3_world.launch)
rosrun your_package_name laser_detector.py
Camera Integration with OpenCV and cv_bridge:
Converting ROS image messages to OpenCV images and vice-versa.
Basic image processing: color detection, edge detection.
Practical Example: A Python node that subscribes to a camera topic, converts the image, detects a specific color (e.g., green), and draws a bounding box around it.
Module 3: Robot Localization & Mapping (SLAM) with Python
Odometry (Odometry from IMU/Encoders): Understanding and processing nav_msgs/Odometry messages.
Introduction to SLAM: Simultaneous Localization and Mapping concepts.
Using ROS SLAM Packages (e.g., GMapping, Cartographer):
Configuring and running SLAM algorithms.
Visualizing maps in RViz.
Practical Example: Running GMapping on a simulated TurtleBot3 and then saving the generated map using rosrun map_server map_saver. (Primarily configuration-based, but understanding the Python-ROS interfaces is key).
Particle Filter/Kalman Filter (Conceptual, Python implementation focus): Brief overview of underlying principles and how they are used in localization.
Module 4: Autonomous Navigation with ROS Navigation Stack
Overview of the Navigation Stack: Components like move_base, amcl, global/local planners.
Setting up move_base for your robot.
Sending Navigation Goals: Using geometry_msgs/PoseStamped messages.
Practical Example: Writing a Python node that sends a sequence of navigation goals to a robot in a mapped environment.
Code Snippet (Goal Sender - simple_navigator.py):
#!/usr/bin/env python3
import rospy
import actionlib
from move_base_msgs.msg import MoveBaseAction, MoveBaseGoal
from geometry_msgs.msg import Pose, Point, Quaternion
from tf.transformations import quaternion_from_euler # For converting yaw to quaternion
def simple_navigation_client():
rospy.init_node('simple_navigation_client')
# Create an action client called "move_base" with action definition file "MoveBaseAction"
client = actionlib.SimpleActionClient('move_base', MoveBaseAction)
# Wait for the action server to come up
rospy.loginfo("Waiting for move_base action server...")
client.wait_for_server()
rospy.loginfo("move_base action server connected!")
# Define goals (x, y, yaw in degrees)
goals = [
(1.0, 0.0, 0.0), # Go to (1,0) facing X-axis
(1.0, 1.0, 90.0), # Go to (1,1) facing Y-axis
(0.0, 1.0, 180.0) # Go to (0,1) facing -X-axis
]
for i, (x, y, yaw_deg) in enumerate(goals):
goal = MoveBaseGoal()
goal.target_pose.header.frame_id = "map" # Or "odom" if mapping is not used
goal.target_pose.header.stamp = rospy.Time.now()
# Set position
goal.target_pose.pose.position.x = x
goal.target_pose.pose.position.y = y
goal.target_pose.pose.position.z = 0.0
# Set orientation (convert yaw degrees to quaternion)
quat = quaternion_from_euler(0, 0, np.radians(yaw_deg))
goal.target_pose.pose.orientation.x = quat[0]
goal.target_pose.pose.orientation.y = quat[1]
goal.target_pose.pose.orientation.z = quat[2]
goal.target_pose.pose.orientation.w = quat[3]
rospy.loginfo(f"Sending Goal {i+1}: ({x}, {y}, {yaw_deg} deg)")
client.send_goal(goal)
# Wait for the result
wait = client.wait_for_result()
if not wait:
rospy.logerr("Action server not available or goal was preempted.")
rospy.signal_shutdown("Action server not available or goal was preempted.")
else:
state = client.get_state()
if state == actionlib.SimpleClientGoalState.SUCCEEDED:
rospy.loginfo(f"Goal {i+1} reached successfully!")
else:
rospy.logwarn(f"Goal {i+1} failed: {client.get_goal_status_text()}")
rospy.sleep(1) # Pause between goals
rospy.loginfo("All goals completed!")
if __name__ == '__main__':
import numpy as np # Needed for np.radians
try:
simple_navigation_client()
except rospy.ROSInterruptException:
pass
Usage:
roscore
Launch Gazebo with your robot in a known map (e.g., roslaunch turtlebot3_navigation turtlebot3_navigation.launch map_file:=$(rospack find turtlebot3_navigation)/maps/turtlebot3_world.yaml)
In RViz, set the initial pose of the robot accurately using the "2D Pose Estimate" tool.
rosrun your_package_name simple_navigator.py
Exploration strategies: Basic frontier exploration or random walks.
Module 5: Robotic Manipulation (Optional / Advanced)
Introduction to Robotic Arms: Kinematics (Forward/Inverse), joint states.
ROS joint_state_publisher and robot_state_publisher.
Using MoveIt! for Motion Planning:
Python interface for MoveIt!.
Planning paths for robotic arms.
Practical Example: A Python script that uses MoveIt! to plan and execute a simple pick-and-place operation with a simulated robotic arm in Gazebo (e.g., picking up a cube and placing it elsewhere). This involves defining target poses for the end-effector.
Module 6: Advanced Topics & Project Work
Behavior Trees / State Machines: Designing complex robot behaviors.
Introduction to AI/ML in Robotics:
Using Python libraries like scikit-learn for simple decision-making or OpenCV for more advanced vision.
Conceptual Example: A node that uses a trained classifier (e.g., SVM from scikit-learn) to identify objects based on simple features extracted from camera data and then triggers a specific robot action.
ROS Diagnostics and Debugging Tools: rqt_graph, rqt_plot, rviz, rosconsole, roslint.
Containerization (Docker/Singularity for ROS): For deploying reproducible robot environments.
Final Project Session: Participants work on a mini-project integrating multiple concepts learned, such as:
Autonomous patrolling with obstacle avoidance and map updates.
Simple object tracking and following with a mobile robot.
(If arm available) A basic pick-and-place task with object detection.
While direct "Advanced Robotics with Python" workshops might be specialized, several institutes in Lahore offer relevant courses that can build up to this level. You'd likely combine Python expertise with robotics fundamentals.
How to Find the Right Fit:
Contact Them Directly: Call or visit these institutes and ask specific questions about their advanced robotics curriculum.
Ask for Course Outlines: Request detailed course outlines that specify Python, ROS versions (ROS 1 vs. ROS 2), and the depth of topics like SLAM, Navigation, and Manipulation.
Inquire About Instructors: Ask about the instructors' experience in practical robotics and their proficiency with Python and ROS.
See the Lab/Hardware: If possible, visit their labs to see the hardware (e.g., TurtleBots, robotic arms, LiDARs) they use for hands-on sessions.
Check for Project-Based Learning: For advanced topics, hands-on projects are critical. Ensure the course emphasizes practical application.
Self-Study & Online Resources (Complementary):
Even if you find a good local course, supplementing with online resources is highly beneficial:
The Construct (Robot Ignite Academy): Excellent online courses specifically for ROS with Python.
ROS Wiki & ROS 2 Documentation: The official documentation is vast and invaluable.
YouTube Channels: The Construct, AutomaticAddison, Articulated Robotics, and various university lectures.
OpenCV Documentation & Tutorials: For computer vision aspects.
tf package documentation: For understanding coordinate transformations.
GitHub: Explore open-source ROS projects and Python robotics code.
Embarking on advanced robotics programming with Python is a rewarding journey. Lahore has growing tech and educational sectors, so you should be able to find the resources you need to build truly smart and autonomous robots! Good luck!