SOFTWARE ARCHITECTURE OF A MOBILE ROBOT
Software architecture, in general, both refers to the high-level structure of a system as well as to the process of ensuring that the structure or the design of a system is according to specific needs. For mobile robotics, specific requirements are, for example, real-time capabilities, asynchronous data processing, and distributed functionality. While there is a clear distinction between a design of a software architecture suitable for robotics and the particular reference design implementation, in practice, due to the complexity of the task, frameworks for robotics often come with a single reference implementation. Therefore, when comparing and choosing an appropriate software architecture, it is prudent to take into consideration not only the design but the suitability of the implementation as well.
Architectural Patterns in Robotics
1. Blackboard
The Blackboard architecture is inspired by problem-solving in a shared collaborative space — akin to how multiple specialists might write and read ideas on a physical blackboard to solve a complex problem. In robotics, this metaphor applies to a central shared memory or coordination space where different software modules can post, read, or listen to data entries (tuples).
Tuple spaces are a formal implementation of this architecture and enable asynchronous communication where processes don't need to know about each other directly.
This is useful in heterogeneous or multi-robot systems, where independent modules — such as sensor fusion, SLAM, motion planning — may need to interact or contribute to a global understanding.
Systems like ROS (Robot Operating System) and CARMEN include this behaviour through mechanisms such as the parameter server, which maintains a central storage of shared values like robot configurations, sensor calibration data, and system-wide settings.
This method enhances modularity, flexibility, and reusability but can become a bottleneck or a single point of failure in highly distributed systems if not carefully managed.
In the Client-Server model, distinct software entities are divided into servers (which provide resources or services) and clients (which request and consume them).
This is highly appropriate for centralized tasks such as:
Fleet coordination
Centralized task assignment
Global data aggregation (e.g., cloud mapping)
Communication is request-response, initiated by the client, and thus inherently synchronous.
ROS services (
rosservice) are examples of this, where a node provides a callable service for another node to use.Limitations:
The model is less suitable for real-time sensor data streaming due to the lack of asynchronous communication.
The Player/Stage framework suffers from these limitations when scaling to multi-robot systems.
Nevertheless, this model is useful for control hierarchies, configuration updates, and coordination algorithms that benefit from centralization.
Client/Server communication takes place on TCP Socket I/O channels. On the server side, abstract operations are defined, which allows specific implementations for different robotic platforms. The client side creates the connection with the server side. As soon as the connection is established, the application starts sending high level messages in humanlike language through the client side interface. “ROTATE 90” and “MOVE 20”, for instance, could concern instructions for making the robot turn 90º to the right and moving itself straight ahead 20 centimeters, respectively.
3. Component-Based Software Engineering
A key concept utilized in most mobile robot software architectures is modularisation: i.e., splitting functionality into separate loosely coupled components. Modules are directly interchangeable, given that they expose the same interfaces and, of course, provide the same functionality. This approach treats robot software as a system of loosely coupled, highly cohesive modules, often referred to as components or nodes.
Each component is interchangeable if it adheres to the same input/output interface. For instance, replacing an A* planner with an RRT planner is straightforward if both accept the same goal, map, and robot constraints.
Framework-specific examples:
ROS: Node
Player: Driver
YARP: Device
Each node is a separate process (or executable) that runs independently.
Nodes communicate with each other using ROS 2’s middleware (DDS) via:
Topics (for pub-sub messaging)
Services (for request-response)
Actions (for long-running tasks with feedback)
Parameters (for configuration)
Node Name | Function | Topic/Service Used |
|---|---|---|
| Publishes LiDAR scan data | Publishes: |
| Combines sensor data for localization | Subscribes: |
| Controls motor speed based on path | Subscribes: |
| Builds map while localizing | Publishes: |
| Sends manual control commands | Publishes: |
4. Database-Centric
In this model, the database acts as a central hub for all information, serving as a shared container to store and retrieve data by various components.
Online Data: Acquired during runtime (e.g., LiDAR scans, odometry, IMU readings). These are usually transient and processed in real-time.
Offline Data: Includes maps, calibration files, or lookup tables, typically loaded at startup and referenced throughout the robot’s operation.
While databases like MongoDB (used in YARP) support storing and querying both kinds of data, using them for online communication is less efficient:
High latency compared to real-time messaging
Difficult fault tolerance and synchronization
Not ideal for systems requiring real-time responsiveness
Still, databases are useful for logging, data analytics, and retrieving prior mission data or machine learning models.
A database-centric architecture for robotics leverages in-database processing and resource adapters to enable real-time event handling, secure data management, and seamless integration of heterogeneous robotic systems. . Resource Adapters act as bridges between the database and heterogeneous robotic components (e.g., sensors, actuators, and user interfaces), enabling seamless data streaming and task execution.
5. Event-Driven / Message-Driven Architectures
This approach focuses on reacting to events or messages as the primary driver of computation.
Events can represent changes in sensor readings, task status, or internal conditions.
Enables decoupled communication — components don't need to know who will process the data, only that it exists.
ROS implements this via callbacks in subscriber nodes:
When a new message arrives on a topic (e.g.,
/odom), a registered function is triggered to process it.
This pattern supports asynchronous behavior, crucial for handling real-world dynamics in mobile robotics.
Event-driven architecture enables distributed robotic systems to react to discrete events—such as sensor triggers, actuator status changes, user inputs, environmental changes, and system-level messages—through asynchronous communication. These events drive system behavior and coordination via publish-subscribe messaging, allowing flexible, loosely-coupled integration between high-level controllers and low-level robotic platforms.
Types of events include:
Sensor Events
Obstacle detected by LiDAR or ultrasonic sensor
Temperature exceeds a threshold
Low battery warning
Motion detected by camera or IR sensor
Actuator/Device Events
Completion of a robot arm movement
Gripper opened/closed
Wheel encoder reports a position change
System or Network Events
A new command is received over the network
Connection loss with a component
Task execution status update (e.g., success, failure, timeout)
User Interface or Input Events
User starts or stops a robot from a GUI or touchscreen
Manual override triggered
Voice command received
Environmental Events
Room temperature changes significantly
Object appears or disappears in camera feed (via computer vision)
Time-based triggers (e.g., scheduled task execution)
6. Peer-to-Peer
ROS_MASTER_URI is the environment variable that specifies the address of the ROS Master. This URI allows nodes to find and communicate with the ROS Master.7. Plug-In
This design allows components to be dynamically loaded at runtime without restarting or recompiling the core system.
Plugins adhere to predefined interfaces and are often compiled as shared libraries.
They enable:
Runtime customization
Extensibility (e.g., adding new sensors to a simulator)
User-defined behavior injection (e.g., custom motion control in Gazebo)
Examples:
RViz in ROS loads visualization plugins to render different message types.
Gazebo supports plugins for custom physics, sensors, or actuators.
pluginlib in ROS allows any ROS node to load behavior dynamically.
If we were to develop a robot called MoveIt we would rely on plugins for kinematics, collision detection, ctomap updater plugins, planning plugins and planning request adapter plugins.
8. Publish-Subscribe
This asynchronous communication model decouples message producers (publishers) from consumers (subscribers).
Components publish messages on named topics.
Any number of subscribers can listen to a topic and act on the data.
A message broker (or master node in ROS 1) helps manage routing and connections.
Key features:
Scalability: Many subscribers can read data without overloading the publisher
Flexibility: Consumers can subscribe/unsubscribe without affecting others
Used extensively in:
ROS (topics): e.g.,
/cmd_vel,/scan,/odomYARP (ports)
Unlike client-server, data is broadcasted and can be received by multiple nodes in parallel, which is ideal for sensor data dissemination and reactive behavior.
9. Service-Oriented Architecture: SOA
Patterns of the service-oriented architectures are adopted to support emergent cooperation between autonomous robots. The robots do not need to know each other in advance. As each service presents its provided and consumed interfaces, a very flexible composition of services is possible during runtime. Availability and reliability of the system of systems is increased by decentralized peer-to-peer communication and control.
Detailed Software Architecture for Mobile Robot
1. Hardware Abstraction Layer (HAL) [Arduino Nano]
This runs on the Arduino Nano in C++ and is responsible for directly interfacing with hardware.
Functions:
Read encoder values
Drive motors via BTS7960 (PWM + direction pins)
Monitor voltage (optional)
Receive speed commands from Pi over UART
Send odometry and sensor data to Pi
Interfaces:
Serial communication (USB or UART)
PWM for motor control
GPIO for encoder inputs
AnalogRead for voltage sensing (optional)
2. Firmware Communication Interface (Raspberry Pi → Arduino)
❖ Purpose
This interface enables low-level communication between the high-level control system (ROS 2 running on the Raspberry Pi) and the microcontroller firmware (running on the Arduino). Its primary responsibilities include:
Sending motor commands to the Arduino for actuation via the motor drivers.
Receiving odometry data from the wheel encoders processed on the Arduino.
This interface is essential for closed-loop control, ensuring that motion commands from the navigation stack are properly translated into hardware actions and feedback is relayed for localization.
❖ ROS 2 Node
Name:
arduino_bridge_nodeLanguage: Typically written in Python or C++, depending on performance and integration needs.
❖ Functionality
🔹 Serial Communication
Establishes a serial connection via the /dev/ttyUSBx or /dev/ttyACMx device, which corresponds to the USB port where the Arduino is connected. This uses the Serial Communication Protocol, a standard for sending byte-level data over USB.
Baud rate: Common values are 57600 or 115200, and must match the Arduino's serial configuration.
🔹 Command Sending
The node subscribes to the /cmd_vel topic (which carries geometry_msgs/Twist messages) and forwards the linear and angular velocity commands to the Arduino. These are used to control motor PWM signals or direction pins.
geometry_msgs/Twist: A message type containinglinearandangularvelocity vectors. Represents desired robot motion.
🔹 Data Parsing
Encoder data sent by the Arduino is received as structured or raw serial messages. The node parses this data to:
Calculate wheel velocities or positions.
Estimate robot odometry using a differential drive kinematics model.
Optionally publish joint states for visualization.
❖ Publishes
🔹 /odom → nav_msgs/Odometry
Provides a continuous estimate of the robot’s position, orientation, and velocity based on wheel encoder data. This data feeds into localization and navigation modules.
Covariance matrix is often added to indicate measurement uncertainty.
Frame: Typically
odom → base_link
🔹 /joint_states → sensor_msgs/JointState (Optional)
Publishes individual joint positions (wheel angles), velocities, and efforts. This is useful for simulation and visualization, especially in RViz or Gazebo.
❖ Subscribes
🔹 /cmd_vel → geometry_msgs/Twist
This topic provides high-level velocity commands generated by:
Teleoperation nodes (e.g., joystick or keyboard control)
Autonomous navigation stack (Nav2)
Custom control logic
These commands are then translated into motor control signals by the Arduino firmware.
3. Robot Localization & Odometry
❖ Purpose
Localization estimates the robot's pose (its position and orientation) over time. It fuses multiple sensor inputs (like IMU, wheel odometry, and optionally LiDAR or GPS) to compensate for drift or noise in any single source. Accurate localization is critical for autonomous navigation.
❖ Core Packages
🔹 robot_localization
A package that provides sensor fusion using Kalman filters:
EKF (Extended Kalman Filter) for non-linear systems
UKF (Unscented Kalman Filter) for greater accuracy under nonlinearities
🔹 tf2_ros
Manages transformations between coordinate frames, which is vital for understanding how sensor data and robot pose relate across space.
❖ Node
ekf_localization_node: Subscribes to multiple sensor sources and fuses them into a consistent and smooth estimate of the robot’s state using Kalman filtering.
❖ Inputs (Subscribed Topics)
/imu→sensor_msgs/Imu: Provides acceleration and angular velocity data./wheel_odom→nav_msgs/Odometry: Comes from the Arduino, derived from encoder data./scan→sensor_msgs/LaserScan: If used, helps with local SLAM or scan-based pose corrections.
❖ Publishes
/odom→nav_msgs/Odometry: The fused and filtered pose estimate./tf→ Transform (odom → base_link): Used to relate the fixed odometry frame to the robot’s base (which moves with the robot).
4. Mapping (SLAM)
❖ Purpose
Simultaneous Localization and Mapping (SLAM) allows the robot to build a 2D occupancy grid map of its environment while estimating its own pose. This is critical for operation in previously unknown spaces.
Core Package
slam_toolbox: A modern, efficient SLAM package for ROS 2. Supports both online SLAM (real-time mapping) and offline refinement.
❖ Modes
Async Online SLAM: Enables continuous mapping as the robot explores.
Localization Mode: Uses a prebuilt map to localize the robot (an alternative to AMCL).
❖ Subscribed Topics
/scan: Laser scans from the LiDAR sensor./odom: Odometry from EKF or raw encoders.
❖ Published Topics
/map→nav_msgs/OccupancyGrid: The generated 2D map showing free, occupied, and unknown areas./tf→ map → odom: Provides frame transformations to align the robot in the global map./slam_toolbox/pose_graph: Internal representation of the SLAM graph.
❖ Launch Files
online_async_launch.py: For live mapping.localization_launch.py: For using a saved map in localization mode.
5. Navigation Stack (Nav2)
❖ Purpose
Provides full-stack autonomous navigation, including:
Global path planning
Local obstacle avoidance
Motion control
Recovery behaviors
❖ Core Packages
nav2_bringup: Launch frameworknav2_planner: Global path plannernav2_controller: Executes movement toward waypointsnav2_recoveries: Handles stuck or error statesnav2_behavior_tree: Encodes high-level logic in Behavior Trees (BT)
❖ Subscribed Topics
/scan: Used for local costmap generation (obstacles)/odom: Tracks current pose/map: Static or dynamic map for path planning/goal_pose: Desired destination (PoseStamped)
❖ Published Topics
/cmd_vel: Drives robot toward goals/path: Shows the robot’s planned trajectory/plan: Raw list of waypoints/tf: Frame transformations required by the stack
❖ Behavior Tree (BT XML)
Defines complex decision-making flows such as:
Retrying navigation attempts
Rotating in place to search for exits
Backtracking after failure
Pausing navigation based on triggers
6. Mission Control & Task Planning
❖ Purpose
Implements high-level autonomy, managing what the robot does and when. Converts mission objectives into sequences of goals.
❖ ROS 2 Node
mission_controller_node: Implements either:Finite State Machine (FSM)
Behavior Tree (BT)
❖ Responsibilities
Monitor task status (e.g., “goal reached”)
Issue new goals to navigation stack
Handle event-driven logic (e.g., obstacle, battery low)
Allow manual override
❖ Triggers
Time-based: Patrol schedules
Button-based: Manual missions
Event-based: Respond to sensor or system events
❖ Optional Features
Exploration mode vs. docking mode
Battery-level monitoring and return-to-base
Task queueing and prioritization
7. Perception (LiDAR)
❖ Hardware
360° 2D LiDAR: Example: RPLIDAR A1
❖ Packages
rplidar_ros2orydlidar_ros2: ROS 2-compatible drivers for different LiDAR brands.
❖ Publishes
/scan→sensor_msgs/LaserScan: Returns a set of range measurements in polar coordinates (angle + distance).
❖ Parameters
Serial Port: e.g.,
/dev/ttyUSB0Baud Rate: Typically 115200 or 256000
Frame ID:
laser, used for TF tree integration
8. Robot Description (URDF/Xacro)
❖ Purpose
Provides a complete digital model of the robot, including physical structure, sensors, and reference frames.
❖ Files
.urdf.xacro: Xacro is an XML macro language that makes URDF modular and reusable.
❖ Contents
Links: Components like chassis, wheels, LiDAR.
Joints: Connections between links (e.g., wheels as revolute joints).
Sensors: Plugins for simulating sensors like LiDAR or IMUs.
Frames: All elements must be connected via TF frames (
base_link,wheel_left,laser, etc.)
❖ Nodes
robot_state_publisher: Publishes dynamic and static transforms based on joint states and URDF.joint_state_publisher: For simulation or visualization purposes, simulates wheel rotations, etc.
❖ Publishes
/tf: Hierarchical frame structure of the robot
9. Simulation (Gazebo)
❖ Purpose
Simulates the robot in a physics-based 3D world with gravity, collisions, and sensor emulation. Allows for safe testing and debugging.
❖ Core Package
gazebo_ros_pkgs: Provides Gazebo plugins and launch tools for ROS 2 integration.
❖ Components/Plugins
Differential Drive Plugin: Simulates wheel motion and odometry
LiDAR Plugin: Simulates laser scans
Joint State Plugin: Broadcasts joint data
Battery Plugin (optional): Models power usage and voltage drop
❖ Integration Steps
Pass URDF as
robot_descriptionSpawn with
spawn_entity.pyinto Gazebo
❖ Applications
Test navigation stack
Visualize sensor output
Simulate corner cases (e.g., recovery behavior)
10. Visualization (RViz2)
❖ Purpose
RViz2 is the main tool for visualizing robot data and interaction in real time.
❖ Visualized Topics
Robot Model: From URDF
Laser Data:
/scanOdometry Path:
/odomGoal Pose:
/goal_poseMap:
/mapTF Tree: Visualizes all coordinate frames and relationships
❖ Tools
tf2_tools/rqt_tf_tree: Debug frame transformations2D Nav Goal panel: Clickable interface to issue goals
1️⃣ Physical Hardware Layer
Sensors & Actuators
Component | Details |
|---|---|
📡 RPLIDAR A1 | 360° LiDAR (12m range), publishes |
⚙️ Arduino Nano 3 | Handles motor control (6 PWM channels), encoder reading (12-bit), watchdog timer |
🛞 DC Gear Motors | gear ratio, 20 CPR encoders, BTS7960 motor driver (43A peak) |
📦 Raspberry Pi 4 | Runs ROS 2 Humble (Ubuntu 22.04), 4GB RAM |
Key Features
Motor Safety: Overcurrent protection on Arduino.
Compute Power: Pi 4 handles perception, navigation, and decision-making.
2️⃣ Firmware Layer
Arduino Firmware (C++)
Motor Control PID
100Hz velocity control loop
Serial Protocol
COBS-encoded (prevents buffer overflows)
CRC-16 checksum (ensures data integrity)
Responsibilities
Converts ROS 2
/cmd_vel→ PWM signals.Reads encoders → computes odometry.
Implements watchdog timer (stops motors if no commands arrive).
3️⃣ ROS 2 Node Layer (Green)
Perception Stack
Node | Function | Details |
|---|---|---|
| LiDAR driver | Publishes |
| Optional camera | 640×480 resolution, MJPG codec, topic: |
Localization Stack
Node | Function | Details |
|---|---|---|
| SLAM | Async mode, 5cm resolution, outputs |
| Sensor fusion | Combines |
Navigation Stack
Node | Function | Details |
|---|---|---|
| Motion control | Uses DWB plugin, acceleration limits |
| Global path planning | Uses NavFn, inflation radius |
Key Data Flows
LiDAR Data:
/scan→ SLAM + Controller.Odometry: Arduino →
/odom/wheel→ EKF →/odom/filtered.Commands: Nav2 →
/cmd_vel→ Arduino.
4️⃣ Failure Recovery
System Monitoring
system_monitorChecks CPU temp.
Listens to topics
/heartbeat/motorsand/heartbeat/lidar.
Recovery Behaviors
recoveries_nodeSpin recovery (180° turn).
Backup (0.3m retreat).
Costmap clearing (if stuck).
5️⃣ Debugging Tools
Tool | Purpose |
|---|---|
| Visualizes active nodes/topics. |
| Records |
6️⃣ Simulation
Gazebo
Physics Engine: ODE.
Plugins:
diff_drive(simulates motors).gpu_lidar(realistic LiDAR).
Ignition Fortress
Advanced features:
Sensor noise models.
Dynamic lighting.
Recommended Software Architecture Pattern
🔹 1. Component-Based Architecture (CBSE)
“Modularity and reusability”
The system is already built on modular components: ROS 2 nodes like
slam_toolbox_node,ekf_filter_node,controller_server, andplanner_serverare all modular.Each module can be easily replaced or improved — e.g., swap NavFn with RRT*, or use a different LiDAR without changing the rest of the system.
It aligns perfectly with the structure of ROS 2 (which treats everything as nodes/modules).
Justifies the use of plug-ins (e.g., for planners, Gazebo, RViz), enhancing flexibility and runtime configuration.
🔹 2. Publish–Subscribe Pattern
“Decoupled, asynchronous data flow”
The robot relies heavily on ROS 2 topics like
/scan,/cmd_vel,/odom, etc.DDS (Data Distribution Service) is a standardized middleware protocol and API for data-centric publish–subscribe communication.ROS 2's DDS-based middleware implements pub-sub with QoS (Quality of Service) policies — ideal for real-time robotics. QoS controls how messages are sent, received, and stored.
Decouples producers (e.g., sensors) from consumers (e.g., localization or navigation nodes).
Scales naturally to distributed systems — like adding another perception module or remote control interface.
Works extremely well with event-driven systems like your
recoveries_nodeandmission_controller_node.This ensures fast, asynchronous communication while supporting distributed behavior.
🚫 Why Not Others as the Primary Pattern?
Blackboard: Great for high-level coordination in heterogeneous teams, but the system doesn't rely on a central shared memory. ROS param server is not heavily used beyond startup.
Client-Server: We use it (e.g., for ROS services), but it's limited to configuration and not the core control loop — too synchronous and unsuitable for streaming.
Database-Centric: Good for logging and analytics, but not for your real-time control loops.
Peer-to-Peer: ROS 2 allows this, but the setup is not swarm-like or decentralized enough to justify this as the main architecture.
SOA: Too heavy for the single robot system; more suited to web-scale robotics or cloud services.

Comments
Post a Comment