SwarmUS - A heterogenous robotic platform
The Team
SwarmUS is a team of nine engineering students from the University of Sherbrooke at the end of their bachelor diploma whose graduation is scheduled for December 2021. SwarmUS has multidisciplinary expertise in engineering, since its members come from computer, electrical and robotics engineering.
The Project
The project aims to establish an open software and hardware platform (respectively Hivemind and Hiveboard) allowing the implementation of a robotic swarm doing simultaneous localization and mapping (SLAM)[1] from heterogeneous robots and Android smartphones. Android phones can be used as an augmented reality interface to view the shared map produced and as a control interface for the robotic swarm. This technological breakthrough is taking place during a major engineering design project and will be presented at the MegaGÉNIALE Expo 2021. The project is academic in nature and will be used by a client at 3IT[2] in the robotics research group IntRoLab[3].
What is the MégaGÉNIALE Expo?
The MégaGÉNIALE Expo is the largest exhibition of university engineering projects in Canada. It is a unique opportunity to come and discover the work of graduates in each of the fields of engineering, namely: building, civil, chemical, biotechnology, electrical, computer, mechanical and robotics. The event attracts approximately 4,000 people annually. For more information on the event and photos of the event from previous years, visit the following page: https://www.MégaGÉNIALEe.usherbrooke.ca/.
[1] https://en.wikipedia.org/wiki/Simultaneous_localization_and_mapping
[2] https://www.usherbrooke.ca/3it/en/
[3] https://introlab.3it.usherbrooke.ca/mediawiki-introlab/index.php/Main_Page (French only)
Robotics section
Pioneer 2DX robots
The SwarmUS team was given the task to update two old Pioneer 2DX that was unused from its client’s lab (IntRoLab). These robots will not only be given back to the client, but they will also serve as bench tests to test our swarm platform. Since the electronics and the software of these robots were obsolete, a full upgrade of this robot was needed and only the chasiss and the motors were kept.
Power
The robots are powered by a 12V GOOLOO GP37-Plus LiPo battery. Theses batteries have some internal protections and an integrated charger, thus simplifying the design around the robot’s power source. However, an additional under voltage protection was needed to prevent the robot from draining the battery below the voltage limit of the LiPo cells.
Under voltage lockout board
A custom PCB that contains an under-voltage lockout (UVLO) and a voltage monitoring circuits was then designed by the team. This solution emits a high pitch sound whenever the voltage drops below a selected threshold and opens the main power relay under another lower threshold.
Power distribution
The power tree is divided in 3 main parts: the 12V main power, the 12V motors’ power and the 5V computer’s power. A main relay controlled by a toggle switch and the UVLO board connects the battery to the 12V main power which, in return, gives power to a 5V@5A regulator and to another relay that controls the motors’ power. The motors’ relay is controlled by a red mushroom button to easily cut off the 12V power to the motors in an emergency case. The 5V regulator powers the main computer and all the robot sensors.
Sensors, computer and motor controller
The robot brain is a Raspberry Pi 4 4GB that runs on Linux 20.04.1 LTS. To sense its environment, the robot is equipped with a RPLidar A2M8 to visualize obstacles all around it and is equipped with a Realsense D435i or a Realsense D455, depending on the robot, to have visual and depth information of the scene afront of it. In addition to the sensors, a RoboClaw 2x60A from BasicMicro is used as the motor controller since it has built-in protections, PID controls and quadrature encoder reading features. The motor controller and all the sensors are connected to the Rasberry Pi via USB. Furthermore, the Hiveboard and the BeeBoards are integrated into the robot and are connected though Ethernet to the Raspberry Pi to give the robots the necessary hardware to join a swarm.
Mechanical modifications
The electronic components and the computer are maintained by wooden plates built with laser cutting technology to have the perfect fit in the narrow interior of the robot. The sensors’ supports and side panels of the robot were 3D printed to give us more flexibility in the design process.
Robot’s software
The Raspberry Pi runs all the robot’s software inside the ROS (Robot Operating System) middleware. With the help of sensors, a navigation stack and RTAB-Map, the robot can move in its environment while mapping it and localizing itself in it. Additional ROS packages were made by the team to connect the ROS environment to the HiveMind running on the HiveBoard to let the swarm behaviors control the robot.
Map produced by one of our robot
The Hardware section
The following section will present in detail the 3 PCBs of the SwarmUS project:
· The HiveSight: Test platfrom for the Decawave Ultra-Wide Band (UWB) integrated circuit (IC) the DW1000
· The Hiveboard : The central board of the last iteration
· The BeeBoard : The UWB ‘’sensor’’
The HiveSight
As mentioned, the HiveSight was a test platform for the team. It confirmed that the team could do a functioning PCB (from design to testing) with the facilities we had access to. It was also a test platform for the UWB technology. Let us go over the board:
- Protection: function as an over-voltage, over-current and polarity inversion
- USB: The USB IC opens a COM port for debugging purpose. It is connected to the u-USB port
- 3V3&1V8 Gen: On board 3v3 and 1v8 power rail generation. The 1.8V comes from the 3.3V rail. All the components on the board are using the 3.3V rail. Only the Decawave IC DW1000 is using the 1.8V power rail.
- IMU: Inertial measurement unit
- WROOM reset: Physical button of the ESP32-WROOM (which is on the bottom side of the PCB) The ESP32-WROOM is a Wi-Fi IC.
- Flash 32k: 32 Kb flash memory. Used by our custom solution of the ESP32.
- ESP32: custom implementation of the ESP32 (Wi-Fi IC). This implementation was used to make a comparison between our custom implementation and the WROOM IC.
- CLK Gen: Generate the 38.4MHz clock for the DW1000s. Also includes an synchronisation signal for both the DW1000.
- DW1000: Ultra-wide Band IC. Used to localize other HiveSight in distance and angle. Each DW100 has 4 LEDs for debugging purpose.
Our implementation of the DW1000s is based off the white papers published by decawave (https://www.decawave.com/application-notes/) and is being tested by our team as I write theses lines. However, we did manage to catch some errors in our design.
The 5V power rail can come from 2 different places:
· From the green connector at the top left
· From the u-USB connector
Furthermore, the 3.3V source can come from the 3V3 rail or be supplied by the Nucleo. The Hivesight can also supply 5V or 3.3V to a Nucleo. However, when testing our design, we noticed that when the Nucleo supplied the 3V3 voltage to the HveSight, it was flashing an over-current warning. We recommend using a separate (as in the green connector/u-USB connector) power supply for the HiveSight.
The SPI communication between the HiveSight and the Nucleo is a bit deficient, as we did not treat the SPI lines correctly while routing (too much top to bottom changes) and the ground loop is not the smallest possible. We did not connect the ground of every connector of the HiveSight correctly which is slowing the SPI bus. The maximum speed of the bus we could do was 1Mbit/s of data.
The USB line to the COM port IC has switched plus and minus data lines, resulting in an error in the COM port. We recommend to either change the design or putting the data line in the correct order by removing the ESD protection and solder some wire across. In our implementation, that is the approach we used and put hot glue on top to prevent accidental disconnection.
Also, the IMU was never installed on our version because the footprint lacked the paste opening, therefore no solder was present, and the part was never connected to the board. This simple mistake will be fixed in the next implementation: The Hiveboard.
The HiveSight was designed to be used with the STM32F429ZIT Nucleo (despite the silk screen mentioning STM32F426, another mistake from our part).
All the files (schematic, BOM, PCB, Gerbers) are in the Github link.
The HiveBoard & the BeeBoard
The HiveBoard is the next implementation of the project. It will consist of:
· An central MCU (STM32H7)
· 6 USB-C channels containing
o An SPI
o A Clock signal (38.4MHz)
o A sync signal
o Interrupt request (IRQ) (Receive)
o Chip select (nCS) (sending)
o Reset (sending)
· A 38.4 MHz Crystal
· A WROOM ESP32 for Wi-Fi communication
· Redriving ICs for LVDS and LVCMOS signals
· An FT4232 FTDI debug IC
· A u-USB for debugging purpose.
· An ethernet port to transfer the information to the Robot computer.
· An IMU
The HiveBoard is the motherboard for distributing the clock to the 6 USB-C channels and Syncing all the clocks together. Also, it hoards the information sent by the BeeBoards. The Beeboard are 5x7 cm board hosting a single DW1000, an antenna, redriving circuits and switches in order to detect the USB-C polarity. Connected to a HiveBoard channel, the Beeboard sends an IRQ to the Hiveboard, meaning it has information. The Hiveboard then initiate the SPI transaction to retrieve this information and calculates the distance and the angle of the signal received.
The HiveBoard and the Beeboard are both in the development stage. We do not have any 3D model now, but it will be updated in the next weeks. Stay tuned on the GitHub for the updates.
The software side
As for the software part of our project, the development is guided by the three fundamental requirements of swarm robotic, which are the following:
· Agents in the swarm must have the ability to exchange data with each other, on 1 to 1 basis (unicast) as well as 1 to many (broadcast)
· Agents in in the swarm must be able to locate other neighboring agents with reference to themselves.
· Agents in the swarm are controlled by a shared intelligence which takes decisions for the swarm based on the information supplied by the agents constituting the swarm.
The goal of our project is to deliver a platform that will be able to support those features and expose them for any system. To test this, we will be using our platform alongside an existing robot based on a Raspberry-Pi (See robotics section). The following figure present each component of our architecture, which will be detailed in the following sections:
The HiveBoard and its firmware, the HiveMind
The HiveMind is the core of our system and therefore handles the three requirements of the swarm exposed earlier. It is based on a STM32 processor which runs a firmware that interfaces further hardware components (like the BeeBoard) as well as running the Swarm Intelligence. The HiveMind makes all the calculation to translate the data received from the BeeBoard into an actual position associated with a robot. As for the Swarm Intelligence, it is based on Buzz, a programming language dedicated to programming swarms of robots (https://the.swarming.buzz/wiki/doku.php?id=start ). This Buzz script is runs internally in a virtual machine in the firmware inside all HiveBoards, relying on the data exchanged on the network to takes decisions. Alongside the STM32, an ESP32 is connected via SPI and runs its own firmware, responsible of handling the Wi-Fi connectivity and data exchange across the network. Furthermore, this board exposes two choices of interface to communicate to the robot, either via ethernet or USB.
The Host Robot
The host robot is a robot in which we connect a HiveBoard. The core system component that we develop is the bridge. It handles the command sent by the Swarm Intelligence to the robot, like executing a movement or internal sharing data. The rest of its feature is on a per use-case basis. In our case, the goal is to perform a mapping of the environment to share across the swarm, with the use of a camera, a lidar and RTAB-map, the mapping software. The goal of our development on the host robot is to show the potential of our platform and swarm robotics.
The Augmented Reality Android App
Alongside the swarm of robots, we are also developing an android application to be able to monitor the state of the swarm and command it. To better visualize the state of the swarm and distinguish robots from one another, the app will feature Augmented Reality to add visual cues on robots as well as take commands, like setting a waypoint in the environment to get to. To interact with the robots and send commands, the Android device will be connected to an HiveBoard with a USB connection also delivering power to the board. However, the Android device will not be able to be located in space with our BeeBoards, as our architecture only supports 2D positioning on the Android device will be significantly higher from the ground, but it will be able to have an idea of the position of robots with the use of April Tags, which are specialized Qr codes to identify robots, and evaluate their relative position.
For more details:https://github.com/SwarmUS
Support from PCBWay would mean a lot to us. It would lightened the enormous financial pressure we are under. With the sudden rarity of electronic components, our BOM upped its cost, and we are afraid it might affect the functionality of the HiveBoard (ie, 3 channels instead of six) and force us to choose unfit component for our application. We would be proud to represent PCBWay as a major partner at the MÉGAGéniale Expo in December 2021!
- Comments(1)
- Likes(2)