DonkieTown is an affordable and scalable platform for research on autonomous vehicles. The experimental framework was developed in ROS. The platform integrates multiple small scale autonomous vehicles called Asinus Cars, which are equipped with at least a camera, odometer, and onboard computer. The vehicles are Differential Drive Robots (DDR), forced by software to behave as car-like vehicles. DonkieTown incorporates a low-cost localization system to provide the real-time vehicles' pose, by means of external cameras which detect ArUco markers, then Kalman Filters (KF) are used to track and estimate the pose of each vehicle. The platform includes a base station computer with a graphical interface for monitoring the system. DonkieTown also includes a series of algorithms to facilitate autonomous driving, such as communication, tracking, object detection, obstacle avoidance, control, trajectory tracking, etc. Moreover, a centralized vehicular network is implemented to allow communication between the agents and the base station, where the agents can share information about their state, obstacles, maneuver intentions, etc.
All developed source code, libraries and manufacturing files are released as open source under no license agreements. We expect every DonkieTown user to attribute our effort by citing DonkieTown.
@article{Larralde-Ortiz_Luviano-Juárez_Mirelez-Delgado_Mercado-Ravell_2023,
title={DonkieTown: a Low-cost Experimental Testbed for Research on Autonomous Cars},
volume={21},
url={https://latamt.ieeer9.org/index.php/transactions/article/view/7756},
number={6},
journal={IEEE Latin America Transactions},
author={Larralde-Ortiz, Emmanuel and Luviano-Juárez, Alberto and Mirelez-Delgado, Flabio and Mercado-Ravell, Diego},
year={2023},
month={Jun.},
pages={715–722}
}
To contribute, you are free to create, manage and maintain side branches. By the moment direct git pushes to main branch and any forced git push are not allowed, however, you may submit GitHub pull requests and the maintenance team1 will review them and decide whether to merge it or not.
If you have not installed and set up DonkieTown, visit the installation page for directions. You must get it done before using DonkieTown. Once it is done, you may be able to start launching Simulator nodes, Navigation nodes, Localization Nodes, and Vehicular Communication Nodes.
ROS Nodes are processes capable of subscribing (or listening) and publishing (or talking) to ROS topics while ROS topics are named buses for the broadcasted data. ROS launch is a command that launches one or more nodes with both preset configuration and command-line arguments. The following ROS launches are sufficient to use all DonkieTown functions.
Multiple simulation scenarios are provided in the bring_up package. You can launch one scenario, e.g. in-house navigation, with the following command.
roslaunch bring_up navigation_inhouse.launch
Core node is the essential node of the Asinus Cars. This node enables ROS topics to enable the motors, odometry and an Extended Kalman Filter. If you only want to control the Asinus Car, this node is the go. Also, alongside fake_gps node(s), you get an estimation of the robot's absolute position within the road.
roslaunch asinus_car core.launch car_id:=<marker_id>
In addition to Core features, Prime enables DonkieNet (a MobileNet+SSD detection network trained with a hand-labeled dataset of stuffed white donkeys).
roslaunch asinus_car prime.launch car_id:=<marker_id>
The following starts message handling and post-processing of shared data of pedestrians detected by all Asinus Cars.
roslaunch vehicular_communication network.launch
It is recommended to use the base station as navigation controller since you could start it and finish it without login to the Asinus Car.
roslaunch fub_navigation navigation.launch car_id:=<marker_id>
For this, you should calibrate each camera and recognize the port it is connected to. To calibrate a camera, we recommend the OpenCV's tutorial.
To detect camera ports, install and use v4l2 utils:
sudo apt-get install v4l-utils
v4l2-ctl --list-devices
Once you have calibrated a camera and detected its port, launch the node.
roslaunch fake_gps fake_gps.launch upcam_id:=<upcam_id> cam_port:=<cam_port> calib_file:=<path_calibration_file>
A node must be launched per camera. Multiple nodes can be launched at the same time with little latency impact. Just calibrate every camera, detect its port, and assign it an upcam_id and pass its calibration file.
In Q1 2023, the CIMAT Zacatecas community has provided 3 workshops. Undergraduate students, graduate students, teachers, and entrepreneurs from around Mexico have taken the workshop in Zacatecas (city).
Occasionally visit this repo, documentation is constantly in change and great news will come 😉. Follow Tsanda Labs on YouTube or at least watch its videos. Videos are being producing in this moment. Also, visit this website if you want to enroll to the next workshop.
- Emmanuel Larralde-Ortiz
- Alberto Luviano-Juárez
- Diego Mercado-Ravell
- Flabio Mirelez-Delgado