Skip to main contentSkip to Xpert Chatbot

ETHx: Self-Driving Cars with Duckietown

Self-Driving Cars with Duckietown is the first robotics and AI MOOC with scale-model self-driving cars. Learn state-of-the-art autonomy hands-on: build your own real robot (Duckiebot) and get it to drive autonomously in your scaled city (Duckietown).

9 weeks
4–10 hours per week
Self-paced
Progress at your own speed
Free
Optional upgrade available

There is one session available:

12,753 already enrolled! After a course session ends, it will be archivedOpens in a new tab.
Starts Nov 21
Ends Dec 22

About this course

Skip About this course

Robotics and AI are all around us and promise to revolutionize our daily lives. Autonomous vehicles have a huge potential to impact society in the near future, for example, by making owning private vehicles unnecessary!

Have you ever wondered how autonomous cars actually work?

With this course, you will start from a box of parts and finish with a scaled self-driving car that drives autonomously in your living room. In the process, you will use state-of-the-art approaches, the latest software tools, and real hardware in an engaging hands-on learning experience.

Self-driving cars with Duckietown is a practical introduction to vehicle autonomy. It explores real-world solutions to the theoretical challenges of autonomy, including their translation into algorithms and their deployment in simulation as well as on hardware.

Using modern software architectures built with Python, Robot Operating System (ROS), and Docker, you will appreciate the complementary strengths of classical architectures and modern machine learning-based approaches. The scope of this introductory course is to go from zero to having a self-driving car safely driving in a Duckietown.

This course is presented by Professors and Scientists who are passionate about robotics and accessible education. It uses the Duckietown robotic ecosystem, an open-source platform created at the MIT Computer Science and Artificial Intelligence Laboratory and now used by over 150 universities worldwide.

We support a track for learners to deploy their solutions in a simulation environment, and an additional option for learners that want to engage in the challenging but rewarding, tangible, hands-on learning experience of making the theory come to life in the real world. The hardware track is streamlined through an all-inclusive low-cost Jetson Nano-powered Duckiebot kit, inclusive of city track, available here.

This course is made possible thanks to the support of the Swiss Federal Institute of Technology in Zurich (ETH Zurich), in collaboration with the University of Montreal (Prof. Liam Paull), the Duckietown Foundation, and the Toyota Technological Institute at Chicago (Prof. Matthew Walter).

Course created with support from

University of MontrealDuckietownToyota Technological Institute at Chicago

At a glance

  • Institution: ETHx
  • Subject: Computer Science
  • Level: Introductory
  • Prerequisites:

    Basic Linux, Python, Git:

    • we are going to use a terminal interface, so basic knowledge of Bash is required (cd, ls, mkdir, ...)

    • We are going to write "autonomy" code in Python

    • We are going to pull, fork, push, branch repositories, etc.

    Elements of linear algebra, probability, and calculus:

    • We are going to use matrices to represent coordinate systems

    • We are going to use notions of probability (marginalization, Bayes theorem) to derive perception algorithms for the Duckiebot

    • We are going to write down equations of motion, which involve ODEs (recognizing the acronym is a good start!)

    Computer with native Ubuntu installation

    • We are going to use Ubuntu 22.04 with a native (e.g., dual boot) installation*

    • Minimum requirements: Quad-core at 1.8Ghz, 4GB RAM, 60GB hard drive, GPU compatible with OpenGL 2.1+

    • Recommended setup: Quad-core at 2.1Ghz, 8GB RAM, 120GB hard drive, GPU compatible with OpenGL 2.1+

    • A broadband internet connection: we are going to up and download gigabytes of data (exercises, activities, agent submissions)

  • Language: English
  • Video Transcript: English
  • Associated skills:Docker (Software), Automation, Robot Operating Systems, Artificial Intelligence, Computer Science, Algorithms, Autonomous Vehicles, Robotics, Python (Programming Language), Machine Learning

What you'll learn

Skip What you'll learn

After this course, you will be able to program your Duckiebots to navigate (without accidents!) in road lanes of a model city with rubber-duckie-pedestrian-obstacles using predominantly computer vision-based techniques.

Moreover, you will:

  • recognize essential robot subsystems (sensing, actuation, computation, memory, mechanical) and describe their functions

  • make your Duckiebot drive in user-specified paths

  • understand how to command a robot to reach a goal position

  • make your Duckiebot take driving decisions autonomously according to "traditional approaches", i.e., following the estimation, planning, control architecture

  • make your Duckiebot take driving decisions autonomously according to "modern approaches" (reinforcement learning)

  • process streams of images

  • be able to set up an efficient software environment for robotics with state-of-the-art tools (Docker, ROS, Python)

  • program your Duckiebot and make it safely drive in empty roads lanes

  • program your Duckiebot and make it recognize and avoid rubber duckie obstacles

  • submit your robot agents (a.k.a. "robot minds") to public challenges, and test your skills against your peers

Additional goals (require hardware)

  • independently assemble a Duckiebot and a Duckietown

  • remotely operate your Duckiebot and see with its eye(s)

  • be able to discuss differences between theory, simulation, and real word implementation for different approaches

  • experience the challenges of deploying complex autonomous robots in the real world, and reap the rewards of getting it to work

Module 0: Welcome to the course

  • Welcome to the course, by Prof. Emilio Frazzoli
  • You will familiarize yourself with the logistics and navigation interface of the course resources

  • You will start a learning journey in the world of robot autonomy with Duckietown

Module 1: Introduction to self-driving cars

  • The potentials and challenges

  • Levels of autonomy

  • The vision for autonomous vehicles (AVs)

  • Activities: You will set up your learning environment, and your Duckiebot, and make your first challenge submission

Module 2: Towards autonomy

  • Making a robot

  • Sensorimotor architectures

  • Stateful architectures

  • Logical and physical architectures

  • Application: You will create a reactive "Braitenberg" agent to avoid duckies and see how your agent compares to other submissions

Module 3: Modeling and Control

  • Introduction to control systems

  • Representations and models

  • PID control

  • Application: You will design an odometry function and PID controller to command your Duckiebot's angular velocity

Module 4: Robot Vision

  • Introduction to projective geometry

  • Camera modeling and calibration

  • Image processing

  • Application: You will develop image processing techniques necessary for visual lane servoing - controlling your Duckiebot to drive within markings

Module 5: Object Detection

  • Introduction to neural networks

  • Convolutional neural networks

  • One and two-stage object detection

  • Application: You will train a convolutional neural network (CNN) to detect duckies and integrate your model with ROS to run onboard your Duckiebot and avoid duckies

Module 6: State Estimation and Localization

  • Bayes filtering framework

  • Parameterized methods (Kalman filter)

  • Sampling-based methods (particle and histogram filter)

  • Application: You will build a state estimation algorithm combining the dynamics and sensor data of your Duckiebot in order to predict its pose as it travels through the world

Module 7: Planning I

  • Formalization of the planning problem

  • Application: You will create a collision checker to determine if your Duckiebot is crashing into an obstacle

Module 8: Planning II

  • Graphs

  • Graph search algorithms

  • Application: You will tackle a variety of path-planning challenges and leverage all the skills you've built thus far to navigate your Duckiebot in a variety of simulated environments

Module 9: Learning by Reinforcement

  • Markov decision processes

  • Value functions

  • Policy gradients

  • Domain randomization

  • Application: You will explore the capabilities and limitations of reinforcement learning models when applied to real-world robotics tasks such as lane following

Learner testimonials

Skip Learner testimonials

I learned a lot. The theory was good but I mostly enjoyed gaining practical skills: I had never before used docker, ROS, GitHub, Slack, Stack Overflow, Jupyter notebooks, Colab, to name a few tools, and only very marginally Linux command line and Python. So for me, there was really a lot to take in. It was challenging and therefore all the more rewarding to see progress materialize.

Frequently Asked Questions

Skip Frequently Asked Questions

Q. Is it mandatory to have a native Ubuntu installation?

A. It is possible to have things work in Virtual Environments on Mac or Windows host machines, but to minimize entropy and encourage all to walk at the edge of their comfort zones, we will only provide technical support for native Ubuntu installations.

Q. Is the hardware mandatory for the course?

A. No, but without the hardware, you will be missing out on most of the learning experience.

Q. Which Duckiebot version is recommended for this course?

A. The supported Duckiebot version for this course is DB-J4: Duckiebot with Jetson Nano 4GB.

Q. I have a Jetson Nano 2GB and don't want to purchase a new 4GB model. Can I still take the course?

A. Most of the learning activities will work with Jetson Nanos 2GB, but you will experience severe hardware limitations when running some of the most advanced course content. You can still take the course but the recommended and supported hardware setup is the 4GB version.

Q. I have a DB21-M (or Duckiebot with Jetson Nano 2GB), do I need to purchase a new DB-J4 to take the course?

A. No, you can "hot-swap" the Jetson Nano 2GB with Jetson Nano 4GB (you will need to reflash the SD card according to the assembly instructions).

Q. How much space will I need at home/in the office to deploy the Duckietown city track?

A. The smallest recommended setup is roughly 2x2 meters (80x80 inches).

Q. I already have a Duckiebot but just need the city materials. Can I get them separately?

A. Yes. You can find standalone city packs at this link.

Q. How can I reach out to a human for questions?

A. For questions about the course, send an email to info@duckietown.com. For questions about the hardware, please reach out to hardware@duckietown.com.

Who can take this course?

Unfortunately, learners residing in one or more of the following countries or regions will not be able to register for this course: Iran, Cuba and the Crimea region of Ukraine. While edX has sought licenses from the U.S. Office of Foreign Assets Control (OFAC) to offer our courses to learners in these countries and regions, the licenses we have received are not broad enough to allow us to offer this course in all locations. edX truly regrets that U.S. sanctions prevent us from offering all of our courses to everyone, no matter where they live.

Interested in this course for your business or team?

Train your employees in the most in-demand topics, with edX For Business.