A Tour through openpilot

5 minute read

This document is mostly written for internal consumption, but I figured, why not make it public? openpilot is our open source ADAS system that anyone can contribute to. We’ll start at the hardware and work our way up.

Note: in the 0.6 series of openpilot, camerad, modeld, and monitoringd all are combined into visiond. In the 0.7 series they will be broken apart.

The Hardware

Three pieces of hardware are needed to use the openpilot system. An EON running NEOS, a panda based on an STM32F4, and a supported car. The panda acts as the safety enforcing bridge between the EON and the car, using a chip with great support for functional safety, and software that will soon be MISRA (done), ISO26262, and SIL2 compliant.

The EON runs a modified version of Android where all the processes that this post is about run. And the car is obviously the car, with 3 CANs in the right arrangement for the car harness (it’s amazing how many manufacturers match this spec).

After the car, we hit the panda firmware, maintained by our hardware team. Through that, we get to the EON, and to the start of our software tour. You’ll find these daemons in openpilot/selfdrive.

They share an IPC format as specified by cereal. It’s all single publisher multiple subscriber messaging, abstracted such that multiple backends can be used. Right now, we support ZMQ and our custom msgq.

The Sensors and Actuators (hardware team)

boardd

This is the receiving side of the panda firmware. It uses libusb to communicate and parse the raw USB layer communications into “can” packets. On grey and black panda, it also broadcasts the GPS packets from the NEO M8.

camerad

This is the camera stack. It’s afaik the only public custom Qualcomm camera implementation, and it speaks directly with the kernel. It captures both the road and driver camera, and handles autofocus and autoexposure.

sensord

The rest of the sensors are handled here, gyro, accelerometer, magnetometer, and light. The GPS and Qualcomm raw GPS is handled here as well.

NEOS kernel space

This is the the Linux kernel and the big mess of Android. You’ll find our kernel here and our Android fork here. The kernel is unified to run on both the OnePlus 3 and the LePro 3.

The Data Processing (research team)

modeld

The main model, in models/driving_model.dlc, takes in a picture from the road camera and answers the question “Where should I drive the car?” It also takes in a desire input, which can command the model to take action, such as turning or changing lanes. This is where a lot of the comma magic happens, it’s deeply temporal and trained in ways and using tricks that exceed the deep learning state of the art.

modeld also runs the posenet, in models/posenet.dlc. It takes in two frames and outputs the 6-DoF transform between them. It is used for calibration and sanity checking, and is not trained in any particularly magical way.

monitoringd (lives in modeld directory)

This is the driver monitoring model runner. It tracks your head pose, eye positions, and eye states using the model in models/monitoring_model.dlc. It runs on the DSP so as to not use CPU or GPU resources needed by the other daemons, giving it tons of room to grow.

locationd/ubloxd (TBD)

So there’s stuff in locationd right now, but it’s not the final goal of a real localizer. Right now, it parses the data stream from the ublox in ubloxd, then combines it with the posenet to get a stable estimate of the yaw.

calibrationd

The model takes in calibrated frames, meaning the yaw and pitch is corrected for, before the model even looks at the picture. This is important because users mount their EONs in all sorts of ways, and calibration outputs the transform to canonicalize it.

The Controls (openpilot team)

controlsd

This is the main 100hz loop driving the car. It gets a plan from plannerd, and constructs the CAN packets required to make that plan happen. It also publishes carState, which is our universal car abstraction.

plannerd (lives in controls directory)

The model output isn’t quite good enough to drive the car. It outputs where the car needs to be, but it doesn’t know how to get the car there. In planner, we run 3 ACADO based MPC control loops, 1 for lateral and 2 for longitudinal control.

radard (lives in controls directory)

This parses the radar into a RadarState packet. Cars have all different radars, and this canonicalizes them.

paramsd (lives in locationd directory)

This is the learner for car based parameters, like tire stiffness, steering angle offset, and steer ratio.

logging/app/UI (cloud team)

loggerd

This daemon subscribes to all the sockets and cameras, and writes them out to the logs.

uploader/deleter

After we’ve logged the data, we have to get it to the cloud. But not all data makes it to the cloud anymore, we delete old data to make sure there’s always free space. Like a real dashcam.

ui

This is the main driving UI. It’s a 2300 line mess and needs a refactor, but it does work.

apk.frame

This is the outer border. Soon, this will be merged into the C++ UI. The source for this lives here.

apk.offroad

This is the settings menu, the onboarding tutorial, the miles display, and the ad for comma prime. It’s written in React Native and lives here.

athenad

This service allows real time communication with your parked car. Check out the API.

System Support (openpilot team)

manager/thermald

This starts and stops the constellation of processes that make openpilot work.

updated

This daemon managed openpilot updates.

logmessaged/tombstoned/logcatd/proclogd

These are helpers to log data in the event of a processing or the system misbehaving.

NEOS user space

This is the termux based userspace on the EON. It provides a Linux like environment on Android.

Call to Action

If you are interested in working on this open source project, comma.ai is hiring an openpilot engineer. Apply today!

Also follow us on Twitter.

Updated: