I am a part of the engineering team of a startup company called Zeius.
We are currently in development of a drone with gaming capabilities. As we’ve made progress on our internal developer tools we thought the drone community could benefit from these tools as well. We are developing a completely open-source super compact single board computer designed specifically for drones (think of a Raspberry Pi specifically made for drones).
In short, we’re looking to reach out to the drone community to see if the tools we are working with would interest anyone else.
The key functionalities that we are proud to support are:
Single circuit board for entire functionality
Completely open-source software based on Debian Linux
Fully compatible with ROS and other software platforms for custom flight controllers, computer vision, mapping, etc.
Modular design for adding extra peripherals via UART, I2C, and USB
I would love to see more details and hardware choices for your flight controller (sensor choices, I/O, connectors, etc.) Your project sounds very interesting!
We found the interval timing (near real time capability) of linux is often not very great for real time control applications. Airplanes and drones have a lot of inertia, so dropping out/pausing for a 1/10 or 1/4 second from time to time is almost imperceptible until you look at the data log. So if you are doing data collection or engineering research and care about precise interval timing, it’s something to think about up front. It’s also something worth measuring on your own system and don’t assume it just does what you want. The linux kernel is great for being fair, but it’s not real time (not close.) Compiling the RT kernel doesn’t buy you much unless you dive super deep to leverage the new capabilities (and even understand how to do that.) We ended up putting all our sensor IO on a simple arduino board (teensy) to do all the hard real time tasks there using a simple ISR model. Then we put all the higher level stuff on our linux board.
Fun fact, when we measured the interval timing on the px4 a couple years ago, we observed as much as 0.4 second drop outs even though the average interval time was close to the requested value.
Fun fact, increasing the main loop frequency can make your timing misses worst case smaller, but doesn’t fundamentally fix the issue. Also, if the drop out is due to the kernel deciding it’s time to write data to the SD card, there may not be much you can do about it in code.
You probably should pay attention to things like: if your IMU is connected via I2C, how fast and reliably can you actually sample that from linux? Sometimes there are nice demos out there for doing these things, but they don’t consider sampling the sensor at a rock solid 100hz (for example.) They are more intended for querying a temp sensor (for example) now and then.
You may be way ahead of me on all of these things and already have these considered and addressed, so no worries if that is the case. I’m just trying to think of things that might trip you up down the road if you don’t see them coming.
Please keep us updated here! It’s not often I run across people interested in talking about flight controllers and drones at this level!
Here’s one of my projects (which actually works) I’ve been wrenching on it since the mid-2000’s. Currently I work for the U of MN UAV lab as a full time research/support engineer.
Side note, a company (the good guys) accidentally took my project name as their own, so I’m slowly rebranding my project as I get around to it which I’m a little sad about, but I’ll let them have the Aura name.
It’s great to hear from a fellow researcher! My background is in mapping and localization technology for mid-sized quadrotors where precision timing and accurate measurements are paramount. Reading about your implementation of a multi-state EKF on your GitHub brings back many memories!
For Zeius, we aren’t necessarily putting a lot of focus on research capabilities, unless that’s of an interest to the community. Our biggest focus was to be capable of running image processing on the drone itself. Real-time operation of the entire image processing pipeline from communication with camera’s hardware to the computer vision algorithms was the primary concern. We also wanted to be certain we weren’t using any unobtainable hardware to accomplish this and to keep the price reasonable.
We are currently working with a quad-Core Cortex-A7 ARM CPU running around 1.2GHz with 512 MB of RAM. For development we have a UART port, CSI connector, 4 GPIO, Wifi antenna port, onboard IMU via I2C, and SD card slot. We could expand that to multiple UARTs, I2Cs, and or USBs. Including USB would be great considering all the peripherals that would allow for such as some sophisticated lidar sensors and whatnot.
Flight controller is completely homebrew in Python, allowing for a lot of flexibility if future developers would enjoy adding advanced control techniques like LQR or backstepping or autonomous functionality, etc. Even adding hovering stabilization could be done with ease considering we like to keep the flight control exposed so people can you with it without needing to deal with compilers or anything. Python is great but is not the greatest solution when it comes to real-time. Due to this we keep our control loops relatively slow.
What we have now fits around 57mm x 34mm although we would have to expand the dimensions to accommodate for the extra interfaces. We are hoping to have our board be a perfect fit for drones under 250g or more if desired.
Hopefully this sheds a little bit of light on what we’re doing.
Yeah, that really sounds cool and aligns a lot with my interests! I definitely feel like more python is better (but you have to maintain performance standards so there needs to be some balance.) I came up with a way to mix a hybrid of C++ and python code in the same app where each side shares a common data structure tree (aura-props in my github repository.) My other current big project is mapping … right now I’m trying to extract a super detailed map out of a handflown survey at about 3-4m above ground captured with video. Every new map or dataset seems to stretch my code in new ways, so it’s taking some time to pull this together, but getting there … > 3000 images so kind of a big data set to wrestle. https://github.com/UASLab/ImageAnalysis
Definitely looking forward to seeing what you guys come up with. My controller/code is more focused on fixed wing flying, but it’s MIT licensed so if you find anything there you want to borrow from, the barrier should be pretty low.
Edit: I’m sure I’ve shared this video before, but here’s our system in action (landing) with some augmented reality added in post process (based on our EKF, so it’s sort of a visualization of what the autopilot was thinking drawn on top of the actual camera video.)
Edit #2: details that I think are worth noticing is where I use the EKF solution to estimate where the moon should be in the camera and then draw where it should be (if the EKF was perfect) and then seeing the moon pretty close to that most of the time. That and the position of the horizon line vs the actual horizon gives an indication of how well (or poorly) the EKF is tracking truth.
Well to be honest with you we want our design to be so simple and cheap that if you like crashing into stuff and fixing them it should be quick and easy. Just swap out the SD card into another board, slap it on a frame, plug the motors in and you’re good to go. How we have it set up now there’d be no soldering or anything tedious required.
Actually one of the most fun parts of building this system has been making custom 3d printed frames. Experimenting with different designs and sizes whenever one breaks is pretty neat. You could use a prebuilt frame though for sure.
Also, I should mention, there’s no reason why this board couldn’t control other vehicles too. As long as the motors or actuators take PWM you could control a fixed wing plane, RC car, watercraft, etc. There’s potential to make breakout boards for higher power applications too but that begins to reduce the simplicity.
If we get enough positive feedback we will shoot for either December 2020 or early 2021 launch for the development kit. After that the product will be available on our website www.zeius.io, (hopefully) Amazon, and other DIY drone distributors.
I will stay in touch via email in the coming months to make sure you’re aware of the launch of our first batch.