We had so much fun with our first family project that we decided to do one more before the end of the summer. The kids -- if I may use that term to describe two awesome young men in high school -- decided it would be fun to do a software project as a family; a game, to be exact.
It's been a long time since mom and dad have worked on a game, so it sounded like fun to us. We did add one stipulation, however: it would have to be done in two weeks and at the end of those two weeks we had to have a complete, finished game that could be accepted into the various app stores.
The kids' response? Challenge accepted.
Particle Strain title screen.
Two weeks to make a technology decision, design and program a game (along with all of the various art and sound assets) made for an interesting constraint. We were also constrained by the fact that we're programmers first and foremost, with not a lot of artistic or musical ability among us. This definitely shaped the type of game we could make.
Me not doing so well at the whole "make patterns" thing.
We eventually settled on a action/puzzle game where the player flies through wormholes collecting particles to make patterns on the screen. Pretty much everything in the game such as levels, textures, and effects are procedurally generated. We ultimately decided on Unity3D as a technology framework, mainly for its built-in asset pipeline and its ability to target a variety of mobile platforms. This also gave us the chance to shake the rust off of our C# skills.
And don't forget about all those little things like app stores, code signing, and even a little website to go along with it. All in all we had a really great time doing this project. We even met our programming deadline with 20 minutes to spare, getting the last bug fix in at 11:40 PM on the last day.
As of yesterday, Apple finally approved Particle Strain for the iTunes store. It is now available as a free download in all of the usual places: iTunes, Google Play, and Amazon. There is a big difference between writing code and developing a complete, finished product. I'm glad we were able to help the kids experience it, even if in a small way.
If you end up trying out Particle Strain, we hope you enjoy our little game. I, for one, can't wait to figure out what our next project will be!
Note: This is a series of articles in which we document our attempt to build an autonomous drone for the Sparkfun AVC. The previous post is here.
This is a followup to my previous article describing how to compile a hardware accelerated version of OpenCV. You can find that article here, but to briefly recap: part of the Sparkfun AVC is finding and popping red balloons. We wanted to do this using on-board vision processing. For performance reasons we need to hardware accelerate the vision system. We are using the Jetson TK1 board from NVIDIA with the OpenCV library.
The Open Computer Vision project (OpenCV) is easy to use and allows you to quickly program solutions to a wide variety of computer vision problems. Normally, the structure of a simple computer vision program using OpenCV is:
read in a frame from a video camera or a video file
use image transformations to maximize the visibility of the target
extract target geometry
filter and output information about the target.
For hardware accelerated applications, OpenCV provides hardware accelerated replacements for most of its regular functions. These functions are located in the gpu namespace which is documented at http://docs.opencv.org/modules/gpu/doc/gpu.html
The Algorithm
When trying to identify our balloon targets, the first thing I tried was converting the video stream to Grayscale because it is fast and cheap. However, this did not give sufficient distinction between the sky and balloons. I then tried converting the stream to HSV (Hue Saturation Value) because it is good for identifying objects of a particular color and relatively simple to do. The balloons are quite distinct in both the hue and saturation channels, but neither alone is sufficient to clearly distinguish the balloons against both the sky and the trees. To resolve this, I multiplied the two channels together, which yielded good contrast with the background.
Here is the code implementing that section of the algorithm.
Hue is normally defined with a range of 0..360 (corresponding to a color wheel) but to fit into eight bits, it is rescaled to a range of 0..180. Red corresponds to both edges of the range, so taking the absolute value of the difference between the hue (with a range of 0..180) and 90 gives how close the hue is to red (huered) with a range of 0..90. The redness and saturation are both divided by constants chosen to give them appropriate weightings and so that their product fits the range of the destination. That result, which I call balloonyness (i.e. how much any given pixel looks like the color of the target balloon) is then taken through a binary threshold such that any pixel value above 200 is mapped to 255 and anything else is mapped to zero, storing the resulting binary image in the thresh variable. The threshold function is documented at http://docs.opencv.org/modules/imgproc/doc/miscellaneous_transformations.html#threshold, the GPU version is equivalent.
Once the image has been thresholded, I extract external contours. The external contours correspond to the outlines of the balloons. The balloons are nearly circular, so I find the minimal enclosing circle around the contours. Then, to deal with noise (red things that aren't shaped like balloons), I compare the area of the circle with the area of the contour it encloses to see how circular the contour is. The image below illustrates this process.
I could also have used an edge detector and then a Hough circle transform (http://en.wikipedia.org/wiki/Hough_transform) to detect the balloons, but I decided not to because that method would not be able to detect balloons reliably at long range.
The Joys of Hardware Acceleration
Before hardware acceleration, this algorithm was running at between two and three frames per second on the Jetson board. With hardware acceleration, it now runs at over ten frames per second, and it is now limited by how quickly frames from the camera can be captured and decoded. This equates to about a five times speedup overall. It is even greater if you only consider the image processing phase. Writing computer vision systems seems very intimidating at first, however there is very good library support so many types of problems can be solved easily with only a little research and experimentation.
Note: This is a series of articles in which we document our attempt to build an autonomous drone for the Sparkfun AVC. The previous post is here.
Part of the Sparkfun AVC is finding and popping red balloons. We wanted to do this using on board vision processing. For performance reasons we need to hardware accelerate the vision system. We are using the Jetson TK1 board from NVIDIA with the OpenCV library. Here are instructions for configuring OpenCV with CUDA hardware acceleration on the Jetson TK1.
Imaging the Jetson Board
The first step is to image the Jetson TK1 with the latest version of Linux 4 Tegra (L4T); at the time of writing this is Rel-19 which is available at https://developer.nvidia.com/linux-tegra-rel-19. There are good instructions for this available from NVIDIA in the quick start guide available on the L4T page.
I did "sudo ./flash.sh -S 8GiB jetson-tk1 mmcblk0p1"
Installing Cuda
You must be part of the CUDA/GPU Computing Registered Developer program to get CUDA. Signing up is free on the NVIDIA developer website. You will have to login or create an account.
Download the CUDA Toolkit; at the time of writing this is CUDA 6.0 Toolkit for L4T Rel-19.2 which is available from developer.nvidia.com/jetson-tk1-support. That page also contains a getting started with linux guide which has more instructions; I only give the minimum required for this specific case. For troubleshooting refer to the getting started with linux guide.
Install the CUDA Toolkit: cd Downloads/ sudo dpkg -i cuda-repo-l4t-r19.2_6.0-42_armhf.deb sudo apt-get update sudo apt-get install cuda-toolkit-6-0
Create symbolic links for the CUDA Libraries for compatibility. I put them in /usr/local/lib: cd /usr/local/cuda-6.0 sudo ln -s *.so /usr/local/lib
Ensure that your PATH contains the CUDA executables: export PATH=/usr/local/cuda-6.0/bin:$PATH
Ensure that your LD_LIBRARY_PATH includes the CUDA libraries and custom built libraries: export LD_LIBRARY_PATH=/usr/local/lib:/usr/local/cuda-6.0/lib:$LD_LIBRARY_PATH
Compiling OpenCV
Download and extract the OpenCV code. I am using version 2.4.9 because, at the time of writing, that is the latest version.
Create a build directory in the same directory as the source directory. I put the source code in my Downloads directory, which currently contains "cuda-repo-l4t-r19.2_6.0-42_armhf.deb", "opencv-2.4.9.zip", "opencv-2.4.9", "opencv-2.4.9-build".
The GUI functions in OpenCV depend on GTK, so if you plan to use them, install libgtk2.0-dev: sudo apt-get install libgtk2.0-dev
You need CMake to configure OpenCV so install that: sudo apt-get install cmake
Change into the build directory for OpenCV
Configure OpenCV with the appropriate CUDA_ARCH_BIN for your GPU's Compute Capability, which can be determined with the deviceQuery CUDA sample. For the Jetson TK1, this is 32, so I ran: cmake -DCUDA_ARCH_BIN=32 ../opencv-2.4.9
Run the compile and install using make; The command I used was: sudo make -j4 install
The -j4 flag instructs it to run four jobs simultaneously, which gives a considerable speed-up when run on the Jetson which has four large ARM Cores, but allows some of the output to be out of order or interleaved.
Using GPU accelerated OpenCV
The GPU module for OpenCV is quite simple to use. The functions in the gpu namespace generally have identical semantics to the cpu variants, with the only difference being that they take cv::gpu::GpuMat arguments instead of cv::Mat arguments. Data must be uploaded and downloaded between the CPU and GPU in order to be used. GpuMat provides upload and download functions that take a single Mat as an argument to transfer the data between the CPU and GPU.
For example, the code below shows the difference between using the CPU and the GPU for the threshold function.
using namespace cv;
Mat src_host, dest_host;
//on the CPU threshold(src_host, dest_host, THRESH_VAL);
GroovyFX makes writing JavaFX code fast and easy. The latest version is available from Maven Central using the coordinates
org.codehaus.groovyfx:groovyfx:0.4.0
This new version includes support for Groovy 2.3.x as well as Java 8 and JavaFX 8. Please try it out and let us know if you have any problems by sending email to the mailing lists.
If you are a current user of GroovyFX and have any thoughts or questions on future directions, please send those to the mailing lists as well!
Note: This is a series of articles in which we document our attempt to build an autonomous drone for the Sparkfun AVC. The previous post is here.
Day four was our final day of prototyping so we really wanted to get the ball drop working. We felt that being able to launch, fly a significant distance, drop the ball on a target, and return to a safe landing would give us a minimally viable entry for the Sparkfun AVC.
Having had no success with the passive drop on day three, today was all about programming the Pixhawk to activate a servo motor to initiate the drop. We constructed a simple prototype using some of the parts we had previously printed and then figured out how to wire it into the Pixhawk. It took some time to learn the idiosyncrasies of Pixhawk programming and get it to send the correct servo command at the right time. Perseverance was the word of the day, and it paid off in the end.
You can see the culmination of our four days of prototyping in the summary video, and then read on for all of the (hopefully) interesting details.
You Got Servo-ed
After constructing a quick and simple prototype of the drop mechanism, we attached it to the quad-copter using duct tape, that time-honored prototyping tool.
One of the major pain points of the day was finding good documentation for our specific situation. It's very possible that we just never happened to find the correct, up-to-date docs, but this was a major source of confusion.
The Pixhawk has 8 ports labeled MAIN OUT and 6 additional ports labeled AUX OUT. Based on the best information we could find, our initial guess was that the AUX ports should be used for controlling servos, and that we should pass a servo number of 1 to the DO_SET_SERVO command if we wanted to use AUX port 1. The command's position parameter was another uncertainty. We read that typical position parameter values ranged from 1000 to 2000, so we began with values in that range hoping to just see any movement that would confirm that we were on the right track. When we tried to run the program, the servo just sat there silently mocking us. At this point, the list of potential problems was long:
Was the servo connected properly?
Which port should we use?
Did we use the right servo number in the command?
Were we sending it valid position values?
Is our DO_SET_SERVO command even executing?
Did we need to initialize the port or set other initialization parameters?
What were we doing wrong? Well, as it turns out, all of the above. Let's take them one at a time and we'll share our discoveries and solutions along the way.
Was the servo connected properly?
It was time to go down to the hardware level and eliminate some of our unknowns. We started probing the Pixhawk's output ports with a volt meter and an oscilloscope.
We discovered some things that surprised us. The Pixhawk does not supply power to any of the power pins on its bank of PWM output ports. The only port that has a working 5V power pin is the RC IN port, which is used to talk to the receiver. We couldn't see an obvious place to get 5V power from the Pixhawk itself, so as a work-around we used one of the PWM outputs of the receiver module, which was powered by the Pixhawk's RC IN port. This is a bit of a hack, but it worked.
Which port should we use? Did we use the right servo number in the command?
We discovered that there is a servo test interface in the Mission Planner app. The docs say that it doesn't actually work, but some reports from the internet indicate that it does. When we initially used this interface to test our servo, it would never move, but we couldn't be sure if it was a working test.
After figuring out the servo connection as described above, we were able to confirm that the servo test interface in Mission Planner does indeed work! We were able to see output on MAIN port 8 when toggling the output of servo 8 on the UI. Some quick probing verified that servo numbers 5 - 8 correspond to ports 5 - 8 on the Pixhawk's MAIN OUT ports. The AUX OUT ports 1 - 3 map to servo numbers 9 - 11 on the test interface. Presumably, AUX OUT 4 - 6 map to servo numbers 12 - 14, but the UI didn't allow us to test that. We chose to use MAIN OUT port 8 for our servo, so as to avoid ports 1 through 6 since they will be used by the motors on our hex-copter.
Were we sending the servo valid position values?
Using the servo test UI, we could now determine that the proper position values for our mechanism were 100 and 1900 for the open and close position. This was an exciting moment - the servo was finally moving. It was a big step for us even though it could only be controlled through the test UI. We were finally confident that our servo should work if we could figure out how to execute the commands properly
Is our DO_SET_SERVO command even executing?
There are some gotchas in getting the DO_SET_SERVO command to execute properly. First, you can't just run a mission to move a servo. The servo command needs to be sandwiched between two navigation commands. But, not just any two navigation commands. The navigation commands need to take long enough to execute that the servo has time to complete its movement. This means that the waypoints must be far enough apart that they will not trigger before the servo has the chance to open all the way. We ran into a few more gotchas before Matt figured out a workable solution of a nav command followed by a single servo command, followed by another nav command (not too close to the first one), followed by another single servo command, and one final nav command. Trying a servo command followed by a condition delay didn't work.
We had hoped to test the servo with a tiny mission that we could execute indoors with the propellers removed from the quad-copter, but were unable to get that to work. We are still looking for a good way to debug mission programming, so let us know if you know how to get feedback from Pixhawk that would let us know when our commands are being executed.
The end result was a mission that could reliably and accurately drop a tennis ball on target, but it was a bit of a struggle to get there.
Other Progress and the Future
Aside from helping us debug servos, Alex was also able to make some good progress on the vision front. He installed a patch that allowed him to build a version of OpenCV that was able to properly enumerate menu items when using V4L2 (video for Linux 2). The bottom line is that we are finally able to capture frames from the USB camera that is connected to the Jetson board. That was a big step forward for our vision effort. Now that we have frames to process, we can start writing code to identify our balloon targets.
This brings us to the end of an extremely fun week. Matt and Alex are off to Las Vegas to compete with the Colorado ARML team (editor's note: Good luck, boys!) and so we are concluding this series of daily updates. This week has given us a great start at building an entry for this year's AVC, but we certainly have a lot more to do. We cannot yet find and pop balloons, but hopefully we'll be able to solve that problem in time for the competition on June 21st. We will post future updates and videos as we go, albeit less frequently. Thanks for reading!
Note: This is a series of articles in which we document our attempt to build an autonomous drone for the Sparkfun AVC. The previous post is here.
Any project in which a lot of learning takes place will inevitably experience setbacks. But those kinds of projects are where the real fun happens, so days like this are a great experience for the aspiring engineers on the team. If failures can teach you more than successes, then we surely learned a lot during day three. After all the highs of day two, day three was packed with... challenges.
We experienced all kinds of problems from the small to the large, from 3D printer filament tangles to a major crash that has put our main copter out of commission. Be sure to check out the video below, but I think what this means is that the easy part is over and now we're on to the real engineering portion of this project!
Ball Drop Brainstorming
We started the day by brainstorming ideas for tennis ball dropping. The simplest thing that we thought might work was a passive drop where we put the ball in a container and attempt to tip or flip the copter to cause it to drop. For testing this "passive" drop, we chose a tall cup to ensure the ball misses the propellers on the way out. Unfortunately, we couldn't tip the copter enough to cause the ball to fall out. Flipping the copter didn't do any better either since the copter accelerates through the whole flip and the ball stays pressed to the bottom of the cup.
On the bright side, we did get some practice at acrobatic flying, so that was fun. We tried both manual and autonomous flips, which gave us the opportunity to learn how to trigger an autonomous flip command from our transmitter.
The other idea we pursued was an active drop using a servo motor to release the ball. We modeled the parts in Creo for printing on the 3D printer and we created a quick Arduino-controlled test circuit to test the mechanism. Thanks to filament tangles and other challenges, this part of the project got a little bogged down today. The final parts didn't finish printing until about 2 AM, so we'll have to test the active drop tomorrow.
A Few Wins and a Big Loss
In a day filled with nagging little problems, we did manage to get a few other things accomplished. Sondra and Matt did some more research on getting the autopilot to talk to the vision system. Alex successfully re-flashed the Jetson board with the latest version of the OS. The re-flashing process was giving us problems, so it was nice to finally get that to work.
The day's largest setback occurred during a nighttime test flight of yet another passive drop design. The copter was handling a little strangely but we decided to go forward with the test anyway. We lost control of the copter during a flip and it crashed from a height of about 30 feet straight into concrete. The DJI air frame is tough, but it couldn't withstand that impact and we broke one of the motor arms and both the upper and lower boards.
Fortunately, we had initially ordered a F450 quadcopter for this project. When we decided to try onboard vision processing with the Jetson board, the extra weight led to the switch to the larger F550 air frame. Alex and Matt were able to quickly transfer all the components to the smaller quadcopter and we managed to get it flying before we called it a day.
I guess the good news is that we'll be able to use the quadcopter to continue prototyping our active ball drop mechanism while we wait for replacements for our damaged hexcopter parts.
Note: This is a series of articles in which we document our attempt to build an autonomous drone for the Sparkfun AVC. The previous post is here.
Day two was an insanely fun day. We all agree that this project is even more fun than we thought it would be. We logged a lot of time flying and crashing the hexcopter today. Side note: I am, hands down, the worst pilot in the family. You won't see me at the controls on competition day.
I will say that the DJI F550 is one tough little copter. We've abused it pretty thoroughly and the only damage it's taken is some scrapes and chips on the propellers. There are a couple of new propeller sets on order.
Spinning Out of Control
As we left off from the previous night, we had a copter that would just spin around every time we took off. Our first guess was that it was a transmitter calibration problem, but that checked out fine. Is it possible that there is a trim setting in the autopilot that was off? Our research turned up nothing. Alex finally figured out that our motors were rotating backwards from what the autopilot expected. The diagram below shows the correct configuration, we had fixed the motor numbers but had not noticed that each pair was rotating exactly backwards from what the Pixhawk expected. Another silly mistake, but we're learning!
After rewiring our motors - yet again! - controlled flight was finally achieved. This was an exciting milestone and we all took turns flying our new creation.
After a few very small, very controlled flights around our living room, we took it out to the back yard so we could get some elevation. This allowed the younger, more dextrous members of the team to get some real flight time in. I guess we can no longer claim that all that video game playing is a complete waste of time. If you've never flown a quadcopter, I highly, highly recommend it. So much fun.
We then spent some time testing out the stabilize, loiter, and land modes that we bound to a switch on our transmitter. Stabilize mode attempts to keep the copter level but otherwise gives you complete control. Loiter mode attempts to maintain the copter's current altitude and position. It turns out that you can still control the copter and move it around in this mode, but it's quite a bit less sensitive to control input from the joysticks. This mode was the easiest for me to fly. Land mode, as you'd expect, just executes a soft landing and it does a remarkably good job. The altitude sensor seems very accurate, which will make maneuvering around the obstacles on the AVC course a bit easier.
On to Autonomous
The next step was to add our GPS unit to the copter and connect it to the autopilot. We really need to mount it on a platform above the electronics, but for this prototype we just zip-tied it to the quad base. This may have hampered our accuracy somewhat and probably led to a crash that you'll see in the video below.
We elected to recalibrate our compass since the GPS receiver has its own, although this may have not been necessary. Matt then initialized the flight planning software by setting the home location, adjusting the altitude down to a safe testing height (no higher than our backyard fence) and creating a flight plan. He also replaced loiter mode with autonomous mode on our transmitter switch so we could take the copter in and out of autonomous mode with the flip of a switch in case of emergency.
Matt's first flight plan was to manually takeoff, engage auto mode, go to a waypoint and loiter, and then re-engage manual flight and land. As mentioned previously, it turned out that our GPS receiver really isn't accurate enough to reliably navigate a space as small as our back yard. It may be a problem with our compass, the magnetic declination settings, or maybe we're getting some interference with our GPS receiver. Classifying this problem is on our task list for future investigation.
Eventually we worked around the accuracy issue and progressed to a flight plan with full autonomous take off, proceeding through two waypoints in our backyard, and a fully autonomous landing. Check out this video for the day's highlights!
Tomorrow's objective will be to fly to a waypoint and drop a tennis ball from the hexcopter and return to the start point for landing - our first real mission simulation for the AVC! Check back tomorrow to see how we do.
Note: This is a series of articles in which we document our attempt to build an autonomous drone for the Sparkfun AVC. The previous post is here.
We owe a proof of concept video to Sparkfun by the end of May. Since Alex and Matt are leaving town to compete in a regional mathematics competition on Friday morning, that gives us about four days to get a prototype put together and flying. It's an aggressive schedule, but we like a challenge!
We knew this first day was going to be a rough one. There's just so much we don't know and all we're starting with is a pile of parts and a lot of enthusiasm. By the end of the day, we're hoping to see a copter get off the ground under manual control. So today's agenda was:
Build a hex copter
Get the autopilot module installed and calibrated
Figure out how to get it talking to a transmitter
Learn to fly it
Oh So Many Pieces
My first order of business was to build the air frame. Step number one is always to inventory the parts and see what we're up against.
Power distribution board, arms, motors, ESCs (electronic speed control - see I'm learning already!), props, and miscellaneous screws and straps. Check! Here is a close up of the parts.
That is a lot of parts, but thanks to some great assembly video tutorials, putting it together was pretty easy. The eagle eyed reader might notice that those ESCs are marked as 30A. That's A as in amps, meaning that they are capable of providing a continuous current of 30 amps to its motor. Multiply that by 6 motor/ESC pairs, and you have a system that can draw 180 amps continuous. If that sounds like a lot of current to you, you're right! This might be a good time to talk about batteries.
Where do we get a battery that's light enough and small enough for a hex copter but can supply a lot of current? My solution is a 5000 mAh, 3-cell 25C LiPo battery as shown below with its charger.
A 5000 mAh (milliamp-hour) battery can supply 5000 milliamps, or 5 amps, of current for one hour. Or it can supply 30 amps for 10 minutes. Since I expect each motor to draw about 5 amps continuous during flight, I would guess that we'll get about 10 minutes of flight time, which should be enough to accomplish the mission. If you're wondering, 25C is the battery's capacity rating. This means that it can safely provide 25 * 5000 mA of current - or roughly 125 amps. Of course, it could only supply that much current for about 2.5 minutes before draining the battery!
While I was busy assembling the F550 base, Sondra and Matt started trying to figure out how the Pixhawk, the GPS module, the mission planner software, the transmitter, and the receiver all work together. As with everything, there are a lot of great docs and video tutorials on getting our transmitter to talk to our autopilot. After trying both versions of the mission planning software, they decided to go with Mission Planner rather than APM planner, at least for now. Mission Planner is older, but it seems a little more feature complete.
It can take a long time to calibrate accelerometers, compass sensors, GPS receivers, and gyros, but it's not at all optional. Battling impatience was a theme today. Below is the assembled F550 with the Pixhawk autopilot temporarily mounted. It's a match made in heaven, we hope.
The next step was to calibrate the sensors on the on the Pixhawk and try to get it talking to the transmitter. In addition to helping with everything else that was going on, Alex managed to start some OpenCV development on the Jetson board to try to get it talking to the USB camera.
The Jetson TK1 development board is a really incredible piece of technology. If you haven't heard about it, it's a board that features Nvidia's latest TK1 SoC. This chip features a 4+1 ARM quad-core configuration in a big.LITTLE arrangement. That means that it has 4 big, powerful cores for heavy number crunching and 1 smaller, more power-efficient core that handles all of the less time-critical, more mundane work. In addition, this SoC has a 192-core Kepler GPU so it packs a very serious graphics punch. We're hoping to try to get OpenCV running in a hardware accelerated mode on this baby in order to give us all the performance we need to find and identify our balloon victims.
In addition, the board runs a version of Ubuntu which means it's really easy to develop right on the board itself. This makes hardcore Linux development geeks like Alex very happy.
Oh So Many Mistakes
As always it's the things you don't know that you don't know that get you. Here is a fair sample of our lessons learned for today.
Did you know that if you don't bind the transmitter to the receiver, they won't talk to each other? (that sound you hear is the experienced R/C guys laughing at us... :-)
It pays to make sure that the axes on your transmitter match your autopilot's expectation. It's really tough to fly when an axis is reversed or what you think is the pitch control is actually yaw.
Your autopilot will probably want to be mounted on the center axis of your copter. There is a reason for this (think gyro sensor).
Those loud annoying beeping sounds that never quit are your ESCs complaining because they have no signal input from your autopilot. It will stop once you have things talking properly, I promise. In the mean time, covering your ears and humming loudly to yourself may help you, but it will also annoy your team mates/family members even more. Not recommended.
If you don't pay close attention and number your motors the way your autopilot expects, your copter will invariably flip over on its back as soon as you try to lift off. If you don't actually know how to fly a hex copter, you'll probably think "boy, I really suck at this." And you'll probably be right, but you should also check your motor numbering.
So did we manage to get a copter built and flying after all of that? Well yes and no, we did manage to achieve level flight for a few seconds, but we definitely have a spinning problem. Even so, we had a great first day and we made a bunch of progress! Below is a short video of some of our flight tests.
Hopefully tomorrow we'll achieve some real flight!
We have just completed another long but satisfying FIRST robotics season. Since only myself and my oldest son participate in FRC, as a family we tend to spend a lot of time apart during the season. Toward the end of the season, our family started looking for a fun summer project we could do together that would satisfy our inner geeks. The solution was obvious...
Build an Autonomous Weaponized Drone
And a big welcome to our government friends who have just joined us. Before I get myself into too much trouble, I'll point out that this drone's meager weaponry won't be dangerous to anyone who is not a large red balloon. You see, we have decided to enter Sparkfun'sAutonomous Vehicle Competition. We'll be competing in the arial vehicle class, and the tasks our autonomous flying machine will have to complete are: maneuver around obstacles, drop a tennis ball onto a target, and, if possible, find and pop the three large red balloons that will be randomly placed around the course.
This will not only give our family the chance to geek out together over the summer, but having never done any kind of R/C or flying robot project, we will be learning a bunch of new things. And that is the key, right? Never stop learning.
The Team and the Parts
The team name we picked is "Bitwise, Byte Foolish" and our drone will be officially named Nibble, although we've nicknamed it Splashy (the competition takes place over water!). We will all be doing a bit of everything, but there will be some areas of main responsibility among the team (er... family). I will be handing the hardware and electronics. My wife Sondra and our younger son Matt will be responsible for programming the autonomous navigation and path planning. Our older son Alex will be handling the computer vision programming that we plan to use to find and pop the balloons.
We have spent the last couple of weeks discussing and ordering most of the main components that we plan to use. Now that school is over, we plan to start the project in ernest this week. Sondra and I have taken the week off from work in order to focus on this project. I cannot wait!
Here is a preview of some of the parts we've chosen:
The flight base will be a DJI F550 hex copter, pictured above. This air frame is supposed to be sturdy and it fits our budget.
We will be using a Pixhawk autopilot (right) and GPS receiver from 3D Robotics.
We have selected a Taranis X9D transmitter and X8R receiver from FR Sky just in case we need to take manual control in an emergency.
And finally, the heavy lifting (from a computer vision standpoint) will be done by an Nvidia Jetson TK1 board, pictured below.
This is going to be fun. We will be keeping everyone updated on our quest to actually get this thing airborne this week, so check back here to see how we're doing on this adventure. I'm hoping that I can even talk Matt and Alex into writing an entry or two describing their parts in the project. Fingers crossed.
I hope we'll end up with a drone we can be proud of when the AVC competition rolls around on June 21st. I don't know how competitive we'll be, but if we can send a drone out to run the course and bring it back safely, I'll count that as a win! Just please don't let it crash into the lake. Did you hear that Splashy?