parrot ar drone documentation

This project follows the development of a natural user interface for controlling a Parrot 2.0 AR Drone using body gestures with a Microsoft Kinect (v1). Through this document I will share the different techniques used in the gesture recognition task and implementation. This project was made some time ago, but finally took the time to publish it here, I started the project first as a hobby, but later new team members added, thanks to all of them for all the help with tests and documentation. (/en-us/download/details.aspx?id=40278)*Toolkits (Credits to their respective developers):The main challenge found when trying to make a body gestural interaction interface, is to select which are going to be the movements or patterns you are going to use. First they have to be easy and natural actions to the body, and moreover, you need to be able to detect them with all kind of body complexities. With this I refer to a problem found commonly when smaller or taller people than the average (i.e kids) tried to use the system, it would be hard for them to execute a good control.

As the gestures selected for the control are the following:*Takeoff: Both hands to the front of the chest, fully extended.*Emergency stop: Both hands at the head.*Landing: Take drone to low position and stay still.*Flight control was made to resemble the control used in the Parrot app, with the two joysticks in which right hand would control pitch and roll, and left hand controls yaw and altitude.Fig. 1. Working area for gesture detection. (Note the colors inverted to match left & right hand of the user with left and right joystick in the app)Reference for roll, pitch and yaw movements.In order to avoid problems when little or tall people used the control system, a coordinate normalization was proposed using the distance from the head to the hip of the user and all further measurements would be divided by this distance to be just a proportion of the height of the person, and not to take absolute coordinates. To detect the desired gesture, a simple subtraction was made between the distance of the hand of the user and his/her shoulders.

After taking some measurements of the minimum and maximum distances made by a natural extension of the arms, the following mean values were found.So with this, an active region of gesture detection was made as shown in the next figure by the blue surface. Be aware that outside of that region, no signal would be send to the drone for safety reasons. This also applies to Z axis, so that you need to have your hands extended to the front in order to take control.The first approach to control the drone was to send velocities proportional to how far your hand was extended. This is still documented in the attached documentation, but was later found to be better to just have an ON-OFF control with a limited max velocity. Explanation on how the program works can also be found there. (Btw, the document is still in Spanish –since I´m from México-, I’ll try to upload an English version later)As mentioned before, you´ll need to have all the previous requirements. There´s also a document attached – “Labview changes”- which explains almost step by step how to install and run the program.

If you´re new to this, I would recommend running first the examples of each toolkit run separately to ensure they work correctly… and if you are not that new, well it´s also good to try first.Actually a good hint, when working with the drone, I found it easier to manually configure the network settings to stablish the Wi-Fi connection.
parrot ar drone police modTCP/IP settings for AR Drone connection.
parrot ar drone power edition singaporeWell this is now actually previous work, on which on the same drone an EEG – mind control system was implemented using an Epoc Emotiv+ Headset in LV.
parrot ar drone magnetometerThere were found some usable results controlling up to four movements of the drone, but there are still many things left to work around.
ar drone 1 mainboard

Any help you need or problems you found, don´t hesitate in posting here of mail me at a00811157@itesm.mx.An autonomous flight library for the AR.Drone 2.0, built on top of Instead of directly controlling the drone speed, you can use Autonomy to plan and execute missions by describing the path, altitude and
ar drone parrot ubuntu orientation the drone must follow.
parrot ar drone windows sdk Autonomous means that this library will move your drone automaticaly to reach a given target. Experiment with this library in a closed/controlled environment before going in the wild !! Extended Kalman Filter leveraging the onboard tag detection as the observation source for an Extended Kalman Filter. This provides much more stable and usable state estimate. Camera projection and back-projection to estimate the position of an object detected by the camera.

Currently used to estimate a tag position in the drone coordinate system based on its detection by the bottom camera. PID Controler to autonomously control the drone position. Mission planner to prepare a flight/task plan and then execute it. VSLAM to improve the drone localization estimates. Object tracking to detect and track objects in the video stream. This module exposes a high level API to plan and execute missions, by focusing on where the drone should go instead of its low-level movements. Here is a simple example, with the drone taking off, travelling alongs a 2 x 2 meters square ane then landing.( (, ) () {. Here is a list of know apps built using autonomy. Please let me know if you build something and I'll be happy to add you in the list. panorama autonomously fly to a given altitude and take pictures to form a 360 photo panorama. If you encounter issues, please add them to the issue tracker. me on twitter (@eschnou) or on the #nodecopter IRC channel

This work is based on the Visual Navigation for Flying Robots course. My eternal gratitude for their team to post lectures and slides on the web. I learned a lot from them. Also a big thank you to @felixge who came up with this crazy idea of flying a drone with Javascript and building the fantastic node-ar-drone library. If you like this project, please consider donating. The less time I need to work, the more I can spend on open source projects :-) Please Donate To Bitcoin Address: [[address]] Copyright (c) 2013 by Laurent Eschenauer laurent@eschenauer.be Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: