Andresjhe-tfg

From jderobot
Jump to: navigation, search

Contents

Final Experiments[edit]

The final experiments are all about joining all the different areas I have researched on and develop in order to create one exercise. The hardest part has been finding the PID variables, the differences between the "ideal world" (Gazebo simulator) and the "real world" (the ArDrone 2). I used my laptop and the ICS as test subjects for computing (image processing and control) and sending the orders to the drone via Wi-Fi.

Final Experiments[edit]

The experiment of:

  1. Taking off in a controlled manner on top of a color beacon
  2. Navigating through a room using Apriltags as beacons
  3. Searching for the color beacon
  4. Land on the color beacon

PC as computing unit[edit]

Using my laptop as the Desktop perspective.

PC perspective

Exterior perspective

ICS as computing unit[edit]

Due to the fact that the Ar.Drone 2 couldn't steer properly with the ICS onboard I had but no choice to make the tests externally. This probably added some latency but nothing that has altered severely the results.

Here you can find a success test including take-off, navigation and landing:

Searching and Landing[edit]

Exterior Perspective

PC perspective

Navigation using Apriltags[edit]

Exterior Perspective

PC perspective

Calibration Tool[edit]

I have designed a calibration tool based on ColorTurner and Cameraview using OpenCV in order to store and modify really easy and fast the color filter in order to make easier the process of testing in real life. It is optimized for Precise_Landing in the sense of having not only two color filters based on HSV but, having additional control on Erosion and Dilation in order to check if the noise is completely cancelled.

AprilTag performance on ICS[edit]

Using the ISC, I have given another chance to AprilTags and I have improved the 800ms(using MK802IV) to 150ms on average. Although this is not as good as my current laptop, it is possible to be useful for the purposes of my final thesis.

Here you can find the raw performance, showing the minimum, average and maximum values during the AprilTags processing(Laptop left/ ICS right):

VisualStates[edit]

I have successfully ported the code in charge of landing using color beacons to the JdeRobot Tool Visualstates. I had to made some tweaks in order to fit the requirements of Visualstates.

You can watch a video here:

New Platform: Intel Stick Compute + Battery bank + Ar.drone 2[edit]

I have been testing the limits of the Ar.drone and it seems that 150g is the maximum weight you can put before is too unstable. I had to find a battery bank of 2000mah (60g) and put it onboard together with the ICS (60g) in order to test the flying operativity of both on board the Ar.drone 2.

This is the result:

Components[edit]

Intel Stick Compute: x64 power in a tiny format [edit]

I have had some problems related with the MK802IV performance in order to work as a feasible "CPU" for the ARDrone 2.0. Although it is possible to install and execute C++ and OpenCV, it has been really difficult to optimize and run it with low latency.

Since an update was needed, I have been experimenting with the Intel Stick Compute (ICS). It is an hdmi stick powered by an Atom processor with x64 architecture, capable of executing instructions much more faster and with a similar cost of energy to the ARM inside of the MK802IV. The caveat is that the ICS has only 1GB of ram, which has the potencial to be an issue that we will have to handle.

Installing Ubuntu[edit]

The ICS is a weird device when it comes to installing an OS. Despite of being a x64 processor, the loader (in charge of launching the OS) has to be written for a x86 processor.

Isorespin tool[edit]

I have to reference again to Linuxium, which has provided a tool in order to boot any Linux image and with the necessary drivers for the Wi-Fi integrated card.

The tool is called isorespin and can be found here.

Installing JdeRobot[edit]

Installing JdeRobot has been much easier than the MK802IV, since having the ICS is the same as having a computer. The process is straight forward and it is the same you would need for installing it on your computer. Just follow the instructions showed in [1].

Test performed[edit]

uav_viewer connected with ArDrone via ardrone_server[edit]

Color Detection [edit]

After trying to optimize AprilTags on the MK802IV, I've realized it's not possible to get times during detection phase under 100 ms (fps) which indicates for now it's not possible due to processing power. I'm going to change the AprilTags detection and replace it with a Color Filter detection.

New Model[edit]

Since AprilTags uses tags as beacons I'm going to create a color beacon as substitute. I picked green and orange, since this colors are very rare in the world together. Thus, I had to change the virtual car model color too, from green to blue. This will make almost impossible confusing the beacon with the car (since it was previously green).

New beacon:

New Car with Color Beacon:

New Color Filter Detector[edit]

This time, I'm using a color filter detector, which is much faster and it's more appropiate for the processing power of the MK802IV.

Color filtering in action:

Eroding and Dilating[edit]

One problem I found using this beacon is that in order to be able to detect the center of the image I could make use of the different color's centers of mass. I order to do this I'm applying a method called dilating which disolves the colors and makes each of them distinguishable. Since I don't want to loose or add too much noise I will also apply the inverse method called eroding. This will solve a problem I found with the shadow of the drone once is too close to the beacon.

First Example Eroding One Color[edit]

This is my first try using eroding:

2 Colors Filter dilating and eroding[edit]

Without dilating and eroding[edit]

The shadow is difficulting our task in order to detect the beacon when we get really close

After colors have been filtered[edit]

Not a significant improvement yet. The filtered image is sharpener but the shadows are still adding noise to our filtered image

Before and after colors have been filtered[edit]

The improvement is really noticeable. Not only the shadows have almost dissapeared but the filtered images are even sharpener than before and it will make easier the recognition of the beacon.

Still Image:

Video:

Simulating Navigation[edit]

I'm applying the PID controller for navigation that I used for AprilTags with my new 2 Colors beacon. For now it is enough although I will need to make some tweaks to the derivative and integral coefficients in order to be more reliable:

Getting "Real" [edit]

I've been trying to use my landing application with the real drone. I'm combining all the things I've developed and got ready, like the landing application and the MK802IV for processing purposes.

Real Drone[edit]

I tapped to the back of the ArDrone the MK802IV processor. For powering it, I'm using the drone's usb. You can see it clearly on the following photos:

Latency problem[edit]

I realized the behaviour of the drone haven't been as expected. I've been trying to narrow down the possible problems and I found that the Mk802IV isn't as powerful as expected. This are some of the problems I could be facing:

  • Wifi Latency problem: I used pings to the ArDrone to check if there's any delay in the connection.After checking this, I realized an average of 4.891 ms is not the problem, so I discarded this possible problem.


  • Apriltag Performance - Part I: I'm dealing with a very high delay between processed images. It's around 1 image per second in average, which is really bad for our objective, following and landing using AprilTags.


Optimizations[edit]

One of the compiling tools I tried is the compiling option optimization -o3, which spends more time compiling and gives back more performance. I've been getting better results, sub second average, but still between 700-900 ms in average. Still, not acceptible for a real scenario.

Programming challenge [edit]

I participated setting up a tournament in which I made a program that prints on screen a graph showing how far and how much time the contenders program are close to our simulated "mouse" in Gazebo. They need to follow it as close as possible for the longest amount of time. It is a cat & mouse game in which the drone has cat's role

I had to learn Python language and matplotlib for Python. For python I used Lynda which I totally recommend if you need to learn any programming language and have some experience programming in different languages. For learning matplotlib I recommend using the tutorials which are really good and the examples, so you can start plotting right away.

I also helped testing the tournament platform and improving JdeRobot so it could be ready to work with Gazebo 5.

It has been a very fun experience and I recommend anyone to try to organize or participate on this kind of tournaments.

Drone distance measurer[edit]

This program asks via an Ice interface to a referee what's the distance to the objective.

Requirements

You will need python installed on your system and the python matplotlib.

sudo apt-get install python python-matplotlib

Interface consists on:

  • Countdown timer
  • Score display
  • Graph

The countdown timer shows a 2 minute countdown. During this time, players have to stay below a threshold distance so they can score time inside this area.

The score display shows how many seconds the drone is close enough to the mouse. The highest score wins.

The graph shows the threshold line which separates the scoring area painted in green. The other area, that doesn't count for scoring purposes and it's painted in red.

Randomizer

For testing purposes I created a program that simulates the referee and gives random numbers when the Drone distance measurer asks for it.

Gazebo 5[edit]

Installing Gazebo[edit]

Installing from source files[edit]

  • Dependencies needed:

Ubuntu 14.04

sudo apt-get install build-essential libtinyxml-dev libboost-all-dev cmake mercurial pkg-config libprotoc-dev libprotobuf-dev protobuf-compiler libqt4-dev libtar-dev libcurl4-openssl-dev libcegui-mk2-dev libopenal-dev libtbb-dev libswscale-dev libavformat-dev libavcodec-dev libogre-1.8-dev libgts-dev libltdl3-dev playerc++ libxml2-dev libfreeimage-dev freeglut3-dev

Ubuntu 12.04

sudo apt-get install build-essential libtinyxml-dev libboost-all-dev cmake mercurial pkg-config libprotoc-dev libprotobuf-dev protobuf-compiler libqt4-dev libtar-dev libcurl4-openssl-dev libcegui-mk2-dev libopenal-dev libtbb-dev libswscale-dev libavformat-dev libavcodec-dev libogre-dev libgts-dev libltdl3-dev playerc++ libxml2-dev libfreeimage-dev freeglut3-dev
  • Bullet

You'll need OSRF repository for installing some dependencies like bullet physics engine

sudo sh -c 'echo "deb http://packages.osrfoundation.org/gazebo/ubuntu `lsb_release -cs` main" > /etc/apt/sources.list.d/gazebo-latest.list'
wget http://packages.osrfoundation.org/gazebo.key -O - | sudo apt-key add -
sudo apt-get update
sudo apt-get install libbullet2.82-dev
  • Gazebo library
sudo apt-get install libgazebo5-dev
  • SDFormat

Download

cd ~; hg clone https://bitbucket.org/osrf/sdformat
cd ~sdformat
hg up sdf_2.0

Build and install

mkdir build
cd build
cmake ..
make -j8
sudo make install

make -jX is the number of cores you want to use during the make process

  • Gazebo

Download

cd ~; hg clone https://bitbucket.org/osrf/gazebo
cd ~/gazebo

Build and install

mkdir build
cd build
cmake ..
make -j8
sudo make install

make -jX is the number of cores you want to use during the make process

Open Gazebo

gazebo

Sometimes installation will give you an error like: gazebo: error while loading shared libraries: libgazebo_common.so.1: cannot open shared object file: No such file or directory.

Complete the installation (In case of error while opening Gazebo)

echo '/usr/local/lib' | sudo tee /etc/ld.so.conf.d/gazebo.conf 
sudo ldconfig

Once installed, Gazebo 4 looks like this:

I started playing with Gazebo 4 and I found some models that are interesting like a Quadcopter or a ragdoll:

Plugins in Gazebo 5[edit]

One of Gazebo's features is creating your own plugins. I'm going to create a plugin that uses cameraview from Jderobot and Gazebo will send images of the simulated world.

Playing around with HelloWorld plugin[edit]

I'm going to create a World plugin that prints HelloWorld in a Gazebo Server console.Here there are the files I used.
  • hello_world.cc file uses the core set of functions from gazebo.hh and prints HelloWorld!
The last line is very important. It registers this as a plugin in the Gazebo registry
GZ_REGISTER_WORLD_PLUGIN(WorldPluginTutorial)
  • hello.world file it's a sdf format file that attaches the hello_world.cc. Gazebo will open this file and will execute hello_world.cc as a result.
Finally I included the path where I built the HelloWorld plugin
export GAZEBO_PLUGIN_PATH=${GAZEBO_PLUGIN_PATH}:~/tfg/trunk/gazebo_plugin_tutorial/build
For executing the plugin introduce:
$ gzserver hello.world --verbose
The prompt print:
Gazebo multi-robot simulator, version 4.1.0
Copyright (C) 2012-2014 Open Source Robotics Foundation.
Released under the Apache 2 License.
http://gazebosim.org

[Msg] Waiting for master.
[Msg] Connected to gazebo master @ http://127.0.0.1:11345
[Msg] Publicized address: 192.168.0.193
Hello World!

Playing around with actor.world[edit]

Actor.world if found in the default Gazebo5's world library. It consists on a person model which is moving in several directions.

It has a very common structure and I'm going to describe the most important fields from this gazebo's world.

<animation> defines what kind of movement are going to do the different model's body parts. With <interpolate> true it will create a smooth animation while the model is moving.:
<animation name="walking">
   <filename>walk.dae</filename>
   <scale>1.000000</scale>
   <interpolate_x>true</interpolate_x>
</animation>
In the <script> field we can insert all the waypoints we want the model to go. We also have a <loop> field for repeating the whole <script> field:
<script>
   <loop>true</loop>
   <delay_start>0.000000</delay_start>
   <auto_start>true</auto_start>
   <trajectory id="0" type="walking">
     <!-- Waypoints here -->
   </trajectory>
</script>
<waypoint> defines a single waypoint destination. The duration to get to the <pose> destination is set in
<waypoint>
   <time>0.000000</time>
   <pose>0.000000 -5.000000 0.000000 0.000000 0.000000 4.712388</pose>
</waypoint>

This a video of the default behaviour:

Modifying actor.world[edit]

Modifying the waypoints and adding more Actor models I created a new scenario in which three persons walk in totally diffent directions. With this we'll be able to build scenarios simulating people on them moving around a city or a building.

Here's a video example:

Actor animations[edit]

Inside the Gazebo 5 default installation folder, there's a folder called models. There are animations for the actor model in a sdformat, which the specifications can be found on their official website.

These are the animations you can set for the actor model:

  • Sitting down/ Standing up animations
  • Standing/Sitting. They both are an static pose
  • Talking. Which has 2 variations and simulate 2 models in a conversation
  • Running
  • Walking
  • Moon-walking. Walking backwards

Car Plugin[edit]

Creating a car model[edit]

I created a car's model into Gazebo. For now it looks like this:

Car with April tag[edit]

I have been modifying the car for my purposes, which include a drone landing on top of it. I have inserted an Apriltag so I could build a PID controller and use the tag as a visual localization beacon.

Car Teleoperator[edit]

I made a plugin for Gazebo, so I could control it. Also, I made a teleoperator so I could give orders to the car through the Gazebo plugin.

Here you can see it in action:

Visual Lander[edit]

I have created a program for the ARDrone from JdeRobot. It uses AprilTags to localize the car and then begins an stalking and landing attempt.


Following a moving object[edit]

Using Optic Flow[edit]

Researching Ethernet over USB[edit]

I've been looking up information about Ethernet over USB because I may want to connect and Android Device with a Parrot ArDrone via USB. This way the Drone processor has only to take data from sensors and the Android Device is busy doing different tasks by using the Ethernet(over USB) interface for communication the data between this both devices.

My first thought was that the easiest thing is connecting both USB ports using an A-to-A cable. However, this is not possible, since USB architecture is designed to have a master(hub) and a slave. If both sides try to be masters the communication will not be possible.

Posible solutions[edit]

  1. Use dual purpose USB controller between both devices. The controller will distribute the role of host and master during the communication. Requires an adapter and I would have to implement my own Ethernet over USB solution.
  2. Use a USB to Ethernet adapter in both sides. This method require 2 adapters but simplifies the communication solution.
  3. Tethering model. Both devices will change roles to allow a communication be possible. But due to the USB architecture it's possible to use it without an A to micro-B USB cable. The B conector allows OTG and more signal comunnication.
  4. Wireless connection between the Drone and the Android Device. This will requiere less adaptors, but the Android Device requires Wi-Fi onboard.

The solutions i'm going to use are a USB bridge cable, which is an A-to-A cable with a USB controller in the middle who acts as middle-man communications and allows a connection between two usb host controllers and a USB to Ethernet adapter.

PC to MK802IV connection[edit]

Booting into Ubuntu 12.04[edit]

The MK802IV will boot automatically from the SDCard (which contains Ubuntu) after 10 seconds.

Ubuntu Login[edit]

User: Linuxium
Password: p

USB to USB[edit]

Editing the file /etc/network/interfaces in each end-point You can set any private IP address you want.

PC configuration:

auto usb0
iface usb0 inet static
address 190.0.0.1
netmask 255.255.255.0
gateway 190.0.0.1

ifconfig on PC

usb0      Link encap:Ethernet  direcciónHW 06:61:d8:d9:a1:5d  
          Dirección inet6: fe80::461:d8ff:fed9:a15d/64 Alcance:Enlace
          ACTIVO DIFUSIÓN FUNCIONANDO MULTICAST  MTU:1500  Métrica:1
          Paquetes RX:206 errores:0 perdidos:0 overruns:0 frame:0
          Paquetes TX:300 errores:0 perdidos:0 overruns:0 carrier:0
          colisiones:0 long.colaTX:1000 
          Bytes RX:36880 (36.8 KB)  TX bytes:60359 (60.3 KB)

MK802IV configuration:

auto usb0
iface usb0 inet static
address 190.0.0.2
netmask 255.255.255.0
gateway 190.0.0.2

ifconfig on MK802IV

usb0      Link encap:Ethernet  HWaddr f2:be:56:e7:f6:ea  
          inet addr:190.0.0.2  Bcast:190.0.0.255  Mask:255.255.255.0
          inet6 addr: fe80::f0be:56ff:fee7:f6ea/64 Scope:Link
          UP BROADCAST RUNNING MULTICAST  MTU:1500  Metric:1
          RX packets:133 errors:0 dropped:0 overruns:0 frame:0
          TX packets:119 errors:0 dropped:0 overruns:0 carrier:0
          collisions:0 txqueuelen:1000 
          RX bytes:24582 (24.5 KB)  TX bytes:22110 (22.1 KB)

Testing the conection with a ping

$ ping 190.0.0.2
PING 190.0.0.2 (190.0.0.2) 56(84) bytes of data.
64 bytes from 190.0.0.2: icmp_req=1 ttl=51 time=191 ms
64 bytes from 190.0.0.2: icmp_req=2 ttl=51 time=190 ms
64 bytes from 190.0.0.2: icmp_req=3 ttl=51 time=190 ms
64 bytes from 190.0.0.2: icmp_req=4 ttl=51 time=192 ms
64 bytes from 190.0.0.2: icmp_req=5 ttl=51 time=190 ms
64 bytes from 190.0.0.2: icmp_req=6 ttl=51 time=193 ms
64 bytes from 190.0.0.2: icmp_req=7 ttl=51 time=202 ms
64 bytes from 190.0.0.2: icmp_req=8 ttl=51 time=190 ms

Installing Jderobot[edit]

First I looked into Jderobot's installion instructions. If you just want to install all the Jderobot components you need to follow the instructions and download the dependencies. Also it will give some tip about what are they and where to get some information to learn how to use them. I had a lot of troubles installing some of them properly, so don't get anxious if you feel lost, just try to read carefully again and try to understand what is happening.

Once you've installed all of them you just need to download the latest version, compile and install it. It's really important that you have all dependencies installed properly, because you'll probably won't be able to get it done. I'm currently using Jderobot 5.6.4.

My advice for newbies: Use the email list if you need any help, people is really helpful and there are others like you trying to learn Jderobot.

My final project[edit]

Hello if you are reading this and welcome to my Final Project's diary. I'm going to post here my steps until I can achieve all my goals in order to build my Final Project. I hope it will help you if you are looking for guiding in case you are trying to do the same or even if you just want to develop using Jderobot.