Drivers

From jderobot
Jump to: navigation, search

In this section, we will describe the main drivers distributed with JdeRobot that provide access to different sensors and actuators. They are also valuable examples for configuring the availables drivers, so we will describe how to use them and how they work.

Several Gazebo plugins act like drivers and they communicate a robot in Gazebo simulator with other software components in the JdeRobot platform. It's possible to retrieve information from several sensor devices, such us: laser, encoders, motors, camera, sonar, etc. And send information to the actuators of the robot (if it has them) through ICE interfaces.

Drones[edit]

ArDrone from Parrot[edit]

'ardrone_server' is a JdeRobot component for the control and access to sensors of ArDrone 1 and 2 from Parrot. It's inspired in the ardrone_autonomy packet of ROS (Fuerte).

It provides the following ICE Interfaces:

Configuration file

The driver offers six interfaces:

  1. cmd_vel, interface for velocity commands (forward, backward,...).
  2. navadata, interface for sensorial data transmission (altitude, velocities,...).
  3. ardrone_extra, interface for extra functions of ardrone (record video on USB,...) and basic maneuvers (takeoff,land and reset).
  4. Camera, standard interface for images transmission in JdeRobot
  5. remoteConfig, standard interface for XML file transmission in JdeRobot.
  6. Pose3D, standard interface for pose information (x,y,z,h and quaternion)

An example of configuration that you, most likely, will find by default is the following:

  • All interfaces: ardrone_interfaces.cfg
ArDrone.Camera.Endpoints=default -h 0.0.0.0 -p 9999
ArDrone.Camera.Name=ardrone_camera
ArDrone.Camera.FramerateN=15
ArDrone.Camera.FramerateD=1
ArDrone.Camera.Format=RGB8
ArDrone.Camera.ArDrone2.ImageWidth=640
ArDrone.Camera.ArDrone2.ImageHeight=360
ArDrone.Camera.ArDrone1.ImageWidth=320
ArDrone.Camera.ArDrone1.ImageHeight=240
# If you want a mirror image, set to 1
ArDrone.Camera.Mirror=0


ArDrone.Pose3D.Endpoints=default -h 0.0.0.0 -p 9998
ArDrone.Pose3D.Name=ardrone_pose3d

ArDrone.RemoteConfig.Endpoints=default -h 0.0.0.0 -p 9997
ArDrone.RemoteConfig.Name=ardrone_remoteConfig

ArDrone.Navdata.Endpoints=default -h 0.0.0.0 -p 9996
ArDrone.Navdata.Name=ardrone_navdata

ArDrone.CMDVel.Endpoints=default -h 0.0.0.0 -p 9995
ArDrone.CMDVel.Name=ardrone_cmdvel

ArDrone.Extra.Endpoints=default -h 0.0.0.0 -p 9994
ArDrone.Extra.Name=ardrone_extra

ArDrone.NavdataGPS.Endpoints=default -h 0.0.0.0 -p 9993
ArDrone.NavdataGPS.Name=ardrone_navdatagps

Where:

  • Endpoints indicates the address where our server will be listening to request.
  • Name is the name of the ICE interface.
  • ImageWidht & ImageHeight are the resolution of the images taken by the drone.
  • FramerateN are the frames per second taken by the drone.

ArDrone2 in Gazebo[edit]

'Quadrotor2' is the version 2.0 of quadrotor plugin. You can see a complete description of features and changes at:

Multiple instances quadrotor2 allows multiple models by only edit world files. Each spawn quadrotor must be named with an extra suffix -p<port number>. This port overrides defined at Ice.Confif file allowing even reuse of same config file.

Configuration file

Quadrotor.Adapter.Endpoints=default -h localhost -p 9000
Quadrotor.CMDVel.Name=CMDVel
Quadrotor.Navdata.Name=Navdata
Quadrotor.Extra.Name=Extra
Quadrotor.Camera.Name=Camera
Quadrotor.Pose3D.Name=Pose3D

Cameras[edit]

RGB cameras and video files[edit]

Cameraserver[edit]

'Cameraserver' is a component to serve N cameras, either real or simulated from a video file. It uses gstreamer internally to handle and to process the video sources.

It provides only one ICE interface

Configuration file

To use cameraserver we just have to edit the component's configuration to set the video sources and to set the served formats of our cameras. We also have to set the network address where our component will be listening to for new connections or to choose the locator service.

A configuration file example may be like this:

#network configuration
CameraSrv.Endpoints=default -h 127.0.0.1 -p 9999

#default service mode
CameraSrv.DefaultMode=1

#cameras configuration
CameraSrv.NCameras=1

#camera 0
CameraSrv.Camera.0.Name=cameraA
CameraSrv.Camera.0.ShortDescription=Camera pluged to /dev/video0
CameraSrv.Camera.0.Uri=0
CameraSrv.Camera.0.FramerateN=15
CameraSrv.Camera.0.FramerateD=1
CameraSrv.Camera.0.ImageWidth=320
CameraSrv.Camera.0.ImageHeight=240
CameraSrv.Camera.0.Format=RGB8
CameraSrv.Camera.0.Invert=False

The first paragraph define the network configuration, that is the address where our server will be listening to request. Next paragraph define the number of cameras our server will provide. And following it, we have the configuration for the cameras. Notice that each camera will have its parameters after the prefix CameraSrv.Camera.X., with X in the interval [0..NCameras). In this example we have only one camera. A camera has several parameters:

  • Name: Name use to serve this camera. The interface for this camera will have this name.
  • ShortDescription: A short description of this camera that may be used by the client to retrieve more information about the camera than only a name.
  • Uri: String that define the video source.
  • FramerateN: Frame rate numerator.
  • FramerateD: Frame rate denominator.
  • ImageWidth: Size of the served image.
  • ImageHeight: Size of the served image.
  • Format: A string defining the format of the served image. Cameraserver use libcolorspacesmm to manage the image formats. Currently accepted formats are RGB888 for RGB 24bits and YUY2.
  • Invert: A boolean value to indicate the image appearance. If you need to set the camera in an invert mode (maybe in a real robot), you just need to indicate it setting this parameter on True value.

Cameraserver can serve several types of sources. Each of them are named using the parameter uri with a syntax like:

type-of-source://source-descriptor

where type of source can be one of this:

  • RGB cameras. Source descriptor will name the device name, e.g. 0 -> /dev/video0
  • file: For video files. Source descriptor will name the file, e.g /home/user/file.avi
  • http or https: For files located in a web server. Source descriptor will name the remote resource, e.g http://webserver.com/file.avi


Cameraserver_py[edit]

'Cameraserver_py' is a component to serve a camera (only serves RGB8 at the moment). It uses Opencv internally to handle and to process the video sources.

It provides only one ICE interface

Configuration file

To use cameraserver_py we just have to edit the component's configuration to set the video source and FPS of our camera. We also have to set the network address where our component will be listening.

A configuration file example may be like this:

cameraServer:
  Proxy: "default -h 0.0.0.0 -p 9999"
  Uri: 0 #0 corresponds to /dev/video0, 1 to /dev/video1, and so on...
  FrameRate: 12 #FPS readed from device
  Name: cameraA
  • Name: Name use to serve this camera. The interface for this camera will have this name.
  • Uri: String that define the video source.
  • FrameRate: Frame rate.
  • Proxy: network configuration.

RGBD cameras (Xtion, Kinect...)[edit]

'OpenniServer' driver offers an entry point to connect depth sensors to JdeRobot tools through ICE interfaces. Nowadays, it is compatible with Kinect from Microsoft, and Xtion from ASUS (both RGBD sensors). It also allows the posibility to represent a points cloud from the images taken by the sensor.

It provides the following ICE Interfaces:

Configuration file

This driver uses two different interfaces which have to be configured in order to connect the adapters from ICE properly. To do so, we have several configuration files (*.cfg) that take care of this. An example of configuration that you most likely will find by default is the following:

openniServer.Endpoints=default -h 0.0.0.0 -p 9999
#with registry
#cameras configuration
openniServer.PlayerDetection=0
openniServer.Mode=0
openniServer.ImageRegistration=1
openniServer.Hz=20

NamingService.Enabled=0
NamingService.Proxy=NamingServiceJdeRobot:default -h 0.0.0.0 -p 10000

#mode=0 -> fps: 30x: 320y 240
#mode=2 -> fps: 60x: 320y 240
#mode=4 -> fps: 30x: 640y 480
#mode=6 -> fps: 25x: 320y 240
#mode=8 -> fps: 25x: 640y 480

#camera 1
openniServer.deviceId=0
openniServer.CameraRGB.Name=cameraA
openniServer.CameraRGB.Format=RGB8
openniServer.CameraRGB.fps=25
openniServer.CameraRGB.PlayerDetection=0
openniServer.CameraRGB.Mirror=0

#openniServer.calibration=camera-0.cfg
openniServer.CameraDEPTH.Name=cameraB
openniServer.CameraDEPTH.Format=DEPTH8_16
openniServer.CameraDEPTH.fps=10
openniServer.CameraDEPTH.PlayerDetection=0
openniServer.CameraDEPTH.Mirror=0

openniServer.PointCloud.Name=pointcloud1
openniServer.pointCloud.Fps=15

#Activation flags
openniServer.CameraRGB=1
openniServer.CameraIR=1
openniServer.CameraDEPTH=1
openniServer.pointCloudActive=0
openniServer.Pose3DMotorsActive=0
openniServer.KinectLedsActive=0
openniServer.ExtraCalibration=0
openniServer.Debug=1
openniServer.Fps=20


# Levels: 0(DEBUG), 1(INFO), 2(WARNING), 3(ERROR)

openniServer.Log.File.Name=./log/openniServer.txt
openniServer.Log.File.Level=0
openniServer.Log.Screen.Level=0

Where:

  • Endpoints indicates the address where our server will be listening to requests
  • DeviceID indicates the camera which will be connected
  • Name is the name of the ICE interface
  • Format is the expected image format from the sensor
  • Fps are the frames per second offered by the sensor
  • Activation flags block allows to activate or deactivate the different options from the sensors, even the very sensor.

Note that there are two blocks almost identical with this tags: openniServer.CameraRGB.XXX and openniServer.CameraDEPTH.XXX which means that each sensor (RGB camera and DEPTH camera) has to be configured separately.


Simulated RGB cameras in Gazebo[edit]

Videos from YouTube[edit]

'YouTubeServer'


FlyingKinect2 in Gazebo[edit]

Version 2.0 of flyingKinect done from scratch.
It takes all design patterns and features of Quadrotor2 culminating in a new develop scheme:

Configuration file

Ice.MessageSizeMax=2097152
Kinect.Endpoints=default -h localhost -p 9997

Mobile indoor robots[edit]

TurtleBot robot[edit]

'kobuki_driver' is a JdeRobot driver for the control and access to sensors of the TurtleBot robot from Yujin robot. It's inspired in the kobuki_driver from ROS (Groovy).

By now the turtlebot robot supports two ICE interfaces:

But you can attach some other devices to the robot like cameras, laser, mechanic arms, etc.

Configuration file

In order to use the kobuki_driver, you will need to configure the motors and encoders endpoints properly. Also you have to connect the robot via USB to your computer (laptop) and launch the driver as explained here

An example of configuration file is the following:

kobuki.Motors.Endpoints=default -h 0.0.0.0 -p 9999
kobuki.Pose3D.Endpoints=default -h 0.0.0.0 -p 9997

Where:

  • Endpoints indicates the address where our server will be listening to request.


Pioneer robot in Gazebo[edit]

This driver allows gazebo to load a simulated pioneer robot in Gazebo simulator, connecting all their interfaces through ICE and waiting for a tool to bind to them.

It provides the following ICE Interfaces

Configuration file

This driver uses seven different interfaces which have to be configured in order to connect the adapters from ICE properly. To do so, we have several configuration files (*.cfg) that take care of this. An example of configuration that you most likely will find by default is the following:

  • Left camera: cam_pioneer_left.cfg
CameraGazebo.Endpoints=default -h localhost -p 9995

#camera 1
CameraGazebo.Camera.0.Name=cameraA
CameraGazebo.Camera.0.ImageWidth=320
CameraGazebo.Camera.0.ImageHeight=240
CameraGazebo.Camera.0.Format=RGB8
  • Right camera:cam_pioneer_right.cfg
CameraGazebo.Endpoints=default -h localhost -p 9994

#camera 1
CameraGazebo.Camera.0.Name=cameraA
CameraGazebo.Camera.0.ImageWidth=320
CameraGazebo.Camera.0.ImageHeight=240
CameraGazebo.Camera.0.Format=RGB8
  • Motors: pioneer2dxMotors.cfg
Motors.Endpoints=default -h localhost -p 9999
  • Encoders: pioneer2dxEncoders.cfg
Encoders.Endpoints=default -h localhost -p 9997
  • Laser: pioneer2dx_laser.cfg
Laser.Endpoints=default -h localhost -p 9996
  • Pose3DEncoders & Pose3DMotors: pioneer2dx_pose3dencoders.cfg
Pose3DEncoders1.Endpoints=default -h localhost -p 9993
Pose3DEncoders2.Endpoints=default -h localhost -p 9992
Pose3DMotors1.Endpoints=default -h localhost -p 9991
Pose3DMotors2.Endpoints=default -h localhost -p 9990
  • Pose3D: pioneer2dxPose3d.cfg
Pose3D.Endpoints=default -h localhost -p 9989

Where:

  • Endpoints indicates the address where our server will be listening to request
  • Name (camera) is the name of the camera interface

With this configuration, you only have to know the name and the port of the endpoint in order to bind it with a JdeRobot tool. See Running jderobot pioneer to see how to launch an example of this.

Turtlebot robot in Gazebo[edit]

This driver allows gazebo to load a simulated turtlebot robot in Gazebo simulator, connecting all their interfaces through ICE and waiting for a tool to bind to them.

It provides the following ICE Interfaces:

Configuration file

This driver uses seven different interfaces which have to be configured in order to connect the adapters from ICE properly. To do so, we have several configuration files (*.cfg) that take care of this. An example of configuration that you most likely will find by default is the following:

  • Left camera: cam_turtlebot_left.cfg
CameraGazebo.Endpoints=default -h localhost -p 8995

#camera 1
CameraGazebo.Camera.0.Name=cameraA
CameraGazebo.Camera.0.ImageWidth=320
CameraGazebo.Camera.0.ImageHeight=240
CameraGazebo.Camera.0.Format=RGB8
  • Right camera:cam_turtlebot_right.cfg
CameraGazebo.Endpoints=default -h localhost -p 8994

#camera 1
CameraGazebo.Camera.0.Name=cameraA
CameraGazebo.Camera.0.ImageWidth=320
CameraGazebo.Camera.0.ImageHeight=240
CameraGazebo.Camera.0.Format=RGB8
  • Motors: turtlebotMotors.cfg
Motors.Endpoints=default -h localhost -p 8999
  • Encoders: turtlebotEncoders.cfg
Encoders.Endpoints=default -h localhost -p 8997
  • Laser: turtlebot_laser.cfg
Laser.Endpoints=default -h localhost -p 8996
  • Pose3DEncoders & Pose3DMotors: turtlebot_pose3dencoders.cfg
Pose3DEncoders1.Endpoints=default -h localhost -p 8993
Pose3DEncoders2.Endpoints=default -h localhost -p 8992
Pose3DMotors1.Endpoints=default -h localhost -p 8991
Pose3DMotors2.Endpoints=default -h localhost -p 8990
  • Pose3D: turtlebotPose3d.cfg
Pose3D.Endpoints=default -h localhost -p 8998

Where:

  • Endpoints indicates the address where our server will be listening to request
  • Name (camera) is the name of the camera interface

With this configuration, you only have to know the name and the port of the endpoint in order to bind it with a JdeRobot tool. See Running jderobot turtlebot to see how to launch an example of this.

Cars[edit]

Formula1 car in Gazebo[edit]

This driver allows gazebo to load a simulated formula1 robot in Gazebo simulator, connecting all their interfaces through ICE and waiting for a tool to bind to them.

It provides the following ICE Interfaces

Configuration file

This driver uses four different interfaces which have to be configured in order to connect the adapters from ICE properly. To do so, we have several configuration files (*.cfg) that take care of this. An example of configuration that you most likely will find by default is the following:

  • Left camera: cam_f1_left.cfg
CameraGazebo.Endpoints=default -h localhost -p 8995

#camera 1
CameraGazebo.Camera.0.Name=cameraA
CameraGazebo.Camera.0.ImageWidth=320
CameraGazebo.Camera.0.ImageHeight=240
CameraGazebo.Camera.0.Format=RGB8
  • Right camera:cam_f1_right.cfg
CameraGazebo.Endpoints=default -h localhost -p 8994

#camera 1
CameraGazebo.Camera.0.Name=cameraA
CameraGazebo.Camera.0.ImageWidth=320
CameraGazebo.Camera.0.ImageHeight=240
CameraGazebo.Camera.0.Format=RGB8
  • Motors: f1Motors.cfg
Motors.Endpoints=default -h localhost -p 8999
  • Laser: f1_laser.cfg
Laser.Endpoints=default -h localhost -p 8996
  • Pose3D: f1Pose3D.cfg
Pose3D.Endpoints=default -h localhost -p 8998

Where:

  • Endpoints indicates the address where our server will be listening to request
  • Name (camera) is the name of the camera interface

With this configuration, you only have to know the name and the port of the endpoint in order to bind it with a JdeRobot tool.

Taxi in Gazebo[edit]

This driver allows gazebo to load a simulated taxi car in Gazebo simulator, connecting all their interfaces through ICE and waiting for a tool to bind to them.

Ir provides only one ICE Interface:

Configuration file

This driver uses only one interface which have to be configured in order to connect the adapter from ICE properly. To do so, we have a configuration file (*.cfg) that take care of this. An example of configuration that you most likely will find by default is the following:

  • Motors: carMotors.cfg
Motors.Endpoints=default -h localhost -p 8999
  • Endpoints indicates the address where our server will be listening to request

With this configuration, you only have to know the name and the port of the endpoint in order to bind it with a JdeRobot tool.



Laser sensors[edit]

Hokuyo Laser[edit]

'Laser_server' is a component to serve distances measured with Hokuyo laser (The maximum distance measuring is 5.6 meters).

It provides the following ICE Interface:

Configuration file

To use cameraserver we just have to edit the component's configuration to set the network address where our component will be listening, the DeviceID, the angles (in degrees) minimum and maximum of measurements and the clustering. A configuration file example may be like this:

Laser.Endpoints=default -h 0.0.0.0 -p 9998

#Specifies laser type, this currently only works with hokuyo
Laser.Model=hokuyo

#0 corresponds to /dev/ttyACM0, 1 to /dev/ttyACM1, and so on...
Laser.DeviceId=0

#Indicates the beginning and end of the capture of the laser in degrees (0 for the front)
Laser.MinAng=-90
Laser.MaxAng=90

Laser.FaceUp=1

#Number of adjascent ranges to be clustered into a single measurement.
#0 -> 513 ranges in measurement
#3 -> 171 ranges in measurement
#...
Laser.Clustering=0
  • Endpoints: network address where our component will be listening
  • Model: indicates the model of laser for future extensions. Not change
  • DeviceId: indicates the id of the sensor (/dev/ttyACMx)
  • MinAng and MaxAng: Indicates the beginning and end of the capture of the laser in degrees (0 for the front)
  • Clustering: Number of adjascent ranges to be clustered into a single measurement
  • FaceUP: indicates if laser is face up or not

Execution To run the driver, first we must read and write permissions for all users on the device:

sudo chmod 666 /dev/ttyACM0

Once done, run the driver:

laser_server --Ice.Config=laser_server.cfg

RPLidar[edit]

Simulated laser in Gazebo[edit]