# Status reports

## Repository

Get the team repo like this:

``` git clone gitosis@10.0.0.217:vvv10repo.git
```

For this to work, give Alexis your public ssh key.

## Sunday 25.07.2010

Started closing the loop between marker positions and hand positions. In the first experiments we found some offsets that we hope to solve on Monday. Our code for moving the icub to watching positions, reaching, and the high-level state machine are still working.

The Artoolkit program detects reliably the position of our 4 markers in distances up to 40-50cm from the eyes.

We need a bit of extra light (thanks to Paul for the desk lamps).

Since we have two markers on each lego piece, we have a program that receives their position and reports the position of the lego piece even if we only see one marker, and an average if we see both. A speed/acceleration filter reduces false positives.

Looking at the lego pieces on the table:

"

We also checked the relative position between the forward kinematics of the arm, and the estimated position of a marker. After moving the arm to many positions, we discovered an error of +-1.5cm, picture of the experiment:

## Saturday 24.07.2010

We have tried to extract the best quality images possible from the Dragonfly2 cameras on the iCub.

There are 3 limiting factors:

```* Firewire bus bandwidth
* Ethernet bandwidth
* CPU on the PC104
```

1. Firewire bus bandwidth:

The PC104 has one 1394a (400Mbps) bus. This has to be divided between two cameras.

``` An example of a bandwidth calculation:
# Firewire Bandwidth = xresolution * yresolution * bytes/pixel * frames/sec * bits/byte * Mbits / bits
# Firewire Bandwidth = 640 * 480 * 3 * 30 * 8 / 1024 / 1024 = 210.9375 Mbits/sec
```

So, if you are working at 640x480 RGB8 (3bytes/pixel) @ 30fps, then you need 210Mbps per camera That would be 420Mbps for two cameras, that exceeds the maximum.

Another option is to use YUV422, that uses 1.5bytes per pixel. We have added a videtype to the icubmoddev executable to support it. Then you need 105Mbps per camera.

Even better is to read RAW data from the sensor, and apply the debayer pattern on the PC104.

The Dragonfly2 cameras have a 12bit ADC, so they support RAW8 and RAW16 modes. RAW8 sends 8bits per pixel, and RAW16 16bits per pixel. We have added the RAW16 mode to icubmoddev to support RAW16, and debayering on the PC104 computer.

The RAW16 mode should get the highest possible quality out of the cameras, but you need to adjust brightness, contrast, white balance, etc, on your software. (The normal adjustments in the framegrabber program tell the camera how to apply the debayer pattern and then transfer RGB data or YUV data).

The bandwidth needed for these RAW modes is:

```RAW8:  640 * 480 * 1 * 30 * 8 / 1024 / 1024 = 70.3 Mbits/sec per camera
RAW16: 640 * 480 * 2 * 30 * 8 / 1024 / 1024 = 140.6 Mbits/sec per camera
```

The dragonfly2 camera supports up to 60fps. The bandwidth of the firewire bus is enough for 1 camera at 60fps at 640x480 using RAW16, and for two cameras at 60fps 640x480 using RAW8.

2. Ethernet bandwidth

The iCub has one gigabit ethernet connection to the PC104 computer, so we have 1000Mbps available.

icubmoddev converts the images to RGB8 before sending them on the network using yarp, so you need 3 bytes/pixel always.

```RGB8: 640 * 480 * 3 * 30 * 8 / 1024 / 1024 = 210.93 Mbits/sec per camera
```

For two cameras at 640x480 @ 30fps we are sending 421Mbits/sec. Serializing all this data uses a lot of CPU power!

3. CPU consumption

The most efficient camera mode from the CPU point of view is RGB8 since icubmoddev only does a memcpy from the camera buffer to the internal buffer.

The YUV mode does a mode conversion (YUV422->RGB8) before doing the memcpy.

The RAW8 and RAW16 modes apply a debayer pattern before doing the memcpy.

### The mode we leave:

We were looking for a nice balance between all of these limitations, so we set the cameras like this:

```Resolution: 640x480   Format: Format7 0   Color-coding: RGB8
```

We leave the framerate setting empty, because in Format7, the framerate is automatically selected to use the maximum bandwidth. The package data size is set to 44%, so that even including a little overhead, two cameras share almost all the bandwidth. The resulting framerate is 19fps.

``` Framerate = 19fps
```

The CPU usage for the icubmoddev process for each camera is ~ 30-40%.

The Network bandwidth is: 133.6Mbps per camera.

So we are using 267Mbps for both cameras, or 26.7% of the gigabit ethernet bandwidth.

### Camera calibration parameters

We also adjusted the camera calibration parameters in the XML file used by the 'camcalib' program. We recommend people to use the ports:

```/icub/camcalib/left/out
/icub/camcalib/right/out

```

Since the camcalib software runs on another computer, we avoid having multiple connections getting camera images from the robot.

### Wishlist

What we would like best is that the PC104 computer receives RAW8 (or RAW16) data and sends it as it is on the network, thus reducing network bandwidth and the CPU cost for serializing the images to put them on the network. The clients should then apply the debayer function on their own. (Or this could be done transparently on the FrameGrabber interface, as suggested by Lorenzo.

## Thrusday 22.10.2010

Finally managed to calibrate the cameras of the iCub at 640x480 and get precise values
from the artoolkit program. More light is good! A flat CalTab is important! Vvv10_camera_calibration

Also got the kinematic tree of the iCub loaded, managed to detect the markers at a useful distance
and tested the marker to world coordinate transformations.

The iCub looking for lego pieces:

Here is what it looks like in rviz:

And an example rectified image from the camera. The marker in the hand was detected well.