Sydney Uni









Australian Centre for Field Robotics

Rose St. Building J04
The University of Sydney
NSW 2006 Australia









ACFR


Sensor Data Integrity


Multi-Sensor Perception for Unmanned Ground Vehicles



Access to The Marulan Datasets

The link above gives access to large datasets gathered from a multi-sensor Unmanned Ground Vehicles (UGV), which are described in the Journal Paper and the Technical Report presented below.
NB: The directories presented in the technical report for each source of information (i.e. each sensor) are presented here as compressed archives (in .zip or .tar.bz2 format). Just uncompress the downloaded files to obtain the exact same organisation described in the report. For bz2 files, you may need to uncompress the bzip2 file (e.g. using the bunzip2 command on Linux or Mac) and then extract the content of the tar archive (e.g. by 'tar xvf file.tar' on Linux or Mac).

What is this?

The Marulan datasets are large, accurately calibrated and time-synchronised datasets, gathered in controlled environmental conditions, using an unmanned ground vehicle equipped with a wide variety of sensors.
These sensors include: multiple laser scanners, a millimetre wave radar scanner, a colour camera and an infra-red camera, in addition to a cm accuracy dGPS/INS system for localization.
The environmental conditions (artificially created during the data gathering) include: presence of dust, smoke and rain. Besides, some datasets were collected at night.

For any question related to these datasets, contact Thierry Peynot at the Australian Centre for Field Robotics (ACFR), by email: t.peynot AT acfr DOT usyd.edu.au

NB: For any use of some of these data, the following IJRR journal paper (or any related publication) should be clearly cited.



The Marulan Data Sets: Multi-Sensor Perception in Natural Environment with Challenging Conditions

T. Peynot, S. Scheding and S. Terho
International Journal of Robotics Research (IJRR), November 2010, Vol. 29, No. 13, pp. 1602-1607

Click here for abstract and full paper.

bibtex:
@Article{Peynot-IJRR-2010,
author = {T. Peynot and S. Scheding and S. Terho},
title = {{The Marulan Data Sets: Multi-Sensor Perception in Natural Environment with Challenging Conditions}},
journal = {International Journal of Robotics Research},
year = {2010},
volume = {29},
number = {13},
pages = {1602--1607},
month = {November},
OPTannote = {doi: 10.1177/0278364910384638}
}


For a more extensive description of the datasets, including content and format of the files, refer to the technical report below.



Technical Report with Full Description of the Datasets:

Sensor Data Integrity: Multi-Sensor Perception for Unmanned Ground Vehicles
T. Peynot, S. Terho and S. Scheding
Australian Centre for Field Robotics (ACFR), The University of Sydney, 2009
Technical Report ACFR-TR-2009-002

Abstract
This document describes large, accurately calibrated and time-synchronised datasets, gathered in controlled environmental conditions, using an unmanned ground vehicle equipped with a wide variety of sensors. These sensors include: multiple laser scanners, a millimetre wave radar scanner, a colour camera and an infra-red camera. Full details of the sensors are given, as well as the calibration parameters needed to locate them with respect to each other and to the platform. This report also specifies the format and content of the data, and the conditions in which the data have been gathered. The data collection was made in two different situations of the vehicle: static and dynamic. The static tests consisted of sensing a fixed 'reference' terrain, containing simple known objects, from a motionless vehicle. For the dynamic tests, data were acquired from a moving vehicle in various environments, mainly rural, including an open area, a semi-urban zone and a natural area with different types of vegetation. For both categories, data have been gathered in controlled environmental conditions, which included the presence of dust, smoke and rain. Most of the environments involved were static, except for a few specific datasets which involve the presence of a walking pedestrian. Finally, this document presents illustrations of the effects of adverse environmental conditions on sensor data, as a first step towards reliability and integrity in autonomous perceptual systems.

[PDF] (17MB)

Acknowledgment

This work was supported by the US Air Force Research Laboratory through the Asian Office of Aerospace Research and Development (grant AOARD-08-4059), and the ARC Centre of Excellence programme, funded by the Australian Research Council (ARC) and the New South Wales State Government.




Thierry Peynot
Last modified: Thu Sep 3 19:01:29 EST 2009