The TUM Gait from Audio, Image and Depth (GAID) database

Introduction

Recognizing people by the way they walk - also known as gait recognition - has been studied extensively in the recent past. Recent gait recognition methods solely focus on data extracted from an RGB video stream. With this work, we provide a means for multimodal gait recognition, by introducing the freely available TUM Gait from Audio, Image and Depth (GAID) database.

This database simultaneously contains RGB video, depth and audio. With 305 people in three variations, it is one of the largest to-date. To further investigate challenges of time variation, a subset of 32 people is recorded a second time.

Database Description

A detailed description of the database can be found in [1]. This reference also includes experiment definitions as well as baseline results.

In order to load a raw file from the database, you can use the following code in MATLAB:
>> fid = fopen('test.raw');
>> depth_image = fread(fid, [640,480], 'uint16')';
>> fclose (fid);

Distribution

The database is made freely available for research purposes. To get an impression of the database, the data for the person with ID=1 can be downloaded without password.

The complete set of data is distributed via a password protected download. In order to obtain the password, the release agreement must be signed by a legal representative of your institute (e.g. the head of the institute). You can return the signed copy via mail, e-mail or fax. The database is split into several files, each containing a feature set, or modality for all 305 subjects. Currently, we provide

  • RGB image sequences (jpg files, ~30 GB)
  • Depth image sequences (16-bit raw files, ~16 GB)
  • Four channel audio (16kHz wav files, ~1.6 GB)
  • Tracked RGB image sequences (jpg files, ~3.6 GB)
  • Tracked depth image sequences (16-bit raw files, ~0.3 GB)

We provide the tracked version of the RGB and depth image sequences for convenience. These are obtained by tracking the person and cropping both RGB and depth data around the tracked person.

Download

TUM Gait database is no longer available. Please do not use the release agreement!

 

References

  • [1] Martin Hofmann, Jürgen Geiger, Sebastian Bachmann, Björn Schuller, Gerhard Rigoll: "The TUM Gait from Audio, Image and Depth (GAID) Database: Multimodal Recognition of Subjects and Traits," in Journal of Visual Communication and Image Representation, Special Issue on Visual Understanding and Applications with RGB-D Cameras, vol. 25, no. 1, pp. 195-206, Elsevier, 2014. (PDF)
  • [2] Martin Hofmann, Sebastian Bachmann, Gerhard Rigoll: "2.5D Gait Biometrics using the Depth Gradient Histogram Energy Image," In: IEEE Fifth International Conference on Biometrics: Theory, Applications and Systems (BTAS 2012), Washington, DC, USA, Sept. 23-26, 2012