You are here: Home Staff

Deep Detection of People and their Mobility Aids for a Hospital Robot

This website presents supplementary material to our paper submission (preprint) for the European Conference on Mobile Robotics (ECMR), 2017.


Robots operating in populated environments encounter many different types of people, some of whom might have an advanced need for cautious interaction, because of physical impairments or their advanced age. Robots therefore need to recognize such advanced demands to provide appropriate assistance, guidance or other forms of support. In this paper, we propose a depth-based perception pipeline that estimates the position and velocity of people in the environment and categorizes them according to the mobility aids they use: pedestrian, person in wheelchair, person in a wheelchair with a person pushing them, person with crutches and person using a walker. We present a fast region proposal method that feeds a Region-based Convolutional Network (Fast R-CNN [1]). With this, we speed up the object detection process by a factor of seven compared to a dense sliding window approach. We furthermore propose a probabilistic position, velocity and class estimator to smooth the CNN's detections and account for occlusions and misclassifications. In addition, we introduce a new hospital dataset with over 17,000 annotated RGB-D images. Extensive experiments confirm that our pipeline successfully keeps track of people and their mobility aids, even in challenging situations with multiple people from different categories and frequent occlusions.


The following video shows the performance of our pipeline on our MobilityAids dataset and explains some of the underlying concepts.

Here we show a real-world experiment with our robot Canny, guiding visitors to the professor's office. If the robot perceives that the person is using a walking aid, it guides them to the elevator at a lower travel velocity. People without walking aids are guided to the closer staircase.

MobilityAids Dataset

We collected a hospital dataset with over 17'000 annotated RGB-D images, containing people categorized according to the mobility aids they use: pedestrians, people in wheelchairs, people in wheelchairs with people pushing them, people with crutches and people using walking frames. The images were collected in the facilities of the Faculty of Engineering of the University of Freiburg and in a hospital in Frankfurt.

The following image shows example frames of the dataset. On the right, we show successful classifications of our pipeline. On the left, you can see some failure cases with our approach.

Dataset Download

This dataset is provided for research purposes only. Any commercial use is prohibited. If you use the dataset please cite our paper:

  author = {Andres Vasquez and Marina Kollmitz and Andreas Eitel and Wolfram Burgard},
  title = {Deep Detection of People and their Mobility Aids for a Hospital Robot},
  booktitle={Proceedings of the IEEE European Conference on Mobile Robotics (ECMR)},
  year = 2017
Download RGB images 960x540 12.2GB
Download depth images 960x540 3.8GB
Download depth-jet images 960x540 1.4GB

Download annotations for RGB
Download annotations for RGB test set 2 (occlusions)
Download annotations for depth
Download annotations for depth test set 2 (occlusions)

Download image set textfiles
Download camera calibration
Download README file