Resource-Efficient Methods for Feasibility Studies of Scenarios for Long-Term HRI Studies

Nate Derbinsky, Wan Ching Ho, Ismael Duque, Joe Saunders, Kerstin Dautenhahn, in ACHI 2013 pp. 95-100, February 24 - March 1, 2013.

What can a robot do for you? Evaluating the needs of the elderly in the UK

Lehmann, H., Syrdal, D., Dautenhahn, K., Gelderblom, G.J., Bedaf, S. M. A., Amirabdollahian, F. (2013), The Sixth International Conference on Advances in Computer-Human Interactions (ACHI 2013), February 24 – March 1, 2013 – Nice, France.

Deliverable 6.2 Identification and discussion of relevant ethical norms for the development and use of robots to support the elderly in their own homes.

As required by Task T6.4, this deliverable identifies and begins to explore ethical norms that may be used to guide ACCOMPANY development and evaluation. It asks how multi-functioning humanoid robots, such as the Care-obot used by ACCOMPANY, improve on single-function robots and non-robotic telecare and telehealth technology given: a) how expensive they are to develop and b) existing ACCOMPANY plans and data collection. The delivrable proposes that one positive advantage of a multi-functioning humanoid robot is that it provides ‘presence’.

Deliverable 3.2 Initial design and implementation of the memory visualisation and narrative generation

Deliverable 3.2 documents the first design and implementation of visualization of a part of a robotic companion’s memory. This feature is a specific module as part of the long-term memory system under development in WP3. Memory visualization of the robotic companion (Care-O-bot® 3) aims to support the user in remembering past events from the human-robot interaction history, including those that the user found ‘memorable’ (e.g.

Deliverable 1.3 Phase one scenarios and report on system functionality

Phase one scenarios and report on system functionality: This deliverable reports on the first outcome of task T1.4 (Iterative detailing of scenario).

Deliverable 4.2 Data fusion for robust detection and identification objects and users

In this deliverable report, we introduce the system that has been developed in the completion of T4.1: Data fusion for robust detection and identification objects and users. For object recognition, we fuse data from different modalities to improve the quality of available data for object modelling and detection. Concretely, the colour image data of a colour camera is combined with the depth information gained from stereo vision that is improved with the depth data of a time-of-flight sensor. The result is a dense coloured point cloud at a high resolution.

Adding Rotational Robustness to the Surface-Approximation Polynomials Descriptor

Authors: R. Bormann, J. Fischer, G. Arbeiter, and A. Verl. Humanoids 2012, November 2012.

ACCOMPANY hosted showcasing event in association with KT Equal

60 stakeholder representative groups attended the recent "Showcasing research to promote active ageing: from Rehabilitation robots to Assistive technologies and beyond" event hosted by KT Equal ( and the ACCOMPANY and SCRIPT ( projects at University of Hertfordshire.


Bayesian Fusion of Ceiling Mounted Camera and Laser Range Finder on a Mobile Robot for People Detection and Localization

Authors: Ninghang Hu, Gwenn Englebienne & Ben Kröse, IROS 2012, Oct 2012
DOI 10.1007/978-3-642-34014-7_4


Contextual Analysis of the needs of elderly for independent living: is there a role for robot physical therapy?

Authors: Gallego Pérez, J., Karreman, D. E., & Evers, V. IROS 2012, October 2012.


Subscribe to RSS