Cookies on Zenoot

This website uses cookies to ensure you get the best experience on our website. More info

3 minute read

Academic consortium to lead research project exploring trust in autonomous systems

A distinguished academic consortium dedicated to researching trust in autonomous systems has been announced. Led by Heriot-Watt University, home to the National Robotarium, the £3 million project brings together expertise in robotics, cognitive science and psychology with colleagues from Imperial College London and the University of Manchester.

Autonomous systems that make decisions and perform tasks without human intervention are already deployed in industry. However, their use is largely limited to controlled settings, such as on automated production lines. The systems struggle when the task becomes more complex or the environment is uncontrolled, for example, when drones are used for offshore windfarm inspection.

Heriot-Watt University, home to the National Robotarium, will lead a £3m project to explore trust in autonomous systems / Picture: Heriot-Watt University

 

The project is part of the UKRI Trustworthy Autonomous Systems (TAS) programme, funded through the UKRI Strategic Priorities Fund and delivered by the Engineering and Physical Sciences Research Council (EPSRC). The TAS programme brings together the research communities and key stakeholders to drive forward cross-disciplinary fundamental research to ensure that autonomous systems are safe, reliable, resilient, ethical and trusted.

The project, led by Professor Helen Hastie from Heriot-Watt University and the Edinburgh Centre of Robotics, will explore solutions to manage trust in autonomous systems, covering scenarios that require interaction with humans. Examples include self-driving cars, autonomous wheelchairs or ‘cobots’ in the workforce. The group’s work will help design the autonomous systems of the future, ensuring they are widely used and accepted in a variety of industry-relevant applications.

Professor Hastie said: “The challenge of managing trust between the human and the system is particularly difficult because there can be a lack of mutual understanding of the task and the environment. The new consortium will perform foundational research on how humans, robots and autonomous systems can work together by building a shared reality through human-robot interaction.

“By adopting a multidisciplinary approach, grounded in psychology and cognitive science, systems will learn situations where trust is typically lost unnecessarily, adapting this prediction for specific people and contexts. We will explore how to best establish, maintain and repair trust by incorporating the subjective view of humans towards autonomous systems, with the goal being to increase adoption and maximise their positive societal and economic benefits.

“Trust will be managed through transparent interaction, increasing the confidence of those using autonomous systems, allowing them to be adopted in scenarios never before thought possible. This might include jobs that currently endanger humans, such as pandemic-related tasks or those in hazardous environments.”

The TAS programme is a collaborative UK-based platform comprised of Research Nodes and a Hub, united by the purpose of developing world-leading best practice for the design, regulation and operation of autonomous systems. The central aim of the programme is to ensure that the autonomous systems are ‘socially beneficial’, protect people’s personal freedoms and safeguard physical and mental wellbeing.

TAS is comprised of seven distinct research Nodes: trust, responsibility, resilience, security, functionality, verifiability and governance and regulation. Each Node will receive just over £3 million in funding from UKRI to conduct their research.

The academic consortium includes Professor Yiannis Demiris of Imperial College London, Professor Angelo Cangelosi of the University of Manchester and Professor Thusha Rajendran from Heriot-Watt University.


This content is copyright of Zenoot Ltd and its originator. You can use extracts, share or link to this page and you may draw the attention of others to content posted on our site. Bulk copying of text is not permitted. You can view our Terms of Use here.