I have joined the Edinburgh Centre of Robotics in August 2018 when I started as PDRA in the School of Informatics of the University of Edinburgh. I was hired to work on legged locomotion for the ORCA Hub with Prof Michael Mistry and Prof Sethu Vijayakumar. However, I have also worked on planning, manipulation and teleoperation, collaborated with other research groups and contributed to the research activities on COGIMON, THING, MEMMO, NCNR, and FAIRSPACE.


National Centre for Nuclear Robotics – NCNR

Nuclear facilities require a wide variety of robotics capabilities, engendering a variety of extreme RAI challenges. NCNR brings together a diverse consortium of experts in robotics, AI, sensors, radiation and resilient embedded systems, to address these complex problems.

In high gamma environments, human entries are not possible at all. In alpha-contaminated environments, air-fed suited human entries are possible, but engender significant secondary waste (contaminated suits), and reduced worker capability. We have a duty to eliminate the need for humans to enter such hazardous environments wherever technologically possible.

Hence, nuclear robots will typically be remote from human controllers, creating significant opportunities for advanced telepresence. However, limited bandwidth and situational awareness demand increased intelligence and autonomous control capabilities on the robot, especially for performing complex manipulations. Shared control, where both human and AI collaboratively control the robot, will be critical because i) safety-critical environments demand a human in the loop, however ii) complex remote actions are too difficult for a human to perform reliably and efficiently.

Before decommissioning can begin, and while it is progressing, characterization is needed. This can include 3D modelling of scenes, detection and recognition of objects and materials, as well as detection of contaminants, measurement of types and levels of radiation, and other sensing modalities such as thermal imaging. This will necessitate novel sensor design, advanced algorithms for robotic perception, and new kinds of robots to deploy sensors into hard-to-reach locations.

To carry out remote interventions, both situational awareness for the remote human operator, and also guidance of autonomous/semi-autonomous robotic actions, will need to be informed by real-time multi-modal vision and sensing, including: real-time 3D modelling and semantic understanding of objects and scenes; active vision in dynamic scenes and vision-guided navigation and manipulation.

The nuclear industry is high consequence, safety critical and conservative. It is therefore critically important to rigorously evaluate how well human operators can control remote technology to safely and efficiently perform the tasks that industry requires.

All NCNR research will be driven by a set of industry-defined use-cases, WP1. Each use-case is linked to industry-defined testing environments and acceptance criteria for performance evaluation in WP11. WP2-9 deliver a variety of fundamental RAI research, including radiation resilient hardware, novel design of both robotics and radiation sensors, advanced vision and perception algorithms, mobility and navigation, grasping and manipulation, multi-modal telepresence and shared control.

The project is based on modular design principles. WP10 develops standards for modularisation and module interfaces, which will be met by a diverse range of robotics, sensing and AI modules delivered by WPs2-9. WP10 will then integrate multiple modules onto a set of pre-commercial robot platforms, which will then be evaluated according to end-user acceptance criteria in WP11.

WP12 is devoted to technology transfer, in collaboration with numerous industry partners and the Shield Investment Fund who specialise in venture capital investment in RAI technologies, taking novel ideas through to fully fledged commercial deployments. Shield have ring-fenced £10million capital to run alongside all NCNR Hub research, to fund spin-out companies and industrialisation of Hub IP.

We have rich international involvement, including NASA Jet Propulsion Lab and Carnegie Melon National Robotics Engineering Center as collaborators in USA, and collaboration from Japan Atomic Energy Agency to help us carry out test-deployments of NCNR robots in the unique Fukushima mock-up testing facilities at the Naraha Remote Technology Development Center.

subTerranean Haptic INvestiGator – THING

THING will advance the perceptual capabilities of highly mobile legged platforms through haptic perception and active exploration. In this light, THING will deliver:

  1. Novel foot designs for enhanced tactile perception and locomotion,
  2. Improved perceptual capability, enriching existing modalities (lidar, vision) with haptic information,
  3. Heightened physical sense of the environment, including friction, ground stability (difficult through vision alone), and
  4. Enhanced mobility through improved perception, prediction, and control.

Offshore Energy Asset Integrity Management – ORCA

The international offshore energy industry currently faces the triple challenges of an oil price expected to remain less than $50 a barrel, significant expensive decommissioning commitments of old infrastructure (especially North Sea) and small margins on the traded commodity price per KWh of offshore renewable energy. Further, the offshore workforce is ageing as new generations of suitable graduates prefer not to work in hazardous places offshore. Operators therefore seek more cost effective, safe methods and business models for inspection, repair and maintenance of their topside and marine offshore infrastructure. Robotics and artificial intelligence are seen as key enablers in this regard as fewer staff offshore reduces cost, increases safety and workplace appeal.

The long-term industry vision is thus for a completely autonomous offshore energy field, operated, inspected and maintained from the shore. The time is now right to further develop, integrate and de-risk these into certifiable evaluation prototypes because there is a pressing need to keep UK offshore oil and renewable energy fields economic, and to develop more productive and agile products and services that UK startups, SMEs and the supply chain can export internationally. This will maintain a key economic sector currently worth £40 billion and 440,000 jobs to the UK economy, and a supply chain adding a further £6 billion in exports of goods and services.

The ORCA Hub is an ambitious initiative that brings together internationally leading experts from 5 UK universities with over 30 industry partners (>£17.5M investment). Led by the Edinburgh Centre of Robotics (HWU/UoE), in collaboration with Imperial College, Oxford and Liverpool Universities, this multi-disciplinary consortium brings its unique expertise in: Subsea (HWU), Ground (UoE, Oxf) and Aerial robotics (ICL); as well as human-machine interaction (HWU, UoE), innovative sensors for Non Destructive Evaluation and low-cost sensor networks (ICL, UoE); and asset management and certification (HWU, UoE, LIV).

The Hub will provide game-changing, remote solutions using robotics and AI that are readily integratable with existing and future assets and sensors, and that can operate and interact safely in autonomous or semi-autonomous modes in complex and cluttered environments. We will develop robotics solutions enabling accurate mapping of, navigation around and interaction with offshore assets that support the deployment of sensors networks for asset monitoring. Human-machine systems will be able to co-operate with remotely located human operators through an intelligent interface that manages the cognitive load of users in these complex, high-risk situations. Robots and sensors will be integrated into a broad asset integrity information and planning platform that supports self-certification of the assets and robots.

Future AI and Robotics for Space – FAIR-SPACE

Advances in robotics and autonomous systems are changing the way space is explored in ever more fundamental ways. Both human and scientific exploration missions are impacted by these developments. Where human exploration is concerned, robots act as proxy explorers: deploying infrastructure for human arrival, assisting human crews during in-space operations, and managing assets left behind. As humans extend their reach into space, they will increasingly rely on robots enabled by artificial intelligence to handle many support functions and repetitive tasks, allowing crews to apply themselves to problems that call for human cognition and judgment. Where scientific exploration is concerned, robotic spacecraft will continue to go out into Earth orbit and the far reaches of deep space, venturing to remote and hostile worlds, and returning valuable samples and data for scientific analysis.

The aim of FAIR-SPACE is to go beyond the state-of-the-art in robotic sensing and perception, mobility and manipulation, on-board and on-ground autonomous capabilities, and human-robot interaction to enable space robots to perform more complex tasks on long-duration missions with minimal dependence on the ground crew. More intelligent and dexterous robots will be more self-sufficient, being able to detect and respond to anomalies on board autonomously and requiring far less teleoperation.

The research will see novel technologies being developed for robotic platforms used in orbit or on planet surfaces, namely: future on-orbit robots tasked with repairing satellites, assembling large space telescopes, manufacturing in space, removal of space junk; and future surface robots, also known as planetary rovers, for surveying, observation, extraction of resources, and deploying infrastructure for human arrival and habitation; a further case study will target human-robot interoperability aboard the International Space Station.

The research will merge the best available off-the-shelf hardware and software solutions with trail-blazing innovations and new standards and frameworks, aiming at the development of a constellation of space robotics prototypes and tools. This aims to accelerate the prototyping of autonomous systems in a scalable way, where the innovations and methodologies developed can be rapidly spun out for wide adoption in the space sector worldwide.

FAIR-SPACE directly addresses two of the priorities in the Industrial Strategy Green Paper: robotics & artificial intelligence and satellite & space technologies. The clear commitment offered by the industrial partners demonstrates the need for establishing a national asset that will help translate academic outputs into innovative products/services. Our impact plan will ensure we can maximise co-working with user organisations, align our work with other programmes (e.g. InnovateUK) and effectively transfer our research outputs and technology to other sectors beyond space, such as nuclear, deep mining and offshore energy. FAIR-SPACE will, therefore not only help in wealth creation but also help develop a robotics UK community with a leading international profile.

Memory of Motion – MEMMO

What if we could generate complex movements for a robot with any combination of arms and legs interacting with a dynamic environment in real-time? MEMMO has the ambition to create such a motion-generation technology that will revolutionize the motion capabilities of robots and unlock a large range of industrial and service applications. Based on optimal-control theory, we develop a unified yet tractable approach to motion generation for complex robots with arms and legs. The approach relies on three innovative components.

  1. a massive amount of pre-computed optimal motions are generated offline and compressed into a “memory of motion”.
  2. these trajectories are recovered during execution and adapted to new situations with real-time model predictive control. This allows generalization to dynamically changing environments.
  3. available sensor modalities (vision, inertial, haptic) are exploited for feedback control which goes beyond the basic robot state with a focus on robust and adaptive behavior.

To demonstrate the generality of the approach, MEMMO is organized around 3 relevant industrial applications, where MEMMO technologies have a huge innovation potential. For each application, we will demonstrate the proposed technology in relevant industrial or medical environments, following specifications designed by the end-users partners of the project.

  1. A high-performance humanoid robot will perform advanced locomotion and industrial tooling tasks in a 1:1 scale demonstrator of a real aircraft assembly.
  2. An advanced exoskeleton paired with a paraplegic patient will demonstrate dynamic walking on flat floor, slopes and stairs, in a rehabilitation center under medical surveillance.
  3. A challenging inspection task in a real construction site will be performed with a quadruped robot. While challenging, these demonstrators are feasible, as assessed by preliminary results obtained by MEMMO partners, that are all experts or stakeholders of their domain.

Cognitive Interaction in Motion – CogIMon

Compliant control in humans is exploited in a variety of sophisticated skills. These include solitary actions such as soft catching, sliding, and pushing large objects, as well as joint actions performed in teams such as manipulation of large-scale objects or mutual adaptation through physical coupling for learning, walking or in the execution of joint tasks. We refer to this advanced ability to organize versatile motion under varying contact and impedance as cognitive compliant interaction in motion. The CogIMon project aims at a step-change in human-robot interaction toward the systemic integration of robust, dependable interaction capabilities for teams of humans and compliant robots, in particular the compliant humanoid COMAN. We focus on interaction that requires active and adaptive regulation of motion and behaviour of both the human(s) and the robot(s) and involves whole-body variable impedance actuation, adaptability, prediction, and flexibility. This goal shall be achieved through sophisticated real-world robot demonstrations of interactive compliant soft catching and throwing, interaction with COMANS under changing contact and team constellation, and in model-driven fully engineering multi-arm handling shared by Kuka LWR robots and humans working along.