ITNOW's Popular Computer Science (or PopCompSci) brings the most exciting stories from around the world of computer science together for a taste of the unexpected ways in which tech is impacting our lives. Here we tell you how through working in partnership with charities, AI and robotics experts from the University of Glasgow have demonstrated a robotic guide dog designed to help partially sighted people navigate indoor spaces.
Partially sighted and blind people could soon be helped to find their way around indoor spaces with the help of a pioneering robot guide dog. Along with aiding navigation, the four legged robot provides spoken commentary and answers questions.
James Adams, Director of the Royal National Institute of Blind People (RNIB) Scotland, said: ‘Technology innovations like this are reshaping the future of accessibility, and this partnership demonstrates their burgeoning potential to create a more inclusive world.’
Called RoboGuide, the prototype integrates a range of cutting edge technologies into an off the shelf robot body. This, it's claimed, helps overcome the challenges preventing robots from being more widely used to assist blind and partially sighted people.
For you
Be part of something bigger, join BCS, The Chartered Institute for IT.
Dr Olaoluwa Popoola of the University of Glasgow’s James Watt School of Engineering, the RoboGuide project’s principal investigator, said: ‘One significant drawback of many current four legged, two legged and wheeled robots is that the technology which allows them to find their way around can limit their usefulness as assistants for the visually impaired.
‘Robots which use GPS to navigate, for example, can perform well outdoors, but often struggle in indoor settings, where signal coverage can weaken. Others, which use cameras to ‘see’, are limited by line of sight, making it harder for them to safely guide people around objects or around bends.’
Under the hood
The RoboGuide system uses sophisticated sensors mounted on the robot’s exterior to accurately map and assess its surroundings. Software developed by the team helps it learn the optimal routes between locations and interpret the sensor data in real time. This allows the robot to avoid the many obstacles it might encounter while guiding a human.
The RoboGuide also incorporates large language model (LLM) technology, lending it the ability to understand questions and comments from users and provide verbal responses in return.
Dr Wasim Ahmad of the James Watt School of Engineering is a co-investigator on the project. He stated: ‘…We hope to create a robust commercial product to support the visually impaired wherever they want extra help.’