Scientists at the University of Bristol have developed a new Bi-Touch system, which allows robots to carry out manual tasks by sensing what to do from a digital helper. The findings, published in IEEE Robotics and Automation Letters, show how an AI agent interprets its environment through tactile and proprioceptive feedback, and then controls the robots’ behaviours, enabling precise sensing, gentle interaction, and effective object manipulation to accomplish robotic tasks. This development could revolutionise industries such as fruit picking, domestic service, and eventually recreate touch in artificial limbs. The team were able to develop a tactile dual-arm robotic system using recent advances in AI and robotic tactile sensing, and a virtual world (simulation) that contained two robot arms equipped with tactile sensors. The robot learns bimanual skills through Deep Reinforcement Learning.
Previous ArticleUber Eats’ Ai Chatbot Will Help Customers Place Orders Quickly
Next Article Popular Ai Platforms For Developing Modern Apps