Quantcast
Channel: Analytics India Magazine
Viewing all articles
Browse latest Browse all 21301

Researchers Are Now Giving The Sense Of Touch To Robots Through Deep learning

$
0
0

Ever since it’s inception, artificial intelligence has been used to give better vision to machines. Thanks to AI and deep learning, machines are now developed enough to understand their surroundings through vision. With the robots already equipped sight or vision, researchers and AI enthusiasts are now focusing on other senses like touch.

The Sense of Touch

An AI enthusiast, technologist and the founder of Somatic — a company that specializes in deep learning image and neuro-linguistic programming models optimised for mobile applications — Jason Toy set a path to a new realm for advancement in AI with his new project.

The project primarily focuses on training AI systems to interact with its surroundings by the sense of touch. With this project, Toy aims to expand beyond the concept of machines understanding its surroundings through visual imagery to include object recognition by the sense of touch giving them the ability to understand the characteristics of objects such as contours, textures, shapes, hardness by physical contact. This is to be accomplished by adding sensorimotor neural systems and tactile feedback to robotic systems.

Called SenseNet: 3D Objects Database and Tactile Simulator, the project can be used in a reinforcement learning environment. The project opens possibilities for the application of robotics in wider domains, such as using a robotic hand in factories to perform bin packing, parts retrieval, order fulfillment, and sorting and many other tasks that require robotics to handle objects sensitively — such as preparing food, performing household tasks, and assembling components.

Behind The Technology

The project used Reinforcement Learning Coach, a framework by Intel, for training and evaluating reinforcement learning agents. The framework works within a Python environment and lets the developers model the interaction between the agent and the environment. The RL Coach provides visualization tools for dynamically displaying training and test results. It also enables testing of the agent in multiple environments.

The framework allows agent tests to be performed for specializes applications including robotics gaming etc. The agent can be optimised by using the data collected during training which is available in the dashboard of the framework.

The SenseNet Dataset

The senseNet dataset is an open source dataset of shapes as well as a touch simulator built by Toy. The GitHub repository for SenseNet Dataset includes training examples, classification tests, benchmarks, Python code samples, and more that can benefit AI researchers interested in the particular domain. The simulator lets the researchers load and manipulate the objects.

Other Research

Intel has been dedicating its time and resources in accelerating the progress of AI to solve difficult challenges across various sectors.

Another research that is based around giving the sense of touch to robots is the Gelsight a sensor technology being developed by MIT’s Computer Science and Artificial Intelligence Laboratory. Gelsight maps the 3D objects through physical contact and pressure.

Outlook

Technology is advancing really fast both in terms of hardware as well as software. Sensors and softwares together create data that can be used to train robots to interact with objects more like humans. In the future, we may see more advanced and innovating features on a robot, which will make it more capable than its creators.

The post Researchers Are Now Giving The Sense Of Touch To Robots Through Deep learning appeared first on Analytics India Magazine.


Viewing all articles
Browse latest Browse all 21301

Trending Articles