Tactile sensors are the newest sensors that affect how robots interact with the environment around them and deliver data-driven results. Recently, Meta (a parent company of Facebook) announced the design of new tactile sensing hardware for robots. The GelSight-Style fingerprint sensor works on suspended magnetic particles and uses machine learning technology to gather pertinent information.

Most backend function of Meta is dealt with by Artificial intelligence, which is one of the essential aspects of designing a robot. Although Facebook was against robotics, the boom in the sphere, combined with AI technologies, has driven the company to work on robotics. People involved in AI are researching the loop of perception, planning, reasoning, and action and getting feedback from the environment and the objects around it.

Robotics can be used for maintaining automatic data centers, telepresence, and other areas where tactile sensors deliver their work, which is a good enough reason for the company to begin and extend its research about the use of tactile sensors in robots.

Tactile sensors in robots

Humans are good at understanding the environment and everything around it from the subconscious perspective after many years of using it. However, robots and AI systems need to possess this experience, and there is no clear path to getting them to that level. But touch sensors (one of the tactile sensors) pave a pathway for humans to discover the ‘subconscious’ level of the robots.

robots

The GelSight Style sensor uses a tactile sensor, which can convert any touch problem to a vision-based problem with the help of an array of LEDs that illuminate when there is a touch. The result of this robot is a detailed image or video of the object that the finger pad is pushing against, with the help of a camera in the finger pad. This can be helped in manufacturing DIGIT sensors which also provide a low-cost option for tactile sensing in robots.

Another announcement made by Meta is the ReSkin sensor which uses a tactile sensing skin to make an open-source, low-cost robotic system that helps the robots make sense of touch.

ReSkin is a flexible sheet of 2mm thickness with many magnetic particles mixed randomly. The sheet is placed on top of a magnetometer which can deform when it is in contact with an object. The sheet deforms, and the magnetic particles are squished, which changes signals in the magnetometer. Even if the skin is damaged, it can easily be replaced by peeling off and pasting another sheet. These types of sensors help determine an object in the environment by touch.

Conclusion

Tactical sensors in robots have been paving more excellent paths to exploring new ways in which robots can help humans ease the task. The involvement of big companies like Meta proves the topic’s vastness and aids in further discovery of the use of sensors in building robots.