Technology has allowed us to immerse ourselves in the world of species and sounds without leaving home, but something is missing: touch.
Tactile sensations are an incredibly important part of how people perceive their reality. Tactics or devices that can produce extremely specific vibrations that can mimic the sensation of touch are a way to revive that third sense. However, when it comes to tactile sensations, people are incredibly confident about whether something seems “right” and virtual textures don’t always hit the target.
Now researchers from the USC Viterbi School of Engineering have developed a new method for computers to achieve this true texture – with the help of humans.
The framework, called the preference-based model, uses our ability to distinguish the details of certain textures as a tool to customize these virtual counterparts.
The study was published in IEEE Transactions on Haptics three USC Viterbi Ph.D. computer science students Sheehan Lu, Menlong Zheng and Matthew Fontaine, as well as Stefanas Nikolaidis, Associate Professor of Computer Science USC Viterbi and Heather Kalbertson, Associate Professor USC Viterbi WiSE Gabilan in Computer Science.
“We ask users to compare their feelings between real texture and virtual texture,” explained Lou, the first author. “Then the model iteratively updates the virtual texture so that eventually the virtual texture can match the real one.”
According to Fontaine, the idea first emerged when in the fall of 2019 they held a class on tactile interfaces and virtual environments taught by Culbertson. They drew inspiration from the Picbreeder art app, which can create images based on user preferences over and over again until it achieves the desired result.
“We thought that if we could do it for textures?” Fontaine reminded him.
Using this model based on preferences, the user first gets a real texture, and the model randomly generates three virtual textures using dozens of variables from which the user can choose the one that most closely resembles the real one. Over time, the search adjusts the distribution of these variables as you get closer and closer to what the user prefers. According to Fontaine, this method has an advantage over directly recording and “playing” textures, because there is always a gap between what the computer reads and what we feel.
“You measure the parameters of how they feel, not just mimic what we can record,” Fontaine said. There will be a mistake in the way you recorded this texture, in the way you reproduce it. “
All the user has to do is choose which texture best suits, and adjust the amount of friction with a simple slider. Endurance is important for how we perceive textures, and it can vary from person to person. “It’s very easy,” Lou said.
Their work comes just in time to develop a market for specific, accurate virtual textures. Everything from video games to clothing design combines tactile technology, and existing virtual texture databases can be improved with this user preference method.
“There is a growing popularity of tactile devices in video games, fashion design and surgery modeling,” Lou said. “Even at home, we’ve started to see users with (tactical) devices that are becoming as popular as laptops. For example, with first-person video games, they’ll feel like they’re really interacting with their environment.”
Lou used to do other work on immersion technology, but with sound – in particular, making the virtual texture even more exciting by introducing appropriate sounds when the instrument interacts with it.
“When we interact with the environment through a tool, tactile feedback is just one modality, one kind of sensory feedback,” Lou said. “Audio is another kind of sensory feedback, and both are very important.”
The texture search model also allows someone to extract a virtual texture from a database, such as the University of Pennsylvania’s Haptic Texture Toolkit, and refine them until they get the desired result.
“You can use previous virtual textures that others have been looking for, and then based on them you can continue customizing them,” Lou said. “You don’t have to search from scratch every time.”
This is especially handy for virtual textures used in teaching dentistry or surgery, which need to be extremely accurate, Lou believes.
“Surgical training is definitely a huge area that requires very realistic textures and tactile feedback,” Lou said. “Fashion design also requires great precision in texture design before they go and produce.”
In the future, the model may not even need real textures, Lou explained. The way some things in our lives feel so intuitive that fine-tuning the texture to fit that memory is something we can do just by looking at a photo without having a real texture in front of us for reference.
“When we see the table, we can imagine what the table will feel like when we touch it,” Lou said. “Using this prior knowledge of the surface, you can just give users visual feedback, and it allows them to choose what fits.”