Baylor researchers, led by Bryan Shaw, found that oral tactile visualization of complex 3D structures is as accurate as eyesight

WACO, Texas (May 28, 2021) – Approximately 36 million people go blind, including 1 million children. In addition, 216 million people have moderate to severe visual impairment. However, STEM (science, technology, engineering, and math) education continues to rely on three-dimensional images for education. Most of these images are inaccessible to blind students. A groundbreaking study by Bryan Shaw, Ph.D., professor of chemistry and biochemistry at Baylor University, aims to use small, candy-like models to make science more accessible to the blind or visually impaired.

The Baylor-led study, published May 28 in the journal Scientific advances uses millimeter-sized gelatin models – similar to gummy bears – to improve the visualization of protein molecules through oral stereognosis or the visualization of 3D shapes via the tongue and lips. The aim of the study was to create smaller, more practical tactile models from 3D images that represent protein molecules. The protein molecules were selected because their structures are among the most numerous, complex and high-resolution 3D images presented in the STEM training.

“Your tongue is your finest tactile sensor – about twice as sensitive as your fingertips – but it is also a hydrostat, similar to an octopus arm. It can jiggle in grooves your fingers won’t touch, but no one really uses their tongue or lips in tactile learning. We thought of making very small, high-resolution 3D models and visualizing them with our mouth, ”Shaw said.

The study included a total of 396 participants – 31 fourth and fifth graders and 365 college students. The mouth, hands, and eyesight were tested to identify specific structures. All students were blindfolded during the oral and manual tactile model exam.

Each participant had three minutes to assess or visualize the structure of a study protein with their fingertips, followed by one minute with a test protein. After the four minutes, they were asked if the test protein was the same or a different model than the original study protein. The entire process was repeated with the mouth to recognize the shape instead of the fingers.

Students recognized structures with their mouths with an accuracy of 85.59%, similar to visual recognition using computer animation. The tests included identical edible gelatin models and non-edible 3D printed models. Gelatin models were correctly identified at rates comparable to the inedible models.

“You can visualize the shapes of these tiny objects with your mouth as accurately as you can with your eyes. That was really surprising, ”Shaw said.

The models, which can be used for students with or without visual impairment, provide an affordable, portable, and convenient way to make 3D images more accessible. The study methods aren’t limited to molecular models of protein structures – oral visualization could be done with any 3D model, Shaw said.

While gelatin models were the only edible models tested, Shaw’s team also created high-resolution models from other edible materials, including taffy and chocolate. Certain surface features of the models, such as a protein pattern with a positive and negative surface charge, could be represented by different designs on the model.

“This methodology could be applied to images and models of anything, like cells, organelles, 3D surfaces in math, or 3D artwork – any 3D rendering. It’s not limited to STEM, it’s useful for the humanities too, ”said Katelyn Baumer, PhD student and lead author of the study.

Shaw’s lab sees oral visualization through tiny models as a useful addition to the multisensory learning tools available to students, especially those with exceptional visual needs. Models like the one in this study can make STEM more accessible to students with blindness or low vision.

“Students with blindness are systematically excluded from chemistry and much of MINT. Just take a look around our labs and you can see why – there’s braille on the elevator button to the lab and braille on the lab door. Accessibility ends here. Baylor is the perfect place to make STEM more accessible. Baylor could become a haven for people with disabilities to learn STEM, ”Shaw said.

Shaw isn’t new to high-profile research related to visual impairment. He was recognized for his work on the White Eye Detector app. Shaw and Greg Hamerly, Ph.D., Associate Professor of Computer Science at Baylor, created the mobile app to help parents screen for pediatric eye diseases. Shaw’s inspiration for the app came after his son Noah was diagnosed with retinoblastoma at the age of four months.



Baylor University is a private Christian university and nationally rated research institution. The university offers more than 19,000 students a vibrant campus community by combining interdisciplinary research with an international reputation for excellence in education and a commitment from the faculty to teaching and science. Chartered from the Republic of Texas through the efforts of Baptist pioneers in 1845, Baylor is the oldest continuously operating university in Texas. Located in Waco, Baylor welcomes students from all 50 states and more than 90 countries to study a wide range of degrees in its 12 nationally recognized academic departments.


Please enter your comment!
Please enter your name here