![]() The user’s exploration strategy, tactile data’s complexity, and tactile rendering method all affect the user’s haptic perception, which plays a critical role in design and prototyping of those applications. ![]() In those applications, users manually explore the touch surface to interact with the tactile data using some intuitive strategies. We observed that it was highly difficult to follow the edges of shapes and recognize shapes with more than five edges under electrovibration when a single finger was used for exploration.Īdvancements in surface haptics technology have given rise to the development of interactive applications displaying tactile content on touch surfaces such as images, signs, diagrams, plots, charts, graphs, maps, networks, and tables. Our analyses revealed that the participants first used global scanning to extract the coarse features of the displayed shapes, and then they applied local scanning to identify finer details, but needed another global scan for final confirmation in the case of non-prototypical shapes, possibly due to the current limitations of electrovibration technology in displaying tactile stimuli to a user. We also recorded the participants' finger movements on the touchscreen to examine their haptic exploration strategies. However, as the number of edges increased, the recognition time increased and the recognition rate dropped significantly, arriving to a value slightly higher than the chance rate of 20% for non-prototypical octagon. The results showed that the correct recognition rate of the shapes was higher when the haptically active area was larger. In this study, we conducted experiments with human participants to investigate the recognition rate and time of five tactile shapes rendered by electrovibration on a touchscreen using three different methods and displayed in prototypical orientation and non-prototypical orientations. The user's exploration strategy, tactile data's complexity, and tactile rendering method all affect the user's haptic perception, which plays a critical role in design and prototyping of those applications. The benefits of AIGuide may be enhanced with appropriate interaction design.Īdvancements in surface haptics technology have given rise to the development of interactive applications displaying tactile content on touch surfaces such as images, signs, diagrams, plots, charts, graphs, maps, networks, and tables. Our results show that AIGuide is a promising technology to help people with visual impairments locate and acquire objects in their daily routine. We gathered performance data and participants’ reactions and analyzed videos to understand users’ interactions with the nonvisual smartphone user interface. We conducted a user study to investigate the effectiveness of AIGuide in a visual prosthetic for providing guidance compare it to other assistive technology form factors investigate the use of multimodal feedback, and provide feedback about the overall experience. We present AIGuide, a self-contained smartphone application that leverages augmented reality technology to help users locate and pick up objects around them. ![]() The support of augmented reality frameworks in smartphones can overcome the limitations of current object detection applications designed for people with visual impairments. For people with visual impairments, this task can be a daily struggle. Locating and grasping objects is a critical task in people’s daily lives. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |