TY - CHAP A1 - Götzelmann, Timo A1 - Pavkovic, Aleksander T1 - Towards Automatically Generated Tactile Detail Maps by 3D Printers for Blind Persons BT - 14th International Conference, ICCHP 2014, Paris, France, July 9-11, 2014, Proceedings, Part II T2 - Computers Helping People with Special Needs N2 - This paper introduces an approach for the (semi)automatic generation of worldwide available, detailed tactile maps including buildings and blind-specific features based on recognized illustrators’ guidelines and standards. These guidelines for tactile maps are investigated in order to define a formal rule set and to automatically filter map data accordingly. Using the rule set, our approach automatically abstracts map data in order to generate a 2.1D tactile model providing multiple height levels (layers) which can be printed by usual consumer 3D printers. Based on the popular OpenStreetMap map data, our automated approach allows to generate arbitrary detail maps blind persons individually interested in, without the need for manual adaption of the tactile map. Thus, this approach contributes to the goal to increase the autonomy of blind persons. KW - Tactile Maps KW - Accessibility KW - Haptic KW - 3D printer KW - 3D-Drucker KW - Blindenkarte Y1 - 2014 SN - 978-3-319-08599-9 U6 - https://doi.org/10.1007/978-3-319-08599-9_1 PB - Springer ER - TY - JOUR A1 - Götzelmann, Timo T1 - Visually Augmented Audio-Tactile Graphics for Visually Impaired People JF - ACM Transactions on Accessible Computing (TACCESS) N2 - Tactile graphics play an essential role in knowledge transfer for blind people. The tactile exploration of these graphics is often challenging because of the cognitive load caused by physiological constraints and their complexity. The coupling of physical tactile graphics with electronic devices offers to support the tactile exploration by auditory feedback. Often, these systems have strict constraints regarding their mobility or the process of coupling both components. Additionally, visually impaired people cannot appropriately benefit from their residual vision. This article presents a concept for 3D printed tactile graphics, which offers to use audio-tactile graphics with usual smartphones or tablet-computers. By using capacitive markers, the coupling of the tactile graphics with the mobile device is simplified. These tactile graphics integrating these markers can be printed in one turn by off-the-shelf 3D printers without any post-processing and allows us to use multiple elevation levels for graphical elements. Based on the developed generic concept on visually augmented audio-tactile graphics, we presented a case study for maps. A prototypical implementation was tested by a user study with visually impaired people. All the participants were able to interact with the 3D printed tactile maps using a standard tablet computer. To study the effect of visual augmentation of graphical elements, we conducted another comprehensive user study. We tested multiple types of graphics and obtained evidence that visual augmentation may offer clear advantages for the exploration of tactile graphics. Even participants with a minor residual vision could solve the tasks with visual augmentation more quickly and accurately. Y1 - 2018 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:bvb:92-opus4-5571 VL - 2018 IS - Volume 11, Issue 2, Article No. 8 PB - ACM ER - TY - JOUR A1 - Götzelmann, Timo T1 - Autonomous Selection and Printing of 3D Models for People Who Are Blind JF - ACM Transactions on Accessible Computing (TACCESS) N2 - 3D models are an important means for understanding spatial contexts. Today these models can be materialized by 3D printing, which is increasingly used at schools for people with visual impairments. In contrast to sighted people, people with visual impairments have so far, however, neither been able to search nor to print 3D models without assistance. This article describes our work to develop an aid for people with visual impairments that would facilitate autonomous searching for and printing of 3D models. In our initial study, we determined the requirements to accomplish this task by means of a questionnaire and developed a first approach that allowed personal computer-based 3D printing. An extended approach allowed searching and printing using common smartphones. In our architecture, technical details of 3D printers are abstracted by a separate component that can be accessed via Wi-Fi independently of the actual 3D printer used. It comprises a search of the models in an annotated database and 3D model retrieval from the internet. The whole process can be controlled by voice interaction. The feasibility of autonomous 3D printing for people with visual impairments is shown with a first user study. Our second user study examines the usability of the user interface when searching for 3D models on the internet and preparing them for the materialization. The participants were able to define important printing settings, whereas other printing parameters could be determined algorithmically. Y1 - 2018 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:bvb:92-opus4-5587 VL - 2018 IS - Volume 11 Issue 3, Article No. 14 SP - 1 EP - 25 PB - ACM ER - TY - CHAP A1 - Kreimeier, Julian A1 - Götzelmann, Timo T1 - Real World VR Proxies to Support Blind People in Mobility Training T2 - Proceedings of Mensch und Computer 2018 (MuC'18) N2 - Mobility training is an essential part of blind people’s education in order to move in public spaces. In order to safely learn new routes in public space, however, a seeing trainer must assist the blind person. With the increasing availability of VR hardware, it is possible to transfer real spatial environments to virtual representations. The digitized environments can be used as a basis for this training without safety problems by real world hazards. This allows to cope with the limited resources of sighted assistants and enables blind people to become more independent. We propose to capture real public spaces (such as sidewalks, train stations etc.) and make them in this way ascertainable. Orientation and mobility can be trained in this digital model via multimodal sensory feedback while involving intuitive locomotion and white cane exploration. This paper sketches the related work and proposes our novel approach. Furthermore, we suggest additional improvements on our ongoing research. KW - Virtual Reality KW - blind KW - haptic interaction KW - visually impaired KW - orientation and mobility Y1 - 2018 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:bvb:92-opus4-5592 VL - 2018 PB - Gesellschaft für Informatik e.V. CY - Bonn ER - TY - JOUR A1 - Kreimeier, Julian A1 - Bielmeier, Tobias A1 - Götzelmann, Timo T1 - Evaluation of Capacitive Markers Fabricated by 3D Printing, Laser Cutting and Prototyping JF - Journal of Inventions: Special Issue Innovations in 3-D Printing N2 - With Tangible User Interfaces, the computer user is able to interact in a fundamentally different and more intuitive way than with usual 2D displays. By grasping real physical objects, information can also be conveyed haptically, i.e., the user not only sees information on a 2D display, but can also grasp physical representations. To recognize such objects (“tangibles”) it is skillful to use capacitive sensing, as it happens in most touch screens. Thus, real objects can be located and identified by the touch screen display automatically. Recent work already addressed such capacitive markers, but focused on their coding scheme and automated fabrication by 3D printing. This paper goes beyond the fabrication by 3D printers and, for the first time, applies the concept of capacitive codes to laser cutting and another immediate prototyping approach using modeling clay. Beside the evaluation of additional properties, we adapt recent research results regarding the optimized detection of tangible objects on capacitive screens. As a result of our comprehensive study, the detection performance is affected by the type of capacitive signal processing (respectively the device) and the geometry of the marker. 3D printing revealed to be the most reliable technique, though laser cutting and immediate prototyping of markers showed promising results. Based on our findings, we discuss individual strengths of each capacitive marker type. Y1 - 2018 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:bvb:92-opus4-5603 VL - 2018 IS - Volume 3, Issue 1, Article 9 PB - MDPI ER - TY - CHAP A1 - Götzelmann, Timo T1 - CapMaps: Capacitive Sensing 3D Printed Audio-Tactile Maps T2 - Proc. 15th International Conference on Computers Helping People with Special Needs N2 - Tactile maps can be useful tools for blind people for navigation and orientation tasks. Apart from static maps, there are techniques to augment tactile maps with audio content. They can be used to interact with the map content, to offer extra information and to reduce the tactile complexity of a map. Studies show that audio-tactile maps can be more efficient and satisfying for the user than pure tactile maps without audio feedback. A major challenge of audio-tactile maps is the linkage of tactile elements with audio content and interactivity. This paper introduces a novel approach to link 3D printed tactile maps with mobile devices, such as smartphones and tablets, in a flexible way to enable interactivity and audio-support. By integrating conductive filaments into the printed maps it seamlessly integrates into the 3D printing process. This allows to automatically recognize the tactile map by a single press at its corner. Additionally, the arrangement of the tactile map on the mobile device is flexible and detected automatically which eases the use of these maps. The practicability of this approach is shown by a dedicated feasibility study. Y1 - 2015 UR - https://doi.org/10.1007/978-3-319-41267-2_20 SN - 978-3-319-41266-5 VL - 2015 SP - 146 EP - 152 PB - Springer CY - Cham ER - TY - CHAP A1 - Götzelmann, Timo A1 - Eichler, Laura T1 - BlindWeb Maps – An Interactive Web Service for the Selection and Generation of Personalized Audio-Tactile Maps T2 - Proc. 15th International Conference on Computers Helping People with Special Needs N2 - Tactile maps may contribute to the orientation of blind people or alternatively be used for navigation. In the past, the generation of these maps was a manual task which considerably limited their availability. Nowadays, similar to visual maps, tactile maps can also be generated semi-automatically by tools and web services. The existing approaches enable users to generate maps by entering a specific address or point of interest. This can in principle be done by a blind user. However, these approaches actually show an image of the map on the users display which cannot be read by screen readers. Consequently, the blind user does not know what is on the map before it is printed. Ideally, the map selection process should give the user more information and freedom to select the desired excerpt. This paper introduces a novel web service for blind people to interactively select and automatically generate tactile maps. It adapts the interaction concept for map selection to the requirements of blind users whilst supporting multiple printing technologies. The integrated audio review of the map’s contents allows earlier feedback to review if the currently selected map extract corresponds to the desired information need. Changes can be initiated before the map is printed which, especially for 3D printing, saves much time. The user is able to select map features to be included in the tactile map. Furthermore, the map rendering can be adapted to different zoom levels and supports multiple printing technologies. Finally, an evaluation with blind users was used to refine our approach. Y1 - 2015 UR - https://doi.org/10.1007/978-3-319-41267-2_19 SN - 978-3-319-41266-5 VL - 2015 SP - 139 EP - 145 PB - Springer CY - Cham ER - TY - CHAP A1 - Götzelmann, Timo A1 - Althaus, Christoph T1 - TouchSurfaceModels: Capacitive Sensing Objects through 3D Printers T2 - Proc. 9th ACM International Conference on PErvasive Technologies Related to Assistive Environments N2 - Nowadays, 3D models can be downloaded from the internet and increasingly be printed by low cost 3D printers. In the future, blind people could benefit from this tendency. Unfortunately, many of these models are rather complex and not appropriate for the purely tactile exploration. To obtain quantitative data about how 3D printable models for blind people should be constructed, the tactile exploration can be recorded by video. However, the analysis of these videos is quite time consuming and expensive. Additionally, inaccuracies and masking effects may impede the use of this technique. In this paper we introduce a novel approach to automatically equip existing 3D models with a mesh of conductive wires which enable a touch sensitive surface for the printed 3D objects. These touch sensing 3D models can be printed in one turn by off-the-shelf 3D printers and used as an alternative to video recording. It allows exact registration of when and where the 3D object has been touched. In our multi-touch solution, particular attention has been paid to limit the number of necessary wires between 3D object and sensing electronics. Finally, our approach is evaluated by a feasibility study. Y1 - 2016 UR - https://doi.org/10.1145/2910674.2910690 SN - 978-1-4503-4337-4 VL - 2016 PB - ACM CY - New York, NY, USA ER - TY - CHAP A1 - Götzelmann, Timo T1 - Interactive Tactile Maps for Blind People using Smartphones Integrated Cameras T2 - Proc. 9th ACM International Conference on Interactive Tabletops and Surfaces (ITS'14) N2 - Tactile maps may support blind persons in orientation and understanding geographical relations, but their availability is still very limited. However, recent technologies such as 3D printers allow to autonomously print individual tactile maps which can be linked with interactive applications. Besides geographical depictions, textual annotation of maps is crucial. However, this often adds much complexity to tactile maps. To limit tactile complexity, interactive approaches may help to complement maps by the auditive modality. The presented approach integrates barcodes into tactile maps to allow their detection by standard smartphones' cameras. Automatically, more detailed map data is obtained to auditively support the exploration of the tactile map. Our experimental implementation shows the principal feasibility and provides the basis of ongoing comprehensive user studies. Y1 - 2014 UR - https://doi.org/10.1145/2669485.2669550 SN - 978-1-4503-2587-5 VL - 2014 SP - 381 EP - 385 PB - ACM CY - New York, NY, USA ER - TY - CHAP A1 - Götzelmann, Timo T1 - SmartTactMaps: A Smartphone-Based Approach to Support Blind Persons in Exploring Tactile Maps T2 - Proc. 8th ACM International Conference on PErvasive Technologies Related to Assistive Environments N2 - Despite increasing digitalization of our society many blind persons still have very limited access to predominantly pictorial information such as maps. In this paper we introduce a novel approach to improve the accessibility of maps for blind users by utilizing the abilities of standard smartphones. A major issue of tactile maps is the limited discriminability of the humans' tactile sense. Textual annotation of maps is crucial, but adds much complexity to tactile maps. Additionally, only few Braille labels can be accommodated to maintain legibility. In our approach we link smartphones with adapted tactile maps which transforms the physical maps into interactive surfaces using both the tactile and the auditory modality. We integrate machine readable metadata into these maps which can be recognized by the smartphones' camera to immediately obtain detailed map descriptions from a free global database. During tactile exploration of the map, blind users can request auditory explanations by interacting with the mobile application. An experimental application and a user study demonstrate the feasibility of our approach. Y1 - 2015 UR - https://doi.org/10.1145/2769493.2769497 SN - 978-1-4503-3452-5 VL - 2015 PB - ACM CY - New York, NY, USA ER - TY - CHAP A1 - Götzelmann, Timo A1 - Vázquez, Pere-Pau T1 - InclineType: An Accelerometer-based Typing Approach for Smartwatches T2 - Proc. 16th International Conference on Human Computer Interaction N2 - Small mobile devices such as smartwatches are a rapidly growing market. However, they share the issue of limited input and output space which could impede the success of these devices in future. Hence, suitable alternatives to the concepts and metaphors known from smartphones have to be found. In this paper we present InclineType a tilt-based keyboard input that uses a 3-axis accelerometer for smartwatches. The user may directly select letters by moving his/her wrist and enters them by tapping on the touchscreen. Thanks to the distribution of the letters on the edges of the screen, the keyboard dedicates a low amount of space in the smartwatch. In order to optimize the user input our concept proposes multiple techniques to stabilize the user interaction. Finally, a user study shows that users get familiar with this technique with almost no previous training, reaching speeds of about 6 wpm in average. Y1 - 2015 UR - https://doi.org/10.1145/2829875.2829929 SN - 978-1-4503-3463-1 VL - 2015 PB - ACM CY - New York, NY, USA ER - TY - CHAP A1 - Kreimeier, Julian A1 - Götzelmann, Timo T1 - FeelVR: Haptic Exploration of Virtual Objects T2 - Proceedings of the 11th PErvasive Technologies Related to Assistive Environments Conference (PETRA '18) N2 - The interest in virtual and augmented reality increased rapidly in the last years. Recently, haptic interaction and its applications get into focus. In this paper, we suggest the exploration of virtual objects using off-the-shelf VR game controllers. These are held like a pen with both hands and were used to palpate and identify the virtual object. Our study largely coincides with comparable previous work and shows that a ready-to-use VR system can be basically used for haptic exploration. The results indicate that virtual objects are more effectively recognized with closed eyes than with open eyes. In both cases, objects with a bigger morphological difference were identified the most frequently. The limitations due to quality and quantity of tactile feedback should be tackled in future studies that utilize currently developed wearable haptic devices and haptic rendering involving all fingers or even both hands. Thus, objects could be identifiable more intuitively and haptic feedback devices for interacting with virtual objects will be further disseminated. Y1 - 2018 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:bvb:92-opus4-5611 VL - 2018 SP - 122 EP - 125 PB - ACM CY - New York ER - TY - CHAP A1 - Kreimeier, Julian A1 - Hammer, Sebastian A1 - Friedmann, Daniel A1 - Karg, Pascal A1 - Bühner, Clemens A1 - Bankel, Lukas A1 - Götzelmann, Timo T1 - Evaluation of Different Types of Haptic Feedback Influencing the Task-based Presence and Performance in Virtual Reality T2 - Proceedings of the 12th ACM International Conference on PErvasive Technologies Related to Assistive Environments (PETRA'19) N2 - Haptic feedback may support immersion and presence in virtual reality (VR) environments. The emerging market of consumer devices offers first devices which are expected to increase the degree of feeling being actually present in a virtual environment. In this paper we introduce a novel evaluation that examines the influence of different types of haptic feedback on presence and performance regarding manual tasks in VR. Therefore, we conducted a comprehensive user study involving 14 subjects, who performed throwing, stacking and object identification tasks in VR with visual (i.e., sensory substitution), vibrotactile or force feedback. We measured the degree of presence and task-related performance metrics. Our results indicate that regarding presence vibrotactile feedback outperforms haptic feedback which performs better than visual feedback only. In addition, force feedback significantly lowered the execution time for the throwing and the stacking task. In object identification tasks, the vibrotactile feedback increased the detection rates compared to the vibrotactile and force feedback, but also increased the required time of identification. Despite the inadequacies of the still young consumer technology, there were nevertheless strong indications of connections between presence, task fulfillment and the type of haptic feedback. Y1 - 2019 UR - https://doi.org/10.1145/3316782.3321536 SN - 978-1-4503-6232-0 SP - 289 EP - 298 PB - ACM CY - New York, NY, USA ER - TY - CHAP A1 - Kreimeier, Julian A1 - Götzelmann, Timo T1 - First Steps Towards Walk-In-Place Locomotion and Haptic Feedback in Virtual Reality for Visually Impaired T2 - CHI Conference on Human Factors in Computing Systems Extended Abstracts (CHI'19 Extended Abstracts) N2 - This paper presents the first results on a user study in which people with visual impairments (PVI) explored a virtual environment (VE) by walking in a virtual reality (VR) treadmill. As recently suggested, we have now acquired first results from our feasibility study investigating this walk-in-place interaction. This represents a new, more intuitive way of for example virtually exploring unknown spaces in advance. Our prototype consists of off-the-shelf VR components (i.e., treadmill, headphones, glasses, and controller) providing a simplified white cane simulation and was tested by six visually impaired subjects. Our results indicate that this interaction is yet difficult, but promising and an important step to make VR more and better usable for PVIs. As an impact on the CHI community, we would like to make this research field known to a wider audience by sharing our intermediate results and suggestions for improvements, on some of which we are already working on. Y1 - 2019 UR - https://doi.org/10.1145/3290607.3312944 SN - 978-1-4503-5971-9 VL - 2019 SP - 1 EP - 6 PB - ACM CY - New York, NY, USA ER - TY - CHAP A1 - Kröner, Alexander A1 - Götzelmann, Timo T1 - Fitts’ Gesetz als programmtechnische und empirische Aufgabe für Studierende der Informatik T2 - Lecture Notes in Informatics (LNI) N2 - Fitts’ Gesetz ist Gegenstand vieler Lehrveranstaltungen in der Informatik. Neben der ursprünglichen Quelle existieren mehrere Formeln für verfeinerte Approximationen des Phänomens. Dieses Paper beschreibt einen Ansatz zum Einsatz dieses Themenkomplexes für Lehrzwecke für Studierende des 3. Regelstudiensemesters. Zunächst setzen Studierende den Sachverhalt als Programmieraufgabe mit mehreren zu testenden Variablen um. Im Weiteren führen sie selbst mit einer Benutzerstudie eine Verifikation durch welche im Folgenden ausgewertet wird. Bei der gemeinsamen Diskussion werden die Ergebnisse durch verschiedene Formeln sowie empirische Grundlagen wie die Ausreiserproblematik erörtert. Durch dieses Lehrkonzept, welches über mehrere Jahrgänge verfeinert wurde, werden mehrere Themen der Mensch-Computer-Interaktion mit praktischen Tätigkeiten für die Studierenden verknüpft. Ziel dessen war, Studierende in die softwaretechnische Umsetzung, die Durchführung und Auswertung der Studie zu involvieren und ihnen damit eine ganzheitliche Sicht auf die Zusammenhänge unterschiedlicher Sachverhalte aus dem Bereich der Mensch-Computer-Interaktion zu geben. Y1 - 2017 UR - https://dx.doi.org/10.18420/in2017_23 SN - 978-3-88579-669-5 SN - 1617-5468 VL - 2017 SP - 295 EP - 305 PB - Gesellschaft für Informatik CY - Bonn ER - TY - CHAP A1 - Götzelmann, Timo T1 - <> 3D Printable Hand Exoskeleton for the Haptic Exploration of Virtual 3D Scenes T2 - Proc. 10th Int. Conf. on PErvasive Technologies Related to Assistive Environments N2 - Virtual reality is currently experiencing a comeback. A considerable market has developed for VR computer games and educational applications. Some solutions integrate tracked devices which allow users to freely move within a certain space. Virtual 3D model can be visually explored, implemented collision detected allows users to get a feedback for instance by sound or vibration. For research projects there are several approaches which offer to get the actual feedback for the fingers of a hand, when the users virtually touches the surface of a 3D model. However, in the consumer market currently no product is sold which offers this direct feedback for the whole hand. In this paper we introduce a low-cost hand exoskeleton which is usable in conjunction with commodity hardware. It covers each of the five fingers of the user's hand, its design is open-source, low-cost, can be customized and 3D printed by individuals. It aims at improving the haptic perception of users, bases of a popular physical computing platform and is designed to be assembled even by electronically unexperienced users. We show the integration of our lean interface of the wireless exoskeleton into exemplary VR environment and describe a calibration process which is flexible for customizations. Y1 - 2017 UR - https://doi.org/10.1145/3056540.3064950 SN - 978-1-4503-5227-7 VL - 2017 SP - 63 EP - 66 PB - ACM CY - New York, NY, USA ER - TY - CHAP A1 - Götzelmann, Timo A1 - Branz, Lisa A1 - Heidenreich, Claudia A1 - Otto, Markus T1 - A Personal Computer-based Approach for 3D-Printing Accessible to Blind People T2 - Proc. 10th Int. Conf. on PErvasive Technologies Related to Assistive Environments N2 - Tactile materials play a major role in making information available to blind people and support their understanding for spatial matters. Due to the complex manual manufacturing process there is still a lack of suitable models for the visually impaired. Millions of 3D models are currently available on the internet and can be searched by dedicated retrieval sites. Most of them can be printed by 3D printers; however, this often isn't a trivial task even for sighted users. Blind peoples' self-dependence could be drastically increased if they were able to autonomously print 3D models at home. This paper analyses the individual tasks to actually print 3D models and adapts them to steps accessible for blind people. We introduce a workflow for the combined use of 3D printing software and consumer hardware. We verified our approach by a formal user study with visually impaired people which showed its feasibility. Y1 - 2017 UR - https://doi.org/10.1145/3056540.3064954 SN - 978-1-4503-5227-7 SP - 1 EP - 4 PB - ACM CY - New York, NY, USA ER - TY - JOUR A1 - Götzelmann, Timo T1 - 3D-Druck für blinde Menschen BT - vom statischen Druck zu interaktiven Objekten JF - Informatik Spektrum N2 - Neben herkömmlichen taktilen Drucktechniken für blinde Menschen findet auch der 3D-Druck zunehmend Verbreitung. Während anfängliche Ansätze beabsichtigten, mit dieser alternativen Drucktechnologie qualitativ ähnliche Druckresultate zu erzielen, nutzen neuere Ansätze deren Potenzial, um interaktive Drucke zu erstellen. Ausgehend von dieser Entwicklung verschafft dieser Artikel einen Überblick über wesentliche Ansätze für die Erstellung von vielfältigen taktilen Materialen mittels 3D-Druckern. Er zeigt dabei insbesondere den Wandel von statischen zu interaktiven Ansätzen auf. Dabei muss bei Letzteren eine Kopplung zwischen den taktilen 3D-Drucken und elektronischen Entitäten erfolgen, welche durch unterschiedliche Sensorik umgesetzt werden kann. Zukünftige Entwicklungen könnten es erlauben, die Interaktion des Benutzers mit der kompletten Oberfläche von 3D-Drucken sensorisch zu erfassen und somit komplexe neue Interaktionsmöglichkeiten zu erschließen, welche blinden wie auch sehenden Menschen hilfreich sein können. Y1 - 2017 UR - https://doi.org/10.1007/s00287-017-1068-8 VL - 2017 IS - Volume 40, Issue 6 SP - 511 EP - 515 PB - Springer ER - TY - CHAP A1 - Götzelmann, Timo T1 - LucentMaps: 3D Printed Audiovisual Tactile Maps for Blind and Visually Impaired People T2 - Proc. 18th International ACM SIGACCESS Conference on Computers and Accessibility N2 - Tactile maps support blind and visually impaired people in orientation and to familiarize with unfamiliar environments. Interactive approaches complement these maps with auditory feedback. However, commonly these approaches focus on blind people. We present an approach which incorporates visually impaired people by visually augmenting relevant parts of tactile maps. These audiovisual tactile maps can be used in conjunction with common tablet computers and smartphones. By integrating conductive elements into 3D printed tactile maps, they can be recognized by a single touch on the mobile device's display, which eases the handling for blind and visually impaired people. To allow multiple elevation levels in our transparent tactile maps, we conducted a study to reconcile technical and physiological requirements of off-the-shelf 3D printers, capacitive touch inputs and the human tactile sense. We propose an interaction concept for 3D printed audiovisual tactile maps, verify its feasibility and test it with a user study. Our discussion includes economic considerations crucial for a broad dissemination of tactile maps for both blind and visually impaired people. Y1 - 2016 UR - https://doi.org/10.1145/2982142.2982163 SN - 978-1-4503-4124-0 VL - 2016 SP - 90 PB - ACM CY - New York, NY, USA ER - TY - CHAP A1 - Götzelmann, Timo A1 - Schneider, Daniel T1 - CapCodes: Capacitive 3D Printable Identification and On-screen Tracking for Tangible Interaction T2 - Proceedings of the 9th Nordic Conference on Human-Computer Interaction (NordiCHI'16) N2 - Electronic markers can be used to link physical representations and virtual content for tangible interaction, such as visual markers commonly used for tabletops. Another possibility is to leverage capacitive touch inputs of smartphones, tablets and notebooks. However, existing approaches either do not couple physical and virtual representations or require significant post-processing. This paper presents and evaluates a novel approach using a coding scheme for the automatic identification of tangibles by touch inputs when they are touched and shifted. The codes can be generated automatically and integrated into a great variety of existing 3D models from the internet. The resulting models can then be printed completely in one cycle by off-the-shelf 3D printers; post processing is not needed. Besides the identification, the object's position and orientation can be tracked by touch devices. Our evaluation examined multiple variables and showed that the CapCodes can be integrated into existing 3D models and the approach could also be applied to untouched use for larger tangibles. Y1 - 2016 UR - https://doi.org/10.1145/2971485.2971518 SN - 978-1-4503-4763-1 VL - 2016 PB - ACM CY - New York, NY, USA ER - TY - CHAP A1 - Ullmann, Daniela A1 - Kreimeier, Julian A1 - Götzelmann, Timo A1 - Kipke, Harald T1 - BikeVR BT - a virtual reality bicycle simulator towards sustainable urban space and traffic planning T2 - Proceedings of Mensch und Computer 2020 N2 - While becoming more and more aware of the ongoing climate change, eco-friendly means of transport for all citizens are moving further into focus. In order to be able to implement specific measures, it is necessary to better understand and emphasize sustainable transportation like walking and cycling through focused research. When developing novel traffic concepts and urban spaces for non-motorized traffic participants like bicycles and pedestrians, traffic and urban planning must be focused on their needs. To provide rare qualitative factors (such as stress, the perception of time and attractiveness of the environment) in this context, we present an audiovisual VR bicycle simulator which allows the user to cycle through a virtual urban environment by physically pedaling and also steering. Virtual Reality (VR) is a suitable tool in this context, as study participants find identical and almost freely definable (virtual) urban spaces with adjustable traffic scenarios. Our preliminary prototype proved to be promising and will be further optimized and evaluated. Y1 - 2020 SN - 978-1-4503-7540-5 U6 - https://doi.org/10.1145/3404983.3410417 PB - Association for Computing Machinery CY - New York, NY ER - TY - CHAP A1 - Götzelmann, Timo T1 - Concept of the Joint Use of Smartphone Camera and Projector for Keyboard Inputs N2 - The efficiency of text input by today’s smartphones is significantly limited by the small extents of the virtual keyboard displayed for allowing alphanumeric inputs. Future smartphones will integrate projectors which allow to project multimedia content as well as the smartphones’ dialogs. This paper introduces a concept to project the whole smartphone’s display onto a surface allowing the user to realize text inputs by interacting with the virtual keyboard projection. This projection is analyzed by standard image processing algorithms. Finally, an experimental implementation shows the feasibility of this concept. KW - Human Computer Interaction KW - Smartphone KW - Projector KW - Virtual keyboard KW - Limited input space KW - Mensch-Maschine-Kommunikation KW - Smartphone KW - Projektionsapparat KW - Tastatur Y1 - 2013 UR - http://iscse2013.gediz.edu.tr/ SN - 2147-9097 N1 - Link zum Volltext: http://iscse2013.gediz.edu.tr/docs/ISCSE2013_PROCEEDINGS.pdf PB - Gediz University Press CY - Gediz ER - TY - CHAP A1 - Dotenco, Sergiu A1 - Götzelmann, Timo A1 - Gallwitz, Florian T1 - Smartphone Input Using an Integrated Projector and a Monocular Camera T2 - Lecture Notes in Computer Science N2 - Touch input on modern smartphones can be tedious, especially if the touchscreen is small. Smartphones with integrated projectors can be used to overcome this limitation by projecting the screen contents onto a surface, allowing the user to interact with the projection by means of simple hand gestures. In this work, we propose a novel approach for projector smartphones that allows the user to remotely interact with the smartphone screen via its projection. We detect user's interaction using the built-in camera, and forward detected hand gestures as touch input events to the operating system. In order to avoid costly computations, we additionally use built-in motion sensors. We verify the proposed method using an implementation for the consumer smartphone Samsung Galaxy Beam equipped with a deflection mirror. KW - Mobile computing KW - Touch input KW - Projector KW - Smartphone KW - Limited input space KW - App KW - Programmierung KW - Projektionsapparat Y1 - 2014 UR - http://link.springer.com/chapter/10.1007/978-3-319-07227-2_13 SN - 978-3-319-07226-5 VL - Volume 8512 PB - Springer ER - TY - CHAP A1 - Schäff, Christian A1 - Pugliese, Gaston A1 - Götzelmann, Timo T1 - Behavior Based Web User Identification T2 - GI-Edition / Seminars N2 - This paper examines different approaches for the identification of users by their personal behavior and discusses techniques which could be used in the context of websites. Such web tracking approaches have the potential to identify users even if they use multiple or shared devices. For web pages mouse and touch input are widely used. Therefore, we propose a survey to evaluate the feasibility to identify users by their interaction behavior. KW - User identification KW - Web tracking KW - User input KW - Interaction KW - Biometric KW - Authentifikation KW - Web-Seite KW - Mensch-Maschine-Kommunikation Y1 - 2014 SN - 978-3-88579-447-9 SN - 1614-3213 N1 - Volltext verfügbar in der Zentralbibliothek in: Gesellschaft für Informatik GI-Edition: Informatiktage 2014. ISBN 978-3-88579-447-9. Signatur: 001/SQ 1200-2014+1 VL - Volume S-13 PB - KöllenDruck+Verlag CY - Bonn ER -