TY - CHAP A1 - Götzelmann, Timo T1 - <> 3D Printable Hand Exoskeleton for the Haptic Exploration of Virtual 3D Scenes T2 - Proc. 10th Int. Conf. on PErvasive Technologies Related to Assistive Environments N2 - Virtual reality is currently experiencing a comeback. A considerable market has developed for VR computer games and educational applications. Some solutions integrate tracked devices which allow users to freely move within a certain space. Virtual 3D model can be visually explored, implemented collision detected allows users to get a feedback for instance by sound or vibration. For research projects there are several approaches which offer to get the actual feedback for the fingers of a hand, when the users virtually touches the surface of a 3D model. However, in the consumer market currently no product is sold which offers this direct feedback for the whole hand. In this paper we introduce a low-cost hand exoskeleton which is usable in conjunction with commodity hardware. It covers each of the five fingers of the user's hand, its design is open-source, low-cost, can be customized and 3D printed by individuals. It aims at improving the haptic perception of users, bases of a popular physical computing platform and is designed to be assembled even by electronically unexperienced users. We show the integration of our lean interface of the wireless exoskeleton into exemplary VR environment and describe a calibration process which is flexible for customizations. Y1 - 2017 UR - https://doi.org/10.1145/3056540.3064950 SN - 978-1-4503-5227-7 VL - 2017 SP - 63 EP - 66 PB - ACM CY - New York, NY, USA ER - TY - CHAP A1 - Götzelmann, Timo A1 - Branz, Lisa A1 - Heidenreich, Claudia A1 - Otto, Markus T1 - A Personal Computer-based Approach for 3D-Printing Accessible to Blind People T2 - Proc. 10th Int. Conf. on PErvasive Technologies Related to Assistive Environments N2 - Tactile materials play a major role in making information available to blind people and support their understanding for spatial matters. Due to the complex manual manufacturing process there is still a lack of suitable models for the visually impaired. Millions of 3D models are currently available on the internet and can be searched by dedicated retrieval sites. Most of them can be printed by 3D printers; however, this often isn't a trivial task even for sighted users. Blind peoples' self-dependence could be drastically increased if they were able to autonomously print 3D models at home. This paper analyses the individual tasks to actually print 3D models and adapts them to steps accessible for blind people. We introduce a workflow for the combined use of 3D printing software and consumer hardware. We verified our approach by a formal user study with visually impaired people which showed its feasibility. Y1 - 2017 UR - https://doi.org/10.1145/3056540.3064954 SN - 978-1-4503-5227-7 SP - 1 EP - 4 PB - ACM CY - New York, NY, USA ER - TY - CHAP A1 - Götzelmann, Timo T1 - LucentMaps: 3D Printed Audiovisual Tactile Maps for Blind and Visually Impaired People T2 - Proc. 18th International ACM SIGACCESS Conference on Computers and Accessibility N2 - Tactile maps support blind and visually impaired people in orientation and to familiarize with unfamiliar environments. Interactive approaches complement these maps with auditory feedback. However, commonly these approaches focus on blind people. We present an approach which incorporates visually impaired people by visually augmenting relevant parts of tactile maps. These audiovisual tactile maps can be used in conjunction with common tablet computers and smartphones. By integrating conductive elements into 3D printed tactile maps, they can be recognized by a single touch on the mobile device's display, which eases the handling for blind and visually impaired people. To allow multiple elevation levels in our transparent tactile maps, we conducted a study to reconcile technical and physiological requirements of off-the-shelf 3D printers, capacitive touch inputs and the human tactile sense. We propose an interaction concept for 3D printed audiovisual tactile maps, verify its feasibility and test it with a user study. Our discussion includes economic considerations crucial for a broad dissemination of tactile maps for both blind and visually impaired people. Y1 - 2016 UR - https://doi.org/10.1145/2982142.2982163 SN - 978-1-4503-4124-0 VL - 2016 SP - 90 PB - ACM CY - New York, NY, USA ER - TY - CHAP A1 - Götzelmann, Timo A1 - Schneider, Daniel T1 - CapCodes: Capacitive 3D Printable Identification and On-screen Tracking for Tangible Interaction T2 - Proceedings of the 9th Nordic Conference on Human-Computer Interaction (NordiCHI'16) N2 - Electronic markers can be used to link physical representations and virtual content for tangible interaction, such as visual markers commonly used for tabletops. Another possibility is to leverage capacitive touch inputs of smartphones, tablets and notebooks. However, existing approaches either do not couple physical and virtual representations or require significant post-processing. This paper presents and evaluates a novel approach using a coding scheme for the automatic identification of tangibles by touch inputs when they are touched and shifted. The codes can be generated automatically and integrated into a great variety of existing 3D models from the internet. The resulting models can then be printed completely in one cycle by off-the-shelf 3D printers; post processing is not needed. Besides the identification, the object's position and orientation can be tracked by touch devices. Our evaluation examined multiple variables and showed that the CapCodes can be integrated into existing 3D models and the approach could also be applied to untouched use for larger tangibles. Y1 - 2016 UR - https://doi.org/10.1145/2971485.2971518 SN - 978-1-4503-4763-1 VL - 2016 PB - ACM CY - New York, NY, USA ER - TY - CHAP A1 - Ullmann, Daniela A1 - Kreimeier, Julian A1 - Götzelmann, Timo A1 - Kipke, Harald T1 - BikeVR BT - a virtual reality bicycle simulator towards sustainable urban space and traffic planning T2 - Proceedings of Mensch und Computer 2020 N2 - While becoming more and more aware of the ongoing climate change, eco-friendly means of transport for all citizens are moving further into focus. In order to be able to implement specific measures, it is necessary to better understand and emphasize sustainable transportation like walking and cycling through focused research. When developing novel traffic concepts and urban spaces for non-motorized traffic participants like bicycles and pedestrians, traffic and urban planning must be focused on their needs. To provide rare qualitative factors (such as stress, the perception of time and attractiveness of the environment) in this context, we present an audiovisual VR bicycle simulator which allows the user to cycle through a virtual urban environment by physically pedaling and also steering. Virtual Reality (VR) is a suitable tool in this context, as study participants find identical and almost freely definable (virtual) urban spaces with adjustable traffic scenarios. Our preliminary prototype proved to be promising and will be further optimized and evaluated. Y1 - 2020 SN - 978-1-4503-7540-5 U6 - https://doi.org/10.1145/3404983.3410417 PB - Association for Computing Machinery CY - New York, NY ER - TY - CHAP A1 - Götzelmann, Timo T1 - Concept of the Joint Use of Smartphone Camera and Projector for Keyboard Inputs N2 - The efficiency of text input by today’s smartphones is significantly limited by the small extents of the virtual keyboard displayed for allowing alphanumeric inputs. Future smartphones will integrate projectors which allow to project multimedia content as well as the smartphones’ dialogs. This paper introduces a concept to project the whole smartphone’s display onto a surface allowing the user to realize text inputs by interacting with the virtual keyboard projection. This projection is analyzed by standard image processing algorithms. Finally, an experimental implementation shows the feasibility of this concept. KW - Human Computer Interaction KW - Smartphone KW - Projector KW - Virtual keyboard KW - Limited input space KW - Mensch-Maschine-Kommunikation KW - Smartphone KW - Projektionsapparat KW - Tastatur Y1 - 2013 UR - http://iscse2013.gediz.edu.tr/ SN - 2147-9097 N1 - Link zum Volltext: http://iscse2013.gediz.edu.tr/docs/ISCSE2013_PROCEEDINGS.pdf PB - Gediz University Press CY - Gediz ER - TY - CHAP A1 - Dotenco, Sergiu A1 - Götzelmann, Timo A1 - Gallwitz, Florian T1 - Smartphone Input Using an Integrated Projector and a Monocular Camera T2 - Lecture Notes in Computer Science N2 - Touch input on modern smartphones can be tedious, especially if the touchscreen is small. Smartphones with integrated projectors can be used to overcome this limitation by projecting the screen contents onto a surface, allowing the user to interact with the projection by means of simple hand gestures. In this work, we propose a novel approach for projector smartphones that allows the user to remotely interact with the smartphone screen via its projection. We detect user's interaction using the built-in camera, and forward detected hand gestures as touch input events to the operating system. In order to avoid costly computations, we additionally use built-in motion sensors. We verify the proposed method using an implementation for the consumer smartphone Samsung Galaxy Beam equipped with a deflection mirror. KW - Mobile computing KW - Touch input KW - Projector KW - Smartphone KW - Limited input space KW - App KW - Programmierung KW - Projektionsapparat Y1 - 2014 UR - http://link.springer.com/chapter/10.1007/978-3-319-07227-2_13 SN - 978-3-319-07226-5 VL - Volume 8512 PB - Springer ER - TY - CHAP A1 - Schäff, Christian A1 - Pugliese, Gaston A1 - Götzelmann, Timo T1 - Behavior Based Web User Identification T2 - GI-Edition / Seminars N2 - This paper examines different approaches for the identification of users by their personal behavior and discusses techniques which could be used in the context of websites. Such web tracking approaches have the potential to identify users even if they use multiple or shared devices. For web pages mouse and touch input are widely used. Therefore, we propose a survey to evaluate the feasibility to identify users by their interaction behavior. KW - User identification KW - Web tracking KW - User input KW - Interaction KW - Biometric KW - Authentifikation KW - Web-Seite KW - Mensch-Maschine-Kommunikation Y1 - 2014 SN - 978-3-88579-447-9 SN - 1614-3213 N1 - Volltext verfügbar in der Zentralbibliothek in: Gesellschaft für Informatik GI-Edition: Informatiktage 2014. ISBN 978-3-88579-447-9. Signatur: 001/SQ 1200-2014+1 VL - Volume S-13 PB - KöllenDruck+Verlag CY - Bonn ER -