TY - CHAP A1 - Herder, Jens A1 - Takeda, Shinpei A1 - Vermeegen, Kai A1 - Davin, Till A1 - Berners, Dominique A1 - Ryskeldiev, Bektur A1 - Zimmer, Christian A1 - Druzetic, Ivana A1 - Geiger, Christian T1 - Mixed Reality Art Experiments - Immersive Access to Collective Memories T2 - ISEA2019, Proceedings, 25th International Symposium on Electronic Art, Gwangju, South Korea, June 22-28, 2019 N2 - We report about several experiments on applying mixed reality technology in the context of accessing collective memories from atomic bombs, Holocaust and Second World War. We discuss the impact of Virtual Reality, Augmented Virtuality and Augmented Reality for specific memorial locations. We show how to use a virtual studio for demonstrating an augmented reality application for a specific location in a remote session within a video conference. Augmented Virtuality is used to recreate the local environment, thus providing a context and helping the participants recollect emotions related to a certain place. This technique demonstrates the advantages of using virtual (VR) and augmented (AR) reality environments for rapid prototyping and pitching project ideas in a live remote setting. KW - Mixed Reality KW - Augmented Reality KW - Augmented Virtuality KW - Rapid Prototyping KW - Video Conferencing Y1 - 2019 UR - http://www.isea-archives.org/docs/2019/ISEA2019_Proceedings.pdf SP - 334 EP - 341 PB - IESA CY - Gwangju ER - TY - CHAP A1 - Brettschneider, Nico A1 - Herder, Jens A1 - de Mooij, Jeroen A1 - Ryskeldiev, Bektur ED - Herder, Jens T1 - Audio vs. Visual Avatars as Guides in Virtual Environments T2 - 21th International Conference on Human and Computer, HC-2018, March 27–28, 2019, Shizuoka University, Hamamatsu, Japan. N2 - Through constant technical progress, multi-user virtual reality is transforming towards a social activity that is no longer only used by remote users, but also in large-scale location-based experiences. We evaluate the usage of realtime-tracked avatars in co-located business-oriented applications in a "guide-user-scenario" in comparison to audio only instructions. The present study examined the effect of an avatar-guide on the user-related factors of Spatial Presence, Social Presence, User Experience and Task Load in order to propose design guidelines for co-located collaborative immersive virtual environments. Therefore, an application was developed and a user study with 40 participants was conducted in order to compare both guiding techniques of a realtime-tracked avatar guide and a non-visualised guide with otherwise constant conditions. Results reveal that the avatar-guide enhanced and stimulated communicative processes while facilitating interaction possibilities and creating a higher sense of mental immersion for users. Furthermore, the avatar-guide appeared to make the storyline more engaging and exciting while helping users adapt to the medium of virtual reality. Even though no assertion could be made concerning the Task Load factor, the avatar-guide achieved a higher subjective value on User Experience. Due to the results, avatars can be considered valuable social elements in the design of future co-located collaborative virtual environments. KW - Virtual Reality KW - Co-located Collaborations KW - Networked Immersive Virtual Environments KW - Head-mounted Display KW - Avatars KW - Lehre Y1 - 2019 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:hbz:due62-opus-23859 UR - https://vsvr.medien.hs-duesseldorf.de/publications/hc2018-avatar/ PB - Hochschule Düsseldorf CY - Düsseldorf ER - TY - CHAP A1 - Herder, Jens A1 - Brettschneider, Nico A1 - de Mooij, Jeroen A1 - Ryskeldiev, Bektur T1 - Avatars for Co-located Collaborations in HMD-based Virtual Environments T2 - IEEE VR 2019, 26th IEEE Conference on Virtual Reality and 3D User Interfaces, Osaka, March, 2019 N2 - Multi-user virtual reality is transforming towards a social activity that is no longer only used by remote users, but also in large-scale location-based experiences. Usage of realtime-tracked avatars in co-located business-oriented applications with a ”guide-user-scenario” is examined for user-related factors of Spatial Presence, Social Presence, User Experience and Task Load. A user study was conducted in order to compare both techniques of a realtime-tracked avatar and a non-visualised guide. Results reveal that the avatar-guide enhanced and stimulated communicative processes while facilitating interaction possibilities and creating a higher sense of mental immersion for users and engagement. KW - Virtual Reality KW - Co-located Collaborations KW - Head-mounted Display KW - Avatars KW - Social Presence KW - Lehre Y1 - 2019 UR - https://vsvr.medien.hs-duesseldorf.de/publications/ieeevr2019/ U6 - https://doi.org/10.1109/VR.2019.8798132 SP - 968 EP - 969 PB - IEEE CY - Osaka ER - TY - CHAP A1 - Ryskeldiev, Bektur A1 - Igarashi, Toshiharu A1 - Zhang, Junjian A1 - Ochiai, Yoichi A1 - Cohen, Michael A1 - Herder, Jens T1 - Spotility: Crowdsourced Telepresence for Social and Collaborative Experiences in Mobile Mixed Reality T2 - ACM Conference on Computer Supported Cooperative Work and Social Computing (CSCW '18) N2 - Live video streaming is becoming increasingly popular as a form of interaction in social applications. One of its main advantages is an ability to immediately create and connect a community of remote users on the spot. In this paper we discuss how this feature can be used for crowdsourced completion of simple visual search tasks (such as finding specific objects in libraries and stores, or navigating around live events) and social interactions through mobile mixed reality telepresence interfaces. We present a prototype application that allows users to create a mixed reality space with a photospherical imagery as a background and interact with other connected users through viewpoint, audio, and video sharing, as well as realtime annotations in mixed reality space. Believing in the novelty of our system, we conducted a short series of interviews with industry professionals on the possible applications of our system. We discuss proposed use-cases for user evaluation, as well as outline future extensions of our system. KW - Groupware KW - Mixed reality KW - Mobile computing KW - Remote collaboration KW - Spatial media KW - M KW - FHD Y1 - 2018 UR - http://vsvr.medien.hs-duesseldorf.de/publications/cscw2018-spotility-abstract.html SN - 978-1-4503-6018-0 U6 - https://doi.org/10.1145/3272973.3274100 N1 - Zugriff auf den Volltext via ACM-Datenbank SP - 373 EP - 376 PB - ACM CY - New York ER - TY - CHAP A1 - Ryskeldiev, Bektur A1 - Ochiai, Yoichi A1 - Cohen, Michael A1 - Herder, Jens T1 - Distributed Metaverse: Creating Decentralized Blockchain-based Model for Peer-to-peer Sharing of Virtual Spaces for Mixed Reality Applications T2 - Proceedings of the 9th Augmented Human International Conference N2 - Mixed reality telepresence is becoming an increasingly popular form of interaction in social and collaborative applications. We are interested in how created virtual spaces can be archived, mapped, shared, and reused among different applications. Therefore, we propose a decentralized blockchain-based peer-to-peer model of distribution, with virtual spaces represented as blocks. We demonstrate the integration of our system in a collaborative mixed reality application and discuss the benefits and limitations of our approach. KW - Blockchain KW - Groupware KW - Mixed Reality KW - Mobile Computing KW - Photospherical Imagery Y1 - 2018 UR - http://vsvr.medien.hs-duesseldorf.de/publications/ah2018-blockchain-streamspace-abstract.html SN - 978-1-4503-5415-8 U6 - https://doi.org/10.1145/3174910.3174952 SP - 7 EP - 9 PB - ACM ER - TY - JOUR A1 - Ryskeldiev, Bektur A1 - Cohen, Michael A1 - Herder, Jens T1 - StreamSpace: Pervasive Mixed Reality Telepresence for Remote Collaboration on Mobile Devices JF - Journal of Information Processing N2 - We present a system that exploits mobile rotational tracking and photospherical imagery to allow users to share their environment with remotely connected peers “on the go.” We surveyed related interfaces and developed a unique groupware application that shares a mixed reality space with spatially-oriented live video feeds. Users can collaborate through realtime audio, video, and drawings in a virtual space. The developed system was tested in a preliminary user study, which confirmed an increase in spatial and situational awareness among viewers as well as reduction in cognitive workload. Believing that our system provides a novel style of collaboration in mixed reality environments, we discuss future applications and extensions of our prototype. KW - Ubiquitous Computing KW - Telepresence KW - Remote Collaboration KW - Mixed Reality KW - Live Video Streaming Y1 - 2018 U6 - https://doi.org/10.2197/ipsjjip.26.177 VL - 26 SP - 177 EP - 185 PB - J-STAGE ER - TY - CHAP A1 - Herder, Jens A1 - Ladwig, Philipp A1 - Vermeegen, Kai A1 - Hergert, Dennis A1 - Busch, Florian A1 - Klever, Kevin A1 - Holthausen, Sebastian A1 - Ryskeldiev, Bektur T1 - Mixed Reality Experience - How to Use a Virtual (TV) Studio for Demonstration of Virtual Reality Applications T2 - GRAPP 2018 - 13th International Conference on Computer Graphics Theory and Applications N2 - The article discusses the question of “How to convey the experience in a virtual environment to third parties?” and explains the different technical implementations which can be used for live streaming and recording of a mixed reality experience. The real-world applications of our approach include education, entertainment, e- sports, tutorials, and cinematic trailers, which can benefit from our research by finding a suitable solution for their needs. We explain and outline our Mixed Reality systems as well as discuss the experience of recorded demonstrations of different VR applications, including the need for calibrated camera lens parameters based on realtime encoder values. KW - Virtual Reality KW - Mixed Reality KW - Augmented Virtuality KW - Virtual (TV) Studio KW - Camera Tracking KW - Lehre Y1 - 2018 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:hbz:due62-opus-15823 UR - http://vsvr.medien.hs-duesseldorf.de/publications/grapp2018-mr-openvr-abstract.html SN - 978-989-758-287-5 N1 - Zweitveröffentlichung - Originalpublikation liegt auf: http://www.scitepress.org/DigitalLibrary SP - 281 EP - 287 PB - INSTICC CY - Setubal - Portugal ER - TY - CHAP A1 - Ryskeldiev, Bektur A1 - Cohen, Michael A1 - Herder, Jens T1 - Applying rotational tracking and photospherical imagery to immersive mobile telepresence and live video streaming groupware T2 - Proceeding SA '17 SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications, Article No. 5 N2 - Mobile live video streaming is becoming an increasingly popular form of interaction both in social media and remote collaboration scenarios. However, in most cases the streamed video does not take mobile devices' spatial data into account (e.g., the viewers do not know the spatial orientation of a streamer), or use such data only in specific scenarios (e.g., to navigate around a spherical video stream). KW - spatial media KW - rotational tracking KW - mixed reality KW - live streaming KW - social media KW - telepresence KW - mobile computing KW - groupware KW - photospherical imagery Y1 - 2017 UR - https://dl.acm.org/citation.cfm?doid=3132787.3132813 SN - 978-1-4503-5410-3 U6 - https://doi.org/10.1145/3132787.3132813 PB - ACM CY - New York ER -