CONDROID: USING AN ANDROID PHONE AS A 3D INPUT DEVICE FOR A MULTIUSER 3D DRAWING APPLICATION SETUP IN A COLLABORATIVE VIRTUAL ENVIRONMENT OVER THE WEB
Keywords:
Collaborative virtual environment, 3D gesture recognition, Accelerometers, Android application, Semaphores, Remote synchronization, 3D drawing applicationAbstract
3D collaborative virtual environments (CVE) are gaining popularity. One problem with 3D CVEs is lack of a natural 3D input device. Smartphones are gaining popularity for being used as a 3D input device for a variety of purposes. We present ConDroid, a system that uses a smartphone as a 3D input device in a 3D drawing application setup in collaborative virtual environment over the web with support for multiple users at the same time and on large or small displays. We tackle the problem of remote synchronization in CVEs using the concepts we have named as SUMD (synchronicity using minimal data) and remote semaphores with deadlock avoidance. For the CVE architecture, we use the basic idea of the Active replication model in which an atomic broadcast is used to deliver updates to all of the clients in order to keep them synchronized and use the SUMD and remote semaphores approach on top of it. Our work comprises of an Android application for 3D input, a 3D drawing windows application projected on a large (or small) display with remote collaboration capability, and a middleware server application. Our approach can be used by others to use smartphones as a 3D input device for computers as well design CVEs with quick and effective synchronization using the concepts of SUMD and remote semaphores.
Downloads
References
3Dconnexion: A 3D Mouse. http://www.canalys.com/newsroom/
smart-phones-overtake-client-pcs-2011, 2008. [Online; accessed 18-February-2012].
J. Cashion, C. Wingrave, and J. J. LaViola Jr. Dense and dynamic 3d selection for game-based virtual environments.
Visualization and Computer Graphics, IEEE Transactions on, 18(4):634 –642, april 2012.
Y. Du, H. Ren, G. Pan, and S. Li. Tilt touch: mobile phone for 3d interaction. In Proceedings of the 13th
international conference on Ubiquitous computing, UbiComp ’11, pages 485–486, New York, NY, USA,
ACM.
M. Finke, N. Kaviani, I. Wang, V. Tsao, S. Fels, and R. Lea. Investigating distributed user interfaces across
interactive large displays and mobile devices. In Proceedings of the International Conference on Advanced
Visual Interfaces, pages 413–413. ACM, 2010.
V. Frati and D. Prattichizzo. Using kinect for hand tracking and rendering in wearable haptics. In World
Haptics Conference (WHC), 2011 IEEE, pages 317–321. IEEE, 2011.
Z. He, L. Jin, L. Zhen, and J. Huang. Gesture recognition based on 3d accelerometer for cell phones interaction.
In Circuits and Systems, 2008. APCCAS 2008. IEEE Asia Pacific Conference on, pages 217–220. IEEE,
K. Hinckley and H. Song. Sensor synaesthesia: touch in motion, and motion in touch. In Proceedings of the
annual conference on Human factors in computing systems, CHI ’11, pages 801–810, New York, NY,
USA, 2011. ACM.
W. Hutama, P. Song, C.-W. Fu, and W. B. Goh. Distinguishing multiple smart-phone interactions on a multitouch
wall display using tilt correlation. In Proceedings of the 2011 annual conference on Human factors in
computing systems, CHI ’11, pages 3315–3318, New York, NY, USA, 2011. ACM.
G. Niezen and G. Hancke. Gesture recognition as ubiquitous input for mobile phones. In DAP 2008: Proceedings
of the Workshop on Devices that Alter Perception, 2008.
G. Pan, H. Ren, W. Hua, Q. Zheng, and S. Li. Easypointer: what you pointing at is what you get. In
Proceedings of the 2011 annual conference extended abstracts on Human factors in computing systems, CHI
EA ’11, pages 499–502, New York, NY, USA, 2011. ACM.
S. Peters, V. Loftness, and V. Hartkopf. The intuitive control of smart home and office environments. In
Proceedings of the 10th SIGPLAN symposium on New ideas, new paradigms, and reflections on programming
and software, ONWARD ’11, pages 113–114, New York, NY, USA, 2011. ACM.
Z. Tang, O. Ozbek, and X. Guo. Real-time 3d interaction with deformable model on mobile devices. In
Proceedings of the 19th ACM international conference on Multimedia, pages 1009–1012. ACM, 2011.
Z. Tang, G. Rong, X. Guo, and B. Prabhakaran. Streaming 3d shape deformations in collaborative virtual
environment. In Virtual Reality Conference (VR), 2010 IEEE, pages 183–186. IEEE, 2010.
C. A. Wingrave, B. Williamson, P. D. Varcholik, J. Rose, A. Miller, E. Charbonneau, J. Bott, and J. J. LaViola
Jr. The wiimote and beyond: Spatially convenient devices for 3d user interfaces. Computer Graphics and
Applications, IEEE, 30(2):71 –85, march-april 2010.
J. Wu, G. Pan, D. Zhang, S. Li, and Z. Wu. Magicphone: pointing interacting. In Proceedings of the 12th
ACM international conference adjunct papers on Ubiquitous computing, Ubicomp ’10, pages 451–452, New
York, NY, USA, 2010. ACM.
D. Yoon, J. Lee, K. Yeom, and J. Park. Mobiature: 3d model manipulation technique for large displays using
mobile devices. In Consumer Electronics (ICCE), 2011 IEEE International Conference on, pages 495–496.
IEEE, 2011.