WSEAS Transactions on Environment and Development


Print ISSN: 1790-5079
E-ISSN: 2224-3496

Volume 14, 2018

Notice: As of 2014 and for the forthcoming years, the publication frequency/periodicity of WSEAS Journals is adapted to the 'continuously updated' model. What this means is that instead of being separated into issues, new papers will be added on a continuous basis, allowing a more regular flow and shorter publication times. The papers will appear in reverse order, therefore the most recent one will be on top.


Volume 14, 2018



A Portable Device for Five Sense Augmented Reality Experiences in Museums

AUTHORS: João D. P. Sardo, João A. R. Pereira, Ricardo J. M. Veiga, Jorge Semião, Pedro J. S. Cardoso, João M. F. Rodrigues

Download as PDF

ABSTRACT: Augmented reality (AR) nowadays is focused in two senses, sight and hearing. The work presented here is part of the Mobile Five Senses Augmented Reality system for Museums (M5SAR) project, which has the goal of developing an AR system to be a guide in cultural and historical events and museums, complementing or replacing traditional guides, directional signs or maps, while enhancing the user’s experience by adding multisensorial information to multiple museum objects. The existing solutions for this type of devices either lack portability or fail to implement all the human senses at the same time. This paper presents a new device capable of extending augmented reality experiences to all five human senses, through the use of a portable device that can reproduce stimulus of touch, taste and smell. The proposed apparatus is meant to be paired with a mobile application that controls which sensorial interface is activated and when, relaying that information to the portable device. The application, running on the user’s smartphone or tablet, sends the activation instructions via Bluetooth, using a communication protocol, which is then received by the device’s core microcontroller. Then, the microcontroller acts accordingly, activating the requested physical interfaces to deliver the multisensorial media to the user

KEYWORDS: Augmented reality, multisensorial display, portable device, five-sense experience, museums

REFERENCES:

[1] Rodrigues, J.M.F, Cardoso, P.J.S., Lessa, J., Pereira, J.A.R., Sardo, J.D.P., Freitas, M., Semião, J., Monteiro, J., Ramos, C.M.Q., Lam, R., Esteves, E., Figueiredo. M., Gonçalves, A., Gomes, M., Bica, P. (2017) An Initial Framework to Develop a Mobile 5 Sense Museum System, Chapter 5 in Technological Developments for Cultural Heritage and eTourism Applications, IGI Global. ISBN: 978-1-5225-2927-9.

[2] Yuan, Z., Bi, T., Muntean, G. M. & Ghinea, G. (2015). Perceived synchronization of multisemedia services. IEEE Transactions on Multimedia, 17(7), 957-966.

[3] Kovács, P. T., Rozinaj, G., Murray, N., Sulema, Y. & Rybárová, R. (2015). Application of immersive technologies for education: state of the art. Paper presented at the 2015 International Conference on Interactive Mobile Communication Technologies and Learning (IMCL), Thessaloniki, Greece.

[4] Wolf, M. J. (2008). The video game explosion: a history from PONG to Playstation and beyond. ABC-CLIO.

[5] Bau, O., Poupyrev, I., Israr, A., & Harrison, C. (2010). TeslaTouch: electrovibration for touch surfaces. In Proceedings of the 23nd annual ACM symposium on User interface software and technology (pp. 283-292). ACM.

[6] Alexander, J., Marshall, M. T., & Subramanian, S. (2011). Adding haptic feedback to mobile tv. In CHI'11 Extended Abstracts on Human Factors in Computing Systems (pp. 1975- 1980). ACM.

[7] Subramanian, S., Seah, S. A., Shinoda, H., Hoggan, E., & Corenthy, L. (2016). Mid-Air Haptics and Displays: Systems for Uninstrumented Mid-air Interactions. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems (pp. 3446-3452). ACM.

[8] Moon, T., & Kim, G. J. (2004). Design and evaluation of a wind display for virtual reality. In Proceedings of the ACM symposium on Virtual reality software and technology (pp. 122-128). ACM.

[9] Sodhi, R., Poupyrev, I., Glisson, M., & Israr, A. (2013). AIREAL: interactive tactile experiences in free air. ACM Transactions on Graphics (TOG), 32(4), 134.

[10] Harrison, C., & Hudson, S. E. (2009). Providing dynamically changeable physical buttons on a visual display. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 299-308). ACM.

[11] Geomagic (2016) Geomagic. Retrieved: Dec. 6, 2016 from http://www.geomagic.com.

[12]AxonVR (2016) AxonVR. Retrieved: Dec. 6, 2016 from http://axonvr.com

[13] Benali-Khoudja, M., Hafez, M., Alexandre, J. M., Benachour, J., & Kheddar, A. (2003). Thermal feedback model for virtual reality. In International Symposium on Micromechatronics and Human Science (Vol. 1, pp. 153-158).

[14] Kaye, J. N. (2001). Symbolic olfactory display (Doctoral dissertation, Massachusetts Institute of Technology).

[15] PC World (2006) The 25 Worst Tech Products of All Time. Retrieved: Dec. 6, 2016 from http://www.pcworld.com/article/125772/worst_ products_ever.html.

[16] Yamada, T., Yokoyama, S., Tanikawa, T., Hirota, K., & Hirose, M. (2006). Wearable olfactory display: Using odor in outdoor environment. In IEEE Virtual Reality Conference (VR 2006) (pp. 199-206). IEEE.

[17] Matsukura, H., Yoneda, T., & Ishida, H. (2013). Smelling screen: development and evaluation of an olfactory display system for presenting a virtual odor source. IEEE transactions on visualization and computer graphics, 19(4), 606-615.

[18] Onotes (2016) Onotes. Retrieved: Dec. 6, 2016 from http://www.onotes.com/.

[19] Tanikawa, T., & Hirose, M. (2008). A study of multi-modal display system with visual feedback. In Universal Communication, 2008. ISUC'08. Second International Symposium on (pp. 285-292). IEEE.

[20] Narumi, T., Kajinami, T., Nishizaka, S., Tanikawa, T., & Hirose, M. (2011). Pseudogustatory display system based on cross-modal integration of vision, olfaction and gustation. In Virtual Reality Conference (VR), 2011 IEEE (pp. 127-130). IEEE.

[21] Ranasinghe, N., Cheok, A. D., Fernando, O. N. N., Nii, H., & Ponnampalam, G. (2011). Electronic taste stimulation. In Proceedings of the 13th international conference on Ubiquitous computing (pp. 561-562). ACM

[22] Ranasinghe, N., Nakatsu, R., Nii, H., & Gopalakrishnakone, P. (2012). Tongue mounted interface for digitally actuating the sense of taste. In Wearable Computers (ISWC), 2012 16th International Symposium on (pp. 80- 87). IEEE.

[23] Ranasinghe, N., & Do, E. Y. L. (2016). Digital Lollipop: Studying Electrical Stimulation on the Human Tongue to Simulate Taste Sensations. ACM Transactions on Multimedia Computing, Communications, and Applications (TOMM), 13(1), 5.

[24] FeelReal (2016) FeelReal. Retrieved: Dec. 6, 2016 from http://feelreal.com/.

[25] Museum of Food and Drink (2016) Flavour. Retrieved: Sep. 2017 from http://www.mofad.org/flavor/

[26] Farsalinos, K. E., & Polosa, R. (2014). Safety evaluation and risk assessment of electronic cigarettes as tobacco cigarette substitutes: a systematic review. Therapeutic advances in drug safety, 5(2), 67-86.

WSEAS Transactions on Environment and Development, ISSN / E-ISSN: 1790-5079 / 2224-3496, Volume 14, 2018, Art. #37, pp. 347-362


Copyright © 2018 Author(s) retain the copyright of this article. This article is published under the terms of the Creative Commons Attribution License 4.0

Bulletin Board

Currently:

The editorial board is accepting papers.


WSEAS Main Site