Designing natural gesture interaction for archaeological data in immersive environments

Authors

  • Niccolò Albertini Scuola Normale Superior
  • Andrea Brogni Scuola Normale Superiore https://orcid.org/0000-0001-8714-8160
  • Riccardo Olivito Scuola Normale Superiore
  • Emanuele Taccola Università di Pisa
  • Baptiste Caramiaux Goldsmiths University
  • Marco Gillies Goldsmiths University

DOI:

https://doi.org/10.4995/var.2017.5872

Keywords:

Cyber-Archaeology, Gesture Recnition, Virtual Reality

Abstract

Archaeological data are heterogeneous, making it difficult to correlate and combine different types.  Datasheets  and pictures,  stratigraphic  data  and  3D  models,  time and  space  mixed  together: these are  only a few  of  the  categories  a researcher has to deal with. New technologies may be able to help in this process and trying to solve research related problems needs innovative solutions. In this paper, we describe the whole process for the design and development of a prototype application that uses an Immersive Virtual Reality system to acces archaeological excavation3Ddata through the Gesture Variation Follower (GVF) algorithm. This makes it possible to recognise which gesture is being performed and how it is performed. Archaeologist shave participated actively in the design of the interface and the set of gestures used for triggering the different tasks. Interactive machine learning techniques have been used for the real time detection of the gestures. As a case  study  the  agora  of  Segesta  (Sicily,  Italy)  has  been  selected.  Indeed,  due  to  the  complex architectural  features  and  the  still  ongoing  fieldwork  activities,  Segesta  represents  an  ideal  context  where  to  test  and develop a research approach integrating both traditional and more innovative tools and methods.

Downloads

Download data is not yet available.

References

Anthony, L., & Wobbrock, J.O. (2010). A lightweight multistroke recognizer for user interface prototypes. Proceedings of Graphics Interface 2010, 245–252. http://doi.acm.org/10.1145/4713060.1839258

Bau, O., & Mackay, W.E. (2008). OctoPocus: a dynamic guide for learning gesture-based command sets. Proceedings of the 21st annual ACM symposium on User interface software and technology, 37–46. http://doi.org/10.1145/1449715.1449724

Bevilacqua, F., Zamborlin, B., Sypniewski, A., Schnell, N., Guédy, F., & Rasamimanana, N. (2010). Continuous realtime gesture following and recognition. In S. Kopp & I. Wachsmuth (Eds.), Gesture in Embodied Communication and Human-Computer Interaction:8th International Gesture Workshop, GW 2009, Bielefeld, Germany, February 25-27, 2009, Revised Selected Papers(pp. 73–84). Berlin, Heidelberg: Springer Berlin Heidelberg.http://doi.org/10.1007/978-3-642-12553-9_7

Bragdon, A., Nelson, E., Li, Y., & Hinckley, K. (2011). Experimental analysis of touch-screen gesture designs in mobile environments. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 403–412. http://doi.org/10.1145/1978942.1979000

Brickmann, J., Exner, E.T., Keil, M., & Marhöfer, J.R. (2000). Molecular Graphics -Trends and Perspectives. Molecular modeling annual, 6(2), 328–340. http://doi.org/10.1007/s0089400060328

Brogni, A., Caldwell, D.G., & Slater, M. (2011). Touching Sharp Virtual Objects Produces a Haptic Illusion. In R. Shumaker (Ed.), Virtual and Mixed Reality -New Trends: International Conference, Virtual and Mixed Reality 2011, Held as Part of HCI International 2011, Orlando, FL, USA, July 9-14, 2011, Proceedings, Part I (pp. 234–242). Berlin, Heidelberg: Springer Berlin Heidelberg.http://doi.org/10.1007/978-3-642-22021-0_26

Buxton, B. (2010). Sketching user experiences: getting the design right and the right design. San Francisco, CA: Morgan Kaufmann.

Caramiaux, B., Altavilla, A., Pobiner, S.G., & Tanaka, A. (2015). Form follows sound: designing interactions from sonic memories. Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, 3943–3952. http://doi.org/10.1145/2702123.2702515

Caramiaux, B., Montecchio, N., Tanaka, A., & Bevilacqua, F. (2014). Adaptive Gesture Recognition with Variation Estimation for Interactive Systems. ACM Trans. Interact. Intell. Syst., 4(4), 1–34. http://doi.org/10.1145/2643204

Cruz-Neira, C., Sandin, D.J., DeFanti, T.A., Kenyon, R.V., & Hart, J.C. (1992). The CAVE: audiovisual experience automatic virtual environment. Communications of the ACM, 35(6), 64–72. http://doi.org/10.1145/129888.129892

Fdili Alaoui, S., Caramiaux, B., Serrano, M., & Bevilacqua, F. (2012). Movement qualities as interaction modality. Proceedings of the Designing Interactive Systems Conference, 761–769. http://doi.org/10.1145/2317956.2318071

Fiebrink, R., Cook, P.R., & Trueman, D. (2011). Human model evaluation in interactive supervised learning. Proceedings of the SIGCHIConference on Human Factors in Computing Systems, 147–156. http://doi.org/10.1145/1978942.1978965

Forte, M. (2010). Cyber-archaeology. Oxford, England: Archaeopress.

Forte, M. (2014). 3D archaeology. New perspectives and challenges. The example of Catalhoyuk. Journal of Eastern Mediterranean Archaeology and Heritage Studies, 2(1), pp. 1–29. http://doi.org/10.13140/2.1.3285.0568

Forte, M.,& Siliotti, A. (1997). Virtual archaeology: re-creating ancient worlds. London: Harry N Abrams B.V.

Gillies, M., Kleinsmith, A., & Brenton, H. (2015). Applying the CASSM framework to improving end user debugging of interactive machine learning. Proceedings of the 20th International Conference on Intelligent User Interfaces, 181–185. http://doi.org/10.1145/2678025.2701373

Grossman, T., & Balakrishnan, R. (2005). A probabilistic approach to modeling two-dimensional pointing. ACM Transactions on Computer-Human Interaction, 12(3), 435–459. http://doi.org/10.1145/1096737.1096741

Kane, S.K., Wobbrock, J.O., & Ladner, R.E. (2011). Usable gestures for blind people: understanding preference and performance. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 413–422. http://doi.org/10.1145/1978942.1979001

Kirsh, D. (2013). Embodied cognition and the magical future of interaction design. ACM Transactions on Computer-Human Interaction, 20(1), 1–30. http://doi.org/10.1145/2442106.2442109

Kratz, L., Morris,D., & Saponas, T.S. (2012). Making gestural input from arm-worn inertial sensors more practical. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 1747–1750. http://doi.org/10.1145/2207676.2208304

Lackner, J.R. (1998). Some proprioceptive influences on the perceptual representation of body shape and orientation. Brain, 111(2), 281–297. http://dx.doi.org/10.1093/brain/111.2.281

Long, A.C., Landay, J.A., Rowe, L.A., & Michiels, J. (2000). Visual similarity of pen gestures. Proceedings of the SIGCHI conference on Human Factors in Computing Systems, 360–367. http://doi.org/10.1145/332040.332458

Lucchese, G., Field, M., Ho, J., Gutierrez-Osuna, R., & Hammond, T. (2012). GestureCommander: continuous touch-based gesture prediction. CHI '12 Extended Abstracts on Human Factors in Computing Systems, 1925–1930. http://doi.org/10.1145/2212776.2223730

Mackay, W.E., & Fayard, A.L. (1999). Video brainstorming and prototyping: techniques for participatory design. CHI '99 Extended Abstracts on Human Factors in Computing Systems, 118–119. http://doi.org/10.1145/632716.632790

Mori, A., Uchida, S., Kurazume, R., Ichiro Taniguchi, R., Hasegawa, T. & Sakoe, H. (2006). Early recognition and prediction of gestures. ICPR 2006. 18th International Conference on Pattern Recognition, 560–563. http://doi.org/10.1109/ICPR.2006.467

Muller, M.J., & Kuhn, S. (1993). Participatory design. Communications of theACM, 36(6),24–28. http://doi.org/10.1145/153571.255960

Nielsen, M., Störring, M., Moeslund, T.B., & Granum, E. (2004). A Procedure for Developing Intuitive and Ergonomic Gesture Interfaces for HCI. In A. Camurri & G. Volpe (Eds.), Gesture-Based Communication in Human-Computer Interaction: 5th International Gesture Workshop, GW 2003, Genova, Italy, April 15-17, 2003, Selected Revised Papers(pp. 409–420). Berlin, Heidelberg: Springer Berlin Heidelberg. http://doi.org/10.1007/978-3-540-24598-8_38

Nieuwenhuizen, K., Aliakseyeu, D., & Martens, J.-B. (2009). Insight into Goal-Directed Movements: Beyond Fitts’ Law. In T. Gross, J. Gulliksen, P. Kotzé, L. Oestreicher, P. Palanque, R. O. Prates & M. Winckler (Eds.), Human-Computer Interaction –INTERACT 2009: 12th IFIP TC 13 International Conference, Uppsala, Sweden, August 24-28, 2009,Proceedings, Part I(pp. 274–287). Berlin, Heidelberg: Springer Berlin Heidelberg. http://doi.org/10.1145/1753326.1753457

Norman, D.A. (1990). The design of everyday things (1st Doubleday/Currency Ed.). New York: Doubleday.

Olivito, R., Taccola, E.,& Albertini, N. (2015). A hand-free solution for the interaction in an immersive virtual environment: the case of the agora of Segesta. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, 40-5/W4, 31–36. http://doi.org/10.5194/isprsarchives-XL-5-W4-31-2015

Ouyang, T., & Li, Y. (2012). Bootstrapping personal gesture shortcuts with the wisdom of the crowd and handwriting recognition. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 2895–2904. http://doi.org/10.1145/2207676.2208695

Pietroni, E., & Pescarin, S. (2010). VR cooperative environments for the interpretation and reconstruction of the archaeological landscape. Virtual Archaeology Review, 1(2), 25–29. http://dx.doi.org/10.4995/var.2010.4680

Pietroni, E.,& Rufa, C. (2012). Natural interaction in Virtual Environments for Cultural Heritage: Giotto in 3D and Etruscanning study case. Virtual Archaeology Review, 3(7), 86–91. http://dx.doi.org/10.4995/var.2012.4394

Tanaka, A., Bau, O. & Mackay, W. (2013). The A20: Interactive instrument techniques for sonic design exploration. In: K. Franinović & S. Serafin (eds.), Sonic Interaction Design (pp. 255–270).Cambridge, MA: MIT Press.

Taylor, J.L. (2009). Proprioception. Encyclopedia of Neuroscience(pp. 1143–1149). Oxford: Academic Press. http://dx.doi.org/10.1016/B978-008045046-9.01907-0

Varona, J., Jaume-I-Capò, A., Gonzàlez, J., & Perales, F. J. (2009). Toward natural interaction through visual recognition of body gestures in real-time. Interacting with Computers, 21(1-2), 3–10. http://doi.org/10.1016/j.intcom.2008.10.001

Wilson, A.D., & Bobick, A.F. (1999). Parametric hidden Markov models for gesture recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence, 21(9), 884–900. http://doi.org/10.1109/34.790429

Wilson, A.D., & Bobick, A.F. (2000). Realtime online adaptive gesture recognition. Proceedings 15th International Conference on Pattern Recognition. http://doi.org/10.1109/ICPR.2000.905317

Wobbrock, J.O., Wilson, A.D., & Li, Y. (2007). Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes. Proceedings of the 20th annual ACM symposium on User Interface Software and Technology, 159–168. http://doi.org/10.1145/1294211.1294238

Wobbrock, J.O., Morris, M.R., & Wilson, A.D. (2009). User-defined gestures for surface computing. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 1083–1092. http://doi.org/10.1145/1518701.1518866

Downloads

Published

2017-05-22

How to Cite

Albertini, N., Brogni, A., Olivito, R., Taccola, E., Caramiaux, B., & Gillies, M. (2017). Designing natural gesture interaction for archaeological data in immersive environments. Virtual Archaeology Review, 8(16), 12–21. https://doi.org/10.4995/var.2017.5872

Issue

Section

Articles