Designing natural gesture interaction for archaeological data in immersive environments


  • Niccolò Albertini Scuola Normale Superior
  • Andrea Brogni Scuola Normale Superiore
  • Riccardo Olivito Scuola Normale Superiore
  • Emanuele Taccola Università di Pisa
  • Baptiste Caramiaux Goldsmiths University
  • Marco Gillies Goldsmiths University



Cyber-Archaeology, Gesture Recnition, Virtual Reality


Archaeological data are heterogeneous, making it difficult to correlate and combine different types.  Datasheets  and pictures,  stratigraphic  data  and  3D  models,  time and  space  mixed  together: these are  only a few  of  the  categories  a researcher has to deal with. New technologies may be able to help in this process and trying to solve research related problems needs innovative solutions. In this paper, we describe the whole process for the design and development of a prototype application that uses an Immersive Virtual Reality system to acces archaeological excavation3Ddata through the Gesture Variation Follower (GVF) algorithm. This makes it possible to recognise which gesture is being performed and how it is performed. Archaeologist shave participated actively in the design of the interface and the set of gestures used for triggering the different tasks. Interactive machine learning techniques have been used for the real time detection of the gestures. As a case  study  the  agora  of  Segesta  (Sicily,  Italy)  has  been  selected.  Indeed,  due  to  the  complex architectural  features  and  the  still  ongoing  fieldwork  activities,  Segesta  represents  an  ideal  context  where  to  test  and develop a research approach integrating both traditional and more innovative tools and methods.


Download data is not yet available.


Anthony, L., & Wobbrock, J.O. (2010). A lightweight multistroke recognizer for user interface prototypes. Proceedings of Graphics Interface 2010, 245–252.

Bau, O., & Mackay, W.E. (2008). OctoPocus: a dynamic guide for learning gesture-based command sets. Proceedings of the 21st annual ACM symposium on User interface software and technology, 37–46.

Bevilacqua, F., Zamborlin, B., Sypniewski, A., Schnell, N., Guédy, F., & Rasamimanana, N. (2010). Continuous realtime gesture following and recognition. In S. Kopp & I. Wachsmuth (Eds.), Gesture in Embodied Communication and Human-Computer Interaction:8th International Gesture Workshop, GW 2009, Bielefeld, Germany, February 25-27, 2009, Revised Selected Papers(pp. 73–84). Berlin, Heidelberg: Springer Berlin Heidelberg.

Bragdon, A., Nelson, E., Li, Y., & Hinckley, K. (2011). Experimental analysis of touch-screen gesture designs in mobile environments. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 403–412.

Brickmann, J., Exner, E.T., Keil, M., & Marhöfer, J.R. (2000). Molecular Graphics -Trends and Perspectives. Molecular modeling annual, 6(2), 328–340.

Brogni, A., Caldwell, D.G., & Slater, M. (2011). Touching Sharp Virtual Objects Produces a Haptic Illusion. In R. Shumaker (Ed.), Virtual and Mixed Reality -New Trends: International Conference, Virtual and Mixed Reality 2011, Held as Part of HCI International 2011, Orlando, FL, USA, July 9-14, 2011, Proceedings, Part I (pp. 234–242). Berlin, Heidelberg: Springer Berlin Heidelberg.

Buxton, B. (2010). Sketching user experiences: getting the design right and the right design. San Francisco, CA: Morgan Kaufmann.

Caramiaux, B., Altavilla, A., Pobiner, S.G., & Tanaka, A. (2015). Form follows sound: designing interactions from sonic memories. Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, 3943–3952.

Caramiaux, B., Montecchio, N., Tanaka, A., & Bevilacqua, F. (2014). Adaptive Gesture Recognition with Variation Estimation for Interactive Systems. ACM Trans. Interact. Intell. Syst., 4(4), 1–34.

Cruz-Neira, C., Sandin, D.J., DeFanti, T.A., Kenyon, R.V., & Hart, J.C. (1992). The CAVE: audiovisual experience automatic virtual environment. Communications of the ACM, 35(6), 64–72.

Fdili Alaoui, S., Caramiaux, B., Serrano, M., & Bevilacqua, F. (2012). Movement qualities as interaction modality. Proceedings of the Designing Interactive Systems Conference, 761–769.

Fiebrink, R., Cook, P.R., & Trueman, D. (2011). Human model evaluation in interactive supervised learning. Proceedings of the SIGCHIConference on Human Factors in Computing Systems, 147–156.

Forte, M. (2010). Cyber-archaeology. Oxford, England: Archaeopress.

Forte, M. (2014). 3D archaeology. New perspectives and challenges. The example of Catalhoyuk. Journal of Eastern Mediterranean Archaeology and Heritage Studies, 2(1), pp. 1–29.

Forte, M.,& Siliotti, A. (1997). Virtual archaeology: re-creating ancient worlds. London: Harry N Abrams B.V.

Gillies, M., Kleinsmith, A., & Brenton, H. (2015). Applying the CASSM framework to improving end user debugging of interactive machine learning. Proceedings of the 20th International Conference on Intelligent User Interfaces, 181–185.

Grossman, T., & Balakrishnan, R. (2005). A probabilistic approach to modeling two-dimensional pointing. ACM Transactions on Computer-Human Interaction, 12(3), 435–459.

Kane, S.K., Wobbrock, J.O., & Ladner, R.E. (2011). Usable gestures for blind people: understanding preference and performance. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 413–422.

Kirsh, D. (2013). Embodied cognition and the magical future of interaction design. ACM Transactions on Computer-Human Interaction, 20(1), 1–30.

Kratz, L., Morris,D., & Saponas, T.S. (2012). Making gestural input from arm-worn inertial sensors more practical. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 1747–1750.

Lackner, J.R. (1998). Some proprioceptive influences on the perceptual representation of body shape and orientation. Brain, 111(2), 281–297.

Long, A.C., Landay, J.A., Rowe, L.A., & Michiels, J. (2000). Visual similarity of pen gestures. Proceedings of the SIGCHI conference on Human Factors in Computing Systems, 360–367.

Lucchese, G., Field, M., Ho, J., Gutierrez-Osuna, R., & Hammond, T. (2012). GestureCommander: continuous touch-based gesture prediction. CHI '12 Extended Abstracts on Human Factors in Computing Systems, 1925–1930.

Mackay, W.E., & Fayard, A.L. (1999). Video brainstorming and prototyping: techniques for participatory design. CHI '99 Extended Abstracts on Human Factors in Computing Systems, 118–119.

Mori, A., Uchida, S., Kurazume, R., Ichiro Taniguchi, R., Hasegawa, T. & Sakoe, H. (2006). Early recognition and prediction of gestures. ICPR 2006. 18th International Conference on Pattern Recognition, 560–563.

Muller, M.J., & Kuhn, S. (1993). Participatory design. Communications of theACM, 36(6),24–28.

Nielsen, M., Störring, M., Moeslund, T.B., & Granum, E. (2004). A Procedure for Developing Intuitive and Ergonomic Gesture Interfaces for HCI. In A. Camurri & G. Volpe (Eds.), Gesture-Based Communication in Human-Computer Interaction: 5th International Gesture Workshop, GW 2003, Genova, Italy, April 15-17, 2003, Selected Revised Papers(pp. 409–420). Berlin, Heidelberg: Springer Berlin Heidelberg.

Nieuwenhuizen, K., Aliakseyeu, D., & Martens, J.-B. (2009). Insight into Goal-Directed Movements: Beyond Fitts’ Law. In T. Gross, J. Gulliksen, P. Kotzé, L. Oestreicher, P. Palanque, R. O. Prates & M. Winckler (Eds.), Human-Computer Interaction –INTERACT 2009: 12th IFIP TC 13 International Conference, Uppsala, Sweden, August 24-28, 2009,Proceedings, Part I(pp. 274–287). Berlin, Heidelberg: Springer Berlin Heidelberg.

Norman, D.A. (1990). The design of everyday things (1st Doubleday/Currency Ed.). New York: Doubleday.

Olivito, R., Taccola, E.,& Albertini, N. (2015). A hand-free solution for the interaction in an immersive virtual environment: the case of the agora of Segesta. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, 40-5/W4, 31–36.

Ouyang, T., & Li, Y. (2012). Bootstrapping personal gesture shortcuts with the wisdom of the crowd and handwriting recognition. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 2895–2904.

Pietroni, E., & Pescarin, S. (2010). VR cooperative environments for the interpretation and reconstruction of the archaeological landscape. Virtual Archaeology Review, 1(2), 25–29.

Pietroni, E.,& Rufa, C. (2012). Natural interaction in Virtual Environments for Cultural Heritage: Giotto in 3D and Etruscanning study case. Virtual Archaeology Review, 3(7), 86–91.

Tanaka, A., Bau, O. & Mackay, W. (2013). The A20: Interactive instrument techniques for sonic design exploration. In: K. Franinović & S. Serafin (eds.), Sonic Interaction Design (pp. 255–270).Cambridge, MA: MIT Press.

Taylor, J.L. (2009). Proprioception. Encyclopedia of Neuroscience(pp. 1143–1149). Oxford: Academic Press.

Varona, J., Jaume-I-Capò, A., Gonzàlez, J., & Perales, F. J. (2009). Toward natural interaction through visual recognition of body gestures in real-time. Interacting with Computers, 21(1-2), 3–10.

Wilson, A.D., & Bobick, A.F. (1999). Parametric hidden Markov models for gesture recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence, 21(9), 884–900.

Wilson, A.D., & Bobick, A.F. (2000). Realtime online adaptive gesture recognition. Proceedings 15th International Conference on Pattern Recognition.

Wobbrock, J.O., Wilson, A.D., & Li, Y. (2007). Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes. Proceedings of the 20th annual ACM symposium on User Interface Software and Technology, 159–168.

Wobbrock, J.O., Morris, M.R., & Wilson, A.D. (2009). User-defined gestures for surface computing. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 1083–1092.




How to Cite

Albertini, N., Brogni, A., Olivito, R., Taccola, E., Caramiaux, B., & Gillies, M. (2017). Designing natural gesture interaction for archaeological data in immersive environments. Virtual Archaeology Review, 8(16), 12–21.