Understanding human gestures through Immersive environments (AR/VR)

People

Principal investigator

Researcher

Description

In gesture research, one main challenge is to recognize when the user starts a specific gesture and when it ends. Further, identifying a gesture from an unintentional movement is also a challenge. However, with the proliferation of Augmented Reality (AR) and Virtual Reality (VR) and the Internet of Things (IoT) gestures are becoming prevalent due to its ease of use and adaptability.

A gestural command can be classified as either posture or gesture. A posture is a static configuration of hand/s and a gesture includes a movement of the hand. This project aims to recognize postures and gestures using data gathered through virtual hands. These virtual hands data is collected from real users' hand movements and normalized to one hand scale. The aim of this project is to model, understand and recognize these postures and gestures when they are performed again in AR/VR space. The project has following sub project which students can select. During this project student will be working with Microsoft HoloLens 2, Oculus Quest, Myo EMG band and wearable accelerometer and gyroscope sensors/devices and applying ML/DL techniques.

 

Project 1: Movement classification in hand gestures.

This project is looking at classifying and labelling commonly identifiable movements in gestures. During this project student will gain exposure and skills to evaluate, analyze and interpret research data and use those data to support and justify conclusions or decisions related to the project aims. In achieving this, student will work on applying variety of machine learning (ML) and deep learning (DL) models for clustering, dimensionality reduction, regression and classification etc. In additional students will also get experience with conducting critical evaluation, and synthesizing research and technical literature relevant to the project and at the end to communicate the findings in an appropriate form. If you are interested and want to know more information contact Assoc. Prof. Armin Haller and/or Madhawa Perera

Project 2: Hand gesture attribute classification

Gestures and postures consist various attributes. In this project we are specifically looking at classifying hand posture and/or gestures and their attributes. Similar to project 1, student will be working on applying variety of machine learning (ML) and deep learning (DL) models for clustering, dimensionality reduction, regression and classification etc. If you are interested and want to know more information contact Assoc. Prof. Armin Haller and/or Madhawa Perera.

Project 3: Semantical gesture labelling

This a software engineering research project where the student will develop a data visualization tool related to gesture data. This will require creatively identifying and implementing a solution to a complex problem that exists in the domain of Gesture Elicitation Studies (GES). This will help student to gain the exposure in working on a substantial research and development project which utilizes contemporary engineering techniques and literature in the research field. If you are interested and want to know more information contact Assoc. Prof. Armin Haller.

Project 4: Realtime gesture recognition using multimodal data for AR applications

This project involves EMG, accelerometer and gyroscope sensor data. During this project student will work on the complete life cycle of a data science research project where they will be involved in data collection, analysis and learn to identify, collate, summarize and critically evaluate relevant literature, data and sources. During the analysis they will compare and select the most appropriate ML/DL methods for the application / data set. Again, if you are interested and want to know more information get in touch with Assoc. Prof. Armin Haller and/or Madhawa Perera.

Requirements

Strong programming skills using modern Web APIs

Desired, but optional experience in:

  • Machine learning
  • AR/VR development experience
  • Semantic Web (RDF, OWL, SPARQL)
  • Hypermedia

Background Literature

  • Madhawa Perera, Armin Haller, Sergio José Rodríguez Méndez, Matt Adcock:
    HDGI: A Human Device Gesture Interaction Ontology for the Internet of Things. ISWC (2) 2020: 111-126
  • Madhawa Perera, Armin Haller, Matt Adcock:
    A Roadmap for Semantically-Enabled Human Device Interactions. SAWSemStats at ISWC 2019

Keywords

Internet of Things

Augemented Reality

Virtual Reality

Semantic Web

Knowledge Graph

Ontology

RDF/OWL

SPARQL

Updated:  10 August 2021/Responsible Officer:  Head of School/Page Contact:  CECS Webmaster