FYP 2017/18

Here are my student projects for next year (from September 2017), exploring how mobile devices (and sensors) can determine, direct and model human behaviour (typically within the physical environment).  Importantly, we have most of the required technology within our department (e.g. telepresence robot, Oculus Rift VR hardware). Each projects can focus on one or more of:

  1. Software on device/s to motivate novel user interaction
  2. Analysis of data from device/s
  3. Simulation of human behaviour – and adapting 1.

 

HEALTHJOURNEY will use smartphone technology to reduce anxiety in children and parents before surgery. A hospital would like to create an engaging information platform, tailored to parents, that delivers information in multiple formats, such as virtual reality/360 panoramic images, videos, animation and written text.

OPGAME: A hospital would like to use mobile technology to reduce stress levels in children prior to an operation (e.g. during anaesthesia).  Methods such sound and visual feedback through engaging, age appropriate gamification could be used!

BLOCKSIM is a cyber-physical project that aims to link simple Lego building blocks with simulation.  How can users build a simulation model (e.g. movement around the campus, hospital or museum) using only physical blocks?

ROBOHEALTH is a project that requires you to develop software for a telepresence robot that then provides health advice or basic triage.  How can robots assess and direct patients?

ROBO-INTERACT explores how the “face” of a telepresence robot can change in response to human interaction.  The project could use a robot to interview a human or provide museum guidance to visitors (e.g. working with the London Museum of Water and Steam).

ROBO-ANALYTICS uses data collected by a telepresence robot to decide how to interact with humans in the local environment.  Does behaviour change in response to human-robot interaction? An educational or museum setting may be used.

CHATBOTSIM is a project that combines an online bot and simulation.  How can online automated discussions support better understanding of human behaviour?  Chatbot interaction could be used to generate a simulation model (e.g. hospital attendance locations).

CAMPUSHEALTH is a project that uses mobile technology to improve interaction with health services – campus pharmacy, GPs, Urgent care etc.  How can mobile technology better route students to the most effective service?

 

HEALTHVIZ is a project that extracts data from health “open-data” sites for use in new forms of graph visualisation (utilising Web, VR or mobile technology) – exploring how data is extracted and transformed before visualisation.

MOBILE DISASTER MANAGEMENT is a project that aims to predict the needs of stakeholders in a disaster management scenario.

PLANETHEALTH links a small mobile behaviour change app that you will develop (e.g. one that motivates limited plastic use, recycling or collection) to a wider environmental simulation describing possible impact.

TAGSCOPE combines mobile smartphone sensing technology with smart tagging of museum artefact displays.  The project will require you to develop smartphone app that can provide a virtual representation of the internal structure of an artefact and/or its use in a wider system (e.g. a water pump and the supply of 18th century London with water).

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s