Category: Technology

FYP 2017/18

Here are my student projects for next year (from September 2017), exploring how mobile devices (and sensors) can determine, direct and model human behaviour (typically within the physical environment).  Importantly, we have most of the required technology within our department (e.g. telepresence robot, Oculus Rift VR hardware). Each projects can focus on one or more of:

  1. Software on device/s to motivate novel user interaction
  2. Analysis of data from device/s
  3. Simulation of human behaviour – and adapting 1.

 

HEALTHJOURNEY will use smartphone technology to reduce anxiety in children and parents before surgery. A hospital would like to create an engaging information platform, tailored to parents, that delivers information in multiple formats, such as virtual reality/360 panoramic images, videos, animation and written text.

OPGAME: A hospital would like to use mobile technology to reduce stress levels in children prior to an operation (e.g. during anaesthesia).  Methods such sound and visual feedback through engaging, age appropriate gamification could be used!

BLOCKSIM is a cyber-physical project that aims to link simple Lego building blocks with simulation.  How can users build a simulation model (e.g. movement around the campus, hospital or museum) using only physical blocks?

ROBOHEALTH is a project that requires you to develop software for a telepresence robot that then provides health advice or basic triage.  How can robots assess and direct patients?

ROBO-INTERACT explores how the “face” of a telepresence robot can change in response to human interaction.  The project could use a robot to interview a human or provide museum guidance to visitors (e.g. working with the London Museum of Water and Steam).

ROBO-ANALYTICS uses data collected by a telepresence robot to decide how to interact with humans in the local environment.  Does behaviour change in response to human-robot interaction? An educational or museum setting may be used.

CHATBOTSIM is a project that combines an online bot and simulation.  How can online automated discussions support better understanding of human behaviour?  Chatbot interaction could be used to generate a simulation model (e.g. hospital attendance locations).

CAMPUSHEALTH is a project that uses mobile technology to improve interaction with health services – campus pharmacy, GPs, Urgent care etc.  How can mobile technology better route students to the most effective service?

 

HEALTHVIZ is a project that extracts data from health “open-data” sites for use in new forms of graph visualisation (utilising Web, VR or mobile technology) – exploring how data is extracted and transformed before visualisation.

MOBILE DISASTER MANAGEMENT is a project that aims to predict the needs of stakeholders in a disaster management scenario.

PLANETHEALTH links a small mobile behaviour change app that you will develop (e.g. one that motivates limited plastic use, recycling or collection) to a wider environmental simulation describing possible impact.

TAGSCOPE combines mobile smartphone sensing technology with smart tagging of museum artefact displays.  The project will require you to develop smartphone app that can provide a virtual representation of the internal structure of an artefact and/or its use in a wider system (e.g. a water pump and the supply of 18th century London with water).

Advertisements

Final year projects for 2015/16

My projects this year will typically use a number of state of art technologies, including semantic web tools, NOSQL data stores and machine learning.  These technologies will support a number of highly adaptive and reactive project areas.

Intelligent Adaptive Advertising

A typical electronic advertising screen (in a shop window) will display advertisements at a particular place for a particular time period. This project will investigate how advertising can adapt to groups of people walking in front of a pervasive screen. Machine learning approaches will be used to determine the impact of changes (e.g. movement of the advertisement viewers). The project will utilise the Microsoft Kinect sensor to determine human positioning (infer context) and adapt the screen accordingly. The Kinect API will be used to gather data and Amazon’s cloud infrastructure will be used for predictive analytics.  How can motion capture and predictive analytics be used to optimise advertising?

Mobile Trading Environments
A number of financial markets exist that allow users to trade shares, currencies (FX), futures, options and bonds etc. (e.g. LMAX or TradeWeb). These markets typically rely on the matching of buyers with sellers or the real-time access to particular asset prices. This project will investigate how micro-markets can be initiated using mobile technology (e.g. Smart Phones). The project will require you to develop both client and server code to support trading users. Further experience of smart phone development will be gained, along with NOSQL data stores and messaging. A particular market or group of markets could be chosen by the student (e.g. a typical financial market or an advertising market). The project will start by choosing a novel micro market to address.

Haptic controlled Robotics

Controlling the environment using your own movement can have a number of benefits. This project will investigate how a mobile devices can be used to control a remote robot. This project could be undertaken by a two person team – one student focusing on the sensing and the other on the robot control. A number of sensors and mobile devices are available for use. The project will start with experimental work, identifying the sensitivity of the device and designing an associated event model. The project could use event processing, cloud and Web service technology.

Augmented Reality Educational Gaming
This project will investigate how one or more games came be used to extend the educational experience and environment. The educational context can be chosen by the student – primary, secondary, higher or other education. It is envisaged that the student will develop a game using Android and Google Cardboard APIs. How can games make novel use of the smart phone sensors? How could a game be used to help students joining a University at the start of year 1? How can gaming be used to test and simulate ideas (virtual lab or business)? One of more of these questions could form part of the project.  The project will start by focusing on the problem being addressed and experimenting with augmented reality glasses.

 Augmented Heritage Experience Applications

This project will investigate how smart phones can be used to interact with artistic or heritage content (typically supplied by museums).  The project will explore how content can be produced (by museum or artists) and then delivered to user in the physical environment (e.g. accessing a painting from where it was painted).  The mobile application should consider the multi-disciplinary user and provide innovative media tailed to the user.  The project will start by selecting the context (e.g. museum or art) and investigating current content.  New media could be created in tools such as Blender or Unity3D.

 Analysing public mood and emotion using social media

A number of academic papers have investigated the public mood or emotional state through the analysis of twitter feeds. Some have looked at the correlation to financial markets. This project will extend some of this work and look at mix of social media and markets. One use of such approach could be the prediction of the FTSE100 index. The project will involve the development of server based (web service or cloud) software that is able to read and analyse a number of data feeds – producing models of the source data and associated predictions. The project could (if the student is interested) also have a strong visualisation or semantic web component.

Agent based simulation using open data

This project will involve the construction of an agent based simulation model from one of the many open data sites (e.g. health, economic or traffic data) in order to predict a future state or states.  The project will need to use open data to build a simulation model using manual or automated approaches (such as clustering or classification).  The model will then be executed in a recognised simulation tool or within a student coded simulation model. The coded simulation model could be produced in a functional language such a Clojure (if you are interested in learning a new language).

Agent based hospital simulation using NFC/RFID data

This project will involve the construction of an agent based simulation model from a physical model of a hospital (build already and containing Phidget sensors) in order to predict future movement and/or optimise movement.  The project will use a physical model of Mount Vernon hospital to build a simulation model using manual or automated approaches (such as clustering or classification).  The model will then be executed in a recognised simulation tool or within a student coded simulation model. The coded simulation model could be produced in a functional language such a Clojure (if you are interested in learning a new language).

Hadoop and Spark

A nice Hadoop tutorial – http://blog.cloudera.com/blog/2014/01/how-to-create-a-simple-hadoop-cluster-with-virtualbox/ .  Moving on from a single test machine.  Next step is having a play around with spark – https://amplab.cs.berkeley.edu/2013/10/23/got-a-minute-spin-up-a-spark-cluster-on-your-laptop-with-docker/ (nice way to kick start a spark cluster with Docker).

Which do you prefer? Why and what for?

EPSRC Research Grant – Digital Personhood Futures Trading

Professor Panos Louveiris and I (collaborating with Dr Audrey Guinchard from the University of Essex ) have received £850K of funding from the EPSRC to investigate how novel trading platforms are able to facilitate the use of personal data.  The project is titled Digital Personhood: Digital Prosumer — Establishing a ‘Futures Market’ for Digital Personhood Data and will involve the design and implementation of the legal, technical and business underpinnings for such ventures.

3D Printing

3D printers are now available for the hobby designers. Products like Reprap and Makerbot offer sub £1k design tools. You can design items for everyday use or prototype more futuristic design concepts. I like the way some of the printers (Reprap) are able to re-create themselves.

Take a look at http://www.guardian.co.uk/technology/2010/dec/19/3d-printer-kit-makerbot or http://reprap.org/wiki/Main_Page.

Java and XML

XPATH Intro

XPATH is a language for addressing parts of an XML document (http://www.w3.org/TR/xpath). 

Getting Started

XML and XPath combine well for simple database applications.  The XML files are equivalent to the relational database and XPath the query language.  The Java code shown below shows the beginnings of a “gateway” to XML data (in the same vein as row and table data gateways – see Fowler – RDG and TDG).  I call this type of approach a XML File Gateway (XFG) and in its fuller form should include methods to insert, update, delete and query the underlying XML files.

 

Imagine we want to keep a directory of intranet web sites that are not accessible to search engines.  We could use XML to store descriptions of the web pages and XPath to search the XML file.

 

To find out more about XPath – read http://www.w3schools.com/Xpath/.

Consider the following XML document:

<!– Hiden Web Defintion – what other tags sould you include –>

 

<HiddenWeb>

    <page year=”2008″>

        <author>Bell</author>

        <title>JDBC Lab</title>

        <uri>

           http://people.brunel.ac.uk/~csstdjb/lab5.pdf

        </uri>

        <subject>Java</subject>

    </page>

 

    <page year=”2004″>

        <author>ULink</author>

        <title>ULink Site</title>

        <uri>http://brunel.ac.uk/intranets/u-link/</uri&gt;

        <subject>VLE</subject>

    </page>

  

   <!– more pages defined … –>

   

</HiddenWeb>

 

Next step, write some java code to search through the XML file (called HiddenWeb.xml) in this case:

 

 

// Simple XPath demo code

// DJB Jan 2009

//

import java.io.IOException;

import org.w3c.dom.*;

import org.xml.sax.SAXException;

import javax.xml.parsers.*;

import javax.xml.xpath.*;

 

public class HiddenWebGateway {

 

  public static String SearchForURI() throws ParserConfigurationException, SAXException,

          IOException, XPathExpressionException {

   

    DocumentBuilderFactory domFactory = DocumentBuilderFactory.newInstance();

    domFactory.setNamespaceAware(true); // Important!

    DocumentBuilder docBuilder = domFactory.newDocumentBuilder();

    Document doc = docBuilder.parse(“hiddenweb.xml”);

 

    XPathFactory factory = XPathFactory.newInstance();

    XPath xpath = factory.newXPath();

    XPathExpression query_expr

     = xpath.compile(“//page[subject=’Java’]/uri/text()”);

 

    Object theResult = query_expr.evaluate(doc, XPathConstants.NODESET);

    NodeList nodes = (NodeList) theResult;

    System.out.println(“Results:”);

   

    String concatResult=””;

    for (int i = 0; i < nodes.getLength(); i++) {

        System.out.println(nodes.item(i).getNodeValue());

        concatResult=concatResult+nodes.item(i).getNodeValue();

    }

    return concatResult;

  }

  public static void main(String[] args) {

   

    try {

        System.out.println(“Results:”+SearchForURI());

    } catch (Exception e) {

        System.err.println(“Caught IOException: ”

                        + e.getMessage());

    }

  }

}

 

Some ideas

The Java code above searches for web page uris that have the subject “Java”.  Note! Java 1.5 and above has XPath included.

 

Some other things to try:

–         Put the Java code in a Servlet with the XPath query expression passed as a parameter.

–         Extend or redesign the XML (what is the best way of doing this?).

–         Automatically generate the XML – crawler type functionality – from the source web pages

–         Develop a query UI

–         Others…….