Category: FYP

FYP 2017/18

Here are my student projects for next year (from September 2017), exploring how mobile devices (and sensors) can determine, direct and model human behaviour (typically within the physical environment).  Importantly, we have most of the required technology within our department (e.g. telepresence robot, Oculus Rift VR hardware). Each projects can focus on one or more of:

  1. Software on device/s to motivate novel user interaction
  2. Analysis of data from device/s
  3. Simulation of human behaviour – and adapting 1.

 

HEALTHJOURNEY will use smartphone technology to reduce anxiety in children and parents before surgery. A hospital would like to create an engaging information platform, tailored to parents, that delivers information in multiple formats, such as virtual reality/360 panoramic images, videos, animation and written text.

OPGAME: A hospital would like to use mobile technology to reduce stress levels in children prior to an operation (e.g. during anaesthesia).  Methods such sound and visual feedback through engaging, age appropriate gamification could be used!

BLOCKSIM is a cyber-physical project that aims to link simple Lego building blocks with simulation.  How can users build a simulation model (e.g. movement around the campus, hospital or museum) using only physical blocks?

ROBOHEALTH is a project that requires you to develop software for a telepresence robot that then provides health advice or basic triage.  How can robots assess and direct patients?

ROBO-INTERACT explores how the “face” of a telepresence robot can change in response to human interaction.  The project could use a robot to interview a human or provide museum guidance to visitors (e.g. working with the London Museum of Water and Steam).

ROBO-ANALYTICS uses data collected by a telepresence robot to decide how to interact with humans in the local environment.  Does behaviour change in response to human-robot interaction? An educational or museum setting may be used.

CHATBOTSIM is a project that combines an online bot and simulation.  How can online automated discussions support better understanding of human behaviour?  Chatbot interaction could be used to generate a simulation model (e.g. hospital attendance locations).

CAMPUSHEALTH is a project that uses mobile technology to improve interaction with health services – campus pharmacy, GPs, Urgent care etc.  How can mobile technology better route students to the most effective service?

 

HEALTHVIZ is a project that extracts data from health “open-data” sites for use in new forms of graph visualisation (utilising Web, VR or mobile technology) – exploring how data is extracted and transformed before visualisation.

MOBILE DISASTER MANAGEMENT is a project that aims to predict the needs of stakeholders in a disaster management scenario.

PLANETHEALTH links a small mobile behaviour change app that you will develop (e.g. one that motivates limited plastic use, recycling or collection) to a wider environmental simulation describing possible impact.

TAGSCOPE combines mobile smartphone sensing technology with smart tagging of museum artefact displays.  The project will require you to develop smartphone app that can provide a virtual representation of the internal structure of an artefact and/or its use in a wider system (e.g. a water pump and the supply of 18th century London with water).

Advertisements

FYP 2016/17

Here are my projects for next year (from September 2016).  I am always happy to discuss projects and your ideas.  See my previous post on getting started.

MOBISIM combines mobile smartphone technology with agent based simulation.  The project will require you explore how simulations can execute on smart phones in the physical environment.  Importantly, a smart phone is able to detect movement in the local area and this capability can be used as part of the simulation.  Consumer behaviour in a retail space or patient behaviour in a hospital provides possible areas of investigation.  Students will have the opportunity to learn about agent based simulation and mobile sensing technology.

BLOCKSIM is a cyber-physical project that aims to link simple Lego building blocks with agent based simulation.  It is often difficult, especially for end users, to design models using current tools and techniques.  This project will allow an end-user to design a model with Lego before transformation into a computational model. Opportunities for 3D printing physical objects instead of or to augment Lego is a possibility. Students will have the opportunity to learn about agent based simulation and image analysis techniques.

VRSIM is a virtual reality project that requires you to develop an immersive interface for agent based simulation models.  Typical simulators use a simple grid of images, but VR could allow the user to enter, interrogate and adapt the model in a 3D world. A healthcare context may be explored, with people moving around a hospital environment.  Simulation can then be used to re-design new hospitals or wider systems including primary care. Students will have the opportunity to learn about agent based simulation and virtual reality technology.

DISTRIBUTED AGENT SIM is a project for students interested in more server side development (including protocols for software interoperation).  We currently have an agent based simulation platform that is web based (PHP) and easy to use.  The project aims to create a distributed simulation environment that is able to connect a number of separate, diverse simulations (e.g. infectious diseases and hospitals).  The project could also investigate how simulations can operate on a cloud platform. Students will learn about agent based simulation and messaging technologies.

FUTURE TRADING SIM is a project that explores a “design fiction” where we are all trading our own personal data footprints (purchases, movement, viewing/listening).  Agent based simulation can be used to understand possible futures, determine what is required for its growth and explore associated economic models.  A particular problem when simulating a future state is access to representative data.  Students will learn about agent based simulation and data trading.

ROBOSIM is a project that combines an online bot and agent based simulation.  How can online discussions gather data that is able to build better understanding of human behaviour?  One approach is to use this captured data as a basis for agent based modelling and simulation.  Students will learn about artificial intelligence (driving chatbot interaction) and agent based simulation approaches.

HEALTHSTRAT is a project that uses agent based simulation to build a health strategy game.  The game itself can explore local (e.g. hospital) or planetary health (see https://www.rockefellerfoundation.org/planetary-health/).  The game will allow participant to learn more about healthcare strategy as well as testing new or adapted strategies.  Game design techniques can be used to interact with a simulation. Students will learn about agent based simulation, healthcare systems and strategy.

PICTAG is a project that captures pictures and comments from Art installations (or heritage locations).  The project will utilise visual and non-visual tagging (e.g. QR Codes, NFC or iBeacon) to track visitors and allow them to send media and comments.  The challenge of this project is the subsequent analysis and visualisation of contributions!  Heat maps, simulation, sentiment analysis, emotional tracking are all possible directions. Game design and/or social media techniques can also be used to construct a resulting virtual exhibit.

HEALTHVIZ is a project that extracts data from health “open-data” sites for use in new forms of visualisation (utilising underlying graph storage).  Most open data is in tabular format and not directly connected.  HCI techniques will used to design the visualisation and the user experience (UX).

MOBILE DISASTER MANAGEMENT is a project that aims to predict the needs of stakeholders in a disaster management scenario.  Open-data will be used to design,  build and drive a mobile app that support various response services.  The project can be more focused towards data analysis or mobile app design and usability.

Final year projects for 2015/16

My projects this year will typically use a number of state of art technologies, including semantic web tools, NOSQL data stores and machine learning.  These technologies will support a number of highly adaptive and reactive project areas.

Intelligent Adaptive Advertising

A typical electronic advertising screen (in a shop window) will display advertisements at a particular place for a particular time period. This project will investigate how advertising can adapt to groups of people walking in front of a pervasive screen. Machine learning approaches will be used to determine the impact of changes (e.g. movement of the advertisement viewers). The project will utilise the Microsoft Kinect sensor to determine human positioning (infer context) and adapt the screen accordingly. The Kinect API will be used to gather data and Amazon’s cloud infrastructure will be used for predictive analytics.  How can motion capture and predictive analytics be used to optimise advertising?

Mobile Trading Environments
A number of financial markets exist that allow users to trade shares, currencies (FX), futures, options and bonds etc. (e.g. LMAX or TradeWeb). These markets typically rely on the matching of buyers with sellers or the real-time access to particular asset prices. This project will investigate how micro-markets can be initiated using mobile technology (e.g. Smart Phones). The project will require you to develop both client and server code to support trading users. Further experience of smart phone development will be gained, along with NOSQL data stores and messaging. A particular market or group of markets could be chosen by the student (e.g. a typical financial market or an advertising market). The project will start by choosing a novel micro market to address.

Haptic controlled Robotics

Controlling the environment using your own movement can have a number of benefits. This project will investigate how a mobile devices can be used to control a remote robot. This project could be undertaken by a two person team – one student focusing on the sensing and the other on the robot control. A number of sensors and mobile devices are available for use. The project will start with experimental work, identifying the sensitivity of the device and designing an associated event model. The project could use event processing, cloud and Web service technology.

Augmented Reality Educational Gaming
This project will investigate how one or more games came be used to extend the educational experience and environment. The educational context can be chosen by the student – primary, secondary, higher or other education. It is envisaged that the student will develop a game using Android and Google Cardboard APIs. How can games make novel use of the smart phone sensors? How could a game be used to help students joining a University at the start of year 1? How can gaming be used to test and simulate ideas (virtual lab or business)? One of more of these questions could form part of the project.  The project will start by focusing on the problem being addressed and experimenting with augmented reality glasses.

 Augmented Heritage Experience Applications

This project will investigate how smart phones can be used to interact with artistic or heritage content (typically supplied by museums).  The project will explore how content can be produced (by museum or artists) and then delivered to user in the physical environment (e.g. accessing a painting from where it was painted).  The mobile application should consider the multi-disciplinary user and provide innovative media tailed to the user.  The project will start by selecting the context (e.g. museum or art) and investigating current content.  New media could be created in tools such as Blender or Unity3D.

 Analysing public mood and emotion using social media

A number of academic papers have investigated the public mood or emotional state through the analysis of twitter feeds. Some have looked at the correlation to financial markets. This project will extend some of this work and look at mix of social media and markets. One use of such approach could be the prediction of the FTSE100 index. The project will involve the development of server based (web service or cloud) software that is able to read and analyse a number of data feeds – producing models of the source data and associated predictions. The project could (if the student is interested) also have a strong visualisation or semantic web component.

Agent based simulation using open data

This project will involve the construction of an agent based simulation model from one of the many open data sites (e.g. health, economic or traffic data) in order to predict a future state or states.  The project will need to use open data to build a simulation model using manual or automated approaches (such as clustering or classification).  The model will then be executed in a recognised simulation tool or within a student coded simulation model. The coded simulation model could be produced in a functional language such a Clojure (if you are interested in learning a new language).

Agent based hospital simulation using NFC/RFID data

This project will involve the construction of an agent based simulation model from a physical model of a hospital (build already and containing Phidget sensors) in order to predict future movement and/or optimise movement.  The project will use a physical model of Mount Vernon hospital to build a simulation model using manual or automated approaches (such as clustering or classification).  The model will then be executed in a recognised simulation tool or within a student coded simulation model. The coded simulation model could be produced in a functional language such a Clojure (if you are interested in learning a new language).

Starting your FYP

There is no standard way to prepare for your FYP, but before we meet I suggest that you:

– Decide on some technologies and reacquaint yourself with the language/IDE etc.

– Find some papers on topics close to your project (initially try Communications of the ACM CACM)

– Have a go a writing some aims and objectives for your project.  Remember, your problem should be manageable.

– Attempt some design work – use cases, screen sketches.

You do not need to attempt all of these as your project will become clearer when we have our first meeting in late September.

Final Year Project for 2014/15

Health Innovation Projects 2014

Email if you are interested in any of these projects (with the project name in the email title):

Mobile collection of device effectiveness
Patient-device movement around the physical environment using GPS
Patient-device movement around the physical environment using RFID tags
Monitoring patients with home robots
Patient-device movement around the physical environment using NFC tags
Medical Device Data Simulator
Cloud storage of real-time medical device data
Real-time medical device data storage in a NOSQL document store
Real-time medical device data storage in a NOSQL graph store
Real-time medical device data storage in a NOSQL property-value/triple store
Synchronisation of data between mobile and cloud storage
Storage and querying of device data with semantic web (Web 3.0) technologies
Data security using mobile medical apps
Relational data hub: Cleaning and normalising medical device data
Predicating medical outcomes from Big Data
Medical dashboard for medical device tracking
Medical dashboard for patient pathways tracking
Medical dashboard data verbasizer for improved communication
Identifying past and future medical event from Big Data
Predicting patient behaviour from Big Data
Agent based simulation of medical device networks
Agent based simulation of patient outcomes

FYP Project 2013/14

Health Innovation Projects (HIP@DISC) – Connections everywhere with a Web of devices and heterogeneous “Big Data” stores (Ambitious Individual Projects addressing a common theme, meeting as a group)

Smart cities (www.ibm.com/thesmartercity) allow citizens to more actively engage with their surroundings with smart phones, social media and sensing devices. The network of interconnected devices has been termed The Internet of Things (IOT). This project aims to explore the augmentation of such cities by combining sensor technologies from a number of devices – Merck Serono medical devices, RFID/NFC/QR tags, Microsoft Kinect and Smart Phones – and social media. A smart health scenario is chosen, within a smart city, where the citizen can monitor their own health whilst generating data trails that record their lifestyle. Such smart health scenario is premised on software and services that are able to collect, store and analyse data in a natural and pervasive manner (embedded into everyday life).
HIP@DISC will provide a number of innovative FYPs that address connection to specific devices, visualisation of data, cloud big-data storage and intelligent data analysis.
Although each project is individual (and assessed as such), the projects offers a real chance to interconnect and experiment on a wider environment with both technology and usability opportunities. The group will also work together on literature gathering and analysis. The supervision will be carried out by Prof. Barnett, Dr. Bell and Prof. Young; and include meetings with Merck Serono providing industry context and perspective. You are likely to have meetings with academics over the course of the year in order to fully support different activities and parts of the project.
Some possible project ideas:
• Medical device data acquisition using smartphones (3 separate projects).
• Smart phone development using GPS tracking to collect mobility information and interact with health service providers on the move (seeking services in a Smart City).
• Smartphone capture of wellbeing and happiness (including direct entry and sensor based inference).
• RFID, NFC or QR codes to label physical proximity to either track their movement or use of specific objects (devices or medicines).
• Diet and medical tracking with self-management on a smartphone.
• Kinect sensor (NUI) allowing user tracking in their physical environment at home (with interaction with virtual health professionals)
• Big Data storage in the Cloud – storing data for other projects in the cloud
• Data visualisation of cloud based Big Data (using Smart Phones, PCs or Ambient screens)
• Data visualisation of real time data of the Web of Devices (using Smart Phones, PCs or Ambient screens)
• Intelligent data analysis of Big Data (using clustering or other algorithms).
• Intelligent data analysis of Social Media Data (using clustering or other algorithms).
• Pharma data analysis (and Reporting) of drug performance
• Smart Phone recommender systems for Smart Health
• Co-design of the smart health mobile services (promoting interaction between citizen and health provider)

FYP Projects 2012/13

BigSpace – Mixed modality and augmented reality (Ambitious Individual Projects addressing a common theme, meeting as a group)

A smart city agenda has been popularised by IBM (www.ibm.com/thesmartercity) in recent years, as cities with state of art infrastructure are being constructed. This project aims to investigate the augmentation of such cities by combining sensor technologies from a number of devices – the smart phone, RFIDs and Microsoft Kinect sensors. A city scape will be chosen and a number of applications will be built that sense and support users and groups of user as they move through the environment. A likely project could be the Brunel campus and the city of London. Student, staff and visitors documents/information/events could be made available using a number of devices, e.g. QR codes could be read and the encoded data placed on a map that is later accessed by the Smart phone immersive applications or on ambient screens using Kinect interaction. Campus data could include schedules, maps, food, social and teaching data. A number of scales will be chosen, e.g. a moving individual at a particular location, an individual on the move, a group of users in a space (e.g. an 8m square). The projects will investigate how intelligent software is able to improve the citizen experience (either on the Smart Phone screen or Ambient screens). A contrast between the more constrained Campus and wider cityscape could be investigated by using the same applications in restaurant, museum, transport and shopping scenrios. Although each project is individual (and assessed as such), the projects offers a real chance to interconnect and experiment on a wider environment with both technology and usability opportunities. The group will also work together on literature gathering and analysis. The supervision will be carried out by Dr. Bell. You are likely to have meetings with academics over the course of the year in order to fully support different activities and parts of the project.

Some possible project ideas:

  • Smart phone development using GPS tagging to provide information and interact (seeking services in a Smart City)
  • QR codes to label places for information provision, tagging, messaging, quiz (Student users, City Visitors)
  • Kinect sensor (NUI) allowing the user to interact with their physical environment (shop, museum, government)
  • Kinect monitoring of the environment or people within it (health, queuing, adaptive services)
  • RFID sensor tagging of objects for selling, information provision (Intelligent food shopping, Audio information about the environment)
  • An App store for a Smart City
  • Virtual reality experience linking physical and virtual worlds (museum, wider environment)
  • Monitoring and security in a Smart City
  • Accessing and rating Government services in a Smart City
  • Transport in a Smart City (public, sharing, information)
  • Recommender systems in a Smart City
  • Business models for the smart city
  • Co-design of the smart or mobile services

See

http://channel9.msdn.com/coding4fun/kinect

http://www.hcplive.com/pop-medicine/Microsoft-Kinect-Helps-Research-How-To-Prevent-Patient-Falls/

http://www.mddionline.com/blog/devicetalk/xbox-kinect-medical-device-your-living-room

Each student to carry out their own individual project under the “group” theme.

FYP Projects 2011/12

Intelligent Adaptive Advertising

A typical electronic advertising screen (in a shop window) will display advertisements at and for a particular time. This project will investigate how advertising can adapt to groups of people positioned in front of a screen. Learning approaches could be used to determine the impact of changes (e.g. movement of the advertisement viewers). The project will utilise the Microsoft Kinect sensor to determine group position (infer context) and adapt the screen accordingly. The Kinect API has only recently been released by Microsoft for academic use. It is assumed that the student will use their own Kinect sensor.

Brain controlled Robotics (Optional Group Project)

Controlling the environment using your own thoughts can have a number of benefits (particularly for the disabled user). This project will investigate how an EEG (a brain electrical sensor) can be used to control a remote device. This project could be undertaken by a two person team – one student focusing on the EEG sensing and the other on the robot control. A number of EEG sensors are available for use. The project will start with experimental work, identifying the sensitivity of the device and designing an associated event model. The project could use complex event processing, JMS and Web service technology.

Educational Gaming
The project will investigate how one or more games came be used to extend the educational experience and environment. The educational context can be chosen by the student – primary, secondary, higher or other education. It is envisaged that the student will develop a game using Facebook, Android, JME, Microsoft XNA or other environments. How can games make novel use of the smart phone sensors? How could a game be used to help students joining a University at the start of year 1? How can games be used to test and simulate ideas (virtual lab or business)? Can gaming be coupled with your educational CV development? One of more of these questions could form part of the project.

Mobile Trading Environments

A number of financial retail markets exist that allow users to trade shares, currencies, futures, options and bonds etc. (e.g. LMAX or TradeWeb). These markets typically rely on the matching of buyers with sellers or the real-time access to particular prices. This project will investigate how micro-markets can be developed using mobile technology (e.g. Smart Phones). The project will require you to develop both client and server code to support trading users. Further experience of smart phone development will be gained, along with service based XML and messaging. A particular market or group of markets could be chosen by the student.

Analysing public mood and emotion using social media

A number of academic papers have investigated the public mood or emotional state through the analysis of twitter feeds. Some have looked at the correlation to financial markets. This project will extend some of this work and look at mix of social media and markets. One use of such approach could be the prediction of the FTSE100 index. The project will involve the development of server based (web service or cloud) software that is able to read and analyse a number of data feeds – producing models of the source data and associated predictions. The project could (if the student is interested) also have a strong visualisation or semantic web component.

Healthcare Market Simulation (Group Project)

Current healthcare providers (e.g. GP or hospitals) and their suppliers of products and services (e.g. home monitoring) find it difficult to fully understand how future marketplaces will operate. One way of investigating this future state is through business simulations or gaming – working with their partners and customers in the business network to explore future opportunities and innovation.
A server side platform been built to support this exploration using XML interfaces that are able to connect a diverse set of user interface applications and support varied categories of user. This project is a group project where each student will work on a specific client application, simulation or game based application. The server side application (and game/simulation engine) will be made available as the year progresses (with an interface being defined early in the project development). Consequently, each application will be able to interoperate with the others.
Students will need to both understand the specific business requirements of different stakeholders in the network (suppliers, customer, integrators etc.) as well as build some leading edge technical solutions. The business domain being investigated is a healthcare remote care model where device suppliers, healthcare staff, patients and integrators come together to investigate viable future business models. The project will offer a number of project opportunities, including:
(a) Developing a game, application or simulator using Facebook, iPhone, Android, Blackberry, Xbox360 or PC platforms,
(b) Analysing the proposed healthcare environment and
(c) generating business rule based code.
Examples could include selecting/offering a healthcare service, dose management, first aid, insulin guidance, drip-rate calculators and journey planning. Each project will utilise common server side XML technology in order to provide a reasonable level of interoperation. The group are also likely to work together on literature gathering, analysis and reviewing each others work. The supervision will be carried out by Prof. Young, Dr Bell or Dr. Kent. You are likely to have meetings with all three academics (possibly including some Healthcare workers) over the course of the year in order to fully support different activities and parts of the project.

GROUP:
The group will meet together with 1 or more of the supervision team. 1 to 1 meetings will occur in term 2 when more individual project discussion is required.

BigSpace – Mixed modality augmented reality (Group Project)

A smart city agenda has been popularised by IBM (www.ibm.com/thesmartercity) in recent years, as cities with state of art infrastructure are being constructed. This project aims to investigate the augmentation of such cities by combining sensor technologies from a number of devices – the smart phone, RFIDs and Microsoft Kinect sensors. A city scape will be chosen and a number of applications will be built that sense and support users and groups of user as they move through the environment. A likely project could be the Brunel campus where student and staff data could be made available using a number of devices, e.g. QR codes could be read and the encoded data placed on a map that is later accessed by the Smart phone immersive applications or on ambient screens using Kinect interaction. Campus data could include schedules, maps, food, social and teaching data. A number of scales will be chosen, e.g. a moving individual at a particular location, an individual on the move, a group of users in a space (e.g. an 8m square). The projects will investigate how intelligent software is able to improve the citizen experience (either on the Smart Phone screen or Ambient screens). Although each project is individual (and assessed as such), the projects offers a real chance to interconnect and experiment on a wider environment with both technology and usability opportunities. The group will also work together on literature gathering and analysis. The supervision will be carried out by Prof. Louvieris and Dr. Bell. You are likely to have meetings with academics over the course of the year in order to fully support different activities and parts of the project.

Undergraduate Projects

Although I have reached my quota of FYP projects (2009) – here is my list from the last 2 years:

Cloud Mobility
Mobile Class Messaging
Mobile Chef
Wiimote, Gaming Glove or EEG Data Immersion
Mobile Science Lab
XNA or J2ME Educational Game
Searching the Semantic Web (Web 3.0 Search)
HAOS – A haptic operating system
Game Based Ubiquitous Simulator
Machinima: Cinema meets Software Engineering
Web 2.0,3.0… Visualisation
Mobile Service Discovery
Business Grid Services – or – Mobile Business Grid Services
Applications for Ambient Objects