Here are my student projects for next year (from September 2017), exploring how mobile devices (and sensors) can determine, direct and model human behaviour (typically within the physical environment). Importantly, we have most of the required technology within our department (e.g. telepresence robot, Oculus Rift VR hardware). Each projects can focus on one or more of:
- Software on device/s to motivate novel user interaction
- Analysis of data from device/s
- Simulation of human behaviour – and adapting 1.
HEALTHJOURNEY will use smartphone technology to reduce anxiety in children and parents before surgery. A hospital would like to create an engaging information platform, tailored to parents, that delivers information in multiple formats, such as virtual reality/360 panoramic images, videos, animation and written text.
OPGAME: A hospital would like to use mobile technology to reduce stress levels in children prior to an operation (e.g. during anaesthesia). Methods such sound and visual feedback through engaging, age appropriate gamification could be used!
BLOCKSIM is a cyber-physical project that aims to link simple Lego building blocks with simulation. How can users build a simulation model (e.g. movement around the campus, hospital or museum) using only physical blocks?
ROBOHEALTH is a project that requires you to develop software for a telepresence robot that then provides health advice or basic triage. How can robots assess and direct patients?
ROBO-INTERACT explores how the “face” of a telepresence robot can change in response to human interaction. The project could use a robot to interview a human or provide museum guidance to visitors (e.g. working with the London Museum of Water and Steam).
ROBO-ANALYTICS uses data collected by a telepresence robot to decide how to interact with humans in the local environment. Does behaviour change in response to human-robot interaction? An educational or museum setting may be used.
CHATBOTSIM is a project that combines an online bot and simulation. How can online automated discussions support better understanding of human behaviour? Chatbot interaction could be used to generate a simulation model (e.g. hospital attendance locations).
CAMPUSHEALTH is a project that uses mobile technology to improve interaction with health services – campus pharmacy, GPs, Urgent care etc. How can mobile technology better route students to the most effective service?
HEALTHVIZ is a project that extracts data from health “open-data” sites for use in new forms of graph visualisation (utilising Web, VR or mobile technology) – exploring how data is extracted and transformed before visualisation.
MOBILE DISASTER MANAGEMENT is a project that aims to predict the needs of stakeholders in a disaster management scenario.
PLANETHEALTH links a small mobile behaviour change app that you will develop (e.g. one that motivates limited plastic use, recycling or collection) to a wider environmental simulation describing possible impact.
TAGSCOPE combines mobile smartphone sensing technology with smart tagging of museum artefact displays. The project will require you to develop smartphone app that can provide a virtual representation of the internal structure of an artefact and/or its use in a wider system (e.g. a water pump and the supply of 18th century London with water).
Here are my projects for next year (from September 2016). I am always happy to discuss projects and your ideas. See my previous post on getting started.
MOBISIM combines mobile smartphone technology with agent based simulation. The project will require you explore how simulations can execute on smart phones in the physical environment. Importantly, a smart phone is able to detect movement in the local area and this capability can be used as part of the simulation. Consumer behaviour in a retail space or patient behaviour in a hospital provides possible areas of investigation. Students will have the opportunity to learn about agent based simulation and mobile sensing technology.
BLOCKSIM is a cyber-physical project that aims to link simple Lego building blocks with agent based simulation. It is often difficult, especially for end users, to design models using current tools and techniques. This project will allow an end-user to design a model with Lego before transformation into a computational model. Opportunities for 3D printing physical objects instead of or to augment Lego is a possibility. Students will have the opportunity to learn about agent based simulation and image analysis techniques.
VRSIM is a virtual reality project that requires you to develop an immersive interface for agent based simulation models. Typical simulators use a simple grid of images, but VR could allow the user to enter, interrogate and adapt the model in a 3D world. A healthcare context may be explored, with people moving around a hospital environment. Simulation can then be used to re-design new hospitals or wider systems including primary care. Students will have the opportunity to learn about agent based simulation and virtual reality technology.
DISTRIBUTED AGENT SIM is a project for students interested in more server side development (including protocols for software interoperation). We currently have an agent based simulation platform that is web based (PHP) and easy to use. The project aims to create a distributed simulation environment that is able to connect a number of separate, diverse simulations (e.g. infectious diseases and hospitals). The project could also investigate how simulations can operate on a cloud platform. Students will learn about agent based simulation and messaging technologies.
FUTURE TRADING SIM is a project that explores a “design fiction” where we are all trading our own personal data footprints (purchases, movement, viewing/listening). Agent based simulation can be used to understand possible futures, determine what is required for its growth and explore associated economic models. A particular problem when simulating a future state is access to representative data. Students will learn about agent based simulation and data trading.
ROBOSIM is a project that combines an online bot and agent based simulation. How can online discussions gather data that is able to build better understanding of human behaviour? One approach is to use this captured data as a basis for agent based modelling and simulation. Students will learn about artificial intelligence (driving chatbot interaction) and agent based simulation approaches.
HEALTHSTRAT is a project that uses agent based simulation to build a health strategy game. The game itself can explore local (e.g. hospital) or planetary health (see https://www.rockefellerfoundation.org/planetary-health/). The game will allow participant to learn more about healthcare strategy as well as testing new or adapted strategies. Game design techniques can be used to interact with a simulation. Students will learn about agent based simulation, healthcare systems and strategy.
PICTAG is a project that captures pictures and comments from Art installations (or heritage locations). The project will utilise visual and non-visual tagging (e.g. QR Codes, NFC or iBeacon) to track visitors and allow them to send media and comments. The challenge of this project is the subsequent analysis and visualisation of contributions! Heat maps, simulation, sentiment analysis, emotional tracking are all possible directions. Game design and/or social media techniques can also be used to construct a resulting virtual exhibit.
HEALTHVIZ is a project that extracts data from health “open-data” sites for use in new forms of visualisation (utilising underlying graph storage). Most open data is in tabular format and not directly connected. HCI techniques will used to design the visualisation and the user experience (UX).
MOBILE DISASTER MANAGEMENT is a project that aims to predict the needs of stakeholders in a disaster management scenario. Open-data will be used to design, build and drive a mobile app that support various response services. The project can be more focused towards data analysis or mobile app design and usability.
BigSpace – Mixed modality and augmented reality (Ambitious Individual Projects addressing a common theme, meeting as a group)
A smart city agenda has been popularised by IBM (www.ibm.com/thesmartercity) in recent years, as cities with state of art infrastructure are being constructed. This project aims to investigate the augmentation of such cities by combining sensor technologies from a number of devices – the smart phone, RFIDs and Microsoft Kinect sensors. A city scape will be chosen and a number of applications will be built that sense and support users and groups of user as they move through the environment. A likely project could be the Brunel campus and the city of London. Student, staff and visitors documents/information/events could be made available using a number of devices, e.g. QR codes could be read and the encoded data placed on a map that is later accessed by the Smart phone immersive applications or on ambient screens using Kinect interaction. Campus data could include schedules, maps, food, social and teaching data. A number of scales will be chosen, e.g. a moving individual at a particular location, an individual on the move, a group of users in a space (e.g. an 8m square). The projects will investigate how intelligent software is able to improve the citizen experience (either on the Smart Phone screen or Ambient screens). A contrast between the more constrained Campus and wider cityscape could be investigated by using the same applications in restaurant, museum, transport and shopping scenrios. Although each project is individual (and assessed as such), the projects offers a real chance to interconnect and experiment on a wider environment with both technology and usability opportunities. The group will also work together on literature gathering and analysis. The supervision will be carried out by Dr. Bell. You are likely to have meetings with academics over the course of the year in order to fully support different activities and parts of the project.
Some possible project ideas:
- Smart phone development using GPS tagging to provide information and interact (seeking services in a Smart City)
- QR codes to label places for information provision, tagging, messaging, quiz (Student users, City Visitors)
- Kinect sensor (NUI) allowing the user to interact with their physical environment (shop, museum, government)
- Kinect monitoring of the environment or people within it (health, queuing, adaptive services)
- RFID sensor tagging of objects for selling, information provision (Intelligent food shopping, Audio information about the environment)
- An App store for a Smart City
- Virtual reality experience linking physical and virtual worlds (museum, wider environment)
- Monitoring and security in a Smart City
- Accessing and rating Government services in a Smart City
- Transport in a Smart City (public, sharing, information)
- Recommender systems in a Smart City
- Business models for the smart city
- Co-design of the smart or mobile services
Each student to carry out their own individual project under the “group” theme.
Intelligent Adaptive Advertising
A typical electronic advertising screen (in a shop window) will display advertisements at and for a particular time. This project will investigate how advertising can adapt to groups of people positioned in front of a screen. Learning approaches could be used to determine the impact of changes (e.g. movement of the advertisement viewers). The project will utilise the Microsoft Kinect sensor to determine group position (infer context) and adapt the screen accordingly. The Kinect API has only recently been released by Microsoft for academic use. It is assumed that the student will use their own Kinect sensor.
Brain controlled Robotics (Optional Group Project)
Controlling the environment using your own thoughts can have a number of benefits (particularly for the disabled user). This project will investigate how an EEG (a brain electrical sensor) can be used to control a remote device. This project could be undertaken by a two person team – one student focusing on the EEG sensing and the other on the robot control. A number of EEG sensors are available for use. The project will start with experimental work, identifying the sensitivity of the device and designing an associated event model. The project could use complex event processing, JMS and Web service technology.
The project will investigate how one or more games came be used to extend the educational experience and environment. The educational context can be chosen by the student – primary, secondary, higher or other education. It is envisaged that the student will develop a game using Facebook, Android, JME, Microsoft XNA or other environments. How can games make novel use of the smart phone sensors? How could a game be used to help students joining a University at the start of year 1? How can games be used to test and simulate ideas (virtual lab or business)? Can gaming be coupled with your educational CV development? One of more of these questions could form part of the project.
Mobile Trading Environments
A number of financial retail markets exist that allow users to trade shares, currencies, futures, options and bonds etc. (e.g. LMAX or TradeWeb). These markets typically rely on the matching of buyers with sellers or the real-time access to particular prices. This project will investigate how micro-markets can be developed using mobile technology (e.g. Smart Phones). The project will require you to develop both client and server code to support trading users. Further experience of smart phone development will be gained, along with service based XML and messaging. A particular market or group of markets could be chosen by the student.
Analysing public mood and emotion using social media
A number of academic papers have investigated the public mood or emotional state through the analysis of twitter feeds. Some have looked at the correlation to financial markets. This project will extend some of this work and look at mix of social media and markets. One use of such approach could be the prediction of the FTSE100 index. The project will involve the development of server based (web service or cloud) software that is able to read and analyse a number of data feeds – producing models of the source data and associated predictions. The project could (if the student is interested) also have a strong visualisation or semantic web component.
Healthcare Market Simulation (Group Project)
Current healthcare providers (e.g. GP or hospitals) and their suppliers of products and services (e.g. home monitoring) find it difficult to fully understand how future marketplaces will operate. One way of investigating this future state is through business simulations or gaming – working with their partners and customers in the business network to explore future opportunities and innovation.
A server side platform been built to support this exploration using XML interfaces that are able to connect a diverse set of user interface applications and support varied categories of user. This project is a group project where each student will work on a specific client application, simulation or game based application. The server side application (and game/simulation engine) will be made available as the year progresses (with an interface being defined early in the project development). Consequently, each application will be able to interoperate with the others.
Students will need to both understand the specific business requirements of different stakeholders in the network (suppliers, customer, integrators etc.) as well as build some leading edge technical solutions. The business domain being investigated is a healthcare remote care model where device suppliers, healthcare staff, patients and integrators come together to investigate viable future business models. The project will offer a number of project opportunities, including:
(a) Developing a game, application or simulator using Facebook, iPhone, Android, Blackberry, Xbox360 or PC platforms,
(b) Analysing the proposed healthcare environment and
(c) generating business rule based code.
Examples could include selecting/offering a healthcare service, dose management, first aid, insulin guidance, drip-rate calculators and journey planning. Each project will utilise common server side XML technology in order to provide a reasonable level of interoperation. The group are also likely to work together on literature gathering, analysis and reviewing each others work. The supervision will be carried out by Prof. Young, Dr Bell or Dr. Kent. You are likely to have meetings with all three academics (possibly including some Healthcare workers) over the course of the year in order to fully support different activities and parts of the project.
The group will meet together with 1 or more of the supervision team. 1 to 1 meetings will occur in term 2 when more individual project discussion is required.
BigSpace – Mixed modality augmented reality (Group Project)
A smart city agenda has been popularised by IBM (www.ibm.com/thesmartercity) in recent years, as cities with state of art infrastructure are being constructed. This project aims to investigate the augmentation of such cities by combining sensor technologies from a number of devices – the smart phone, RFIDs and Microsoft Kinect sensors. A city scape will be chosen and a number of applications will be built that sense and support users and groups of user as they move through the environment. A likely project could be the Brunel campus where student and staff data could be made available using a number of devices, e.g. QR codes could be read and the encoded data placed on a map that is later accessed by the Smart phone immersive applications or on ambient screens using Kinect interaction. Campus data could include schedules, maps, food, social and teaching data. A number of scales will be chosen, e.g. a moving individual at a particular location, an individual on the move, a group of users in a space (e.g. an 8m square). The projects will investigate how intelligent software is able to improve the citizen experience (either on the Smart Phone screen or Ambient screens). Although each project is individual (and assessed as such), the projects offers a real chance to interconnect and experiment on a wider environment with both technology and usability opportunities. The group will also work together on literature gathering and analysis. The supervision will be carried out by Prof. Louvieris and Dr. Bell. You are likely to have meetings with academics over the course of the year in order to fully support different activities and parts of the project.
This year I am supervising a number of mobile and pervasive projects (using RFID, gaming and Android) to investigate banking object tracking, ambient experience enhancement, educational systems and others.
e-Participation in Schools: This project will investigate how stakeholder decision making can be improved through the use of web based e- tools. The study will look at a school environment – students, parents, governors, teachers etc – and aim to improve decision making. The research will likely involve the comparison of open-source products or the development of a suitable web based system.
Software Engineering Teams: The project will study the motivations of software engineering teams and the relationship between team working, success and personality factors. Some of the primary data will have already been collected using standard personality testing.
Computer Control using brain activity: The project will investigate how EEG devices can be used to interact with computer software. An EEG device is available for use. This project will involve a reasonable amount of software development. One example could be to use the EEG device as a means to visualise
SAP Business data without direct interaction.
Haptic interaction within the home or business: The project will investigate how a haptic devices (such as Wiimote) can be used to interact with and control applications within a home or business environment. The project will involve a reasonable amount of programming.
Mobile Class Messaging (MCM) aims to improve lecture or class interaction using a mobile device. Instead of requiring that mobile phones be turned off, the mobile phone will be used (in silent mode) to interact with the lecturer/teacher. Possible scenarios include monitoring levels of understanding, questioning, voting on the direction and content of lecture, games, and class tests. The project will require a reasonable amount of programming.
Cloud Mobility: Accessing cloud computing from mobile devices opens a number of challenges, ranging from the user interface to the discovery of appropriate functionality within the cloud environment. The project will investigate how a particular business or scientific problem can be addressed with such a novel architecture. The project will require a reasonable amount of programming.
Wiimote, Gaming Glove or EEG Robot Control: This project aims to utilise commodity gaming devices to improve the way in which we interact with, analyse and visualise a system in use (namely a robot control system).
Swarm Searching on the Semantic Web: This project will investigate how swarm algorithms can be use to improve ontology searching. A domain can be chosen by the student – or an existing ontology can be used.
Although I have reached my quota of FYP projects (2009) – here is my list from the last 2 years:
|Mobile Class Messaging|
|Wiimote, Gaming Glove or EEG Data Immersion|
|Mobile Science Lab|
|XNA or J2ME Educational Game|
|Searching the Semantic Web (Web 3.0 Search)|
|HAOS – A haptic operating system|
|Game Based Ubiquitous Simulator|
|Machinima: Cinema meets Software Engineering|
|Web 2.0,3.0… Visualisation|
|Mobile Service Discovery|
|Business Grid Services – or – Mobile Business Grid Services|
|Applications for Ambient Objects|