TSB Research Grant called TEA-PoCT

Myself and Professor Terry Young have received £350K of funding from the
Technology Strategy Board (TSB) to investigate how novel mobile and Web based
knowledge platforms are able to support the economic modelling and co-design of
medical diagnostic devices.  The project is titled TEA-PoCT (Tools for
evaluation around point of care testing) and will extend models developed by
the MATCH Programme (www.match.ac.uk) with collaborators at the University of Birmingham.

Advertisements

FYP Projects 2012/13

BigSpace – Mixed modality and augmented reality (Ambitious Individual Projects addressing a common theme, meeting as a group)

A smart city agenda has been popularised by IBM (www.ibm.com/thesmartercity) in recent years, as cities with state of art infrastructure are being constructed. This project aims to investigate the augmentation of such cities by combining sensor technologies from a number of devices – the smart phone, RFIDs and Microsoft Kinect sensors. A city scape will be chosen and a number of applications will be built that sense and support users and groups of user as they move through the environment. A likely project could be the Brunel campus and the city of London. Student, staff and visitors documents/information/events could be made available using a number of devices, e.g. QR codes could be read and the encoded data placed on a map that is later accessed by the Smart phone immersive applications or on ambient screens using Kinect interaction. Campus data could include schedules, maps, food, social and teaching data. A number of scales will be chosen, e.g. a moving individual at a particular location, an individual on the move, a group of users in a space (e.g. an 8m square). The projects will investigate how intelligent software is able to improve the citizen experience (either on the Smart Phone screen or Ambient screens). A contrast between the more constrained Campus and wider cityscape could be investigated by using the same applications in restaurant, museum, transport and shopping scenrios. Although each project is individual (and assessed as such), the projects offers a real chance to interconnect and experiment on a wider environment with both technology and usability opportunities. The group will also work together on literature gathering and analysis. The supervision will be carried out by Dr. Bell. You are likely to have meetings with academics over the course of the year in order to fully support different activities and parts of the project.

Some possible project ideas:

  • Smart phone development using GPS tagging to provide information and interact (seeking services in a Smart City)
  • QR codes to label places for information provision, tagging, messaging, quiz (Student users, City Visitors)
  • Kinect sensor (NUI) allowing the user to interact with their physical environment (shop, museum, government)
  • Kinect monitoring of the environment or people within it (health, queuing, adaptive services)
  • RFID sensor tagging of objects for selling, information provision (Intelligent food shopping, Audio information about the environment)
  • An App store for a Smart City
  • Virtual reality experience linking physical and virtual worlds (museum, wider environment)
  • Monitoring and security in a Smart City
  • Accessing and rating Government services in a Smart City
  • Transport in a Smart City (public, sharing, information)
  • Recommender systems in a Smart City
  • Business models for the smart city
  • Co-design of the smart or mobile services

See

http://channel9.msdn.com/coding4fun/kinect

http://www.hcplive.com/pop-medicine/Microsoft-Kinect-Helps-Research-How-To-Prevent-Patient-Falls/

http://www.mddionline.com/blog/devicetalk/xbox-kinect-medical-device-your-living-room

Each student to carry out their own individual project under the “group” theme.

3D Printing

3D printers are now available for the hobby designers. Products like Reprap and Makerbot offer sub £1k design tools. You can design items for everyday use or prototype more futuristic design concepts. I like the way some of the printers (Reprap) are able to re-create themselves.

Take a look at http://www.guardian.co.uk/technology/2010/dec/19/3d-printer-kit-makerbot or http://reprap.org/wiki/Main_Page.

FYP Projects 2011/12

Intelligent Adaptive Advertising

A typical electronic advertising screen (in a shop window) will display advertisements at and for a particular time. This project will investigate how advertising can adapt to groups of people positioned in front of a screen. Learning approaches could be used to determine the impact of changes (e.g. movement of the advertisement viewers). The project will utilise the Microsoft Kinect sensor to determine group position (infer context) and adapt the screen accordingly. The Kinect API has only recently been released by Microsoft for academic use. It is assumed that the student will use their own Kinect sensor.

Brain controlled Robotics (Optional Group Project)

Controlling the environment using your own thoughts can have a number of benefits (particularly for the disabled user). This project will investigate how an EEG (a brain electrical sensor) can be used to control a remote device. This project could be undertaken by a two person team – one student focusing on the EEG sensing and the other on the robot control. A number of EEG sensors are available for use. The project will start with experimental work, identifying the sensitivity of the device and designing an associated event model. The project could use complex event processing, JMS and Web service technology.

Educational Gaming
The project will investigate how one or more games came be used to extend the educational experience and environment. The educational context can be chosen by the student – primary, secondary, higher or other education. It is envisaged that the student will develop a game using Facebook, Android, JME, Microsoft XNA or other environments. How can games make novel use of the smart phone sensors? How could a game be used to help students joining a University at the start of year 1? How can games be used to test and simulate ideas (virtual lab or business)? Can gaming be coupled with your educational CV development? One of more of these questions could form part of the project.

Mobile Trading Environments

A number of financial retail markets exist that allow users to trade shares, currencies, futures, options and bonds etc. (e.g. LMAX or TradeWeb). These markets typically rely on the matching of buyers with sellers or the real-time access to particular prices. This project will investigate how micro-markets can be developed using mobile technology (e.g. Smart Phones). The project will require you to develop both client and server code to support trading users. Further experience of smart phone development will be gained, along with service based XML and messaging. A particular market or group of markets could be chosen by the student.

Analysing public mood and emotion using social media

A number of academic papers have investigated the public mood or emotional state through the analysis of twitter feeds. Some have looked at the correlation to financial markets. This project will extend some of this work and look at mix of social media and markets. One use of such approach could be the prediction of the FTSE100 index. The project will involve the development of server based (web service or cloud) software that is able to read and analyse a number of data feeds – producing models of the source data and associated predictions. The project could (if the student is interested) also have a strong visualisation or semantic web component.

Healthcare Market Simulation (Group Project)

Current healthcare providers (e.g. GP or hospitals) and their suppliers of products and services (e.g. home monitoring) find it difficult to fully understand how future marketplaces will operate. One way of investigating this future state is through business simulations or gaming – working with their partners and customers in the business network to explore future opportunities and innovation.
A server side platform been built to support this exploration using XML interfaces that are able to connect a diverse set of user interface applications and support varied categories of user. This project is a group project where each student will work on a specific client application, simulation or game based application. The server side application (and game/simulation engine) will be made available as the year progresses (with an interface being defined early in the project development). Consequently, each application will be able to interoperate with the others.
Students will need to both understand the specific business requirements of different stakeholders in the network (suppliers, customer, integrators etc.) as well as build some leading edge technical solutions. The business domain being investigated is a healthcare remote care model where device suppliers, healthcare staff, patients and integrators come together to investigate viable future business models. The project will offer a number of project opportunities, including:
(a) Developing a game, application or simulator using Facebook, iPhone, Android, Blackberry, Xbox360 or PC platforms,
(b) Analysing the proposed healthcare environment and
(c) generating business rule based code.
Examples could include selecting/offering a healthcare service, dose management, first aid, insulin guidance, drip-rate calculators and journey planning. Each project will utilise common server side XML technology in order to provide a reasonable level of interoperation. The group are also likely to work together on literature gathering, analysis and reviewing each others work. The supervision will be carried out by Prof. Young, Dr Bell or Dr. Kent. You are likely to have meetings with all three academics (possibly including some Healthcare workers) over the course of the year in order to fully support different activities and parts of the project.

GROUP:
The group will meet together with 1 or more of the supervision team. 1 to 1 meetings will occur in term 2 when more individual project discussion is required.

BigSpace – Mixed modality augmented reality (Group Project)

A smart city agenda has been popularised by IBM (www.ibm.com/thesmartercity) in recent years, as cities with state of art infrastructure are being constructed. This project aims to investigate the augmentation of such cities by combining sensor technologies from a number of devices – the smart phone, RFIDs and Microsoft Kinect sensors. A city scape will be chosen and a number of applications will be built that sense and support users and groups of user as they move through the environment. A likely project could be the Brunel campus where student and staff data could be made available using a number of devices, e.g. QR codes could be read and the encoded data placed on a map that is later accessed by the Smart phone immersive applications or on ambient screens using Kinect interaction. Campus data could include schedules, maps, food, social and teaching data. A number of scales will be chosen, e.g. a moving individual at a particular location, an individual on the move, a group of users in a space (e.g. an 8m square). The projects will investigate how intelligent software is able to improve the citizen experience (either on the Smart Phone screen or Ambient screens). Although each project is individual (and assessed as such), the projects offers a real chance to interconnect and experiment on a wider environment with both technology and usability opportunities. The group will also work together on literature gathering and analysis. The supervision will be carried out by Prof. Louvieris and Dr. Bell. You are likely to have meetings with academics over the course of the year in order to fully support different activities and parts of the project.