R1: Predicting sleep stages Human activity monitoring is a well-established field but is mostly based on approaches which are inconvenient and intrusive. Sleep stage monitoring also relies on Polysomnography, which is conducted in hospitals or sleep labs, and the patient is required to wear several sensors. It is not yet known if the same level of results and accuracy can be achieved using machine learning algorithms with data collected from non- intrusive sensor. This thesis will, therefore, explore the potential of (R1) predicting the sleep stages using single non- intrusive sensors with deep learning. More specifically: • Relate sleep stages from EEG data to the sensors implemented during the research. • Model the dynamics of the sensor readings to sleep stages and stage transitions. R2: Detecting mental and emotional stages Emotion recognition from biometrics is relevant in a wide range of application domains including healthcare. Monitoring the emotional stages could be useful in identifying concerned behavior before they become serious. In this thesis, we will (R2) detect mental/emotional stages (using deep learning algorithms), which are precursors to suicide attempts from the patient behavior and vital signs by using non-intrusive sensors. More specifically: • We need to collect data by interviewing the psychiatric nurses/doctors to find out what tell-tale signs they are looking for while observing the patients on the suicide watch. • We need to model the behavior of the patients in conjunction with vital sign monitoring to detect the critical stage which requires intervention. R3: Improve the prediction using multiple sensors It is an open question of whether we can improve the prediction of sleep stages and emotional states using multiple non-intrusive sensors (R3).
Project leader: Ann-Elisabeth Ludvigsen
Institution: EGDE CONSULTING AS