UC-DASH

UC-DASH: UC-Davis AI-based Sensing in Healthcare

 

Call for close collaboration between healthcare and engineering teams

 

About

We develop and enhance ML technologies to analyze healthcare data and uncover patterns and insights that humans could not find on their own. With the advances in deep learning (DL) and classical machine learning (CML), we help healthcare partitioners to make better clinical decisions and improve the quality of the experiences they provide.

Researchers  

  • Professor Xin Liu
  • Huanle Zhang
  • Rex Liu
  • Albara Ah Ramli
  • Arefeh Yavary

Talk  

Human Activity Recognition through Wearable Sensors: DMD patient identification and ICU patient activity recognition, [video] [slides]

    Publications 

    • An Overview of Human Activity Recognition Using Wearable Sensors: Healthcare and Artificial Intelligence [pdf]
      Rex Liu*, Albara Ah Ramli*, Huanle Zhang, Erik Henricson, and Xin Liu
      International Conference on Internet of Things (ICIOT), 2021.
    • Gait Characterization in Duchenne Muscular Dystrophy (DMD) Using a Single-Sensor Accelerometer: Classical Machine Learning and Deep Learning Approaches [pdf]
      Albara Ah Ramli, Huanle Zhang, Jiahui Hou, Rex Liu, Xin Liu, Alina Nicorici, Daniel Aranki, Corey Owens, Poonam Prasad, Craig McDonald, Erik Henricson, arXiv.
    • Early Mobility Recognition for Intensive Care Unit Patients Using Accelerometers [pdf]
      Rex Liu, Sarina A Fazio, Huanle Zhang, Albara Ah Ramli, Xin Liu, Jason Yeates Adams
      Artificial Intelligence of Things (AIoT) workshop at KDD 2021.
    • BWCNN: Blink to Word, a Real-Time Convolutional Neural Network Approach [pdf]
      Albara Ah Ramli, Rex Liu, Rahul Krishnamoorthy, Vishal I B, Xiaoxiao Wang, Ilias Tagkopoulos, Xin Liu
      International Conference on Internet of Things (ICIOT), 2020.
    • An Automated System for Early Diagnosis, Severity and Progression Identification in Duchenne Muscular Dystrophy: A Machine Learning and Deep Learning Approach [abstuct] [pdf]
      Albara Ah Ramli, Alina Nicorici, Poonam Prasad, Jaihui Hou, Craig Mcdonald, Xin Liu, and Erik Henricson
      7th Annual UC Davis Health Human Genomics Symposium, 2020

     

    Ongoing projects

    • Duchenne Muscular Dystrophy Identification and Community-based Prescreening
      Team: Professor Erik Henricson, Albara Ah Ramli, Rex Liu, Huanle Zhang, and Professor Xin Liu
      Summary: Differences in gait patterns of children with Duchenne muscular dystrophy (DMD) and typically developing (TD) peers are visible to the eye, but quantification of those differences outside of the gait laboratory has been elusive. In this project, we have developed a system consisting of an iPhone app to collect raw data remotely using the phone’s built-in accelerometer sensor, and a web-based tool to aggregate, store and analyze data. Using this system, we have collected mobility data based on a protocol consisting of seven walking/running tasks, including five 25m walk/run tests at a slow walk to running speeds, a 6-minute walk test (6MWT), and a 100-meter-run/walk (100MRW).  We use both traditional machine learning algorithms and deep learning algorithms with both extracted features and raw data. We demonstrate that using AI tools with accelerometer data from a consumer-level smartphone, we can identify DMD gait disturbance in toddlers to early teens. We are in the process of promoting our developed system to the community to enable a much broader scale of disease prescreening. The system we developed can be used for many mobility-related disease prescreening, identification, and progress monitoring, both in medical facilities and at home
    • Early mobility recognition for Intensive Care Unit patients
      Team: Professor Sarina Fazio, Professor Jason Adams, Rex Liu, Huanle Zhang,  Albara Ah Ramli, Professor Xin Liu
      Summary: Due to long periods of inactivity and immobilization, Intensive Care Unit(ICU) patients become weak when recovering from critical illnesses. Therefore, it is of great interest for clinicians to identify ICU patients' early mobilization, an effective and safe intervention to improve ICU patients' outcomes, such as ventilator days and functional status. When clinicians can accurately recognize the EM activities of the ICU patients, they can prescribe an optimal personalized dose of mobility to the ICU patients. However, the advancement of research on EM is limited by the lack of accurate and effective systems to quantify patients' EM activities in the ICU. Our project aims to build early mobility recognition capabilities for ICU patients based on multi-modality data, i.e., wearable sensor data and video data. Our work currently is able to identify four early activities, i.e., repositioning, range of motion exercises, percussion therapy, and oral care based on wearable data. We are working on integrating video data with the corresponding wearable sensor data in a privacy preserving manner in order to identify 20 early activities for ICU patients accurately.
    • Identifying Hemorrhagic Stroke based on Wearable Sensors in Real-time
      Team: Professor Kevin James Keenan, Albara Ah Ramli, Huanle Zhang, and Professor Xin Liu
      Summary: Hemorrhagic strokes make up about 13% of stroke cases and these patients need to be treated promptly at specialized medical centers. Our objective here is to quickly identify such patients: because such patients have abnormal movement behavior, wearable sensors with an accelerometer/gyroscope could help capture this abnormal behavior. By analyzing these sensor signals, our objective is to develop ML models that can diagnose hemorrhagic stroke in real-time. As a result, ambulances can decide either to go to the nearest hospital or to a specialized medical center in order to treat patients quickly and appropriately.    
    • Gait Identification of Stroke Patients based on Multi-camera Systems and Force plates
      Team: Professor Carolynn Patten, Albara Ah Ramli, and Professor Xin Liu
      Summary: In current motion labs, doctors and researchers use multi-camera systems and force plates to analyze gait patterns of stroke patients. To bring this research to the community, we need to significantly simplify the equipment needs. Therefore, our ongoing project is to identify whether a subset of the existing system suffices or a mobile-device-based system, like the one we developed for DMD patients, can be used. The project has broad implications for mobility research and healthy aging.  
    • ADHD Actigraph Analysis
      Team: Professor Julie Beth Schweitzer, Professor Wilsaan M. Joiner, Rose De Kock, and Professor Xin Liu
      Summary:  Kids and adults with ADHD often exhibit different motion patterns during cognitive tasks. Using the collected actigraph data from over 600 patients over time, we hope to identify motion patterns and answer questions such as whether motion can assist cognitive functions and whether motion patterns can be used for more objective ADHD identification. 
    • Sonomyography based on Ultrasound 
      Team: Professor Wilsaan M. Joiner,Justin Fitzgerald, Arefeh Yavary, Huanle Zhang, and Professor Xin Liu
      Summary: In this project ultrasound signals are used to find tissue deformation patterns in children with upper limb below-the-elbow limb differences. Besides, ultrasound signals are a control signal for pediatric devices. We specifically focused on people who are missing the hand but they have some residual limb below the hand. Actually, we want to use machine learning and artificial intelligence to improve quantifying tissue deformation patterns in order to design prosthetic devices and perform tasks.

     

    Other projects

    • Alzheimer’s disease platform
      Team: Professor Sarah Tomaszewski Farias, Professor Alyssa Mae Weakley, Albara Ah Ramli, and Professor Xin Liu
      Summary: A central cognitive rehabilitation tool is the dry-erase whiteboard which is hung in a prominent location within the home and clearly displays the day, date, daily schedule, and important reminders. The whiteboard promotes orientation and reduces reliance on caregivers, but requires habitual training delivered by an in-home caregiver. The Interactive Care (I-Care) platform extends the traditional whiteboard to a digital application, making remote caregiving a possibility. The I-Care platform will consist of a large touch screen computer for the user that takes the place of the “whiteboard” and an accompanying mobile/tablet app. I-Care will enable a secure and rapid form of communication between the care-recipient and their caregiver and will allow all stakeholders to manage the care recipient’s calendar, daily to-do lists (e.g., take morning medication) and goals (e.g., exercise 20 minutes). Customizable alarms will allow users and caregivers to pre-record video, audio, or text messages to prompt when it is time to complete an activity. Once a task is completed, the care recipient can check it off, giving themselves a sense of accomplishment and keeping their caregiver in-the-loop. If the task has not been completed by a certain time I-Care can provide a friendly reminder to the care-recipient and a notification to the caregiver. Smart technology in the form of a sensor-equipped pillbox will also be incorporated.
    • Eye blinks detection for Amyotrophic Lateral Sclerosis (ALS) patients using ML
      Team: Professor Ilias Tagkopoulos, Albara Ah Ramli, Rex Liu, and Professor Xin Liu
      Summary: Amyotrophic lateral sclerosis (ALS) is a progressive neurodegenerative disease of the brain and the spinal cord, which leads to paralysis of motor functions. Patients retain their ability to blink, which can be used for communication. Here, We present an Artificial Intelligence (AI) system that uses eye-blinks to communicate with the outside world, running on real-time Internet-of-Things (IoT) devices. The system uses a Convolutional Neural Network (CNN) to find the blinking pattern, which is defined as a series of Open and Closed states. Each pattern is mapped to a collection of words that manifest the patient's intent. To investigate the best trade-off between accuracy and latency, we investigated several Convolutional Network architectures, such as ResNet, SqueezeNet, DenseNet, and InceptionV3, and evaluated their performance. We found that the InceptionV3 architecture, after hyper-parameter fine-tuning on the specific task led to the best performance with an accuracy of 99.20% and 94ms latency. This work demonstrates how the latest advances in deep learning architectures can be adapted for clinical systems that ameliorate the patient's quality of life regardless of the point-of-care.
    • COVID-19 symptoms tracker 
      Team: Professor Katherine Kim, Albara Ah Ramli, and Professor Xin Liu
      Summary: discover symptom phenotypes (characteristics) and trajectories (changes over time) of Coronavirus infection initially in order to: (1) understand which symptom profiles are predictive of infection. (2) prevent spread by those who are asymptomatic or who have mild symptoms (3) prioritize testing and health monitoring. We will, in the future, apply our system to other contagious or infectious diseases and seek to predict health outcomes, morbidity, and mortality. The system includes progressive web app development that runs in browsers, iOS, and Android platforms. Interoperable with HIEs, EHRs, and decision support tools, e.g., HL7 standards, mapped to ICD10-CM and SNOMED terminologies, FHIR.

     

    Collaborators

    • Professor Erik Henricson
      Assistant Professor, Department of Physical Medicine & Rehabilitation [Neuromuscular Disease Human Development]
      Project: Duchenne Muscular Dystrophy Diagnostic
    • Professor Kevin James Keenan
      Assistant Clinical Professor, Department of Neurology [Vascular Neurology]
      Project: Identifying acute stroke based on wearable sensors in real-time
    • Professor Wilsaan M. Joiner
      Associate Professor, Department of Neurobiology [Physiology and Behavior]
      Project: Sonomyography based on Ultrasound
    • Sarina Fazio (PhD, RN)
      Clinical Nurse Educator, Patient Care Services
      Project: Early mobility recognition for Intensive Care Unit patients
    • Professor Jason Yeates Adams
      Associate Professor, Pulmonary Medicine, Critical Care Medicine
      Project: Early mobility recognition for Intensive Care Unit patients
    • Professor Julie Beth Schweitzer
      Professor, Department of Psychiatry and Behavioral Sciences
      Project: ADHD Actigraph Analysis
    • Professor Carolynn Patten
      Professor, Department of Physical Medicine & Rehabilitation [Neurorehabilitation]
      Project: Identifying stroke from gait based on the multi-camera system and force plates
    • Professor Ilias Tagkopoulos
      Professor, Department of Computer Science
      Project: Eye blinks detection for Amyotrophic Lateral Sclerosis (ALS) patients using ML
    • Professor Katherine Kim
      Associate Professor, Department of Public Health
      Project: COVID-19 symptoms tracker
    • Professor Sarah Tomaszewski Farias
      Professor, Department of Neurology [Neuropsychology Cognitive Neuroscience Alzheimer's Disease Dementia]
      Project: Alzheimer’s patients' behaviors
    • Professor Alyssa Mae Weakley
      Assistant Professor, Department of Neurology [Neuropsychology, Neurodegenerative Conditions, Movement Disorders, Epilepsy]
      Project: Alzheimer’s patients' behaviors