Case Study 1: Personal Devices for Health and Wellbeing

In this Case Study we look at the security, privacy, bias and trust in digital health technologies that are available off the shelf for individuals to use as personal devices. Examples include fitness trackers and female-oriented technologies (FemTech) such as fertility trackers and menopause management apps and IoT devices. 

Digital Health and FemTech apps and devices collect a wide range of information about users including: information about User (e.g., name, photo, age, gender), Contact (e.g, mobile, email, address), Lifestyle (e.g., weight, diet, sleep), Period (e.g., cycle length, ovulation days), Pregnancy (e.g., test results, due dates, IVF), Nursing (e.g., type, volume, pain), Reproductive organs (e.g., cervical mucus, muscle strength), Sexual activities (e.g., date, contraceptives, orgasm), Medical information (e.g., medication type, blood pressure, lab reports scan), Physical symptoms (e.g., headache, constipation), and Emotional symptoms (e.g., happy, anxious). 

In addition to data directly concerning the user, these devices also ask for or automatically collect data about others including: Baby or child (e.g., nursing, sleep cycles, fetal movements), Social media profiles, forums, or plugins (e.g., Facebook, Spotify), Partner (e.g., details of partnered sex activities, name, age, photo). These technologies might even ask about the medical history of the user’s family (e.g., cancer).  

Finally, these systems also have access to the devices’ resources on the phone and the IoT device e.g, camera, microphone, device files and storage, phone’s contacts and calls, communicational sensors (WiFi, Bluetooth, NFC), motion and environmental sensors from the phone or the device (e.g., temperature, pressure, Co2). 

This data can be used for several purposes including functioning the app. However, the majority of these apps and IoT devices do not need such a wide range of information to deliver their service. The data can be also monetized i.e., being shared with third parties without valid consent from the user or even their knowledge. 

Collecting and sharing user data is neither a new nor a simple problem to tackle; particularly when different demographics e.g., gender are factored in. There are several entities who might be interested in such data including: (Ex-)Partner and Family, Employers and Colleagues, Insurance Firms, Cyber-criminals, Advertising Companies, Political and Religious Organisations, Governments, Medical and Research Companies, and beyond. 

In this Case Study, we specifically aim to:

  • O1: to conduct a pilot research study on personal digital health technologies and map their usage applications, platforms, services, and security and privacy features.  
  • O2: to perform system experiments on apps, websites and IoT devices and measure their security and privacy practices (e.g., data collection, tracking, data sharing, data breach, user consent, privacy policies, etc.). 
  • O3: to conduct user studies to contextualize the complex risks and harms of such technologies (including data breaches and their consequences) via the lived experiences of citizens. 
  • O4: to translate and disseminate our intersectional research results for experts across disciplines and non-expert stakeholders including end-users. 

We aim to tackle the complex risks and harms associated with these technologies by taking an interdisciplinary approach and by working with multiple stakeholders including academic researchers, industrial partners, policy makers, as well as end users and citizens.