Sending the clinicians to the toilets to remove their shirts and try out home-made iPhone stethoscopes - this was the kind of spur of the moment inventiveness that would later be hard to replicate over lockdown video meetings. We were a small new team at Huma (renamed from Medopad during this period) developing “digital biomarkers” - measures of human health (physiological or behavioural) that could be captured via digital devices. We were determined to stick to regular smartphones, although at times we strayed into considering attachments, or straps of various kinds to help capture gait. A modern smartphone has a surprising number of sensors - as well as the microphone and camera you’d usually have 3-axis accelerometers and gyroscopes, and a magnetometer (compass).

Many digital biomarkers already existed. For example, if you turn on the flash and press your finger against the camera lens, minute periodic changes in the detected red colour could be used to pick up your heart rate using photoplethysmography (PPG for short), and even calculate measures such as heart rate variability. The challenge in developing digital biomarkers is to prove they work as well as a more traditional measurement, and ultimately to validate them as medical devices (Happitech’s software for detecting heart rate from smartphone cameras, for example, is now CE marked as a medical device).

I was a few months into my tenure at Huma when Covid came to Europe (we’d already been liaising with a researcher in Singapore who was dealing with the outbreak much earlier). Once the lockdowns began and we’d been dispersed from the heights of Millbank Tower to our separate homes, digital biomarker work became considerably more challenging. Who had the test devices? How could we run user testing or trials with patients? One thread of our work became a project to measure breathing rate, considered significant during the early panic of the pandemic, but also one of the four main vital signs (alongside temperature, pulse and blood pressure). Analysing the tiny movements of a phone held to your chest could pick up your in and out breaths, at least in ideal conditions (and potentially even heart rate, known as ballistocardiography or seismocardiography), and our challenge was to turn this into an accurate product that could be used “in the wild”. It took considerable creativity over the next few months to figure out where to position the phone (sternum, diaphragm, sitting or lying, left, right or middle - endless tests!), how to run trials (lots of individual testers recruited at home), how to compare to a “gold standard” (fun conversations with Dr Jack about CPEX testing, tidal volumes and hypoxic tents), how to eliminate obviously bad signals, and how to achieve an accuracy comparable to a clinician or a medical device. In a genuine setting it was hard to know if the user was holding the phone where and how they’d been asked to, or moving it around, or had been interrupted by a phone call, so there was a big emphasis on instructional material and ease of use. In the end the team produced a great product that achieved an accuracy within 1-2 breaths per minute, although I left before the lengthy process of medical device regulation had completed.

It had been an intense year at Huma, mostly during the lockdowns of 2020. As well as breathing rate the team worked on a big trial to use an app as part of the Fenland study, creating new measures of Covid risk (with a somewhat crazy attempt to link data from a hospital in Modena (northern Italy being hit hard at the beginning of the pandemic), with a research team at Newcastle University's National Innovation Centre for Data and epidemiologists at Johns Hopkins, and ultimately the UK Biobank dataset), collaborating with the frighteningly global Pandemic Alliance, exploring AI for lung cancer imaging, and looking at many new biomarkers using accelerometers for gait or the camera looking at faces. There’s a common saying that 90% of data science is cleaning the data. What I learned from my year working on AI digital biomarkers is that 90% of health AI is perfecting the user interface and struggling with regulation, so progress can seem slow, but ultimately I still believe a more preventative healthcare system will depend on widely distributed and cheap health sensing.