Dr. Nowell Talks About RA Patient Engagement in Smartwatch Study

By Kaitlyn D’Onofrio - Last Updated: August 1, 2023

Ben Nowell, PhD, Director of Patient-Centered Research at the Global Healthy Living Foundation and CreakyJoints Patient community, and Principal Investigator of the ArthritisPower research registry, discusses a study presented at the American College of Rheumatology (ACR) 2020 Convergence about patient engagement in studies that use a smartwatch to capture data. This study focused on patients with rheumatoid arthritis (RA).

Advertisement

Dr. Nowell: Hello, my name is Ben Nowell, and I’m the director of patient-centered research at the Global Healthy Living Foundation and Creaky Joints Patient Community. And I’m also the principal investigator of our ArthritisPower research registry, which we started about seven years ago with rheumatologists and pharmacoepidemiologists at the University of Alabama at Birmingham. Today I’d like to provide an overview of our oral presentation that [was] presented on November 9th this year [at the ACR 2020 Convergence], and it’s titled, “Participant Engagement and Adherence in an ArthritisPower Real World Study to Capture Smartwatch and Patient Reported Outcome Data Among Rheumatoid Arthritis Patients.” And this is abstract #1979.

Given the widespread availability now of smartphones like an iPhone or an Android, [and] smartwatches (of course I’m not wearing any right now) like a Fitbit or Apple Watch and other digital devices, it’s important to understand what patients’ willingness is to engage in activities, research activities, research tasks with these devices, as well as to confirm how complete we can get the data to be when we’re requiring multiple daily and weekly digital tasks of participants in a study. In this study, we characterized the participating patients’ engagement, their adherence to the daily digital tasks and weekly tasks, and also the completeness [of the data].

This is a larger study where we’re actually looking at the association of activity data collected passively and patient-reported outcome (PRO) measures that patients are completing in the ArthritisPower app. In this study, first we had a lead-in period (some people call it a run-in period) to gauge who would be committed to completing the entire study protocol. It’s like a tryout period. For our lead in, we asked that participants, for at least 10 days of that first two-week lead-in period, they needed to electronically complete two daily single item measures. We use the pain and the fatigue numeric rating scales, and also longer weekly sets of electronic PROs (EPROs). All of this was collected in ArthritisPower, in the app. Once the participants had successfully completed that lead-in period, they were mailed a smartwatch.

We used the commercial grade smartwatch—in this case, a Fitbit Versa—along with additional study materials describing the setup so that they could then participate in the main study. The main study included both automated prompts and reminders that would come via email, and also the lock screen notification on their phone, and also manual prompts or intervention from our case managers, study coordinators, to make sure that people were staying on track to complete their daily and their weekly EPROs where the smartwatch charge and sync it regularly for the next three to six months.

As part of our methods, we had priori designated what triggers would prompt a call, text, or email from the real human beings. We did as much as we could to automate it, but then if it looked like a participant in the study was not able to complete their tasks, we would know about it right away and we could reach out to them to see what was going on. As the study progressed, our study coordinators obviously were monitoring that patient data [and] contacted patients to resolve adherence issues. That of course was allowed based on our protocol. The way we defined adherence, participant adherence in the study during the main study, was that we wanted to get 70% or more of data overall. That meant we wanted to have at least 70% of the daily EPROs—70% of the 84 days in that main study period. We wanted to get at least 70% or more of the smartwatch data for those 84 days. [For] smartwatch data, we considered complete for each individual day if they were providing 80% or more of the total minutes in that day of activity. And then for the weekly EPRO data, we said we needed to have, again, that 70% cutoff, so nine or more of the 12 weeks of weekly EPRO data. So over that 84 day main study period, we found that the lowest adherence was to providing the daily EPROs, because that was in some ways the most cumbersome task—[participants] had to remember to do it every day.

However, we saw that a much better adherence rate was elsewhere, so we had greater than 80% adherence to both the smartwatch data we were getting and also the weekly EPROs. So to put some numbers on that, we had, if you lined up all those different types of digital daily tasks, 53% of the participants met that composite adherence. That meant every single day, they had to provide daily data plus smartwatch data, plus the weekly EPROs. This number was kind of brought down by the fact that about 57% of the participants were able to complete those daily EPROs more than 70% of the 84 days. However, we had 87% of the participants provided their weekly EPRO data at least 80% of the time. And 82% provided smartwatch data at least 70% of the time.

And compared to others, there’s other details about whether or not people met different types of adherence. We looked at meeting smartwatch versus EPRO adherence and also compared it against the baseline information that we had about the participants. Patients who experienced higher levels of pain or lower levels of physical function at baseline or during the lead-in period, they completed the EPROs in a greater proportion than those with less pain [and] better physical function, but they may not have always adhered to the smartwatch use when they advanced to the main study period.

Compared to other digital health studies in RA, like the PARADE study that was conducted a few years ago that was exclusively using the iOS platform or iPhones, patients in this completely virtual study were well engaged and adherent, meaning that more than 80% of the data came in over those 12 weeks of the main study period. And this was really good adherence despite an onboarding process and the study itself being entirely remote or virtual.

Some potential reasons that we attribute [to] this good success in getting people to be adherent is, one, we had the lead-in element in the design—that two-week period where we wanted to, before we ship them the more expensive smartwatch, we wanted to just make sure they were comfortable doing digital tasks on a regular basis using their smartphone that they had already. They just needed to download and use the ArthritisPower app. We also had a patient-centered custom app workflow. In ArthritisPower, we employed [a] user-centered design where we worked with patient partners to design the workflow, make sure that it was patient friendly, that it was usable, [and] the user interface was good so that patients could easily follow along what they were supposed to do. Another reason for this good adherence is that we had active monitoring by study coordinators and automated reminders. So we had both the automated reminders and we also had active monitoring by real human beings. And if necessary, we could reach out to them via email, text, and phone calls as needed.

The adherence, of course, varied by data type. The highest adherence was for the weekly EPROs, but the smartwatch data was not that far behind. The patients who had more pain, more fatigue in the lead-in period were not advancing in a greater proportion. Those who did advance to that main study may have been also those that had a little bit better, not as bad pain and fatigue scores, so there’s a potential trade off and generalizability. So potentially looking to other studies in this area is that there may be less time commitment, [which] might be more feasible for some patients. The other exciting thing about this is that it was possible for us to characterize the profile of RA patients who are more likely to engage with the digital data collection to help us inform engagement in future app based real-world evidence studies like this one.

In conclusion, I think the key takeaway is that we really need to understand how patients in this day and age are going to interact with these kinds of digital smart technology, a smartwatch, a smart phone, and at the same time confirm [and] verify that we’re getting the data that we need to the extent that we needed to answer the primary aims of the study. This also demonstrates that it’s possible to do real-world studies involving passive data collection and completely virtual studies in a way that minimizes data missing and promotes this longitudinal engagement in a way that meets the needs of participants in the study.

Thank you very much for your interest. Happy to talk more about this if anyone wants to reach out to me directly. And I should say too that this was a study that was sponsored by Eli Lilly and Company and their digital real-world evidence unit. Thanks.

For more information on Dr. Nowell’s research, click here to watch another interview, “Ben Nowell, PhD, Discusses How A Financial Incentive May Increase Physical Activity in Inflammatory Arthritis Patients.”

Advertisement