Legislation Trailing Advances in Health Research in Consumer Tech

Clinical research has been evolving rapidly, and this growth will be accentuated by the current introduction of consumer technology to clinical research as a platform to collect health metrics. Improved consumer technology that has the ability to collect and aggregate user data within a given database allows for clinicians to enroll a larger number of patients or participants into studies, and also allows them to access more geographically diverse participants. However, clinicians must tread these waters carefully because in the absence of legislation that accurately considers the potential of health research utilizing consumer technology, clinicians carry more responsibility for ensuring that they satisfy their primary duty of providing healthcare before their duties as scientists and researchers. 

Despite the promises and seemingly unlimited potential of these new forms of clinical research, clinicians must remain steadfast in their core values, especially because the law has not caught up to the rapid growth of consumer technology in clinical research. For instance, the Leon Medical Centers and Nocona General Hospital were hacked in December 2020, which led to roughly 84% of patient data being stolen and leaked online. This data not only involves medical data such as treatments and diagnostic tests, but also highly personal data such as Social Security numbers. Conventionally this data is stored by hospitals to ensure a comprehensive history of patients exists for future visits, but it also allows clinicians to conduct detailed prospective and retrospective database analyses. The Declaration of Helsinki provides a number of guidelines for clinicians, championing that they should be wary of emerging technologies and open data in their practices, and the declaration imposes a duty upon clinicians to “protect the privacy and confidentiality of personal information of research subjects.” This is important because clinical research using consumer technology is particularly vulnerable to data breaches, and such breaches can be catastrophic with unintended consequences beyond the healthcare space. Therefore, when clinicians enroll patients into research studies they must be confident beyond any reasonable doubt that the data is safe and can only be accessed by authorized individuals. This guideline should be imposed because in the case of a data breach, the information that is leaked is a lot more personal and influential than other forms of data.  

There are other forms of guidelines and regulations that must be followed by clinicians supervising a research study with patient data. First, there are guidelines provided by the International Ethical Guidelines for Health-Related Research Involving Humans that stipulate that researchers should communicate with participants that an opt-out procedure exists, where if a person objects to the use of their data for research purposes, the researcher is forbidden from using their data. In other words, if a patient refuses to provide consent for including their data in a research study, a clinician cannot use any law to claim that they have the right to include the patient’s data even if they are the healthcare provider. Another guideline, the Common Rule, is provided by the Federal Policy for the Protection of Human subjects, which stipulates that institutional review boards review and approve research involving human subjects, and that adequate provisions to protect the privacy of subjects and maintain the confidentiality of their data exist. This rule delineates an expectation of the institutional review boards to ensure that subjects’s data maintain a degree of privacy and confidentiality. However, there is a glaring gap in this policy, which is that individual researchers (patients or citizens) who do not receive funding or do not work for an institution that receives federal funding, are not accounted for by the Common Rule. Lastly, there is HIPAA, which is the most well-known health information regulation that aims to protect the privacy and security of identifiable health information of patients, specifically health information that was either discovered or verified within the formal healthcare system

One of the largest milestones in ensuring clinical research is ethical, was the culture shift and legal expectation of clinicians obtaining informed consent from patients or study participants to enroll them in a study. The growth of health research in consumer technology subtly bypasses many of the standards for obtaining informed consent because mobile platforms can potentially remove regulatory obligations if they are conducted outside the traditional institutions to which the regulatory obligations are conventionally attached. Therefore, the protections that exist under law for informed consent traditionally do not apply homogeneously to informed consent over virtual platforms. Another potential peril of unregulated health research in consumer technology is that those with rare diseases or mental health conditions are more easily identifiable through digital phenotyping because it is a more streamlined process to quantify and analyze granular health information that is collected both actively and passively over consumer technology. Therefore, if a research study is investigating a rare condition, then it is usually more cost-effective to identify patients with the condition through a virtual platform than it is to identify patients with the condition at a given site. Lastly, the granularity of data captured in consumer technology varies from platform to platform, but there must be a clear maximum threshold for granularity in health research to preserve the privacy of participants and ensure that in the case of a security breach the data stolen is not as granular as it would have been had the threshold not existed. For instance, when participants enroll in clinical trials over mobile devices, the GPS location of the device should be only as precise as the study warrants it, and should not automatically send the participant’s specific GPS coordinates. 

These guidelines and regulations need to be more dynamic and match the pace of consumer technology, which may be more difficult for the federal government to achieve, and should consequently be the responsibility of local institutions and state governments. This responsibility can be undertaken significantly by the institutional review boards (IRB), and IRB policies should be consistent and present in certain capacities in law. For instance, if a researcher is affiliated with a particular university or a researcher is obtaining participants through a platform provided by a particular institution, then that institution must provide and require researchers to adhere to a set of guidelines that reflect the values of their institution and the communities that the institution exists in. This would allow for internal boards to be more adaptive to advances in technology, and not encourage a significant discrepancy to exist between the capacities of health research in consumer technology and enforceable regulations in clinical trials.

 

Ashwin Palaniappan is a senior at Brown, attending medical school in the fall. He can be reached at ashwin_palaniappan@brown.edu.