AI threatens health data privacy: Report

By Melissa Fassbender contact

- Last updated on GMT

(Image: Getty/ipopba)
(Image: Getty/ipopba)

Related tags: AI, Data, Regulations, Wearables

Artificial intelligence advances threaten data privacy, according to a new study, which found it is possible to re-identify individuals using their physical activity data.

According to the study, conducted by the University of California, Berkeley, current laws and regulations fail to safeguard confidential health information.

The concerns follow several advancements in artificial intelligence (AI) since the passing of HIPAA more than 20 years ago, noted UC Berkeley engineer Anil Aswani who led the study, which was published in the JAMA Network Open journal​.

Aswani told us the study was motivated in part by the question of whether AI tools could be used to re-identify specific types of data that are allowed to be shared under current regulations, such as physical activity data.

Researchers mined two years' worth of data covering more than 15,000 Americans and found that a de-identified dataset, consisting of physical activity data plus sociodemographic data, could often be re-identified.

Using AI, the researchers were in many cases able to identify individuals by correlating physical activity data to demographic data.

“The implications are that physical activity data should be considered as identifying data in specific scenarios, and that more broadly other sets of health data that are currently considered as de-identified may also potentially be identifying in specific scenarios,”​ Aswani explained.

“The most surprising aspect of the results was the high accuracy in reidentifying data using only detailed sociodemographic data,”​ he added.

The researchers suggest that policymakers readdress HIPPA legislation to restrict the sharing of activity data by device manufacturers.

Aswani also said the industry could help to address privacy issues “by aggregating the data of many patients before sharing de-identified data, in order to prevent the shared data from being matched to single individuals.”

“Industry could also be more careful by limiting the sharing of de-identified data to situations where there is a clear need for the sharing,”​ he said. “In general, industry is falling short when de-identified health data is shared without a compelling health care need.”

Source: JAMA
DOI: 10.1001/jamanetworkopen.2018.6040
“Feasibility of Reidentifying Individuals in Large National Physical Activity Data Sets From Which Protected Health Information Has Been Removed With Use of Machine Learning”
Authors: Liangyuan Na, Cong Yang, Chi-Cheng Lo, Fangyuan Zhao, Yoshimi Fukuoka, Anil Aswani

Related news

Show more

Related products

show more

Using SDTM, ADaM, and SEND

Using SDTM, ADaM, and SEND

Formedix | 09-Nov-2022 | Technical / White Paper

This article gives an overview of SDTM, ADaM, SEND and ARM, and discusses how these CDISC standards fit in with the wider clinical trial process, and how...

How to design an effective CRF

How to design an effective CRF

Formedix | 10-Oct-2022 | Technical / White Paper

CRFs and eCRFs are used for gathering patient data during clinical trials. They play a crucial role in helping to assess the safety and efficacy of clinical...

ODM and CDASH in CRF design

ODM and CDASH in CRF design

Formedix | 15-Aug-2022 | Technical / White Paper

The lesser-known Operational Data Model (ODM) standard is often overlooked as it's not required by any regulators. So, why should you be interested...

Related suppliers

Follow us


View more