Is it allowed to use consumer genetic test raw data to train a commercial machine-learning health product in the EU without informed consent?

Last updated on November 3, 2025

No. Using consumer genetic test raw data to train a commercial machine-learning health product in the EU without informed consent is prohibited under the General Data Protection Regulation (GDPR). Genetic data is classified as a special category of personal data, requiring explicit consent or another lawful basis for processing.

Why the Answer is No

The European Union treats genetic data as one of the most sensitive forms of personal information. This isn’t just a technicality—it’s a reflection of the profound privacy risks associated with DNA. Genetic data can reveal health predispositions, family relationships, and even ethnic origins. When companies use this data to train machine-learning models without consent, they violate core principles of EU data protection law, exposing themselves to severe penalties and reputational damage.

The Legal Framework Behind the Rule

Under Article 9 of the GDPR, genetic data falls under “special categories of personal data.” Processing such data is generally prohibited unless one of the specific exceptions applies—most notably, explicit consent from the data subject. This consent must be informed, freely given, and specific to the intended purpose. Using genetic data for commercial AI development without meeting these conditions is unlawful.

The European Data Protection Board (EDPB) reinforced this interpretation in its Opinion 28/2024, clarifying that AI models trained on personal data—including genetic data—are subject to GDPR obligations. The opinion emphasizes that legitimate interest, often cited by companies as a fallback legal basis, rarely suffices for high-risk data like genetics. Controllers must demonstrate necessity, proportionality, and implement robust safeguards, but even then, explicit consent remains the gold standard.

Why Genetic Data Is Special

Genetic data is not just another dataset—it’s a blueprint of identity. Unlike passwords or email addresses, DNA cannot be changed. Misuse can lead to irreversible harm, from discrimination in insurance and employment to breaches of family privacy. The GDPR recognizes this permanence and imposes strict conditions to prevent exploitation.

Consumer genetic testing services, such as those offering ancestry or health insights, typically collect raw data from saliva samples. This data often ends up in large repositories, tempting companies to repurpose it for machine-learning health products. However, GDPR’s principle of purpose limitation forbids such repurposing without additional consent. Data collected for personal reports cannot be silently diverted into commercial AI projects.

Cultural and Practical Context

Europe’s approach reflects a broader commitment to human dignity and autonomy. The GDPR was designed not only to regulate data flows but to empower individuals. Informed consent is central to this vision—it ensures people understand how their data will be used and have the power to say no. Ignoring this requirement undermines trust in both technology and healthcare innovation.

What Happens If You Ignore the Rules?

The consequences are severe. GDPR violations can trigger fines of up to EUR 20 million or 4% of global annual turnover, whichever is higher. Beyond financial penalties, companies risk injunctions halting their AI projects and reputational fallout that can cripple investor confidence. Supervisory authorities across the EU have shown increasing willingness to enforce these rules, especially in sectors involving health and biometrics.

Practical Advice for Compliance

If you plan to develop a machine-learning health product using genetic data:

  • Obtain explicit, informed consent for the specific purpose of AI training.
  • Conduct a Data Protection Impact Assessment (DPIA) to evaluate risks.
  • Apply data minimization and anonymization techniques wherever possible.
  • Implement strong security measures and maintain transparency with data subjects.

These steps are not optional—they are legal obligations under GDPR and essential for ethical innovation.

See more on EUROPEAN UNION

Sources

General Data Protection Regulation (GDPR) – EUR-Lex
https://eur-lex.europa.eu/eli/reg/2016/679/oj
Ongoing

Artificial intelligence – European Data Protection Board
https://www.edpb.europa.eu/our-work-tools/our-documents/topic/artificial-intelligence_en
18 December 2024

Scroll to Top