A direct-to-consumer genetic testing service called 23andMe came under fire recently because of a DNA profiling app. While there are many ways to discriminate using genetic information, that’s not the only dark side to genetic data. Here are three ways genetic data can be abused and steps society, industry, and regulators can take to ensure genetic data is open for good use and closed to abuse.
The line between good and evil
First, in order to better understand why users don’t understand the dangers in releasing their genetic information to services such as 23andMe, and why regulators are stumped over how to protect consumers from genetic data abuse, consider what went afoul with the 23andMe app. Or rather an app a third-party developer made that uses 23andMe’s genetic data and connects to it via the company’s application program interface (API).
It’s called the Genetic Access Control app
and it enables website managers and others to identify website visitors by ethnicity, gender and other parameters. App users could then limit or block “undesirables” from accessing online content. It’s not a huge leap to imagine that this and similar apps could also be used one day to weed undesirables from job applicants, real estate rentals or purchases, and other life sustaining needs.
The predictable backlash to this app was immediate and severe. The public verdict was that this app was undeniably used for evil. 23andMe promptly blocked the app
. Evil, it appeared, was swiftly conquered.
But the issue of using genetics to identify users is not that simply labeled or dismissed.
This very same app could be used for good purposes as well. On Github, a Git code repository hosting service, the app developer provides examples of good
1) Creating "safe spaces" online where frequently attacked and trolled victim groups can congregate, such as a female-only community
2) Ethnoreligious sects may wish to limit membership, e.g. Hasidic Jewish groups restricting access to Ashkenazi or Sephardic maternal haplogroups with the "Cohen" gene
3) Safer online dating sites that only partner people with a low likelihood of offspring with two recessive genes for congenital diseases
4) Pharmaceutical applications that check for genetic predisposition to negative drug interactions before dispensing
5) Groups defined by ethnic background, e.g. Black Panthers or NAACP members
In the near future, DNA and genetic profiling will be widely used as an identity authenticator. It will be used to verify the identity of hospital patients, especially those who are unable to speak for themselves. It will also likely to be used to verify the identity of bank customers; consumers who use mobile, wearable or online secure payment systems; voters; employees; convicts and their victims; natural and manmade catastrophe victims; and, patients seeking to access their medical information, among other uses.
DNA used as an identifier works for both good and ill; so do the apps that make the genetic identification process possible. It is therefore extremely difficult for regulators and lawmakers to find a definitive line between the good and evil that inherently exists in using genetic information for identification purposes.
Why permission isn’t really permission
In any case, the Genetic Access Control app can’t access a website user’s 23andMe genetic data unless such data already exists in that database and
the app has permission from the user to access it.
“The user is presented with a dialog asking them to approve the sharing of certain genetic data with your application,” explains the developer on Github. “If the request is approved a temporary access token is passed to your application which can be used to make API requests to retrieve information, such as ancestry composition and SNP nucleotide sequences. This data can then be used to grant or restrict authorization.”
Unfortunately, most people pay little attention to app permissions and tend to grant them without giving much thought to possible repercussions. Multiple studies show that the majority of consumers do not read app permissions and of those who do, most do not comprehend the full extent of personal data use they are granting. An app permission is therefore not really an informed permission. Courts are increasingly coming to the conclusion that app permissions are more a matter of subterfuge than consumer intent and consent which is prompting companies that host apps such as Google to crackdown on excessive app permissions
Once any app has access to a user’s data anywhere, it tends to keep that data forever. Revoking the permission later only means the app can’t go back for more data. But if an app already has all your genetic data, what data is left to retrieve at a source like 23andMe anyway?
Clearly, asking the user to grant permissions in effect does not constitute informed permission or consumer control over their data, and is therefore not a competent and trustworthy means to ensure this data is not abused. Hence the need for oversight and possible additional regulation.
In order to rein in abuses, one must first identify what those abuses are or are likely to be.
Here are three of them:
1) Discrimination in all its ugly forms
. Discrimination by most legal definitions means behavior meant to discriminate against members of an entire group rather than an individual. Efforts to prevent discriminatory behavior in mortgage lending, apartment renting, job applicants, government services, and elsewhere are easily circumvented by the use of genetic data from direct-to-consumer genetic testing services since these services, and the data they collect, often fall outside of healthcare privacy laws. Because consumers willfully submit this data outside of traditional healthcare organizations, and because many such services label themselves as “entertainment” rather than as health organizations, the data can often be easily sold or traded ad infinitum.
Further, such granular data enables much more specific discrimination than was previously possible. For example, an entity could discriminate against a subset group such as the darker-skinned individuals within the larger African or African-descended group. In this way, an organization can appear to be inclusive by including some dark skinned individuals while actually discriminating against many of them. Discrimination can thus become far more insidious and more difficult to police.
2) Identity theft
. Yes, identity theft. As a recent rash of big data breeches demonstrates, any data stored anywhere in digital form is data that can be stolen and used by criminals. A Reuters article deftly points out in its headline “Your medical record is worth more to hackers than your credit card
,” so too is genetic data that can be used to impersonate someone and/or to bypass anything secured by genetic authorization processes.
. Genetic data reveals precise disease and physical vulnerabilities. Such can be used to develop bioweapons to assassinate a specific individual, such as a president or other world leader, or an entire population in either a terrorist attack or an act of war. In my book, Data Divination: Big Data Strategies
, I describe the many ways the Veterans Affairs data hacks
, which the U.S. government attributes to China, can be used to target U.S. military personnel with both bioweapons and enhanced traditional weapons designed to take advantage of vehicle and protective gear vulnerabilities based on the analysis of veteran injuries, disease patterns and DNA. Such precise attack weapons can be developed by any country and used against any country. Similarly, the general populace of any country can be subjected to bioweapons designed exclusively for them from their genetic data. But it’s not only nation states that can abuse genetic data, biohackers and bio-hacktivists can too.
Protective steps society, industry, and regulators can take to prevent genetic data abuse
While there is no one panacea for this problem, there are steps that can be taken to protect against genetic data abuse. No doubt other steps will be identified and developed too, but here are some that provide a good start.
Individuals should never share DNA samples with direct-to-consumer genetic testing services unless and until they have verifiable (meaning not just the service’s word on it) confirmation of how the data is to be used, by whom and for what specific purposes (including the service’s partners, advertisers, or other third parties), how it will be stored and for how long it will be stored, specific security and protective measures, and whether the company obeys and is beholden to healthcare data privacy laws. Better yet, don’t use these services at all. If you want your genetic information, go through a traditional healthcare provider to be safe(r).
Society should demand stricter privacy protections from their governments, not only on genetic data but all personal data. Further, data security practices should be taught in schools and to the public so that consumers can learn how to protect their data on their own computers and devices and what protections to demand from companies seeking to use their data for whatever purpose. Further, consumers of all ages need to be educated on their rights to data privacy and the typical traps in service agreements and app permissions so they can avoid and report them.
Industry should stop gathering all the data they can just “in case” they might can find a use for it one day. Don’t be a data hoarder since all you’re doing then is driving up your data storage and management costs and putting everyone at unnecessary risk, including your own company. Further, don’t keep data forever. Give it an end-date and destroy it when that date comes around. Again, this will save you money and reduce risks for everyone.
The security industry that is currently eyeing genetic data as the fool-proof way to identify authorized individuals should understand now that no, it isn’t foolproof. Nothing is foolproof. If genetic data is used to authenticate a user it should be used in conjunction with additional authenticators so that the genetic data is less subject to theft and abuse.
Regulators should look at holding researchers legally accountable for the medical and genetic data they generate or use. This will curtail some of the vulnerabilities to hacking because it focuses more attention on data security and use. And that's not the only upside. Dr. Barth-Jones, a HIV and Infectious Disease Epidemiologist on the faculty at the Mailman School of Public Health at Columbia University, eloquently made the argument that realigning researcher responsibilities may actually make data easier to use in research. You can read his thoughts on that in this post in Fierce Big Data
Direct-to-consumer genetic testing services should be forced to comply with the same privacy and security laws as traditional healthcare providers and medical researchers. Further, the sale or trade of genetic data should be outlawed outright outside of verifiably legitimate medical research.
The security industry should also be heavily regulated to ensure they do not sale or trade genetic data. For security purposes, genetic data should be limited to authentication activities and prohibited from any other use. Bankers, lenders, commerce in general, and all employers should be prohibited from using genetic data for any purposes outside of security measures. Further such should be handled by a third-party security firm so that genetic data is never stored, accessible or usable by any commerce-oriented entity.
Last but not least, any entity or person using or storing genetic data should be subject to regular and intense security audits by a third party, preferably a government agency, to ensure the data is adequately protected from internal abuse and internal and external hacking or breeches.
Criminals and aggressive nation states are clever, determined, and often talented and well-funded too. Securing genetic data will be a constant exercise in diligence just as it is with any other kind of data. However, we must be extra diligent with genetic data as it represents many dangers, some of which are potentially lethal.