You’ve probably heard of the genetic testing site, 23andMe. The site allows users to send in a swab covered in their saliva for genetic decoding. When that code is translated, it’s viewable online as a pie chart of ancestry. 23andMe even offers an API that allows you to share your genetic information with the REST of the world.

Genetic information is some powerful stuff: It can countermand information that’s been passed down through a family, provide a clue to lost relatives, and even offer unexpected insights into one’s origins. But did you ever think that genetic information could be used as an access control?

Stumbling around GitHub, I came across this bit of code: Genetic Access Control. Now, budding young racist coders can check out your 23andMe page before they allow you into their website!

Seriously, this code uses the 23andMe API to pull genetic info, then runs access control on the user based on the results. Just why you decide not to let someone into your site is up to you, but it can be based on any aspect of the 23andMe API.

This is literally the code to automate racism. The author offers up a number of possible uses, many of which sound fairly legitimate, however. Imagine a women’s support group online that restricts access to women only. What if JDate didn’t just take your word for it that you were Jewish, and actually checked your DNA to make sure?

It can go much deeper than that, however. Reading the actual source of this software shows that it has a variable known as “allowed_population_threshold.” This insinuates there are limits of access that can be placed in your software with this code. Too many men on the site? How about too many Europeans? “Sorry ma’am, only three women allowed per day.”

The mind truly boggles. I’m fairly certain this is a serious piece of work, but the more I think about it, this entire software project feels more like a work of art, like something by contemporary artist Kara Walker. She recently created a massive statue of a naked, offensively stereotypical African American woman in a warehouse in New York. The sculpture was enormous, and based on offensive images of Africans from the 1800s.

Many of the white visitors (particularly younger ones) who came to see the statue laughed and took their pictures with it in irreverent poses. Later, it was discovered that Walker secretly was videotaping visitors’ actions and reactions. Rather than the statue itself being the artistic piece, Walker’s end objective was to hold a sort of mirror up to society and highlight its reaction to her fairly disturbing work.

I mention Walker because I feel that this software is doing something similar. Perhaps some users will implement it in a harmless and beneficial way, such as creating a safe space for women. But it’s just as likely that, in a few years, Googling for a snippet of this code yields search results that are the equivalent of a who’s who of racist and misogynist sites. I can imagine the author handing this code to humanity and saying, “This can be used for evil. Are you going to be evil with it?”

However, for now, practically, this code means very little. Only a sliver of humans have 23andMe profiles, so actually using this to control access to your site would basically just lock everyone out, regardless of any status other than 23andMe usage.

I’ve never personally had a reaction like this to software. While entertainment software has evoked emotions in millions, this is just some Python code for access controls. It’s an interesting relic of our time. It could be displayed in the Victoria and Albert Museum in London. The very existence of this code says something about our society.

Does this mean there will be a future where the Web is sectioned off with velvet ropes that only allow access for purebred Inuit people? Or does it mean there will come a time where folks are out paying people of other races to spit on their 23andMe swabs so they can sneak into some secret forum on the Internet?

I hope this code ends up meaning nothing in the long run. But it’s sobering and a tad frightening to look and see that it is now legitimately possible to automate racism (or sexism).

Perhaps a better use of this code would be to restrict site access to humans. Because, on the Internet, nobody knows that you’re a dog.