We’ve all fallen prey to them at one time or another: Design techniques such as the bait-and-switch, disguised ads, faraway billing, friend spam and sneaking items into the checkout cart. These “dark patterns” are interfaces are “carefully crafted to trick users into doing things, such as buying insurance with their purchase or signing up for recurring bills,” according to the website Darkpattern.org, which is dedicated to exposing these tricks and “shaming” companies that use them.

Many of these shady practices are classic business scams brought online. Perhaps more worrisome are the new ways mobile apps capture our attention — until we can’t break away.

In Addiction by Design: Machine Gambling in Las Vegas (Princeton University Press, 2013), MIT science, technology, and society professor Natasha Dow Schüll crystallizes her 15 years of field research in Las Vegas in an analysis of how electronic gamblers slip into a twilight called the “machine zone” — and how the industry optimizes for maximum “time on device.” Slot machines are one of the most profitable entertainment industries in the United States, according to Tristan Harris, a former design ethicist for Google.

In a disconcerting essay on Medium, Harris argues that Schüll’s findings don’t only apply to gamblers:

“But here’s the unfortunate truth  — several billion people have a slot machine their pocket: When we pull our phone out of our pocket, we’re playing a slot machine to see what notifications we got. When we pull to refresh our email, we’re playing a slot machine to see what new email we got. When we swipe down our finger to scroll the Instagram feed, we’re playing a slot machine to see what photo comes next. When we swipe faces left/right on dating apps like Tinder, we’re playing a slot machine to see if we got a match. When we tap the # of red notifications, we’re playing a slot machine to what’s underneath.”

Thanks to intermittent variable rewards, Harris and many others note, mobile apps are easily addicting. But when you design for addiction, you open yourself to ethical questions.

In Hooked: How to Build Habit-Forming Products (Portfolio, 2014), consumer psychology expert Nir Eyal recommends using operant conditioning — intermittent rewards — to create addictive products. But are all products meant to be addictive, or is a “viral” product one that will flame out after the hype is over? Are “habit-forming” apps a sustainable business model? In short, what are the ethics of addictive design?

Interestingly, though Eyal argues that technology cannot be addictive, Schüll’s gambling research indicates otherwise. And Eyal contradicted his own book’s.

Technology dependence and distraction are easily solved, so calling them addictive is overkill, he said: “Everything is addictive these days. We’re told our iPhones are addictive, Facebook is addictive, even Slack is addictive,” Eyal said. However, he admitted, one to five percent of the technology user population does struggle to stop using a product when they don’t want to.

“What do these companies that have people that they know want to stop, but can’t because of an addiction, do? What’s their ethical obligation? Well, there’s something we can do in our industry that other industries can’t do. If you are a distiller, you could throw up your hands and say ‘I don’t know who the alcoholics are.’ But in our industry, we do know — because we have personally identifiable information that tells us who is using and who is abusing our product. What is that data? It’s time on site. A company like Facebook could, if they so choose, reach out to the small percentage of people that are using that product past a certain threshold — 20 hour a week, 30 hours a week, whatever that threshold may be — and reach out to them with a small message that asks them do they need help?” Eyal said.

He suggests a simple, respectful pop-up message to these users that reads, “Facebook is great but sometimes people find they use it too much. Can we help you cut back?” It remains to be seen if Facebook will implement such a measure, but Harris has come out swinging in the opposite direction from Eyal. He has launched timewellspent.io, a movement to “reclaim our minds from being hijacked by technology,” according to the website.

Harris offers an eight-point ethical design checklist, recommending that technology products:

1. Honor off-screen possibilities such as clicking to other sites
2. Be easy to disconnect
3. Enhance relationships rather than isolate users
4. Respect schedules and boundaries, not encouraging addiction or rewarding oversharing
5. Help “get life well lived” as opposed to “get things done” — in other words, prioritize life-enhancing work over shuffling meaningless tasks
6. Have “net positive” benefits
7. Minimize misunderstandings and “unnecessary conflicts that prolong screen time”
8. Eliminate detours and distractions.

About Alexandra Weber Morales

Alexandra Weber Morales is a freelance writer (and singer and songwriter).