Michael Schulson in Aeon writes about designing devices for addiction:
[S]hould individuals be blamed for having poor self-control? To a point, yes. Personal responsibility matters. But it’s important to realise that many websites and other digital tools have been engineered specifically to elicit compulsive behaviour.
A handful of corporations determine the basic shape of the web that most of us use every day. Many of those companies make money by capturing users’ attention, and turning it into pageviews and clicks. They’ve staked their futures on methods to cultivate habits in users, in order to win as much of that attention as possible. Successful companies build specialised teams and collect reams of personalised data, all intended to hook users on their products.
‘Much as a user might need to exercise willpower, responsibility and self-control, and that’s great, we also have to acknowledge the other side of the street,’ said Tristan Harris, an ethical design proponent who works at Google. (He spoke outside his role at the search giant.) Major tech companies, Harris told me, ‘have 100 of the smartest statisticians and computer scientists, who went to top schools, whose job it is to break your willpower.’
I met Harris not long ago, and seems to me that we’re reaching a turning point in the way we talk about the addictive quality of devices and social media: it’s no longer sufficient to invoke dopamine and intermittent rewards, and then shrug and either assume that these are inherent, unavoidable features of our technologies, or are addictive because of flaws in our human programming, rather than effects that designers work hard to create. Behind every claim that some technology or technological feature is inevitable is someone working hard to make money off that feature, while also convincing you that it just happened, and there’s nothing to be done about it.