
Dark patterns
Anyone who uses the internet has almost certainly encountered dark patterns. The term was coined by Harry Brignull, a user-experience consultant in the United Kingdom, who began compiling examples of problematic design practices in 2010. Brignull’s “Roach Motel” dark pattern specifically describes cases where online service providers make it easy to get into a situation but hard to leave. Difficult-to-cancel subscriptions have drawn heightened attention from regulators, but they hardly represent the only situation where online service providers deliberately deter users from cancelling or leaving their services. In our research in user-experience design, we found that social media sites also routinely make it difficult — or even impossible — for users to disable their accounts.Uncovering common dark patterns
The Language and Information Technology Research Lab (LiT.RL) at the University of Western Ontario studies deceptive, inaccurate and misleading information practices. We collected data from 25 social media sites, drawn from a list of the 50 most popular ones in May 2020. We then used content analysis to review the account-disabling process for each site screen-by-screen, including the options given (or hidden) from users, and the exact wording and visuals shown. We wanted to establish which strategies were used to deter users from leaving these sites and how prevalent they were. Our research is currently undergoing peer review in a journal dedicated to social media and societal issues. In total, our study uncovered five major types of dark patterns — Complete Obstruction, Temporary Obstruction, Obfuscation, Inducements to Reconsider and Consequences — and 13 subtypes, specifically associated with disabling social media accounts.Discouraging tactics
Like the Amazon Prime cancellation process described in the FTC’s complaint, these strategies were rarely deployed in isolation: the sites in our sample used 2.4 dark patterns on average, and five sites contained five or more dark patterns to deter account disabling. One site simply provided no option in the interface for the user to disable their account, and warned that requests for account disabling would not be considered by the site administrators (Complete Obstruction). Nine sites obstructed the path to account disabling by burdening the user with unnecessary work, such as chatting to a company representative in real time or responding to an email to confirm their decision to leave (Temporary Obstruction). Seven sites confused or mislead the user by, for instance, hiding the button to initiate the account disabling process in an unusual location or making the button itself small and faint (Obfuscation). Fifteen sites relied on more transparent efforts to convince the user to reconsider, often by employing language and visuals that induced fear, guilt or doubt – such as sad faces, large red “warning!” labels, and proclamations that “it would be a shame to see you go!” (Inducements to Reconsider). Even if the user was able to successfully disable their account, they were frequently confronted with opportunities or pressure to return (Consequences). Twelve sites continued to communicate with the user via email or offered account reactivation for a fixed period; one site made reactivation possible for the exorbitantly long period of a year. Even worse, four sites offered account reactivation indefinitely, meaning that the account and its associated data could never be permanently deleted.The Commission alleges that Amazon Prime’s “manipulative” enrollment and “labyrinthine” cancellation processes are illegal. https://t.co/Yqqi6VMg7r
— Motherboard (@motherboard) June 25, 2023