At this point, everyone has seen the AI Overview section at the top of their Google search results: a panel of instant answers, magically generated from various search results somewhere further down the page. As of May 2024, the feature left Google’s experimental Search Labs and began rolling out to all users — intentionally or unintentionally introducing new dark patterns into the user experience.
What is a dark pattern?
The term “dark pattern” was coined by UX designer Dr. Harry Brignull, founder of the Deceptive Patterns Initiative, to describe interface decisions that manipulate, trick, or force users into taking an action. Familiar examples include confirmshaming (putting “No thanks, I like paying full price” on your pop-up ad’s close button), obstruction (making it hard for a user to perform the action they want), and hard-to-cancel aka “roach motel” (making it easy to sign up, but hard to quit).
Dark patterns explicitly benefit the company at the expense of the user, and they’re not exactly a great experience for anyone. Software engineer and responsible tech advocate Selam Moges gave an excellent presentation at the recent International Javascript Conference on how these patterns show up in common scenarios, and the ethical alternatives — which can range from simple wording changes to more empathetically rethinking how the business interacts with its customers.
Let’s take a look at how this shows up in Google’s AI Overview feature, why it’s a problem, and what Google could do about it.
Issues with Google’s AI Overview
When Google announced they were rolling out AI Overview to all users, their blog post made it clear that they saw it as a win for their users: get answers more quickly, ask more complex questions, or even search with a photo or video. Google CEO Sundar Pichai speculated that many people may not even realize they’re using AI at all when they see the overview, normalizing the interaction to the point of invisibility.
Not everyone is thrilled with the feature, though. Users are constantly asking how to turn it off. (Search Engine Land notes that while Google says users love AI Overview, they have yet to present any data to back that claim.)
That’s a lot of different concerns. The simplest way to address them? Don’t like it, just don’t use it.
And that’s where the dark patterns come in.
Dark patterns in Google’s AI Overview
Google’s AI Overview is now, according to their own FAQ, a built-in feature that can’t be turned off. Their official advice to users who dislike the feature is to use the Web filter after running a search —click “More” in the filter bar, then click “Web”.
Note that this multi-step workaround doesn’t disable the AI Overview at all: the user still has to perform their search, see the AI Overview of the results, and then switch to the filtered view. So for the ethically-opposed segment of unhappy users, this solves nothing, and users who just don’t like the experience still risk misinformation, spoilers, and general unreliability before they can navigate over to the other view.
In its current state, AI Overviews demonstrates dark patterns including:
Fixing the dark patterns
Google’s gotten in legal trouble for dark patterns before, and it cost them $95 million. In order to stay on the safe side of UX law, they have plenty of options: