Many privacy laws now in effect worldwide require businesses to acquire explicit consent before collecting their customers' private, personal information. However, the compliance rates are abysmally low, according to research published by Cornell University.
In fact, only about 11 percent of all websites are putting together consent notices that comply with even the minimum required by law. Because businesses are trying to find loopholes everywhere they can, legislators are focusing more on the effect of "dark patterns" in technical design on commerce, privacy, data protection, and user choice.
The U.S. Federal Trade Commissioner Rohit Chopra has defined these so-called dark patterns as "design features used to deceive, steer, or manipulate users into behavior that is profitable for an online service, but often harmful to users or contrary to their intent."
For example, app and website design have certain conventions with which most people are familiar. Visual cues like using an "X" icon let users know that clicking it will close out programs and documents. A solid red circle around an "X" is often used to denote a warning, and so on.
However, what happens if a company purposefully manipulates their customers by creating larger buttons for the things they want customers to agree to? What if the company purposely makes customers browse through pages and pages of content before allowing them to arrive at the information they sought initially?
This kind of deceptive, manipulative behavior in web and technical design is what legislators are talking about when addressing unethical "dark patterns." Increasingly, lawmakers are seeking to tackle the problem.
Three laws currently have regulations that attempt to curtail dark patterns. They are:
"Dark patterns" is a broad term, and it encompasses far more than misleading and manipulative visual cues. Take the following, for example. It's a dark pattern tactic when an app or website:
While there is still no broad consensus on the definition of "dark patterns" from a legal perspective, lawmakers are still trying to address them in privacy legislation. For example, the CCPA declares that dark patterns are a user interface designed or manipulated with the substantial effect of impairing or subverting "user autonomy, decision-making, or choice."
Human psychology is what makes dark patterns so powerful. They play upon cognitive biases that most people aren't aware they even have. A cognitive bias is a "weakness" or "flaw," which can lead an individual into making irrational, poor decisions.
As the independent French administrative regulatory and data protection authority, CNIL declared that we are influenced and trained to always share more, without always recognizing, ultimately, "that we are jeopardizing our rights and freedoms."
As previously noted, some companies have no problem employing unethical and deceptive design practices to get their customers to buy more and give up more user data.
These kinds of things can happen when:
As you have probably surmised by now, dark patterns can take many forms, and not all of them are easy to spot or recognize. However, researchers have attempted to categorize and list them out.
Forbrukerrådet, a Norwegian Consumer Council, identified five categories of dark patterns and compiled them into a notable report called "Deceived by Design." These categories are as follows.
Most digital products come with privacy options built in. With that said, most users never even touch them.
If you want to talk about purely digital settings, consider that many digital service providers take advantage of them. They know the statistics about how many people usually change settings. Therefore these service providers purposefully create defaults that collect as much information as possible. The service provider then profits without the user being any the wiser.
Moreover, even if the service provider is transparent about its practices, it will often use large "accept" buttons, etc., to get users to agree and move on without pausing to read what they're actually agreeing to.
Real choices can be difficult to come by. Some companies create the illusion that the customer has a choice, but it's not a real one.
For instance, there are ways that companies can push or incentivize their customers to take specific actions. They do so by making certain options more visually appealing or visible, making buttons or links more appealing or visible, and giving users only one real choice when accepting Terms and Condition agreements or Privacy Policies.
If a company is able to determine a customer's intent, it can then present a choice in a way that manipulates that customer. For instance, the company can create wording that sounds great and aligns with the customer's intent. However, the company will purposefully leave out vital information.
An example of a terrible actor when it comes to this type of behavior is Facebook. When the company launched its facial recognition feature, many people were worried about its potential to violate privacy in the extreme. Facebook turned off the feature for citizens of the EU.
Later, the company turned it back on but with GDPR popups. Facebook claimed that its purpose in implementing facial recognition software was to help the visually impaired, all while it was busy collecting the biometric data of users all over the place.
Facebook tried to downplay the negatives by reframing the launch of their facial recognition features as something positive while leaving out crucial information. By glossing over the negatives, the company was essentially attempting to monetize user data without explicit permission.
The company's actions led to a class-action lawsuit filed in Illinois. In 2021 a judge penalized Facebook to the tune of $650 million.
Some companies attempt to get customers to make certain actions by limiting their time to complete the action. It's an inherently coercive practice. Another way companies compel customers to make a certain choice is to force them through a series of actions if they want to receive something specific.
That series of actions could be allowing the company to share private, personal data or it could be agreeing to the company's Terms and Conditions, etc.
While the GDPR is considered the gold standard for privacy laws worldwide, dark patterns oppose the basic concept of consent as conceived by the law.
For example, the GDPR's definition of consent is:
"any freely given, specific, informed and unambiguous indication of the data subject's wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her."
The law also makes clear that "Silence, pre-ticked boxes or inactivity should not, therefore, constitute consent."
Yet, as previously stated, studies have shown that a vast majority of companies essentially thumb their noses at the law by engaging in practices that don't give customers any real choices at all.
The most typical dark pattern practices that violate the GDPR's consent rules are the following:
California residents have the right to prohibit companies from selling their data. Moreover, companies that do business in the state must be transparent regarding what personal data is processed, why it's processed, and how it's processed.
However, in 2020 a group of researchers from Stanford University analyzed CCPA notices connected with Do Not Sell requirements that companies were using. The researchers discovered that there were many cases wherein companies used dark patterns.
Throughout the entire process, there were no instructions to let customers know exactly what they needed to do to exercise rights over their own data. Plus, many of these companies asked irrelevant, intrusive questions on the Do Not Sell form itself.
In other words, these companies were still attempting to collect user data even when the customer was trying to prohibit them from doing so.
While the world attempts to implement more robust data privacy laws, many companies are trying to circumvent them.
Many companies employ wildly deceptive and unethical methods to manipulate their customers into taking specific actions. Many of these actions lead to capturing and processing an individual's private, personal information that the company then seeks to monetize.
With the above in mind, ethical business owners can seek to build trust with their customer base by adhering as closely as possible to data privacy laws such as the GDPR and the CCPA.