One of the stated goals of the DSA is to ensure a “transparent and safe online environment” across the EU. New rules for online platforms, which also include tech giants such as Facebook, Google, and TikTok, are intended to help with this. For the first time, it’s not just about setting guidelines for moderating and deleting individual pieces of content. Rather, the DSA is a kind of mandatory handbook with rules of conduct for tech companies, such as how they must report on their business practices. However, what was missing from the Commission’s draft was a consideration of platform design.
If the DSA is to be the forward-looking set of rules that EU lawmakers have been talking about for years and that many people are hoping for, it must contain a separate article on platform design with clear definitions, transparency requirements, and also prohibitions. Proposals for such a design article now exist, but the Commission, member states, and the European Parliament (EP) have not yet been able to agree on a compromise. The EU countries had brought a ban on misleading design into play, but only for online marketplaces.
The EP goes further and wants to ban such practices on all online platforms. This is the right approach because misleading design does not only occur in online shopping but also in social networks or video apps, where people inform themselves and form their opinions.
However, the EP proposals also need improvement. A sensible regulation of platform design should not rely solely on prohibitions. Rather, it should allow insights into design processes, for example, through mandatory design reports. After all, design cannot only be used to mislead people.
There are many researchers and practitioners in the fields of user interface and user experience (UI/UX) design who are working on ethical or “prosocial” platform design. For example, researchers have found that pop-ups with verified facts can help people deal with disinformation online. The DSA should encourage platforms to test such design measures and disclose new approaches and their results. But neither the EP nor the Council have made proposals of this kind. Therefore, at the moment, it does not look like such requirements will be included in the DSA.
It is, therefore, all the more important that the rules that are ultimately laid down as a compromise in the DSA are also consistently enforced. Well-designed supervisory structures are therefore necessary. The responsible authorities must not only have sufficient expertise and resources of their own but must also exchange information with external experts, such as UX/UI experts. Improvements in the draft are also needed in this regard. So far, the involvement of external expertise is not a must, rather an option. This must change urgently.
Term ‘dark patterns’ has served its time
The EU countries are making the development of expertise an obligation for the Commission, which, according to the member states’ proposal, should play an important role in platform supervision. This makes sense but should also explicitly include the involvement of external experts. The Council proposal, which wants to declare the Commission centrally responsible for large players, would be most likely to ensure strong supervision. In the long term, a separate EU agency for platform supervision should be established, which can specialize even better than the Commission and is also independent.
The EU could use the DSA to emphasize the importance of dealing with annoying pop-ups and misleading buttons. Such design practices have often been referred to as “dark patterns”, including in the DSA. It may seem minor, but it would be an important signal from the EU to stop using this term. It has served its purpose in drawing attention to the issue of platform design. What is needed now is a more precise term such as “misleading design practices“. Moreover, “dark patterns”, as design expert Kat Zhou says, perpetuates the problematic dualism between light/good and dark/evil.