“By design” principles considered harmful
Introduction
The Dutch computer scientist Edgar Dijkstra (1968) popularised the famous phrase about something “considered harmful”. This phrase was then widely adapted for essay titles in computer science and beyond. When using the phrase, an author seeks to criticise or express concerns about something. For the author of this opinion, the something is about the so-called “by design” principles, including but not limited to the famous notion of “privacy by design”. These “by design” principles have recently proliferated to numerous other policy domains and contexts as well. However, none of the “by design” principles seem to carry much substance; they are mostly used as slogans just like the “considered harmful” phrase. They also entail some fundamental theoretical contradictions. This opinion argues that contradictions should be also taken into account already during policy-making.
Origins
The probably most well-known “by design” principle was pioneered by Ann Cavoukian and associates in the 1990s. It is known as “privacy by design”. Later on, Cavoukian (2011) summarised it into seven foundational principles; privacy should be considered when designing information technology products, people’s privacy should be honoured, processing of people’s personal data should be transparent, and so forth. Many of these principles work also underneath the General Data Protection Regulation (GDPR) of the European Union (EU). The GDPR also brought a further notion of “data protection by design”. After the enactment of the GDPR, different “by design” principles have further proliferated in the EU’s jurisprudence.
Contradictions
Cavoukian’s (2011) seven foundational principles contain a statement that privacy should be a default setting. The same applies to the GDPR’s Article 25 entitled “data protection by design and by default”. These principles have their counterpart in security engineering. Therein, the notion of “secure by default” has multiple meanings, but, in general, at least all default settings should be secure (Ruohonen, 2025). For instance, a firewall should deny everything by default and then allow only what is needed. This security engineering principle was recently, in 2024, codified into a law with the EU’s Cyber Resilience Act (CRA). It contains a legal requirement that a product should always be shipped with a “secure by default” configuration.
A further “open by design” principle appears in the EU, including in the Data Governance Act (DGA) enacted to promote data economy in Europe (Ruohonen and Mickelsson, 2023). Theoretically, this principle may contradict the “secure by default” principle. If a system and a network would have the maximum theoretical openness, there would exist neither access controls nor firewalls. Thus, the “secure by default” principle would not be satisfied. A more practical and a more serious example would come from Finland within which the “open by design” principle was followed to openly distribute geolocation data. Later on, the open distribution was recognised as a bad idea because the data allegedly contained also sensitive details about critical infrastructures (Hakahuhta, 2023). The CRA can be used to elaborate another contradiction.
The GDPR introduced the legal requirement to minimise personal data collection and processing only to what is required for a task. With the CRA, this data minimisation requirement was extended to most network-connected information technology products. In other words, products should only process data, whether personal or something else, that is relevant, correct, and limited to what is necessary. As was pointed out during the CRA’s policy consultation (SMEunited, 2022), this data minimisation requirement seems to contradict the “access by design” principle implicitly present in the EU’s Data Act (DA). In other words, the DA promotes people to get access to data generated by products they use, but if the data minimisation requirement is pushed to the extreme, there is not much useful data for them any more. As with the “open by design” principle, the implicit “access by design” principle may also enlarge attack surfaces, thus contradicting the “secure by default” principle.
The lesson learned
Recently, Kostina Prifti (2024) discussed and elaborated a further principle of “regulation by design” by emphasising a need to consider both the noun design and the verb designing. To simplify: A given policy can be seen as a design that is being designed by designers, including but not limited to politicians, civil servants, and stakeholders. In a similar vein, Mireille Hildebrandt (2020) has advocated notions of “legal by design” and “legal protection by design”. By the former, she emphasises a need to ensure compliance with legal requirements through deliberate design choices, including a consideration about unintended and unforeseen consequences. By the latter, she underlines a right of people to contest decisions in a court and a right of them to participate in democratic policy-making. These are sensible guidelines also when adopting “by design” principles into laws.
In particular, drafting of laws and prior impact assessments should involve contemplations about whether a given “by design” principle entails contradictions from which potential unintentional consequences may emerge. When evaluating different “by design” principles against each other, a proportionality test (Hildebrandt, 2020) could be used with slight adjustments. The “open by design” principle and its relation to the “secure by default” principle serves well as a brief example. When considering adding the former into a law, it should be contemplated whether the issue at hand should be “fully open” or “fully closed”. The latter would correlate with a theoretical maximum in the “secure by default” principle. If the contemplation would indicate that a balance is needed between the two endpoints, some exemptions or clarifications should be arguably added to the given law.






