Hostname: page-component-76fb5796d-vfjqv Total loading time: 0 Render date: 2024-04-27T09:02:54.426Z Has data issue: false hasContentIssue false

Regulating Gatekeeper Artificial Intelligence and Data: Transparency, Access and Fairness under the Digital Markets Act, the General Data Protection Regulation and Beyond

Published online by Cambridge University Press:  13 December 2023

Philipp Hacker*
Affiliation:
Chair for Law and Ethics of the Digital Society, European New School of Digital Studies, European University Viadrina, Frankfurt, Germany
Johann Cordes
Affiliation:
Chair for Law and Ethics of the Digital Society, European New School of Digital Studies, European University Viadrina, Frankfurt, Germany
Janina Rochon
Affiliation:
Chair for Law and Ethics of the Digital Society, European New School of Digital Studies, European University Viadrina, Frankfurt, Germany
*
Corresponding author: Philipp Hacker; Email: hacker@europa-uni.de

Abstract

Artificial intelligence (AI) is not only increasingly being used in business and administration contexts, but a race for its regulation is also underway, with the European Union (EU) spearheading the efforts. Contrary to existing literature, this article suggests that the most far-reaching and effective EU rules for AI applications in the digital economy will not be contained in the proposed AI Act, but in the Digital Markets Act (DMA). We analyse the impact of the DMA and related EU acts on AI models and underlying data across four key areas: disclosure requirements; the regulation of AI training data; access rules; and the regime for fair rankings. We demonstrate that fairness, under the DMA, goes beyond traditionally protected categories of non-discrimination law on which scholarship at the intersection of AI and law has focused on. Rather, we draw on competition law and the FRAND criteria known from intellectual property law to interpret and refine the DMA provisions on fair rankings. Moreover, we show how, based on Court of Justice of the European Union jurisprudence, a coherent interpretation of the concept of non-discrimination in both traditional non-discrimination and competition law may be found. The final section sketches out proposals for a comprehensive framework of transparency, access and fairness under the DMA and beyond.

Keywords

Type
Articles
Copyright
© The Author(s), 2023. Published by Cambridge University Press

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1 In this paper, AI is understood based on the definition in Art 3(1) and Recitals 6a and 6b AI Act. This definition is far from perfect, see SJ Russell and P Norvig, Artificial Intelligence: A Modern Approach (3rd global edition, London, Pearson Education 2016) p 5; DL Poole and AK Mackworth, Artificial Intelligence: Foundations of Computational Agents (2nd edition, Cambridge, Cambridge University Press 2017) pp 3–7; M O’Shaughnessy, “One of the Biggest Problems in Regulating AI Is Agreeing on a Definition” (Carnegie Endowment, 2022) <https://carnegieendowment.org/2022/10/06/one-of-biggest-problems-in-regulating-ai-is-agreeing-on-definition-pub-88100> (last accessed 6 December 2022); P Hacker, “The European AI Liability Directives – Critique of a Half-Hearted Approach and Lessons for the Future” (2022) arXiv preprint arXiv:221113960, 11–12. However, it constitutes a workable definition for the purposes of this paper.

2 For a definition of ML, see TM Mitchell, Machine Learning (1st edition, New York, McGraw Hill 1997) p 2: “A computer program is said to learn from experience E with respect to some class of tasks T and performance measure P, if its performance at tasks in T, as measured by P, improves with experience E.”

3 I Goodfellow, Y Bengio and A Courville, Deep Learning (Cambridge, MA, MIT Press 2016) p 23 et seq.; M Javaid et al, “Artificial Intelligence Applications for Industry 4.0: A Literature-Based Study” (2022) 7 Journal of Industrial Integration and Management 83.

4 ZC Lipton, “The mythos of model interpretability: In machine learning, the concept of interpretability is both important and slippery” (2018) 16 Queue 31.

5 X Ferrer et al, “Bias and Discrimination in AI: a cross-disciplinary perspective” (2021) 40 IEEE Technology and Society Magazine 72.

6 See, eg, C Reed, “How should we regulate artificial intelligence?” (2018) 376(2128) Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 20170360; M Buiten, “Towards intelligent regulation of artificial intelligence” (2019) 10 European Journal of Risk Regulation 41; H Zech, Entscheidungen digitaler autonomer Systeme: Empfehlen sich Regelungen zu Verantwortung und Haftung? (Bonn, Deutscher Juristentag 2020); P Hacker, “Europäische und nationale Regulierung von Künstlicher Intelligenz” (2020) Neue Juristische Wochenzeitschrift 2142; N Smuha, “From a ‘race to AI’ to a ‘race to AI regulation’: regulatory competition for artificial intelligence” (2021) 13 Law, Innovation and Technology 57; M Ebers et al, “The European Commission’s proposal for an artificial intelligence act – a critical assessment by members of the robotics and AI law society (RAILS)” (2021) 4 J 589; C Wendehorst and J Hirtenlehner, “Outlook on the Future Regulatory Requirements for AI in Europe” (2022) <https://ssrn.com/abstract=4093016> (last accessed 22 November 2023).

7 For analysis and critique, see, eg, M Veale and FZ Borgesius, “Demystifying the Draft EU Artificial Intelligence Act – Analysing the good, the bad, and the unclear elements of the proposed approach” (2021) 22 Computer Law Review International 97, as well as some of the references in note 6, supra.

8 European Commission, Proposal for a Regulation of the European Parliament and of the Council laying down harmonised rules on Artificial Intelligence, COM(2021) 206 final; all references to the AI Act are to the following document: Council of the EU, Interinstitutional File: 2021/0106(COD), General Approach (= final version of the Council compromise text) of Nov. 25, 2022, Doc. No. 14954/22.

9 L Bertuzzi, “Artificial Intelligence definition, governance on MEPs’ menu” (EURACTIV, 10 November 2022) <https://www.euractiv.com/section/digital/news/artificial-intelligence-definition-governance-on-meps-menu> (last accessed 10 November 2022).

10 See, eg, the remarks by MEP Axel Voss in L Bertuzzi, “The new liability rules for AI” (EURACTIV, 30 September 2022) <https://www.euractiv.com/section/digital/podcast/the-new-liability-rules-for-ai/> (last accessed 9 November 2022).

11 Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market for Digital Services, OJ L277/1 (DSA); Regulation (EU) 2022/1925 of the European Parliament and of the Council of 14 September 2022 on contestable and fair markets in the digital sector, OJ L265/1 (DMA).

12 European Commission, Proposal for a Directive of the European Parliament and of the Council on Liability for Defective Products, COM(2022) 495 final (PLD Proposal); European Commission, Proposal for a Directive of the European Parliament and of the Council on adapting non-contractual civil liability rules to artificial intelligence, COM(2022) 496 final (AILD Proposal).

13 See, eg, Hacker, supra, note 1; G Spindler, “Die Vorschläge der EU-Kommission zu einer neuen Produkthaftung und zur Haftung von Herstellern und Betreibern Künstlicher Intelligenz” (2022) Computer und Recht 689.

14 See, eg, Veale and Borgesius, supra, note 7; N Helberger and N Diakopoulos, “The European AI Act and how it matters for research into AI in media and journalism” (2023) 11 Digital Journalism 1751; F Sovrano et al, “Metrics, Explainability and the European AI Act Proposal” (2022) 5 J 126; N Smuha et al, “How the EU can achieve legally trustworthy AI: a response to the European Commission’s proposal for an artificial intelligence act” (2021) <https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3899991> (last accessed 22 November 2023); P Hacker and J-H Passoth, “Varieties of AI Explanations Under the Law. From the GDPR to the AIA, and Beyond” in A Holzinger et al (eds), xxAI – Beyond Explainable AI, International Workshop on Extending Explainable AI Beyond Deep Models and Classifiers (Berlin, Springer 2022) p 343; ME Kaminski, “Regulating the Risks of AI” (2023) 103 Boston University Law Review (forthcoming); P Hacker, “A legal framework for AI training data – from first principles to the Artificial Intelligence Act” (2021) 13 Law, Innovation and Technology 257.

15 See only M Eifert et al, “Taming the giants: The DMA/DSA package” (2021) 58 Common Market Law Review 987; J Laux, S Wachter and B Mittelstadt, “Taming the Few: Platform Regulation, Independent Audits, and the Risks of Capture Created by the DMA and DSA” (2021) 43 Computer Law & Security Review 105613; R Podszun, P Bongartz and S Langenstein, “The Digital Markets Act: Moving from Competition Law to Regulation for Large Gatekeepers” (2021) 11 EuCML: Journal of European Consumer and Market Law 60; A Davola and G Malgieri, “Data, Power and Competition Law: The (Im)Possible Mission of the DMA?” (2022) 2023 Research in Law and Economics, Forthcoming.

16 See, eg, M Zehlike et al, “Beyond Incompatibility: Interpolation between Mutually Exclusive Fairness Criteria in Classification Problems” (2022) arXiv:2212.00469; P Hacker, “KI und DMA– Zugang, Transparenz und Fairness für KI-Modelle in der digitalen Wirtschaft ” (2022) 75 Gewerblicher Rechtsschutz und Urheberrecht 1278.

17 They are limited to medical AI, credit scoring, life and health insurance and employment; see Annexes II and III of the AI Act Proposal.

18 For the “A9” algorithm, see V Sandeep and B Pohutezhini, “The e-commerce revolution of Amazon.com” (2019) 6 Splint Internationals Journal of Professionals 33, 37; N Maio and B Re, “How Amazon’s e-commerce works” (2020) 2 International Journal of Technology for Business 8, 10; T Fries, “Amazon A9 – Amazon’s ranking algorithm explained” (Amalytix, 10 December 2020) <https://www.amalytix.com/en/knowledge/seo/amazon-alogrithm-a9/> (last accessed 4 December 2022).

19 A-J Su et al, “How to improve your Google ranking: Myths and reality” (2010) 2010 IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology 50; MG Southern, “Is RankBrain a Ranking Factor in Google Search?” (SEJ, 5 October 2022) <https://www.searchenginejournal.com/ranking-factors/rankbrain-ranking-factor> (last accessed 4 December 2022).

20 See, eg, T-H Teh and J Wright, “Intermediation and steering: competition in prices and commissions” (2022) 14 AEJ: Microeconomics 281.

21 See supra, notes 18 and 19.

22 See, eg, J Padilla, J Perkins and S Piccolo, “Self-Preferencing in Markets with Vertically Integrated Gatekeeper Platforms” (2022) 70 The Journal of Industrial Economics 371; A Hagiu, T-H Teh and J Wright, “Should platforms be allowed to sell on their own marketplaces?” (2022) 53 The RAND Journal of Economics 297.

23 R Ursu, “The Power of Rankings: Quantifying the Effect of Rankings on Online Consumer Search and Purchase Decisions” (2018) 37 Marketing Science 530; M Mohri, A Rostamizadeh and A Talwalkar, Foundations of Machine Learning (2nd edition, Cambridge, MA, MIT Press 2018) p 3.

24 Hacker, supra, note 16, 1281.

25 The definition essentially corresponds to that of Art 2(8) of Regulation (EU) 2019/1150 on fairness and transparency for business users of online intermediary services (P2B Regulation); on the P2B Regulation, see Section II.2.a infra.

26 Ursu, supra, note 23.

27 See, eg, AD Mishra and D Garg, “Selection of best sorting algorithm” (2008) 2 International Journal of Intelligent Information Processing 363; however, the sorting algorithms that can be used differ, sometimes considerably, in their efficiency (eg in terms of their runtime and the memory they require). For an overview, see, eg, R Sedgewick and K Wayne, Algorithms (4th edition, Boston, MA, Addison-Wesley 2011) ch 2.

28 Google, for example, uses ML techniques to best understand search queries and provide the most relevant answers; see B Schwartz, “How Google uses artificial intelligence in Google Search. From RankBrain, Neural Matching, BERT and MUM – here is how Google uses AI for understanding language for query, content and ranking purposes” (Search Engine Land, 3 February 2022) <https://searchengineland.com/how-google-uses-artificial-intelligence-in-google-search-379746> (last accessed 6 December 2022); see also supra, note 19. Amazon uses such techniques to, among other things, suggest ads based on the previous search behaviour of potential buyers, see supra, note 18.

29 OpenAI, “Introducing ChatGPT” (Official OpenAI Blog, 30 November 2022) <https://openai.com/blog/chatgpt> (last accessed 5 May 2023).

30 Y Mehdi, “Reinventing search with a new AI-powered Microsoft Bing and Edge, your copilot for the web” (Official Microsoft Blog, 7 February 2023) <https://blogs.microsoft.com/blog/2023/02/07/reinventing-search-with-a-new-ai-powered-microsoft-bing-and-edge-your-copilot-for-the-web/> (last accessed 27 March 2023).

31 B Martens, “An Economic Policy Perspective on Online Platforms” (2016) Institute for Prospective Technological Studies Digital Economy Working Paper 2016/05, 4, 20 et seqq.

32 See supra, note 30.

33 See supra, note 28, and also supra, note 19.

34 Ursu, supra, note 23; M Derakhshan et al, “Product Ranking on Online Platforms” (2022) 68 Management Science 4024, 4028.

35 T-Y Liu, “Learning to Rank for Information Retrieval” (2009) 3 Foundations and Trends in Information Retrieval 225.

36 See Annexes II and III AI Act.

37 P Hacker, A Engel and M Mauer, “Regulating ChatGPT and Other Large Generative AI Models”, Proceedings of the 2023 ACM Conference on Fairness, Accountability, and Transparency (Association for Computing Machinery 2023) <https://dl.acm.org/doi/10.1145/3593013.3594067> (last accessed 17 July 2023).

38 Bongartz et al, supra, note 15, 61.

39 Regulation (EU) 2019/1150 of the European Parliament and of the Council of 20 June 2019 on promoting fairness and transparency for business users of online intermediation services, OJ L 186, 11.7.2019, p 57.

40 Directive 2011/83/EU of the European Parliament and of the Council of 25 October 2011 on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council, OJ L 304, 22.11.2011, p 64.

41 Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to-consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council, OJ L 149, 11.6.2005, p 22.

42 Directive 2019/2161/EU of 27 November 2019 on better enforcement and modernisation of Union consumer protection laws, OJ L 328. 18.12.2019, p 7.

43 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC OJ L 119, 4.5.2016, p 1.

44 The term “parameter” is poorly chosen, as in technical terms it is understood to mean the internal coefficients of the model and not the factors relevant to the decision, such as price, availability, etc. However, this seems to be meant in the case of the P2B Regulation and the CRD, see Recital 24 P2B Regulation and Recital 22 Omnibus Directive; see also C Alexander, “Neue Transparenzanforderungen im Internet – Ergänzungen der UGP-RL durch den ‘New Deal for Consumers’” (2019) WRP 1235, marginal no. 30. Technically, the decision factors are rather called “features” (Goodfellow et al, supra, note 3, 3, 292 f).

45 See, eg, LK Kumkar and D Roth-Isigkeit, “Erklärungspflichten bei automatisierten Datenverarbeitungen nach der DSGVO” (2020) JZ 277.

46 Lipton, supra, note 4.

47 SM Lundberg and S-I Lee, “A unified approach to interpreting model predictions” (2017) 30 Advances in Neural Information Processing Systems 4765; AB Arrieta et al, “Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI” (2020) 58 Information Fusion 82, 92 et seqq.

48 Alejandro Barredo Arrieta et al, supra, note 47, 90; for approaches, see, eg, S Lapuschkin et al, “Unmasking Clever Hans predictors and assessing what machines really learn” (2019) 10 Nature Communication 1; likewise, the calculation of an average of so-called Shapley values, which yield local feature relevance, is possible, cf. Lundberg and Lee, supra, note 47.

49 Recital 24 P2B Regulation; M Grochowski et al, “Algorithmic Transparency and Explainability for EU Consumer Protection: Unwrapping the Regulatory Premises” (2021) 8 Critical Analysis of Law 43, 52; A Bibal et al, “Legal requirements on explainability in machine learning” (2021) 29 Artificial Intelligence & Law 149, 161; Hacker and Passoth, supra, note 14, 343, 364.

50 According to the new Art 2(1)(n) of the UCP Directive, an online marketplace is a “service enabling consumers to conclude distance contracts with other traders or consumers through the use of software, including a website, part of a website or an application operated by or on behalf of the trader”.

51 See also Alexander, supra, note 44, para 34.

52 ibid, para 29.

53 In addition, the terms and conditions for sponsored ranking must be disclosed to business clients, Art 5(3) P2B Regulation.

54 See, eg, FZ Borgesius, “Personal data processing for behavioural targeting: which legal basis?” (2015) 5 International Data Privacy Law 163.

55 S Wachter, B Mittelstadt and L Floridi, “Why a Right to Explanation of Automated Decision-Making Does Not Exist in the General Data Protection Regulation” (2017) 7 International Data Privacy Law 76, 88.

56 Article 29 Data Protection Working Party, “Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679” (2017) WP 251, 21.

57 Ursu, supra, note 23; Derakhshan et al, supra, note 34.

58 Article 29 Data Protection Working Party, “Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679” (2017) WP 251, 25; AD Selbst and J Powles, “Meaningful information and the right to explanation” (2017) 7 International Data Privacy Law 233, 236; B Custers and A-S Heijne, “The right of access in automated decision-making: The scope of Article 15(1)(h) GDPR in theory and practice” (2022) 46 Computer Law & Security Review 105727.

59 G Zanfir-Fortuna, “Article 13: Information to be provided where personal data are collected from the data subject”, in C Kuner et al (eds), The EU General Data Protection Regulation (GDPR): A Commentary (online edition, Oxford, Oxford Academic 2020) p 430.

60 Article 29 Data Protection Working Party, “Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679” (2017) WP 251, 25; Hacker and Passoth, supra, note 14, 343, 349; Selbst and Powles, supra, note 58, 236; Custers and Heijne, supra, note 58, 5.

61 M Bäcker, “Article 15” in J Kühling and B Buchner (eds), Datenschutzgrundverordnung BDSG (3rd edition, Munich, CH Beck 2020) para 27; P Bräutigam and F Schmidt-Wudy, “Das geplante Auskunft- und Herausgaberecht des Betroffenen nach Article 15 der EU-Datenschutzgrundverordnung” (2015) CR 56, 62. The opposite is argued in LK Kumkar and D Roth-Isigkeit, “A Criterion-Based Approach to GDPR’s Explanation Requirements for Automated Individual Decision-Making” (2021) 12 JIPITEC, the Journal of Intellectual Property, Information Technology and Electronic Commerce Law 289, 296.

62 G Malgieri and G Comandé, “Why a Right to Legibility of Automated Decision-Making Exists in the General Data Protection Regulation” (2017) 7 International Data Privacy Law 243; Wachter et al, supra, note 55, 76–99; Selbst and Powles, supra, note 58, 233–42. M Brkan, “Do algorithms rule the world? Algorithmic decision-making and data protection in the framework of the GDPR and beyond” (2019) 27(2) International Journal of Law and Information Technology 91, 110–19.

63 Article 29 Data Protection Working Party, supra, note 56, 25.

64 See also sources cited supra, note 60.

65 See, eg, (French) Conseil Constitutionnel, Décision n° 2020-834 QPC du 3 avril 2020, Parcoursup; (Dutch) Rechtbank Den Haag, Case C-09-550982-HA ZA 18-388, SyrRI, ECLI:NL:RBDHA:2020:1878; (Italian) Corte Suprema di Cassazione, Judgment of 25 May 2021, Case 14381/2021,

66 District Court of Amsterdam, Case C/13/689705/HA RK 20-258, Ola, ECLI:NL:RBAMS:2021:1019 (Ola Judgment); see also R Gellert, M van Bekkum and FZ Borgesius, “The Ola & Uber judgments: for the first time a court recognises a GDPR right to an explanation for algorithmic decision-making” (EU Law Analysis, 28 April 2021) <https://eulawanalysis.blogspot.com/2021/04/the-ola-uber-judgments-for-first-time.html/> (last accessed 12 December 2022).

67 Article 29 Data Protection Working Party, “Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679” (2017) WP 251, 25.

68 Ola Judgment, para 4.52; translation according to Anton Ekker, “Dutch court rules on data transparency for Uber and Ola drivers” (Ekker Blog) <https://ekker.legal/en/2021/03/13/dutch-Court-rules-on-data-transparency-for-uber-and-ola-drivers/> (last accessed 12 December 2022).

69 See supra, note 45.

70 The judgment can be found here: <https://uitspraken.rechtspraak.nl/#!/details?id=ECLI:NL:GHAMS:2023:804>; an unofficial translation can be found here: <https://5b88ae42-7f11-4060-85ff-4724bbfed648.usrfiles.com/ugd/5b88ae_de414334d89844bea61deaaebedfbbfe.pdf>; see also J Turner, “Amsterdam Court Upholds Appeal in Algorithmic Decision-Making Test Case: Drivers v Uber and Ola” (Fountain Court Blog, 4 June 2023) <https://www.fountaincourt.co.uk/2023/04/amsterdam-court-upholds-appeal-in-algorithmic-decision-making-test-case-drivers-v-uber-and-ola/> (last accessed 22 November 2023).

71 See: Garante per la protezione dei dati personali, “Intelligenza artificiale: il Garante blocca ChatGPT. Raccolta illecita di dati personali. Assenza di sistemi per la verifica dell’età dei minori” <https://www.garanteprivacy.it:443/home/docweb/-/docweb-display/docweb/9870847> (last accessed 17 July 2023).

73 Hacker et al, supra, note 37; J Möller-Klapperich, “ChatGPT und Co. – aus der Perspektive der Rechtswissenschaft” (2023) 4 Neue Justitz 144; I Goodfellow et al, “Generative adversarial networks” (2020) 63 Communications of the ACM 11, 139–44; “Press Release: LfDI informiert sich bei OpenAI, wie ChatGPT datenschutzrechtlich funktioniert” <https://www.baden-wuerttemberg.datenschutz.de/lfdi-informiert-sich-bei-openai-wie-chatgpt-datenschutzrechtlich-funktioniert/> (last accessed 11 May 2023).

74 See, eg, Hacker et al, supra, note 37.

75 ibid.

76 Art 29 WP, Guidelines on transparency under Regulation 2016/679, 17/EN WP260 rev.01, para 64.

77 Goodfellow et al, supra, note 73; N Carlini et al, “Extracting Training Data from Diffusion Models” (arXiv, 30 January 2023) <http://arxiv.org/abs/2301.13188> (last accessed 17 July 2023); R Plant et al, “You Are What You Write: Preserving Privacy in the Era of Large Language Models” (April 2022) <https://www.researchgate.net/publication/360079388_You_Are_What_You_Write_Preserving_Privacy_in_the_Era_of_Large_Language_Models> (last accessed 17 July 2023).

78 See also D Brouwer, “Towards a ban of discriminatory rankings by digital gatekeepers? Reflections on the proposal for a Digital Markets Act” (Internet Policy Review, 11 January 2021) <https://policyreview.info/articles/news/towards-ban-discriminatory-rankings-digital-gatekeepers-reflections-proposal-digital> (last accessed 7 December 2022), according to which fairness was already to be understood as transparency in the sense of the P2B Regulation under the Commission draft.

79 See, eg, the overview in A Holzinger et al, “xxAI – Beyond Explainable Artificial Intelligence” in A Holzinger et al (eds), xxAI – Beyond Explainable AI, International Workshop on Extending Explainable AI Beyond Deep Models and Classifiers (Berlin, Springer 2022) p 13; Arrieta et al, supra, note 47; for an account of their accuracy, see A Alonso and JM Carbó, “Accuracy of Explanations of Machine Learning Models for Credit Decision” (2022) Banco de España Working Paper 2222.

80 Hacker and Passoth, supra, note 14, 343, 358 et seqq.

81 See only Wachter et al, supra, note 55; Selbst and Powles, supra, note 58; Kumkar and Roth-Isigkeit, supra, note 45.

82 Recitals 5 and 62 AI Act.

83 W Pieters, “Explanation and trust: what to tell the user in security and AI?” (2011) 13 Ethics and Information Technology 53.

84 See, eg, Hacker and Passoth, supra, note 14, 343, 357 et seqq.

85 See Arts 13(3)(b)(ii) and (iv)–(vi) AI Act.

86 See, eg, Lapuschkin et al, supra, note 48.

87 Tie-Yan, supra, note 35; T Joachims et al, “Accurately interpreting clickthrough data as implicit feedback” in Proceedings of the 28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval (New York, Association for Computing Machinery 2005) p 154.

88 G Adomavicius and A Tuzhilin, “Toward the next generation of recommender systems: a survey of the state-of-the-art and possible extensions” (2005) 17 IEEE Transactions on Knowledge and Data Engineering 734; G Takács and D Tikk, “Alternating least squares for personalized ranking” in RecSys ‘12: Proceedings of the Sixth ACM Conference on Recommender Systems (New York, Association for Computing Machinery 2012) p 83; T Zhao, J McAuley and I King, “Leveraging Social Connections to Improve Personalized Ranking for Collaborative Filtering” in CIKM’14: Proceedings of the 23rd ACM International Conference on Conference on Information and Knowledge Management (New York, Association for Computing Machinery 2014) p 261.

89 The fact that personal data can be incorrect is explicitly recognised by the GDPR through the introduction of the right to rectification (see: Art 29 Working Party, “Opinion 4/2007 on the concept of personal data”, WP 136 (01248/07/EN) 6 <https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/2007/wp136_en.pdf> (last accessed 17 November 2022)). Also, arguably opinions do not need to be held to the standards of accuracy at all, see: D Hallinan and FZ Borgesius, “Opinions can be incorrect (in our opinion)! On data protection law’s accuracy principle” (2020) 10 International Data Privacy Law 1.

90 S Wachter and B Mittelstadt, “A Right to Reasonable Inferences: Re-thinking Data Protection Law in the Age of Big Data and AI” (2019) 2 Columbia Business Law Review 494.

91 Art 29 Working Party, “Opinion 4/2007 on the concept of personal data”, WP 136 (01248/07/EN) 10 <https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/2007/wp136_en.pdf> (last accessed 17 November 2022).

92 Case C-434/16 Nowak v Data Protection Commissioner, ECLI:EU:C:2017:994, para 35.

93 ibid, 10–11.

94 Article 29 Data Protection Working Party, supra, note 56, 9.

95 ibid, 18.

96 Case C-141/12 YS v Minister voor Immigratie, ECLI:EU:C:2014:2081.

97 ibid, paras 45–46.

98 Technically speaking, the correct term would be “performance”, with accuracy being only one of several relevant performance measures, see A Lindholm et al, Machine Learning – A First Course for Engineers and Scientists (Cambridge, Cambridge University Press 2022) p 88.

99 See, eg, Hacker, supra, note 1, 5 et seqq.

100 Cf. Hacker, supra, note 1, 34, 37.

101 See Art 6 PLD Proposal; Hacker, supra, note 1, 22, 53.

102 European Commission, “Questions and answers on the revision of the Product Liability Directive” (QANDA/22/5791, 2022), under 9.

103 For an argument in this vein, see Hacker, supra, note 1, 44.

104 CJEU, Case T-612/17, Google Shopping, ECLI:EU:T:2021:763; on the procedure, see, eg, A Lohse, “Marktmissbrauch durch Internetplattformen?” (2018) 182 ZHR 321, 348–53.

105 European Commission, Procedure AT.40703 (Amazon Buy Box).

106 Hagiu et al, supra, note 22; Bongartz et al, supra, note 15; J Padilla, J Perkins and S Piccolo, “Self-Preferencing in Markets with Vertically Integrated Gatekeeper Platforms” (2022) 70 The Journal of Industrial Economics 371.

107 I Graef, “Differentiated Treatment in Platform-to-Business Relations: EU Competition Law and Economic Dependence” (2019) 38 Yearbook of European Law 453.

108 ibid.

109 Hagiu et al, supra, note 22, 300.

110 Bongartz et al, supra, note 15, 62.

111 This can lead to a problematic market definition, particularly in the case of digital companies; see, eg, Bundeskartellamt, decision of 6 February 2019, B6-22/16, para 166 et seq.

112 P Marsden, “Google Shopping for the Empress’s New Clothes – When a Remedy Isn’t a Remedy (and How to Fix it)” (2020) 11 Journal of European Competition Law & Practice 553; Digital Competition Expert Panel, “Unlocking Digital Competition” (Report, 2019) para 2.46.

113 Cf. TJ Gerpott, “Das Gesetz über digitale Märkte nach den Trilog-Verhandlungen” (2022) CR 409, according to which Google’s share value has increased by a factor of five during the almost eleven-year duration of the proceedings (citing Macrotrends, “Alphabet Market Cap 2010–2021” <https://www.macrotrends.net/stocks/charts/GOOGL/alphabet/market-cap> (last accessed 12 September 2022), for stock market value data).

114 See Section II.4.a.

115 Eifert et al, supra, note 15, 1003 f.

116 Brouwer, supra, note 78.

117 Graef, supra, note 107, 463 f.

118 But see Brouwer, supra, note 78.

119 See only J Block and B Rätz, “The FRAND offer – attempt at an international definition” (2019) GRUR 797; J-S Borghetti, I Nikolic and N Petit, “FRAND licensing levels under EU law” (2021) 17 European Competition Journal 205; JG Sidak, “The Meaning of FRAND, Part I: Royalties” (2013) 9 Journal of Competition Law and Economics 931; M Heim and I Nikolic, “A FRAND Regime for Dominant Digital Platforms” (2019) 10 Journal of Intellectual Property, Information Technology and Electronic Commerce Law 38.

120 ibid.

121 See, eg, C Ann, Patentrecht (8th edition, Munich, CH Beck 2022) § 43 marginal no. 35 et seqq.

122 PG Picht and H Richter, “EU Digital Regulation 2022: Data Desiderata” (2022) GRUR International 395.

123 Sidak, supra, note 119.

124 Ann, supra, note 121, § 43 marginal no. 36.

125 C Busch, “Mehr Fairness und Transparenz in der Plattformökonomie? Die neue P2B-Verordnung im Überblick” (2019) GRUR 788.

126 See, eg, Ann, supra, note 121, § 43 marginal no. 39 ff on the various calculation approaches.

127 Sidak, supra, note 119, 968; Brouwer (supra, note 78) would like to load fairness in the ranking context, albeit still on the basis of the original Commission draft, with the transparency rules of the P2B Regulation. This is now prohibited, as transparency is an independent criterion.

128 This concerns access to essential input data and equivalence violations in the contractual structure.

129 Art 12, para 5, lit b(i) DMA.

130 L Cabral et al, “The EU Digital Markets Act” (Luxembourg, Joint Research Centre, JRC122910, Publications Office of the European Union 2021) p 13.

131 See, eg, Council Directive 2000/43/EC of 29 June 2000 implementing the principle of equal treatment between persons irrespective of racial or ethnic origin (Race Equality Directive); Council Directive 2000/78/EC of 27 November 2000 establishing a general framework for equal treatment in employment and occupation (Framework Directive); Council Directive 2000/78/EC of 27 November 2000 establishing a general framework for equal treatment in employment and occupation; Council Directive 2004/113/EC of 13 December 2004 implementing the principle of equal treatment between men and women in the access to and supply of goods and services.

132 See, eg, the overview in A Asudeh et al, “Designing Fair Ranking Schemes” in SIGMOD ‘19: Proceedings of the 2019 International Conference on Management of Data (New York, Association for Computing Machinery 2019) p 1259; A Singh and T Joachims, “Fairness of Exposure in Rankings” in Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining (New York, Association for Computing Machinery 2018) p 2219; M Zehlike, K Yang and J Stoyanovich, “Fairness in Ranking, Part II: Learning-to-Rank and Recommender Systems” (2022) 55 ACM Computing Surveys 117.

133 See, eg, S Wachter, “The Theory of Artificial Immutability: Protecting Algorithmic Groups Under Anti-Discrimination Law” (2022) arXiv preprint arXiv:220501166; S Wachter, B Mittelstadt and C Russell, “Why fairness cannot be automated: Bridging the gap between EU non-discrimination law and AI” (2021) 41 Computer Law & Security Review 105567; FJZ Borgesius, “Strengthening legal protection against discrimination by algorithms and artificial intelligence” (2020) 24 The International Journal of Human Rights 1572; M Zehlike, P Hacker and E Wiedemann, “Matching code and law: achieving algorithmic fairness with optimal transport” (2020) 34 Data Mining and Knowledge Discovery 163; FZ Borgesius, Discrimination, Artificial Intelligence, and Algorithmic Decision-Making (Strasbourg, Council of Europe, Directorate General of Democracy 2018); P Hacker, “Teaching fairness to artificial intelligence: existing and novel strategies against algorithmic discrimination under EU law” (2018) 55 Common Market Law Review 1143.

134 CJEU, Case C-453/10, Pereničová and Perenič, ECLI:EU:C:2012:144.

135 CJEU, Case C-109/17, Bankia, ECLI:EU:C:2018:201.

136 CJEU, Case C-694/17, Pillar Securitisation, ECLI:EU:C:2019:44, para 35.

137 AG Trstenjak, Case C-453/10, Pereničová und Perenič, ECLI:EU:C:2011:788, para 90 (discussing legal acts relating to EU consumer law).

138 See also P Hacker, Datenprivatrecht (Tubingen, Mohr Siebeck 2020) p 335 et seqq.

139 See, eg, J Adams-Prassl, R Binns and A Kelly-Lyth, “Directly Discriminatory Algorithms” (2023) 86(1) The Modern Law Review 144.

140 See, eg, Art 4(1) of the Race Equality Directive; Art 4(1) of the Framework Directive; Art 14(2) of the recast Gender Equality Directive 2006/54/EC; CJEU, Case C-229/08, Wolf, ECLI EU:C:2010:3, para 35.

141 Hacker, supra, note 133, 1166.

142 CJEU, Case C-188/15, Bougnaoui, EU:C:2017:204, para 40; Joined Cases C-804/18 and C-341/19, Wabe and Müller, ECLI EU:C:2021:594, para 65; see also E Howard, “Headscarves and the CJEU: Protecting fundamental rights or pandering to prejudice” (2021) 28 Maastricht Journal of European and Comparative Law 648, 255 et seqq.; E Cloots, “Safe Harbour or Open Sea for Corporate Headscarf bans? Achbita and Bougnaoui” (2018) 55 Common Market Law Review 589, 613.

143 Cabral et al, supra, note 130, 13.

144 In particular the contestability of ranked products and services, cf. Recitals 7, 11 and 51 DMA.

145 See, eg, District Court of Düsseldorf, [2018] GRUR-RS 37930, ECLI:DE:LGD:2018:1212.4B.O4.17.00, para 202; CJEU, Case C-313/04, Egenberger, ECLI:EU:C:2008:728, para 33; Unwired Planet v Huawei [2018] EWCA Civ 2344, para 162; European Commission, “Setting Out the EU Approach to Standard Essential Patents” (Communication) COM(2017) 712 final, 7; see also Brouwer, supra, note 78; Art 102(c) TFEU.

146 Unwired Planet v Huawei [2018] EWCA Civ 2344, para 162 et seq.

147 See also Brouwer, supra, note 78; Unwired Planet v Huawei [2018] EWCA Civ 2344, paras 169–70.

148 The fact that a company has entered into an exclusivity agreement with Amazon is not relevant to the accuracy of prediction, which is orientated towards the needs of consumers.

149 D Pessach and E Shmueli, “Algorithmic Fairness” (2020) arXiv preprint arXiv:2001.09784.

150 M Zehlike et al, “FA*IR: A Fair Top-k Ranking Algorithm” in Proceedings of the 2017 ACM on Conference on Information and Knowledge, Management (New York, Association for Computing Machinery 2017) p 1569; Zehlike et al, supra, note 133.

151 Cf. Hacker, supra, note 16, 1284.

152 See also Cabral et al, supra, note 130, 13.

153 See also Brouwer, supra, note 78.

154 See, eg, S Bar-Ziv and N Elkin-Koren, “Behind the scenes of online copyright enforcement: empirical evidence on notice & takedown” (2018) 50 Connecticut Law Review 339; J Cobia, “The digital millennium copyright act takedown notice procedure: Misuses, abuses, and shortcomings of the process” (2008) 10 Minnesota Journal of Law Science & Technology 387.

155 JM Urban and L Quilter, “Efficient process or chilling effects – takedown notices under Section 512 of the Digital Millennium Copyright Act” (2005) 22 Santa Clara Computer & High Tech Law Journal 621, 622.

156 See, eg, Arts 5 and 14 DSA and Art 17 C-DSM Directive.

157 See also, for a practical proposal in this vein, M Veale and R Binns, “Fairer machine learning in the real world: mitigating discrimination without collecting sensitive data” (2017) 4 Big Data & Society 2053951717743530.

158 Cabral et al, supra, note 130, 13.

159 Lindholm et al, supra, note 98, p 67 ff, 299 et seqq.; Hacker, supra, note 14, 259.

160 Q Yang et al, “Federated Machine Learning: Concept and Applications” (2019) 10 ACM Transactions on Intelligent Systems and Technology 12.

161 B Güler and A Yener, “Sustainable federated learning” (2021) arXiv preprint arXiv:210211274.

162 See already R Podszun, “Should Gatekeepers Be Allowed to Combine Data? Ideas for Article 5(a) of the Draft Digital Markets Act” (2022) 71 GRUR International 197, 199.

163 See, eg, N Richards and W Hartzog, “The pathologies of digital consent” (2018) 96 Wash University Law Review 1461; P Blume, “The inherent contradictions in data protection law” (2012) 2 International Data Privacy Law 26, 29 et seqq.

164 SY Soh, “Privacy Nudges: An Alternative Regulatory Mechanism to ‘Informed Consent’ for Online Data Protection Behaviour” (2019) 5 European Data Protection Law Review 65; Y Hermstrüwer, “Contracting around privacy: the (behavioral) law and economics of consent and big data” (2017) 8 JIPITEC, the Journal of Intellectual Property, Information Technology and Electronic Commerce Law 9.

165 C Utz et al, “(Un)informed Consent: Studying GDPR Consent Notices in the Field” in Proceedings of the 2019 ACM SIGSAC Conference on Computer and Communications Security (CCS ‘19) (New York, Association for Computing Machinery) p 973.

166 D Machuletz and R Böhme, “Multiple Purposes, Multiple Problems: A User Study of Consent Dialogs after GDPR” in Proceedings on Privacy Enhancing Technologies (2020) pp 481–98; B Kostic and EV Penagos, “The freely given consent and the ‘bundling’ provision under the GDPR” (2017) 153 Computerrecht 217.

167 C Van Slyke et al, “Rational ignorance: a privacy pre-calculus” (2021) WISP 2021 Proceedings 12.

168 M Nouwens et al, “Dark Patterns after the GDPR: Scraping Consent Pop-Ups and Demonstrating Their Influence” in Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (New York, Association for Computing Machinery 2020) p 1.

169 S Barth and MDT de Jong, “The privacy paradox – investigating discrepancies between expressed privacy concerns and actual online behavior – a systematic literature review” (2017) 34 Telematics and Informatics 1038–58.

170 D Geradin, K Bania and T Karanikioti, “The interplay between the Digital Markets Act and the General Data Protection Regulation” (29 August 2022) <http://dx.doi.org/10.2139/ssrn.4203907> (last accessed 6 November 2022).

171 Kostic and Penagos, supra, note 166; FZ Borgesius et al, “Tracking walls, take-it-or-leave-it choices, the GDPR, and the ePrivacy regulation” (2017) 3 European Data Protection Law Review 353, 361.

172 Cf. Utz et al, supra, note 165, 973.

173 See, eg, J Luguri and L Strahilevitz, “Shining a light on dark patterns” (2021) 13 Journal of Legal Analysis 43; M Martini et al, “Dark patterns” (2021) 1 Zeitschrift für Digitalisierung und Recht 47.

174 See the analysis in P Hacker, “Manipulation by algorithms. Exploring the triangle of unfair commercial practice, data protection, and privacy law” (2022) European Law Journal.

175 noyb, “noyb win: Personalized Ads on Facebook, Instagram and WhatsApp declared illegal” (noyb, 6 December 2022) <https://noyb.eu/en/noyb-win-personalized-ads-facebook-instagram-and-whatsapp-declared-illegal> (last accessed 8 December 2022).

176 J Bryant, “Belgian DPA fines IAB Europe 250K euros over consent framework GDPR violations” (iapp, 2 February 2022) <https://iapp.org/news/a/belgian-dpa-fines-iab-europe-250k-euros-over-consent-framework-gdpr-violations/> (last accessed 2 September 2022); noyb, “226 Complaints Lodged Against Deceptive Cookie Banners” (noyb, 9 August 2022) <https://noyb.eu/en/226-complaints-lodged-against-deceptive-cookie-banners> (last accessed 21 September 2022).

177 See also Podszun, supra, note 162, 201 et seq.; Hacker, supra, note 138, 627 et seqq. on privacy scores.

178 See, eg, A Goldfarb, S Greenstein and C Tucker, “Introduction to Economic Analysis of the Digital Economy” in A Goldfarb, S Greenstein and C Tucker (eds), Economic Analysis of the Digital Economy (Chicago, IL, University of Chicago Press 2015) p 1.

179 See only Wachter and Mittelstadt, supra, note 90.

180 See, eg, Hacker, supra, note 14, 265–68; see also, more generally, M Finck and F Pallas, “They who must not be identified – distinguishing personal from non-personal data under the GDPR” (2020) 10 International Data Privacy Law 11.

181 For a critical analysis of Art 10 AI Act, see Hacker, supra, note 14, 296–300; M van Bekkum and FZ Borgesius, “Using sensitive data to prevent discrimination by artificial intelligence: does the GDPR need a new exception?” (2023) 48 Computer Law & Security Review 105770, 11–12.

182 The right of access in Art 6, para 12 DMA, on the other hand, has no specific reference to AI.

183 DL Rubinfeld and MS Gal, “Access Barriers to Big Data” (2017) 59 Arizona Law Review 339; W Kerber, “Governance of Data: Exclusive Property vs. Access” (2016) 47 IIC – International Review of Intellectual Property and Competition Law 759; see also H Schweitzer, “Datenzugang in der Datenökonomie: Eckpfeiler einer neuen Informationsordnung” (2019) 121 Gewerblicher Rechtsschutz und Urheberrecht 569.

184 See, eg, Finck and Pallas, supra, note 180; N Purtova, “The law of everything. Broad concept of personal data and future of EU data protection law” (2018) 10 Law, Innovation and Technology 40.

185 D Machuletz and R Böhme, “Multiple Purposes, Multiple Problems: A User Study of Consent Dialogs after GDPR” in (2020) Proceedings on Privacy Enhancing Technologies 481–98; Kostic and Penagos, supra, note 166; Utz et al, supra, note 165, 973–90; Nouwens et al, supra, note 168, 1–13.

186 Rubinfeld and Gal, supra, note 183, 353; Martens, supra, note 31, 4, 24 et seq.; see also supra, note 19.

187 Finck and Pallas, supra, note 180, 15.

188 See, eg, L Rocher, JM Hendrickx and Y-A de Montjoye, “Estimating the success of re-identifications in incomplete datasets using generative models” (2019) 10 Nature Communications 3069.

189 See, eg, Irish Data Protection Commission, “Guidance on Anonymisation and Pseudonymisation” (June 2019) <https://www.dataprotection.ie/sites/default/files/uploads/2019-06/190614%20Anonymisation%20and%20Pseudonymisation.pdf> (last accessed 7 December 2022); Article 29 Data Protection Working Party, Opinion 05/2014 on Anonymisation Techniques, WP 216, 2014.

190 EF Villaronga, P Kieseberg and T Li, “Humans forget, machines remember: artificial intelligence and the right to be forgotten” (2018) 34 Computer Law & Security Review 304, 310.

191 See, eg, P Mohassel and Y Zhang, “SecureML: A System for Scalable Privacy-Preserving Machine Learning” (2017) IEEE Symposium on Security and Privacy (SP) 1.

192 European Commission, Proposal for a Regulation on the European Parliament and of the Council on harmonised rules on fair access to and use of data (Data Act) COM(2022) 68 final.

193 ibid, 2 et seq.

194 Regarding the relationship of the DA to other legal acts, including the DMA, cf. also L Specht-Riemenschneider, “Der Entwurf des Data Act – Eine Analyse der vorgesehenen Datenzugangsansprüche im Verhältnis B2B, B2C und B2G” (2022) 25 Zeitschrift für IT – Recht und Recht der Digitalisierung 809, 810 et seq; critical with regard to the achievement of consumer empowerment is W Kerber, “Governance of IoT Data: Why the EU Data Act Will not Fulfill Its Objectives” (2022) GRUR International 1.

195 See Art 5(1) DA.

196 See, for a critique regarding the effectiveness of the DA’s guarantee of self-determination, Specht-Riemenschneider, supra, note 194, 816 et seqq.

197 On the B2G (“business to government”) relationship, see ibid, 824 et seqq.

198 For further critique on the dependency on the consumer’s initiative, see, eg, R Podszun and P Offergeld, “The EU Data Act and the Access to Secondary Markets” (24 October 2022) <https://ssrn.com/abstract=4256882> (last accessed 22 November 2023) 45 et seq.

199 Data Act, supra, note 192, 5.

200 See also I Graef and M Husovec, “Seven Things to Improve in the Data Act” (2022) <https://ssrn.com/abstract=4051793> (last accessed 22 November 2023) 2 et seq., who mention the possibility of bypassing the exclusion by relying on Art 20 GDPR instead.

201 See, eg, D Geradin and D Katsifis, “An EU competition law analysis of online display advertising in the programmatic age” (2019)15 European Competition Journal 55, 62; Recital 45 DMA.

202 Geradin and Katsifis, supra, note 201, 55 f.

203 L Bertuzzi, “Dark patterns, online ads will be potential targets for the next Commission, Reynders says” (EURACTIV, 9 December 2022), <https://www.euractiv.com/section/digital/interview/dark-patterns-online-ads-will-be-potential-targets-for-the-next-commission-reynders-says> (last accessed 9 December 2022).

204 X Liu et al, “Neural Auction: End-to-End Learning of Auction Mechanisms for E-Commerce Advertising” (2021) Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery and Data Mining 3354-64 <https://doi.org/10.1145/3447548.3467103> (last accessed 9 December 2022); Z Zhang et al, “Optimizing Multiple Performance Metrics with Deep GSP Auctions for E-Commerce Advertising” (2021) Proceedings of the Fourteenth ACM International Conference on Web Search and Data Mining (WSDM ’21) 993–1001 <https://doi.org/10.1145/3437963.3441771> (last accessed 9 December 2022); SC Geyik et al, “Joint Optimization of Multiple Performance Metrics in Online Video Advertising” (2016) Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD ’16) 471–80. <https://doi.org/10.1145/2939672.2939724> (last accessed 9 December 2022).

205 Cf. ibid.

206 Geradin and Katsifis, supra, note 201, 61.

207 See, on their significance, see D Geradin, T Karanikioti and D Katsifis, “GDPR myopia: how a well-intended regulation ended up favouring large online platforms – the case of ad tech” (2021) 17 European Competition Journal 47, 50 et seqq.

208 See the overview in note 79, supra.

209 See supra, note 46, and accompanying text.

210 ibid.

211 Cf. Geradin supra, note 207, 48–49.

212 Liu et al, supra, note 204; Zhang et al, supra, note 204.

213 For details on the click-through rate and other procedures, see Zehlike et al, supra, note 16.

214 Cf., eg, Annex IV, para 2, lit b AI Act: meaning of the various parameters; see also Hacker and Passoth, supra, note 14, 343, 357 et seqq.

215 Cf. The Royal Society, “Explainable AI: The Basics – Policy Briefing” (November 2019) <https://royalsociety.org/topics-policy/projects/explainable-ai/> (last accessed 2 October 2022) 14.

216 Cf. also D Citron and F Pasquale, “The scored society: due process for automated predictions” (2014) 89 Washington University Law Review 1; D Citron, “Technological due process” (2007) 85 Washington University Law Review 1249.

217 See supra, note 1.

218 For an overview, see references in note 79, supra.

219 Hacker and Passoth, supra, note 14, 343, 362 et seqq.

220 Cf. O Ben-Shahar and A Chilton, “Simplification of privacy disclosures: an experimental test” (2016) 45 The Journal of Legal Studies S41; JA Obar and A Oeldorf-Hirsch, “The biggest lie on the Internet: ignoring the privacy policies and terms of service policies of social networking services” (2020) 23 Information, Communication & Society 128.

221 See, eg, P Bischhoff, “Comparing the privacy policy of Internet giants side-by-side (comparitech, March 2017) <https://www.comparitech.com/blog/vpn-privacy/we-compared-the-privacy-policies-of-internet-giants-side-by-side/> (last accessed 6 September 2022); Forbrukerrådet, Deceived by Design, Report, 2018, <https://fil.forbrukerradet.no/wp-content/up/loads/2018/06/2018-06-27-deceived-by-design-final.pdf> (last accessed 6 September 2022).

222 See, eg, S Zimmeck and SM Bellovin, “Privee: an architecture for automatically analyzing web privacy policies” (2014) 23rd USENIX Security Symposium 1; L Austin et al, “Towards Dynamic Transparency: The AppTrans (Transparency for Android Applications) Project” (Working Paper, 2018) <https://ssrn.com/abstract=3203601> (last accessed 22 November 2023).

223 S Wachter, B Mittelstadt and C Russell, “Counterfactual explanations without opening the black box: Automated decisions and the GDPR” (2017) 31 Harvard Journal of Law & Technology 841; RK Mothilal, A Sharma and C Tan, “Explaining machine learning classifiers through diverse counterfactual explanations” (2020) Proceedings of the 2020 conference on Fairness, Accountability, and Transparency 607.

224 DD Friedman, WM Landes and RA Posner, “Some economics of trade secret law” (1991) 5 Journal of Economic Perspectives 61.

225 I Png, “Law and innovation: evidence from state trade secrets laws” (2017) 99 Review of Economics and Statistics 167.

226 A Contigiani, DH Hsu and I Barankay, “Trade secrets and innovation: evidence from the ‘inevitable disclosure’ doctrine” (2018) 39 Strategic Management Journal 2921.

227 L Pedraza-Fariña, “Spill your (trade) secrets: knowledge networks as innovation drivers” (2016) 92 Notre Dame Law Review 1561.

228 J Bambauer and T Zarsky, “The Algorithm Game” (2018) 94 Notre Dame Law Review 1.

229 See, eg, J Pearl, “The seven tools of causal inference, with reflections on machine learning” (2019) 62 Communications of the ACM 54; J Richens et al, “Improving the accuracy of medical diagnosis with causal machine learning” (2020) 11 Nature Communications 1.

230 J Kaddour et al, “Causal machine learning: a survey and open problems” (2022) arXiv preprint arXiv:220615475.

231 See also C Busch, “Mehr Fairness und Transparenz in der Plattformökonomie? Die neue P2B-Verordnung im Überblick” (2019) 121 Gewerblicher Rechtsschutz und Urheberrecht 788, 793.

232 See also Bäcker, supra, note 61, Art 13, para 54; Hacker and Passoth, supra, note 14, 343, 350.

233 Article 29 Data Protection Working Party, “Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679” (2017) WP 251, 17.

234 Cf. also ibid.

235 See, eg, J Sakshaug et al, “The effect of framing and placement on linkage consent” (2019) 83 Public Opinion Quarterly 289.

236 See supra, note 184 and accompanying text.

237 Cf. Art 11 AI Act.

238 Ursu, supra, note 23, 530.

239 ibid, 549.

240 Hacker, supra, note 133, 1159.

241 On the concept of fairness in the DMA, see H Schweitzer, “The Art to Make Gatekeeper Positions Contestable and the Challenge to Know What Is Fair: A Discussion of the Digital Markets Act Proposal” (2021) 29 Zeitschrift für Europäisches Privatrecht 2021, 503; Bongartz et al, supra, note 15, 62; see also W Fikentscher, P Hacker and R Podszun, FairEconomy (Berlin, Springer 2013).

242 See, eg, Y Shmargad and S Klar, “Sorting the news: how ranking by popularity polarizes our politics” (2020) 37 Political Communication 423; F Germano et al, “The few-get-richer: a surprising consequence of popularity-based rankings?” (2019) The World Wide Web Conference 2764.

243 S de Brouwer, “Privacy self-management and the issue of privacy externalities: of thwarted expectations, and harmful exploitation” (2020) 9 Internet Policy Review 1, 17.

244 S Pandey et al, “Shuffling a Stacked Deck: The Case for Partially Randomized Ranking of Search Engine Results” (2005) Proceedings of the 31st VLDB Conference, DOI:10.48550/arXiv.cs/0503011.

245 A Ezrachi and ME Stucke, “Is your digital assistant devious?” Oxford Legal Studies Research Paper 52/2016.

246 A Mari, “Voice Commerce: Understanding Shopping-Related Voice Assistants and their Effect on Brands” (IMMAA Annual Conference, 2019) 4.

247 Busch, supra, note 231, 792.

248 Cf. B Kuchinke and M Vidal, “Exclusionary strategies and the rise of winner-takes-it-all markets on the Internet” (2016) 40 Telecommunications Policy 582.