On Monday 14 May 2018, the biennial ACA colloquium was held in The Hague. ACA members discussed developments on the theme of technology and law. This summary report provides an overview of the main topics that were discussed.

 

Piet Hein Donner giving a short speechAfter a short speech by ACA president, Mr Piet Hein Donner, the colloquium commenced with an address by Professor Corien Prins, chairperson of the Scientific Council for Government Policy (WRR) and Professor of Law and Information Technology at Tilburg University. In her speech, Professor Prins drew attention to the fact that our modes of transportation are becoming more and more dependent on digital and robot technology, such as automatic pilot systems for airplanes and self-driving cars, raising new legal questions among others in the field of liability. Who should be held liable in case of an accident with a self-driving car, the passenger, the car manufacturer or the system manufacturer who programmed the car on its reaction in an accident? Professor Prins touched upon the different legislative approaches that could be taken to address these and other legal issues and spoke of the risk of legal uncertainty if such legislation is not coordinated among European countries. Another point that was addressed related to the position of tech companies that provide platforms for accessible contact between organizations and society.

By way of illustration, Professor Prins mentioned the Facebook page of the Dutch Council of State. Such platforms contribute to a flourishing debate among civil society and are a vital aspect for the functioning of a democratic society alongside the formal structures. However, by using these platforms courts contribute to personal data-gathering and the creation of profiles for social engineering purposes both by active users and non-users. So it is important to keep in mind that digital tools play a crucial role in shaping our society. Technology and society mutually influence each other and once a technology starts to play a certain part in society it is difficult to change it. This is creating challenges for existing legislation and associated regulations. Control in the early stages of technological developments is consequently crucial. Professor Prins concluded her speech by saying that the approach taken by legislators and judges in this respect is of key importance for these developments.

Mr Piet Hein Donner continued with a summary of the outcome of the questionnaire that was distributed in preparation for this colloquium. An overwhelming number of 29 ACA members responded to the questionnaire and the outcome has led to the selection of the two topics for today:

  1. The possibilities and limits of technology-proof legislation
  2. Digital decision-making and digital enforcement.

In the short debate that followed these two papers, Judge Von Danwitz of the Court of Justice of the European Union (hereafter: Court of Justice), shed light on two interesting cases concerning today’s topics; one pending before the European Commission concerning Google shopping and the other before the German Bundeskartellamt about Facebook. Both cases are fundamental in nature as they deal with the question of market dominance and the abuse of market power. If these are adjudicated, depending on the criteria they will certainly be referenced later in litigation and will set rules for the future which have so far not been explored. Concerning the cases before the Court of Justice, Judge Von Danwitz drew attention to a number of follow-up cases in jurisprudence. These include pending cases in response to the Digital Rights case (CJEU, 9 April 2014, Case C-288/12) , the Tele2 Sweden case (CJEU, 21 December 2016, joint cases C 203/15 and C 698/15) , and the currently pending case of Privacy International (Case C-623/17) .

Judge Von Danwitz also explained that we are always late in law. This was illustrated by the fact that it was only in November last year that the Court of Justice was given the opportunity to specify the circumstances which qualify as ‘valid consent’. This is twenty years too late, since two decades of discussion in academic circles and elsewhere have passed on the conditions for consent. The whole industry is based on consent but nobody has known whether the consent was valid. The Court of Justice is due to give an answer on the matter in the Wirtschaftsakademie case in June 2018. The case will allow the Court of Justice to consider who is responsible for compliance with data protection law on Facebook fan pages. In the coming two years the Court of Justice is expected to clarify a number of fundamental aspects of data protection that will allow us to move forward.

Vice-president of the French Council of State, Sauvé continued with a strong call for European cooperation on today’s topics, both from the angle of legislation as from the perspective of the judge. It is essential to set up mechanisms that allow the exchange of information on questions that judges and legislators in Europe face and the answers they provide within that context, such as ACA’s own Jurifast platform which was designed for this purpose. Vice-president Sauvé stressed the importance of cooperation outside country borders since legal questions in the field of information technology should be considered at the European level. Within this context, vice-president Sauvé referred to a preliminary request from the French Council of State on the territorial reach of the new privacy regulation that brings together two contradicting principles: the right to be forgotten and the right of freedom of information. These are issues that, although they fall within the national jurisdiction, need to be answered at the European level.

In response Professor Prins fully agreed with the importance of the exchange of insight and knowledge about what is happening at the level of national courts and national legislators, not only through these yearly conferences but also through other means. However, the consequences of certain European initiatives at the national level are not always fully thought through. This was illustrated by the example of the status of data. For two years, discussions have been held at the European level concerning the status of data, meaning that consumers can pay with their data for online services. That entails that data is considered to be money and in the proposal for online services a provision exists that sets out this possibility. Naturally, such a provision has consequences at the national level. Therefore interaction between courts and legislators at the European level and the national level is crucial.

 

ACA 2018 1The first topic of the seminar was technology-proof legislation. The author of the Dutch paper on this subject, Councillor of State Daalder, viewed technology-proof legislation from two perspectives; the legislator and the courts.

A legislator can take different approaches on how to respond to rapidly changing technologies, such as using technology-neutral standards and delegating legislation. The perspective of the legislator resulted in the three questions for the discussion on the topic of technology-proof legislation:

  1. Should legislation be result-orientated rather than instrumental?
  2. Will delegated legislation comply with the principle of legal certainty and will such delegated legislation possess sufficient democratic legitimacy?
  3. Building on the topic of delegation: at EU level, what powers do national legislators have when directives no longer reflect the present reality due to technical developments?

Courts may have to use various interpretative methods when existing legislation falls short in the face of the new technological advances. The perspective of the courts resulted in the three questions for the discussion on the topic of technology-proof legislation:

  1. How should courts deal with these situations, and what are the limitations?
  2. Should courts give a wider interpretation of legislation in order to guarantee the protection of fundamental rights?
  3. Should courts in matters involving sanctions by the government make a distinction between punitive and reparative sanctions or adopt a different approach to avoid conflict with Article 7 of the ECHR?

Judge Dunne of the Supreme Court of Ireland was asked to give the first presentation on this matter and started by explaining that the explosion in technological developments has had a significant impact on our social, economic and political life. This observation raised the question of how (information-) technology, could effectively be regulated even if it develops more rapidly than the law. To make legislation more future-proof, Ireland has taken the technology-neutral approach in some of its legislation.
For the meaning of technology-neutral legislation, Judge Dunne referred to Dutch Professor Koops and explained that the aim of technology neutral legislation is for the effects of ICT to be regulated but not the ICT itself. This implies that regulation should not have a negative effect on the development of technology and should not unduly discriminate between technologies. He suggested that the most important purpose of technology neutral legislation is to ensure the sustainability of legislation while maintaining legal certainty. In Ireland technology-neutrality has been used in a number of legislative measures, including the Electronic Commerce Act and the Copyright Act. Difficulties do arise for the judge surrounding the matter of interpreting technology-neutral legislation and existing legislation.

To underpin the role of the judge in interpreting the law, the Irish legislator has implemented the Interpretation Act which allows the judge, under certain circumstances, to take a more purposive or dynamic approach in order to uphold the original intention of the legislator. This involves ascertaining the intention behind the provisions of an act in the context of the circumstances prevalent at the time of the enactment and then going on to apply that general objective to the legislation in the context of the changed technology. The Interpretation Act makes provision in particular for a court, in construing a stipulation of any act or statutory instrument, to make allowances for any changes in the law, social conditions, technology, and the meaning of words used in that act or statutory instrument and other relevant matters which have occurred since the date of the passing of that act or the making of that statutory instrument. But only and so far as its text, purposes and context permit it.

After the presentation of the Irish delegation, the Supreme Court of the United Kingdom was the first to react. Lord Carnwath expressed his interest in the Irish interpretation clause, as it seems to give the Irish courts a lot of potential power, but it does not give much guidance as to what its limits are. Judge Dunne responded that the difficulty of interpretation by the judge is always in trying to strike the balance between looking at the words used in the legislation and trying not to strain them to carry a meaning that they could never have, while at the same time accepting that law is a living instrument which grows and develops along with technological developments.

ACA 2018 64Subsequently, Judge Parrest of the Supreme Court of Estonia spoke on the topic of technology-proof-legislation. According to Judge Parrest the concept of technology-neutral legislation can be understood as the possibility of a client of public services to choose the channel through which he or she wants to communicate with the state. The legislator does not differentiate between different systems or IT-service providers.

The Estonian legislator used to be very enthusiastic about introducing electronic ways of communicating even making them obligatory. However, some difficulties have occurred in relation to the use of new technological advances. The availability of public services may not always be the same since the executive branch is unable to force big IT-companies to cooperate in order to fully implement the law. And there has also been an ID card crisis in Estonia and all ID cards had to be quickly renewed because of the use of risky chips. Most public services have been subjected to a huge risk since public officials were unable to operate work without workable ID cards for authentication or for a digital signature. Estonia learned from this that in practice, despite the technology-neutral position of the Estonian legislator, they were relying too much on one way of authentication, despite the fact that there were other possibilities available.

Judge Parrest drew a number of general conclusions concerning the legal practice by the Supreme Court of Estonia with regard to applying old regulations to new economic activities, such as trading bitcoins. According to Judge Parrest, the starting point should be the principle that everyone has the freedom to conduct business or choose the form of activity he or she wants. Afterwards, a person may be penalised if a clear norm has been violated. There is no room for wide interpretation. Public authorities must guarantee public order in a wide sense in order to protect the rights and freedoms of others. In this regard old legislation can be applied to new forms of activities if this is proportionate.

After the presentation, the Vice-president of the French Council of State Sauvé expressed doubts about the legitimacy of the judge to apply existing legislation to entirely new matters. President Rennert of the German Bundesverwaltungsgericht had a different view saying that even when it comes to entirely new questions, a court has the legitimacy to judge, as a judge is not allowed to refuse to give a ruling in any given a case. He mentioned that even if legislation is not applicable and only general principles can guide a ruling, a decision by the judge must still be made.

The delegation of the Czech Supreme Administrative Court intervened and added that in the long run it is about balancing the principles of Article 6 and Article 7 to the European Convention on Human Rights. On the one hand, if a court does not answer your question, it will be in conflict with its duty to respond as part of the human right of access to the law. On the other hand, if a court responds too quickly and an interpretation of the law is too broad, this may violate the right of no punishment without previous and satisfactorily precise law. Accordingly, the discussion is actually about the question on how to navigate between these two rights.

Judge Patroni Griffi of the Italian Council of State noted that the views from France and Germany were not so far apart. The courts will have to face new issues, sometimes without law. With regard to the role of the legislator French Vice-president Sauvé made the remark that existing laws and general principles should be used as much as possible and should be preferred to specific legislation, as that may increasingly lead to more detailed regulations, while general principles create unity. Judge Patroni Griffi made an additional remark mentioning that it is almost impossible to regulate on all technological developments. Flexibility is crucial and therefore a large part of the regulation of technological developments is being delegated to independent administrative bodies.

Vice-president Sauvé concluded by saying that the discussion on this topic had shown the different aspects and clarified the existing questions. The discussion had to be continued and the delegates had to keep on exchanging decisions and approaches since finding a certain convergence in jurisprudence was preferable to having divergent interpretations.

 

Luc VerheyThe afternoon session of the seminar was reserved for the topic of digital decision-making and digital enforcement as administrative bodies are increasingly using new digital technologies in the decision-making process. By way of introduction, State Councillor Verhey of the Dutch Council of State highlighted the issues and consequences that arise in this field, both from the perspective of the administrative body making these decisions as from the judge who is tasked with reviewing such decisions. 

A growing number of obligations are being imposed on administrative bodies in connection with the use of automated decision-making. To facilitate the discussions on this topic from the perspective of administrative bodies the following questions where distributed:

  1. How do legislators in the various European countries deal with the General Data Protection Regulation and the requirements imposed on administrative bodies by the Regulation, particularly Article 22(1)?
  2. Are there general limits that need to be observed in a decision on whether to make or enforce a decision digitally?
  3. To what extent does government liability play a significant role in the use of automated decision-making (via particular software programs or algorithms)?

Automated decision-making and enforcement have legal consequences for those concerned. Judges must also be able to review the legitimacy of a decision on the basis of all the relevant information. Questions for discussion that were on the agenda on this subject were:

  1. What does the increased use of automated decision-making mean for the assessment framework of administrative judges?
  2. What consequences does the increase in automated decision-making have for administrative procedural law?
  3. What does the increased use of automated decision-making mean for the demands made on the function of judge?

Following this general address remarks were made by representatives from France and Estonia. Vice-president Sauvé stressed the importance of this afternoon’s topic, as the number of decisions that are based on algorithms is growing. It is important that judges realize how different judges react to this and consider fundamental requirements and checks and balances. The Estonian representatives explained that Estonian law does not regulate the use of algorithms in decisions and questioned whether a general transparency requirement should be promoted and developed, instead of separately reviewing transparency in every decision.

judge NicolatosThese short remarks were followed by a presentation by Judge Nicolatos from the Supreme Court of Cyprus. Judge Nicolatos explained that in today’s era of e-government, it comes as no surprise that the language of government has quickly changed as automated systems are used daily. Many tax departments around the world have introduced online ‘e-tax’ systems to help taxpayers in completing their annual tax returns. Automated systems undoubtedly come with benefits. However, their growth and application by administrative bodies has been introduced into the public law sphere well before anyone could properly reflect on how they would interrelate with administrative law principles. The use of these systems by the government has become well-entrenched by now, but questions may be raised as to the measures needed to ensure their compatibility with the core administrative law principles that underpin a democratic society, governed by the rule of law, such as transparency, good faith, the rule of law, legality, reasoned decision-making, equal treatment, proportionality, impartiality, abuse of power, good administration. Judge Nicolatos then focused on a number of key principles in the remainder of his presentation, including the principle of legality and the requirement of transparency in light of the new General Data Protection Regulation, which contains provisions on automated individual decision-making and profiling which can be part of an automated decision-making process.

Judge Nicolatos, further stressed the need for due reasoning underlying administrative decisions. For the court to effectively review a decision made by an administrative body, its reasoning must be clear, unambiguous and not general or vague. Moreover, automated computer-generated decisions have transformed the discretion-based decision-making process to data exchange and big data analytics. Judge Nicolatos expressed his scepticism on this and furthermore drew attention to the weaknesses of such automated systems. One of the greatest challenges is to ensure accuracy, since the potential for coding errors is real. Errors in computer programming can result in wrong decisions, potentially on a great scale, if undetected. To illustrate this point, it was explained that in asylum and tax cases, the Cyprus Administrative Court has jurisdiction to review both the legality and the correctness of the decision, and can substitute the administration’s decision with its own. If an automated, pre-programmed system was utilized wholly or partly, by a public authority, in the process of reaching a decision, some knowledge of the underlying decision-making system will be required by the administrative judge. This requires the judiciary to adapt to the new demands and challenges of a modern digital landscape.

In conclusion, Judge Nicolatos explained that the existing systems with their built in checks and balances, need to be integrated or even replicated in any new system, before fully embracing the new world of automation.

This intervention was followed by a presentation by State Councillor Paris of the French Council of State who began by showing a source code of an algorithm and subsequently asked how many of the present representatives were able to understand it. None could.
This line of code was used in the French education system for the orientation – and effectively the enrolment – of high school students for their first year studies at university. The French Council of State did not rule on this source code but annulled a circular drawn up by the Minister of Education and which formed the basis for the application of that algorithm. This was possible because an association of jurists and programmers intervened in court proceedings and requested the source code to determine the modalities of its functioning. The example shows just one important element in the discussions surrounding the use of algorithms. If citizens or judges are unable to read and comprehend complex algorithms, this causes problems in light of the principle of legality.

Of course, judges can always refer to an expert; however two issues must be taken in to account. First, algorithms are nowadays of such complexity that generally speaking only the creators are capable of identifying what they contain. Second, it is impossible for humans to comprehend the reasoning of algorithms that use autonomous reasoning, such as artificial intelligence. We can identify the principles that form the basis for algorithms used by the public administration but the real question relates to how citizens and judges can ensure that these algorithms respect the predesigned principles and rules if we cannot understand their reasoning. State Councillor Paris presented a few answers to this dilemma.

The first answer lies in the training and education of citizens and judges. A second point is the re-appropriation by the public bodies of the creation of algorithms used by public administration. Public bodies should create their own algorithms. As we are not capable of directly controlling the content of the algorithm, it is of essence that we are given the opportunity to ensure the inclusion of all the legal and ethical elements during the process of creating algorithms. This has at least three implications. First, the creator of the algorithm used by public administration should be subject to ethical codes that follow from hard or soft law. Second, it is necessary to anticipate the mechanisms of compliance that allow judges to ensure compatibility with the rules and principles of ontology. Third, it is necessary to ensure complete transparency of algorithms that are used by public administration. This means that all elements concerning the functioning of algorithms, including the source code, should be made public. Transparency will allow civil society to verify the legality of algorithms and eventually assist the judge. Members of civil society can serve as experts and monitor the use of algorithms.

State Councillor Paris concluded his intervention by emphasizing that the issues concerning the topic of automated decision making are global. This was illustrated by referring to local law no. 49 of 2018 of the city of New York in relation to automated-decision systems used by agencies. The local law enabled the creation of a working group (task force) with the task of making recommendations on the ethics of using automated decisions including algorithms.

Due to time constraints, a short but lively discussion followed after both interventions. The discussion mainly focused on the training requirements of judges in the field of information technology. Are judges required to be ‘literate’ in these fields? And what about the role of experts that commonly intervene in proceedings? State Councillor Carbone of the Italian Council of State explained that the algorithms are fixed – although always regulated by man – and this raises questions about the power of judges. To illustrate this point, an Italian case was presented about the allocation of school teachers. After their hiring, school teachers were ranked on the basis of certain criteria, and then allocated to schools around Italy. The allocation was decided on the basis of an algorithm and meant that school teachers were in some cases placed in schools in different parts of the country. A number of decisions were contested and brought before administrative tribunals. These tribunals ruled that the presence of an automated procedure does not limit the competence of judges, either on procedure or on merit. This means that the underlying algorithm should be in compliance with the principle of transparency and the decision should be reasonable. Judges have the duty to understand the algorithm and its consequences.
 

ACA 2018 74President of ACA, Mr Piet-Hein Donner, made some concluding remarks at the start of the General Assembly the next day on Tuesday 15 May 2018. Following a brief summary of the discussions during the colloquium, Mr Donner explained that regrettably there was limited time to finish the debate of the previous day. However, since the colloquium had raised general interest in discussing some ideas and solutions further, Mr Donner expressed his wish to continue exchanging views on the fundamental questions that were raised during the colloquium in the years to come. Consideration had therefore been given first of all to establishing a system for the exchange of information on the common issues that our ACA members face when confronted with rapid technological developments. A second point would be to identify in more depth the developments (both common and different) that ACA members face. Members could use the national contributions for this. Finally, the ACA should focus on a selection of common points, for example, the question of the use of algorithms, for the purpose of drawing up a report with recommendations on certain ideas and criteria that could be useful when faced with common issues. Mr Donner then proposed that the ACA Board should consider these issues during their next meeting in September 2018 and decide how to proceed.