Estimated reading time: 07 minutes -

Are platforms too big? Should they be responsible for the content their users post? Should they publicly release the algorithms they deploy? The debate is strongly polarised between the advocates of heavy versus light regulation. But everyone agrees that it is difficult for anyone to appropriately understand (not to mention regulate) such fast-evolving markets, technology and trends. The question then is, how can government acquire the capabilities and advanced expertise necessary for good, effective regulation?

The problem became evident in the debate over several recent proposals on both sides of the Atlantic. The European Commission, for one, announced its intention of continuing to set ambitious global regulatory standards, following the precedent set by the General Data Protection Regulation, by proposing landmark measures to regulate the liability of online intermediaries (the digital services act, or DSA), the market power of platforms (the digital markets act, or DMA) and finally artificial intelligence applications (the artificial intelligence act) However, many analysts have pointed to major weaknesses in the problem analysis for both the DSA and the DMA, indicating a fundamental lack of understanding of how digital markets work. Even some of the loudest critics of technology companies denounced the poor quality of the DMA impact assessment. Similar criticisms were raised towards a U.S. Congress report, which tried to demonstrate a decline in startup creation and reduced innovation in the U.S. due to market concentration based on data from 2012. As technology expert Benedict Evans put it: “This is what happens when a handful of staffers are asked to boil the ocean overnight.”

Indeed, the scale and complexity of the challenge is disproportionate to the institutional capacity of government to fully understand new trends, new technologies and new business models. This capacity gap paves the way for possible regulatory failures such as the abuse of the precautionary principle, excessive regulatory burden and perverse effects because of a lack of understanding of trade-offs between policy goals, for example, between privacy and competition.

But recognising the challenge should not be seen as a form of laissez faire-based criticism of government regulation per se. First, while it is easy to criticise poor analysis in the impact assessments and reports cited here, it is genuinely difficult to provide alternative measures backed up by fully robust data and evidence. And precisely thanks to the transparency of better regulation rules and procedures that force governments to produce and publish the evidence behind such policy proposals, the public is able to detect weaknesses and provide meaningful criticism.

Secondly, this incapacity to govern new trends is physiological, not pathological. It happened with electricity, cars and trains. Sixty years passed between the first operating train (1827) and the creation of the first regulatory commission (the Interstate Commerce Commission established in 1887). We are living through these “gap years” between the emergence of a disruptive technology and the capacity to govern it effectively, with the added concern that evolution is now much faster. In a recent survey from the Organisation for Economic Co-operation and Development (OECD), 63% of national infrastructure regulators indicated that their role had changed in the last five years, notably because of technological change – and in the communications sector the percentage of positive respondents reached 88%.

So the question is, how can Europe build the institutional capacity for a stronger, evidence-based technology policy? Put differently, how do we make regulation fully fit for the very real challenges of the 21st century?

Many have argued for developing stronger in-house regulatory competences based on new methods and tools. More specifically, several landmark papers have recommended developing institutional capacity in running sandboxes, behavioural insight and data management and analysis. A recent joint World Economic Forum/OECD initiative on agile regulation set out six main areas of development, ranging from data-driven regulation to anticipatory regulation. Another OECD report on the future of regulation explored best practices in using innovative data-driven technologies for better regulation, such as drones to support risk-based inspections.

But others go a few steps further, suggesting that the problem is not about a lack of specific skills or methods but about a need for entirely new structures filled with highly qualified staff with deep sectoral knowledge, possibly including private-sector experience or background – people, in other words, who “think digital.”

In Unlocking Digital Competition: Report of the Digital Competition Expert Panel written for the United Kingdom government, Jason Furman, who formerly served as head of U.S. President Barack Obama’s council of economic advisers, and a team of experts called for the establishment of a dedicated “Digital Markets Unit (DMU)” with “a remit to use tools and frameworks that will support greater competition and consumer choice in digital markets, and backed by new powers in legislation to ensure they are effective.” Recommendation No. 6 of the report proposes that “government should ensure the unit has the specialist skills, capabilities and funding needed to deliver its functions successfully.”

The DMU has subsequently launched as part of the UK’s competition and markets authority, and the UK government is busy implementing the Furman Review’s recommendations, including a stress on “building skills” among the regulatory cadre that will write and enforce the rules. On a similar note, the OECD recently advised that because of rapid market evolution and uncertainty, there is “a need for flexible and autonomous operating models. This includes funding and human resource strategies that respond to needs, perhaps going beyond regular government schemes.” This is politically correct jargon for new recruitment practices and structures in the staffing of regulatory agencies.

In A Focused Federal Agency is Necessary to Oversee Big Tech, a recent paper, Brookings Institution analyst Tom Wheeler recommended the creation of a new U.S. agency because the Federal Trade Commission is too imbued with industrial era culture. “The digital economy requires departure from such a hidebound precedent to create an all-digital-all-the-time agency staffed by specialists with digital DNA,” he wrote.

These recommendations build on pioneering experiences that already exist. Specialist agencies have been created to regulate very technical issues related to data portability and access. Over the last two years, the Open Banking Implementation Entity in the UK and the Consumer Data Standards Board in Australia have brought new people and new thinking to regulation. These “hybrid” organisations staffed by highly skilled individuals are very knowledgeable and respected in the private sector and its technology-driven wing, but they also have a deep public service ethos, as seen in their commitment to the open source movement and their reliance on collaborative platforms like GitHub where they often share and develop new ideas and thinking.

As far as Europe is concerned, the solution may or may not be the creation of an entirely new regulatory agency. A lot depends on the task involved and the specific context. And either way, the “agencification” of problems is a persistent risk, too. As the OECD puts it, “forms should follow functions,” not the other way round.

What we do know is that to design effective evidence-based regulation, we need to bring new people into government – people with deep knowledge of technology, markets and data. Innovating recruitment is not enough: we also need to retain talented public servants by empowering them and creating a suitable working environment.

But we should also refrain from the illusion that employing the most highly qualified staff is a sufficient condition for good policy. The reality is that no one has a ready answer on tech regulation: it can be achieved only through continuous, evidence-based and open debate – precisely the thing that the European Union’s celebrated “better-regulation” procedures promises. The solution will not be found easily and rapidly, but in a democracy, transparent due processes are not only a prerequisite for good regulation: they are the main source of legitimacy and trust.

David Osimo is director of research at the Lisbon Council.

 

Download in PDF