Marietje Schaake advocates for a global effort by democracies around the world to claim back power in the digital world. As she rightly observes
decisions that companies make about digital systems may not adhere to essential democratic principles such as freedom of choice, fair competition, nondiscrimination, justice, and accountability. Unintended consequences of technological processes, wrong decisions, or business-driven designs could create serious risks for public safety and national security. And power that is not subject to systematic checks and balances is at odds with the founding principles of most democracies.
The recent move by Google and Apple to force the use of their platform for exposure notification as the only way to reliably use smart phones for Covid-19 contact tracing is one more example of the kind of power grabbing that is taking place. And while I strongly agree with Schaake on the importance of democratic oversight and the important role international institutions (UN, WTO, etc.) could and should play here, I am afraid the kind of interventions she envisions are not radical enough.
For example, she asks whether
democratic governments should build their own social-media platforms, data centers, and mobile phones instead?
Probably not. But more competition and viable alternatives in the U.S. and China dominated market for devices and services is sorely needed. This means a more 'level playing field' (the European market doctrine) should be created. This could perhaps be achieved by mandating open APIs (next to open hardware and software as well as open document formats) to finally break down the walled gardens of the Internet. Why is SMS still around? Because it is the only messaging system that works on all mobile phones. It is ridiculous that the most modern communication systems (WhatsApp, iMessage) are designed to be inoperable, while the older ones (SMS, Email) have no such restrictions.
Secondly, a shared definition of freedom of expression might be a good idea, but this may not be enough to curb the spread of fake news (which not necessarily incites to violence). If the algorithms (driven by business model incentives) consistently prioritise extreme content, then perhaps the algorithms themselves or even the underlying business models need to be regulated.
Similarly, requiring transparency regarding the use of microtargeting for political campaigns, and forbidding the use of sensitive personal attributes (like ethnicity and religion) is but a small first step in the right direction. The business of profiling in general, and the use of automated decision making based on such very detailed personal profiles by governments and businesses alike is extremely worrying. Transparency is a woefully inadequate defence mechanism here.
I was slightly puzzled by Schaake's plea
never to demand that companies hand over the source code of software to state authorities
I would actually argue the opposite: source code should be open by default, most certainly the source code underlying the systems, applications and services used by the governments themselves. Openness makes them more secure ("given enough eyeballs, all bugs are shallow"), allows them to be inspected for privacy infringements (and other risks), and allows any issues to be fixed without a critical dependence on the original supplier.
So yes, let's reclaim our digital sovereignty and hold the technology companies accountable to the standards of a democratic society. And indeed, let's do that together, on a global scale at the international level. But let's be much more ambitious in establishing a proper baseline to which we hold these technology companies accountable. A baseline that also reckons with the underlying systemic economic principles that feed these companies and have so far allowed them to grow without any regard for the societal consequences.