Skip to content

Digital economy: Prevent digital feudalism

The use and abuse of data by Facebook and other technology companies is finally receiving the official attention it deserves. Now that personal data is becoming the most valuable merchandise in the world, will consumers be the owners or slaves of the platform economy?

The prospects for democratization of this sector are bad

The algorithms are developed in ways that allow companies to benefit from our past, present and future behavior, or what Shoshana Zuboff of Harvard Business School calls “behavioral surplus.” In many cases, digital platforms already know our preferences better than us and can push us to do things that produce even more value. Do we really want to live in a society in which our most intimate desires and manifestations of our personal will are for sale?

Capitalism has always known very well to create new desires and cravings. But, with massive data and algorithms, technology companies have accelerated and simultaneously reversed that process. Instead of creating new products and services imagining what people might want, they already know what we are going to want and sell it to our future selves. Worse, the algorithmic processes that are used often perpetuate racial and gender bias and can be manipulated to obtain economic or political benefits. Although we all took full advantage of digital services such as Google’s search engine, we didn’t think they were going to catalog, shape and sell our behavior.

What about changes in the business model?

To change the situation, it will be necessary to focus directly on the predominant business model, specifically on the source of economic income. Just as the seventeenth-century landowners extracted their incomes from the inflation of land prices, just as the thief magnates benefited from the shortage of oil, today’s technology companies gain their value by monopolizing search services and electronic commerce.

Naturally, presumably, sectors with high network externalities – in which the benefits for each user increase according to the total number of users – generate large companies. That’s why telephone companies grew so much in the past. The problem is not size, but how network-based businesses exercise their market power.

Now and then

At the beginning, current technology companies used their extensive networks to incorporate different suppliers, which greatly benefited consumers. Amazon allowed small publishers to sell titles (including my first book) that would never have reached the shelves of a neighborhood bookstore. The Google search engine used to display a wide variety of suppliers, goods and services.

Now, however, both companies use their dominant positions to stifle competition, controlling what products users see and giving priority to their own brands (many of which have seemingly independent names). Meanwhile, companies that do not advertise on these platforms are at a serious disadvantage. As Tim O’Reilly has explained, over time, this method of rent-seeking weakens the ecosystem of providers for whose service the platforms had initially been created.

Instead of just assuming that incomes are all the same, economic authorities should try to understand how algorithms assign value among consumers, suppliers and the platform itself. Some assignments may reflect real competition, but others are based on the extraction of value, and not on its creation.

Therefore, we need to develop a new governance structure that begins by creating a new vocabulary. For example, calling platform companies “technology giants” implies that they have invested in the technologies they take advantage of, when, in reality, it was the taxpayers who financed the fundamental technologies, from the Internet to GPS.

In addition, the widespread use of tax arbitration and autonomous workers (to avoid the costs of health insurance and other benefits) is eroding the markets and institutions on which the platform economy relies. Therefore, instead of talking about regulation, we must go further and adopt concepts such as co-creation. Governments can and should influence markets to ensure that a value created collectively is at the service of the collective good.

Competition policy should not pay attention only to size

Decomposing large companies would not solve the problems of extracting value or violations of individual rights. There is no reason to think that many smaller Googles or Facebooks would work differently or with new and less abusive algorithms.

Creating an environment that rewards the true creation of value and punishes the extraction of value is the fundamental economic challenge of our time. Luckily, governments are also creating platforms to identify citizens, collect taxes and offer public services. Concern about the official misuse of data in the early days of the Internet caused much of the current data architecture to be developed by private companies. But now, government platforms have enormous potential in improving the efficiency of the public sector and democratizing the platform economy.

To realize that potential we have to transform data management, develop new institutions and, given the dynamics of platform economics, experiment with alternative forms of ownership. To give just one of many examples, the data we generate when using Google Maps or Citymapper – or any other platform based on taxpayer-funded technologies – should be used to improve public transportation and other services, and not to become only private profits.

Of course, some will say that regulating the platform economy will prevent the creation of market-driven value. But they would do well to reread Adam Smith, whose “free market” ideal was a market without income, not without the State.

Algorithms and massive data can be used to improve public services, working conditions and the well-being of all. These technologies are used today to worsen public services, encourage zero-hour contracts, violate privacy and destabilize the world’s democracies, all in the interest of personal gain.

Innovation not only has a speed of progression

It also has an address. The threat posed by artificial intelligence and other technologies is not in their pace of development, but in how they are conceived and used. The challenge we have is to set a new direction.


Also published on Medium.

Published inFintech
%d bloggers like this: