US businessman Roger McName, one of the early investors of Google, Amazon and Facebook, has published identical critical articles in the US press. His materials were presented to Washington Post, The Guardian and Time. McName considers the social network a threat to democracy, confidentiality, personal data and, in general, extremely damaging to society. On February 5, he released his book, "Zucked: Waking Up to the Facebook Catastrophe," in which McName says he was a mentor to Mark Zuckerberg, whom fame has blinded and manipulated public opinion.
More than 10 years ago I joined Facebook and I was proud of the company's success … until the last few months. I am embarrassed and ashamed. With its 1.7 billion users Facebook is one of the most influential companies in the world. Whether we like it or not, Facebook is a high-tech media company that has a huge impact on people, politics and social well-being. Any decision taken by management can have a major impact on the lives of real people. The company's management must be responsible for each action and receives both rewards for success and must be responsible for the failure. Facebook has recently done some really horrible things, and I can no longer justify this behavior.
Nine days before the start of the November 2016 elections, I sent a mail to Facebook founder Mark Zuckerberg and Chief Operating Officer Sheryl Sandberg. There I described the problems I noticed on Facebook and I intended to publish this material in the media. A few months ago I noticed a wave of sharing coming from Facebook groups that seemed at first sight to be related to the Bernie Sanders campaign, but nobody could imagine that they had come from his team. I wanted to share with Sandberg and Zuckerberg my fear that some people are using Facebook's architecture and business model to hurt unsuspecting people.
I am a long-time investor and evangelist. Technologies are my passion and my career. I was an early adviser to Zuckerberg – Zook, called by many colleagues and friends, as well as an early investor in Facebook. I really believed in the company. My early encounters with Zook were almost always in his office, and I did not have a complete picture of him, but he was always frank with me. Zook liked me. I liked him and his team. I was a real fan of Facebook. I was one of the people Zook could call when faced with new challenges. My mentorship is fun and fun, and Zook could not be a better student. We were talking about things that were important to Zook, in areas where I had more experience, and he often followed my advice.
When I sent this email to Zook and Sheryl, I thought Facebook was the victim of a misunderstanding. But what I learned in the coming months – the elections in 2016, the dissemination of Brexit information about the sale of user data – it all shocked me and disappointed me. It took me a long time to understand and accept that success has blinded Zook and Cheryl. I have never had reasons to face Facebook, and I still have shares in the company. My criticism of Facebook is a matter of principle and stock ownership is a good way to follow the principles. I first invested in Facebook and again I was the first to see the upcoming crash.
This is the story of power, privileges, trust and misuse.
Facebook's huge success eventually led to a catastrophe. The business model of the company is based on advertising, the success of which depends on manipulating the attention of users in such a way that they see as many ads as possible. One of the best ways to manipulate attention is to provoke indignation and fear – emotions that engage the attention. And the more emotional a post is, the better it is seen in the tape. Facebook spies users and gathers data from wherever they can and the algorithms used give users exactly what they want. Ultimately everyone finds himself in his information bubble and believes that other users think like him. But a number of studies have shown that displaying only the opinions with which a consumer agrees increases polarization and, as it turns out, damages democracy.
To feed its algorithms with artificial intellect elements, Facebook began collecting data even from people who did not use this social network. Sadly, Facebook was unable to protect and store this data. The company began selling them and trading with them just to make more profitable deals. In this way, the number of users increased, but another innovation was introduced, which contributed to the even greater success of the Facebook advertising business.
From the end of 2012 to 2017 inclusive, Facebook has gradually perfected a new idea. The company has experimented with new type of algorithms, new data types, with small design changes to determine and learn everything. Gradually, consumers became indicators, not people. Facebook so efficiently monetised its oceans of data that all other considerations were forgotten. Any action taken by the user has allowed Facebook to better understand this user and that user's friends. Algorithms have enabled the company to make small "improvements" to users every day, which in fact means that every advertiser can buy access to the attention of every user. The Russians took advantage of this very well.
People on Facebook live in something like their own bubble. Zook has always believed that connecting all people to the planet is such an important mission that it justifies any action needed to achieve it. Convinced of the nobility of their mission, Zook and his employees listen to criticism, but they do not hear them or change their behavior. They solve almost every problem with the same approach: more artificial intelligence, more code, faster upgrades. They do not do this because they are bad people. They do this because success has distorted their perception of reality. They can not imagine that the problems that have arisen might somehow relate to their projects or business decisions. He would never have liked to listen to the critics. And when confronted with the evidence that fake news and misinformation spread through Facebook and influenced the British referendum, as well as the US elections, they began to pretend and deny. And then plan B was applied: excuses and promises that everything would be done.
Thanks to the exceptional success of Facebook, in the world of technology, Zook is something like a rock star and a cult leader. He is deeply involved in the algorithms and products, and the rest of the business leaves Sandberg. On Facebook, Zook is the undisputed boss. His subordinates carefully studied and developed special techniques to influence him. Sheryl Sandberg is brilliant, ambitious and highly organized. Considering Zuck's founding status, Facebook's team almost never provoked him to move forward and warn him of bad times. (A Facebook employee says "People never agree with Mark").
Perhaps you will think that Facebook users will be outraged by the way the platform has been used to undermine democracy, human rights and privacy? Some are really outraged, but about 1.5 billion people use it to communicate with relatives abroad, distant friends and relatives. People like to share their photos, thoughts and emotions. They can not and do not want to believe that this platform can be responsible for so much damage and evil. Facebook uses our trust in family and friends to build one of the largest businesses in the world. But at the same time it is totally neglected with consumer data and reinforces the shortcomings of democracy, while making people less and less aware of themselves and no longer knowing whom to trust. Facebook uses people's trust to spread misinformation, influence voters and polarize views of the forest in a number of countries. This company will continue to do so while we, as citizens, do not regain our right to self-determination.
We need to start reforming Facebook and other major Internet companies in the following areas:
Democracy directly depends on shared facts and values. It depends on the rule of law. It also depends on the presence of free press and other opposition forces that hold the power responsible for their actions. Facebook undermines the free press from two sides: the social network erodes journalism and overwhelms it with disinformation. In Facebook, information and disinformation are the same. The only difference in misinformation is that misinformation generates more income. Facebook's facts are not absolute – the choice is left in the hands of users and their friends, continually assisted by the algorithms, to increase engagement. Against this background, Facebook algorithms promote extreme messages compared to neutral ones. In this way, the level of disinformation increases, consumers concentrate on different conspiracy theories, and the facts do not pay much attention.
At the scale of Facebook and even somewhat on Google, there is no way to avoid the impact on the lives of consumers and hence the future of nations. The latest story shows that the threat to democracy is real. Facebook, Google and Twitter's ongoing efforts to protect future elections seem sincere, but there is no reason not to think that they will not bump anyone who decides to intervene. Only general changes in their business models can reduce the risk to democracy. Facebook remains a threat to the weak in the world. The Free Basic service provided access to the social network of poor people from around 60 countries, but at the cost of huge social shocks. To this day, Facebook is blind to the ways in which the platform can be used to harm the defenseless minorities. In fact, this has already happened – let us recall the deaths in Sri Lanka and Myanmar.
Facebook is a big threat to privacy. The way the social network watches people would make an intelligence agency proud. But not the data processing. There should be Facebook News Feed, in which search results are manipulated. Consumers must have control over their data and absolute control over the way they are used. Consumers have the right to know the name of any organization or individual who has access and uses their personal data. Another important point is that users have the right to download all their data and transfer them to another social network or other similar resource. Furthermore, platforms must be transparent to both consumers and advertisers and regulators.
Control of personal data
Users must always have all their data and metadata. And if someone uses them, the user should be very well off. No one should be able to use the user's data without his explicit prior consent. Restrictions need to be placed on exactly what data can be collected, and users can at any time limit the collection of even those data. And this should be done immediately before products such as Alexa and Google Home become massive. Finally, I would not want to introduce artificial intelligence and modern robots without proof that they serve people rather than use them.
Control and regulation
Assuming that Google, Amazon, and Facebook are purely investment companies, we can not help admiring the brilliant way in which they made their business plans. The problem is that unintended consequences occur that are more numerous and heavier than we can afford. But in fact, Google and Facebook are artificially profitable because they do not frighten for the damage they cause.
The US economy is historically dependent on start-ups, especially in technology. If my hypothesis is true, the country has begun a very risky experiment with the introduction of a monopoly on innovation, economic growth and job creation.
Google, Amazon and Facebook have become monopolies and have built firewalls around their core business. Their success has greatly increased the bar for start-ups, limiting their potential success. IT giants are constantly buying starters that could be a competitive threat. These technological colossals stifle start-ups, and this is the typical approach of monopolists.
From an economic policy point of view, boundaries of monopoly giants such as Facebook, Google and Amazon should be set. The economy will benefit greatly from breaking their monopoly. The first step is to prevent the collection of user data and the sharing of these data between different platforms. Personally, I find control and regulation for the most effective way to reduce harmful behavior.
These technologies have the power to persuade, and the financial means of advertisers ensure that persuasion will always be the main purpose of the platforms. Each pixel on screen and every web application affect user behavior. Not every user can be influenced at any time, but almost all users can be influenced from time to time. A number of people develop behavioral addictions that aggravate the quality of their lives and their families, their colleagues and relatives. Millions of people are checking their phone when they wake up. It may sound like a joke, but for most of them it is a problem whether to look at it before or after going to the toilet. This creates dependence. Too many people report sleeping difficulties because they can not stop using their tablet or phone. Let's remind you that a drug addict very well understands that he is hurt but can not stop. Change will require more than regulation. Investments in research and public health services will be needed to counteract addiction to Facebook and even the Internet.
More and more children prefer the virtual to the real world. Numerous medical studies indicate that this way uncontrolled psychological experiments are carried out on millions of people, the most affected children. Medicine supports the introduction of regulation in the Global Network.
Damage to public health, democracy, privacy, and competition caused by Facebook and other major platforms are the result of their business models that need to be changed. As consumers, we have much more power to impose changes than we realize. We can, for example, change our behavior. We can create a political movement. We can insist on government intervention. At the same time, it is precisely the government that should take the first steps to remove the damage caused by Internet platforms. The political and social power of Facebook and other major internet platforms is unhealthy and inappropriate for democracy. We have to make them make real decisions, not just write new source code for even smarter algorithms.
Roger McNamee: I Mentored Mark Zuckerberg. I Loved Facebook. But I Can not Stay Silent About What's Happening.