Big Tech Controversy Part 2: Beyond the Economic Scope

Big Technology Companies, particularly Google, Facebook, Amazon, and Apple, have risen to become a key part of society, influencing the everyday lives of most Americans and those far across the globe.  Recently, however, there has been criticism and controversy surrounding Big Tech, due to its large amounts of market power and its infringement of Civil Liberties.  A previous BULR article addressed the economic consequences of Big Tech and presented the way Antitrust Laws should be used to mitigate these issues.  In that article, the Consumer Welfare Standard was found to be the ideal principle in implementing these laws due to its measurability, empirical nature, implementability, and since it protects consumers against monopolistic market power in its lack of market competition (since no other firms are in the market) which ultimately reduces economic efficiency.  

While the Consumer Welfare Standard is ideal for addressing the many economic concerns of Big Tech, it is powerless against the noneconomic consequences that these companies have created.  Big Tech has, most notably, infringed Civil Liberties in the areas of privacy and free speech.  Various legal mechanisms aside from Antitrust are needed to alleviate these looming issues.

Privacy and the GDPR

Big Tech companies are eroding people’s personal privacy through various sorts of data collection.  Google is predicted to collect 11.6 MB of data per day of typical electronic use, and many tech companies sell consumer data to third party sources [1].  Data collection by these big tech companies is rampant, and the United States has no federal law that restricts this [2].  

The European Union has addressed this issue with a piece of landmark legislation, the General Data Protection Regulation (GDPR).  It establishes basic rights that people have in regards to their online privacy, formally known as the “fundamental freedoms and rights of the data subject” [3].  The law enumerates the following rights:

  1. The right to access data, including knowing if and how it will be processed.

  2. The right to correct wrong data.

  3. The right to erase data.

  4. The right to restrict the use of data.

  5. The right to know who received a person’s data [4].

  6. The right to keep and use personal data with different services (Data Portability) [5].

  7. The right to object to the use of one’s personal data.

  8. The right for the automatic processing of data to not be the sole basis for a decision [6].

This legislation also establishes instances where a company can use one’s personal data including:

  1. If a person agrees to it.  This agreement must be specific, clear, and unambiguous.

  2. If a company will be entering a contract with this person.

  3. If a person has a legal obligation, like a court order.

  4. If it is needed to save a person’s life.

  5. If it is needed to conduct a job for the public interest.

  6. If there is a legitimate need to use the data.  This is restricted by the data subject’s “fundamental rights and freedoms” [7].

The GDPR has other mechanisms of data protection and the regulation of Tech companies.  Any person who works with data must prove that they are fluent in these guidelines, and any tech behemoth, like Google, must have a government-appointed “Data Protection Officer” to ensure that these guidelines are being followed.  This also strengthens areas such as data security, and fines Big Tech companies for any violations, in the amount of either 20 million Euros or 4% of their revenue globally (the higher of the two) [8].  The two most important aspects of this legislation is its protection of consumer’s data privacy and its increased pressure on Big Tech companies to operate with integrity and accountability.  In the United States, people’s privacy is currently being governed by Big Tech companies, not by the people themselves.  The United States, currently having no legislation restricting the reach of Big Tech companies, needs to pass its own piece of landmark legislation through Congress to protect its consumers’ online privacy and to hold these corporations accountable.

Censorship and Section 230

There is also substantial evidence of Big Tech restricting people’s free speech through the censorship of information.  Although the First Amendment of the U.S. Constitution protects free speech, Twitter was recently indicted for censoring a New York Post Article containing information about Hunter Biden’s emails [9], and Google was accused of suppressing the search results of The World Socialist’s publication, “The 1619 Project” [10].  Big Tech companies, particularly web and social media platforms, are starting to act as gatekeepers for the information that we see online, which, in essence, is censorship.

Section 230 of the Communications Decency Act has been a topic of hot debate by lawmakers on both sides of the political spectrum.  This statute defines people and companies who put information online as platforms, not as publishers.  This ensures that they are immune from any legal responsibility for what they post [11].  This is important to protect bloggers, creators, and people who post online.  Conversely, however, this protects Big Tech companies from having to face the consequences of their actions that suppress free speech [12].  Politicians on both sides of the aisle are calling to remove this legal protection across the board, but this will remove the legal protections not only from large corporations, but also from an individual person’s posts.

The best way to address this is to amend Section 230 to have different protections for people and for the companies that control this flow of information.  The Section 230 protections that are in effect today should apply to people who post online, like bloggers, youtubers, and even people who comment on social media.  The separate protections for companies should be less expansive.  Big Tech should be protected legally from any issues of legality arising from something electronically published by a user, and they should be allowed to flag posts that are considered offensive or violent by nature, but they should not be protected legally for the suppression of political material or material created by their competitors.  Their side of the revised Section 230 should be based on the proposed law, the “Limiting Section 230 Immunity to Good Samaritans Act,” which would allow people to sue Big Tech companies for potential concerns with censorship and suppression of political material and material by their competitors, but only allows legal consequence if they did it with malice or bad intention [13].  This differentiation between individuals and corporations in Section 230 will allow for Big Tech to be held accountable for their censorship while still preserving the rights of individuals online.

Conclusion

Big Tech companies have caused numerous issues, both economic and relating to Civil Liberties.  Antitrust Laws are only equipped to mitigate the economic issues that Big Tech has imposed, so they do not address issues regarding data collection and censorship.  To increase consumer privacy rights online, Congress must create legislation like the GDPR that limits Big Tech’s use of data, holds them accountable with government oversight, and fines them heavily for any violations.  To decrease the frequency of censorship or suppression online, Section 230 protections must be amended to differentiate between the protections of individuals and the protections of corporations.  The protections to corporations should be amended in the fashion of the proposed “Limiting Section 230 Immunity to Good Samaritans Act” to allow people to sue Big Tech for any blocking of competitor publications or political speech.  A landmark privacy bill and a revised Section 230 paired with Antitrust laws and the Consumer Welfare Standard is sure to mitigate the numerous problems that Big Tech has inflicted.

 

David Vojtaskovic is a current first-year who plans on studying Economics. He is a Staff Writer for the Law Review's Blog and can be reached at david_vojtaskovic@brown.edu.