Regulators step into the ring with BigTech
With governments around the world making tough noises, Bill Mew considers how they can bring the tech giants to heel on freedom of speech, content moderation and antitrust measures
Should Alexei Navalny be banned from social media for threatening the Russian constitution in the same way that Donald Trump was banned for doing so in the US?
Should large tech companies be seen as publishers or moderators? And who is responsible for the content: the person publishing it, the person sharing it or the platform it is shared on? And how should we regulate the tech giants?
Answering these questions is going to be one of the great challenges we face in 2021 and beyond. In just the same way that protecting citizens from cybercrime is going to require global collaboration, the authorities will need to come together to improve protections for privacy, intellectual property and taxable revenue.
Reaching international agreement on almost anything is hard - any accountant who has followed global tax reform efforts will know.
Big tech will also challenge regulators seeking to apply antitrust and content controls. Different cultures have very different views of the balance between freedom of speech and the need to address misinformation and abuse. In some cultures, any criticism of religion or monarchy can be unlawful or even be seen as treason.
Publishers, editors or moderators?
Most of the big tech firms are based in the USA, where their roles and responsibilities defined in section 230 of the 1996 Communications Decency Act. The legislation protects “interactive computer services” from lawsuits if a user posts something illegal, although there are exceptions for copyright violations, sex work-related material and violations of federal criminal law. The Electronic Frontier Foundation (EFF) backs section 230, calling it “the most important law protecting internet speech”.
Following criticism that big tech firms were using their status as publishers under section 230 to turn a blind eye to misinformation and abusive content, social media operators started moderating content more actively. The moment they did so, they were criticised by freedom of speech advocates and were accused of bias by many on both the right and the left whose content was impacted.
The social platforms enjoyed a particularly antagonistic relationship with President Trump, who used Twitter as his primary channel for communication and propaganda. When Twitter and Facebook started to tag his content as misleading or to pull down his posts, he threatened to remove their section 230 protections. And when he was seen to incite violence, they banned him entirely.
For years, despite employing thousands of content moderators, the social media firms have claimed that stamping out misinformation is almost impossible. However, following Trump’s social media ban the volume of election misinformation dropped by over 70% overnight, showing that some measures can be effective.
Moderating text-based posts can be automated to a great extent. Improvements in speech recognition technology means that audio posts can be converted relatively effectively to text and moderated in much the same way.
The real challenge is with video. Current video analysis technology cannot distinguish between a man, a cat and a terrorist. Human moderators need to view thousands of hours of video to see if it needs removing. They can then make a video fingerprint to immediately recognise the footage again in future to prevent it being reposted. Once tagged in this way, attempts to repost video of the Christchurch shootings were blocked effectively over a million times by Facebook.
However, if the frame rate is changed or a colour filter is used then a fresh video fingerprint is required. The same fingerprint approach is used to block known child abuse images from being posted online. A library of fingerprints of such images is maintained for this purpose.
Regulation and antitrust
While some in the US are calling for reform of the 25-year-old section 230, others are seeking a full antitrust hearing against the US tech giants, starting with Facebook and looking at others like Google.
Meanwhile, the EU has followed up GDPR, the regulations on privacy, with the draft Digital Markets Act (DMA). Along with the Digital Services Act (DSA), the DMA represents the first major overhaul of EU Internet legislation in the 21st century. Together the DMA and DSA seek to address the monopoly power of the tech giants by proposing sweeping pro-competition regulations and applying serious penalties for noncompliance.
The EEF has also given the DMA a favourable initial review . As it explains: “The Commission’s draft is just a starting point: it will go through many iterations and amendments before it is put to votes at the European Parliament and the Council of the EU (which represents the governments of EU member states). As starting points go, there’s a lot to like in this document, as well as room for improvement.”
In parallel, the UK has created the Digital Markets Unit to oversee similar regulatory and anti-trust action and to “introduce and enforce a new code to govern the behaviour of platforms that currently dominate the market, such as Google and Facebook”.
Even while the draft DMA and DSA are being discussed, Google and regulators in both France and Australia have clashed over payments for news content.
In the UK, national newspaper publishers and a number of other publications have formed a consortium that uses a licensing company, Newspaper Licensing Agency Limited (NLA), to enforce copyright protection on their literary works. If your company is featured in an article in the press and you want to include details of the article on your website, then you need an NLA licence to do so. This is where things start to get complicated.
The internet was set up to allow people to link to each other for free. However, it has been established that copy from an article or publication falls under copyright protection. This includes snippets (the excerpts of up to 256 characters that you see in news searches) as well as the title of the article. So you can use the name of the publication and a link on your web site for free, but if you include the title or any copy from the article, then you need to pay.
National regulators have seen quality journalism and a vibrant news industry as critical to a functioning democratic society. Unfortunately, newspapers and magazines have been in financial decline for some time, a trend that predated the internet. The position of the news organisations has been further undermined by what the Australian government has called “bargaining power imbalances”, by which they mean publishers need Google and Facebook far more than Google and Facebook need the publishers.
Google News has become the dominant search engine for news. Historically, instead of paying publishers, it has believed that they benefit enough from the traffic that it directs to their news sites. Sensing mounting pressure to do more, Google announced a $1bn investment in a partnership with news publishers in October 2020 termed the News Showcase.
Local regulators, however, have demanded more. In France, Google has agreed on a deal to pay local publishers, but in Australia it has refused to do so, threatening to close down the Google News service in Australia instead. The differences in the two decisions can be explained by the context.
In France, publishers are being paid for the snippets of news featured in the Google News Showcase platform. This is in line with the principle that the article titles and snippets are subject to copyright. The proposed new law in Australia would not only require Google to pay publishers for links featured on the search engine but would also require it to notify them of impending changes to its algorithms. As Tim Berners-Lee, the inventor of the world wide web, put it, seeking to charge for links would be at odds with the basic tenets of the internet.
In addition, Google sees its algorithms as some of its most valuable intellectual property, so sharing changes with publishers in advance or being in any way restricted by them is seen by the search giant as unacceptable.
How these initial skirmishes play out will help shape parts of the EU’s DMA and DSA as well as the expected code from the UK’s DMU. The full scope of these regulations, however, is far broader than this. We will be following developments with interest, as these regulations will help define the digital landscape for a generation to come, in much the same way that GDPR has become a global standard for privacy regulation.
You might also be interested in
Founder and CEO of CrisisTeam.co.uk (SiliconANGLE global Startup of the Week – May 2019), an elite team of experts in incident response, cyber law, reputation management and social influence that help clients minimize the impact of cyber incidents. Previous cloud strategist at UKCloud (the...