Skip to main content
Menu

Democracy under threat from ‘pandemic of misinformation' online – Lords Democracy and Digital Technologies Committee


The UK Government should act immediately to deal with a ‘pandemic of misinformation' that poses an existential threat to our democracy and way of life. The stark warning comes in a report published today by the Lords Committee on Democracy and Digital Technologies.

The report says the Government must take action ‘without delay' to ensure tech giants are held responsible for the harm done to individuals, wider society and our democratic processes through misinformation widely spread on their platforms.
 
The Committee says online platforms are not ‘inherently ungovernable' but power has been ceded to a “few unelected and unaccountable digital corporations” including Facebook and Google, and politicians must act now to hold those corporations to account when they are shown to negatively influence public debate and undermine democracy.
 
The Committee sets out a package of reforms which, if implemented, could help restore public trust and ensure democracy does not ‘decline into irrelevance'.
 
Publish draft Online Harms Bill now
 
The Government has failed to get to grips with the urgency of the challenges of the digital age and should immediately publish an Online Harms Draft Bill that covers the impact of disinformation. This should give Ofcom, as the proposed Online Harms regulator, the power to hold digital platforms legally responsible for content they recommend to large audiences or that is produced by users with a large following on the platform.
 
The Committee point out that many content providers are in effect in business relationships with platforms that host their content and the platforms have a duty of care to ensure the content is not harmful, either to individuals or our shared democratic principles. This should be backed up by the power for Ofcom to fine digital companies up to four percent of their global turnover or force ISP blocking of serial offenders.
 
Ofcom should also be given the power to ensure online platforms are transparent in how their algorithms work so they are not operating in ways that discriminate against minorities. To achieve this Ofcom should publish a code of practice on algorithms including internal and external audits of their effects on users with the characteristics protected in the 2010 Equalities Act.
 
Regulate political advertising
 
The report calls for political advertising to be brought into line with other advertising in the requirement for truth and accuracy. It says the political parties should work with the Advertising Standards Authority and other regulators to develop a code of practice  that would ban “fundamentally inaccurate advertising during a parliamentary or mayoral election or referendum”. This Code would be overseen by a Committee including the ASA, the Electoral Commission, Ofcom and the UK Statistics Authority and would have the power to remove political advertising that breached the code.
 
This new regulation would be supported by a significant toughening up of electoral law including a requirement for online political material to include imprints indicating who has paid for them, real time databases of all political advertising on online platforms and an increase in the fines that the Electoral Commission can impose on campaigners to £500,000 or four percent of the total campaign spend, whichever is greater.
 
Introduce a digital ombudsman
 
The Committee calls on the Government to establish an independent ombudsman for content moderation who will provide a point of appeal for people who have been let down by digital platforms. This would ensure the public would have a representative who could both force the tech giants to take down inappropriate content and protect individuals from their content being unfairly taken down by platforms.
 
The Committee also makes recommendations for increasing digital media literacy and developing active digital citizens through changes to school curriculum and adult digital literacy initiatives.
 
Commenting, Lord Puttnam, Chair of the Committee, said:
 
“We are living through a time in which trust is collapsing. People no longer have faith that they can rely on the information they receive or believe what they are told. That is absolutely corrosive for democracy.

“Part of the reason for the decline in trust is the unchecked power of digital platforms. These international behemoths exercise great power without any matching accountability, often denying responsibility for the harm some of the content they host can cause, while continuing to profit from it.
 
“We've seen clear evidence of this in recent months through a dangerous rise of misinformation about Covid-19. We have become aware of the ways in which misinformation can damage an individual's health along with a growing number of instances where it is our collective democratic health that's under threat.  That must stop – it is time for the Government to get a grip of this issue. They should start by taking steps to immediately bring forward a Draft Online Harms Bill. We heard that on the current schedule the legislation may not be in place until 2024. That is clearly unacceptable.
 
“We have set out a programme for change that, taken as a whole, can allow our democratic institutions to wrestle power back from unaccountable corporations and begin the slow process of restoring trust. Technology is not a force of nature and can be harnessed for the public good. The time to do so is now.”

Latest tweets

Loading...

Subscribe to Lords newsletter

Sign up for the House of Lords newsletter for the latest news, debates and business.

Subscribe now (external site)