House of Lords - Digital Technology and the Resurrection of Trust - Select Committee on Democracy and Digital Technologies

amarashar's bookmarks 2020-07-06

Summary:

The Government’s Online Harms programme presents a significant first step towards this goal. It needs to happen; it needs to happen fast; and the necessary draft legislation must be laid before Parliament for scrutiny without delay. The Government must not flinch in the face of the inevitable and powerful lobbying of Big Tech and others that benefit from the current situation.

Well drafted Online Harms legislation can do much to protect our democracy. Issues such as misinformation and disinformation must be included in the Bill. The Government must make sure that online platforms bear ultimate responsibility for the content that their algorithms promote. Where harmful content spreads virally on their service or where it is posted by users with a large audience, they should face sanctions over their output as other broadcasters do.

Individual users need greater protection. They must have redress against large platforms through an ombudsman tasked with safeguarding the rights of citizens.

Transparency of online platforms is essential if democracy is to flourish. Platforms like Facebook and Google seek to hide behind ‘black box’ algorithms which choose what content users are shown. They take the position that their decisions are not responsible for harms that may result from online activity. This is plain wrong. The decisions platforms make in designing and training these algorithmic systems shape the conversations that happen online. For this reason, we recommend that platforms be mandated to conduct audits to show how in creating these algorithms they have ensured, for example, that they are not discriminating against certain groups. Regulators must have the powers to oversee these decisions, with the right to acquire the information from platforms they need to exercise those powers.

Platforms’ decisions about what content they remove or stop promoting through their algorithms set the de facto limits of free expression online. As it currently stands the rules behind these decisions are poorly defined. Their practical operation should reflect what the public needs. In order to protect free and open debate online, platforms should be obliged to publish their content decisions making clear what the actual rules of online debate are.

Alongside establishing rules in the online world, we must also empower citizens, young and old, to take part as critical users of information. We need to create a programme of lifelong education that will equip people with the skills they need to be active citizens. People need to be taught from a very young age about the ways in which platforms shape their online experience.

The public needs to have access to high quality public interest journalism to help inform them about current events. This requires fair funding to support such journalism.

Platforms must also be forced to ensure that their services empower users to exercise their rights online. The public need to understand how their data is being used. We propose that this obligation of fairness by design should be a core element in ensuring platforms meet their duty of care to their users.

Parliament and government at all levels need to invest in technology to engage better with the public.

Electoral law must be completely updated for an online age. There have been no major changes to electoral law since the invention of social media and the rise of online political advertising. As the Law Commission recently pointed out, a wholesale revision of the relevant law is now needed. This should include rules that set standards for online imprints on political advertisements so that people can see who they come from and advert libraries that enable researchers and the public to see what campaigns are saying. The Electoral Commission needs the powers to obtain the information necessary to understand when individuals are breaking the rules and to be able to set fines that act as a real deterrent against flagrant breaches. We also need to ensure that there is greater clarity around the use of personal data in political campaigns; the Information Commissioner’s guidance should be put on statutory footing.

Link:

https://publications.parliament.uk/pa/ld5801/ldselect/lddemdigi/77/7702.htm

From feeds:

Ethics/Gov of AI » amarashar's bookmarks

Tags:

Date tagged:

07/06/2020, 11:07

Date published:

07/06/2020, 07:07