THE UK government is trying to crack down on social media giant Facebook after it was deemed ineffective in stopping harmful content and disinformation on the platform.
But is Facebook regulated, and what is the company doing to protect personal data and tackle fake news? Here’s everything you need to know.
Is Facebook regulated?
The social network is not currently regulated, but recently MPs have called for Facebook and other companies like Google to face a crackdown from a new watchdog.
Powerful firms should help users spot fake news and “nudge people towards reading news of high quality”, according to a new government report.
It would mean Facebook and Google would have to “give more prominence to public interest news”, to boost trust in the media and public engagement in politics.
The Cairncross review, commissioned by Theresa May, is a wide-ranging report into the future of news in the UK.
It says tech companies should be overseen by a news watchdog to create a “level playing field”.
This body could work like the Arts Council, channelling public and private funds to “those parts of the industry it deemed most worthy of support”.
The Society of Editors, which campaigns for media freedom, welcomed the findings but warned of the dangers of government-imposed regulation of the media.
In November, experts warned that Facebook needed regulation in relation to the way it uses personal data too.
Data tsar Elizabeth Denham said Facebook still had “a long way to go” to “change practices so that people have trust.”
She called for politicians to move quickly to regulate the tech giant, telling MPs: “I think we have seen some evidence of Facebook being more transparent, but I think they need to do more, and I think they should be subject to greater oversight.”
How has Facebook violated privacy laws?
Last year Facebook was caught up in a data breach controversy that saw info on 50 million users exposed and harvested by Cambridge Analytica.
This information was allegedly used to map out voter behaviour in 2016 for both the Brexit campaign and the US presidential election.
Cambridge Analytica is a British company that helps businesses “change audience behaviour”, and supposedly helped get US President Donald Trump elected and aided the Brexit Leave campaign.
The company used the data to build psychological profiles of Facebook users, to create better political campaigns that could sway their views.
Back in 2015, a Cambridge psychology professor called Aleksandr Kogan built an app called “thisisyourdigitallife”.
The app was a personality quiz that asked Facebook users for information about themselves.
Kogan’s company Global Science Research had a deal to share info from the app with Cambridge Analytica.
Roughly 270,000 Facebook users signed up and took personality tests.
But the app also collected the information of each user’s Facebook friends, who couldn’t possibly have provided consent.
Facebook has flatly denied that the fiasco was even a data breach.
They say Kogan’s app picked up information in “a legitimate way”.
However, they admit that their rules were violated when the data was sold on to Cambridge Analytica.
In a damning report, British lawmakers said internal documents showed Facebook “violated” laws by selling people’s private data without their permission and by crushing rivals through anti-competitive behaviour.
“Companies like Facebook should not be allowed to behave like ‘digital gangsters’ in the online world…considering themselves to be ahead of and beyond the law,” the report states.
The UK parliament’s Digital, Culture, Media and Sport Committee said that for years, Facebook was willing to “override its users’ privacy settings” as part of a campaign to maximise revenue derived from such sensitive information.
Facebook Data Policy – what do they know?
Facebook admits collecting the following data…
- Things you do when you use Facebook
- The information you provide to Facebook
- The information other people submit about you, including info, photos, and messages sent to you
- Your networks and connections
- Information about payments made through Facebook
- Device information about the gadgets you use to access Facebook
- Location information, uncovered through your device
- Information from websites and apps that use Facebook services
- Information from third-party partners, like advertisers
- Information from Facebook-owned companies, like WhatsApp and Instagram
How is Facebook tackling fake news?
The Digital, Culture, Media and Sport Committee have said untrue stories from foreign powers were risking the UK’s democracy.
“The guiding principle of the ‘move fast and break things’ culture often seems to be that it is better to apologise than ask permission,” committee chairman Damian Collins said.
“We need a radical shift in the balance of power between the platforms and the people.”
Collins said the age of inadequate self-regulation must come to an end.
“The rights of the citizen need to be established in statute, by requiring the tech companies to adhere to a code of conduct written into law by Parliament, and overseen by an independent regulator,” he said.
The committee also said democracy was at risk from the malicious and relentless targeting of citizens with disinformation and personalised adverts from unidentifiable sources.
And social media platforms were failing to act against harmful content and respect the privacy of users.
Companies like Facebook were also using their size to bully smaller firms that relied on social media platforms to reach customers, it added.
BUCK UP ZUCK: What did the report call for?
- A compulsory code of ethics for tech companies, overseen by an independent regulator.
- The regulator to be given powers to launch legal action if companies breach the code.
- The government to reform current electoral laws and rules on overseas involvement in UK elections.
- Social media companies to be forced to take down known sources of harmful content, including proven sources of disinformation.
- Tech companies operating in the UK to be taxed to help fund the work for the Information Commissioner’s Office and any new regulator set up to oversee them.
In response, Facebook said: “We share the Committee’s concerns about false news and election integrity and are pleased to have made a significant contribution to their investigation over the past 18 months, answering more than 700 questions and with four of our most senior executives giving evidence.
“We are open to meaningful regulation and support the committee’s recommendation for electoral law reform. But we’re not waiting.
“We have already made substantial changes so that every political ad on Facebook has to be authorised, state who is paying for it and then is stored in a searchable archive for seven years.
“No other channel for political advertising is as transparent and offers the tools that we do.”
MOST READ IN TECH
Facebook has also said it will work with independent fact-checkers to verify articles about health news, an area which has been hit hard by the spread of fake news and misinformation on social media.
A spokesperson said: “When reviewing content we always try to strike a balance between allowing free speech and tackling misinformation – which is why we don’t prevent people from saying something that is factually incorrect, particularly if they aren’t doing so deliberately.
“However, we do take steps to ensure this kind of misleading content is demoted in people’s News Feeds to give it less chance of being seen and spread and – ultimately – to discourage those posting it.”
We pay for your stories! Do you have a story for The Sun Online news team? Email us at firstname.lastname@example.org or call 0207 782 4368 . You can WhatsApp us on 07810 791 502. We pay for videos too. Click here to upload yours.