Amid growing concerns that Facebook is attempting – and succeeding – to influence the government, it has been revealed that the tech giant has recruited ten former policy officials over the past 18 months. With each new hire, Facebook has acquired more and more insider knowledge of the regulatory process. However, Facebook certainly isn’t the only culprit here. The Times discovered that, in the past five years, at least 14 special advisers had moved to tech companies including Uber, Google and Facebook after working in ministerial offices.
Back in April, while the UK was preparing to introduce legislation to crack down on offensive online content, it was announced that Tony Close (previously Director of Content Standards at Ofcom) would become Director of Content Regulation at Facebook in August. As the UK’s media watchdog, Ofcom is of course responsible for keeping users safe on social media, and they faced backlash for allowing such an influential figure to make this move at such a pivotal point in the process.
Numerous Conservative MPs have voiced their concerns surrounding Facebook’s intent to hire insiders who have a direct, full understanding of upcoming developments in policy, as well as extensive networks and connections to those who will be turned to for guidance. Damian Collins, Conservative MP and former Chairman of the Digital, Culture, Media and Sport Committee, even boldly stated that ‘[Facebook] are doing this to try and change the direction of policy before it is even launched,’ all serving to confirm mounting suspicions.
Formerly, the banking, oil and pharmaceutical industries held the greatest lobbying power – yet now, with their incomprehensible masses of data, it is Big Tech companies that wield the most influence. It begs the question, are tech giants recruiting ex-policymakers to truly ensure their policies are of optimum effectiveness (as they claim) or to continue gaining influence and political momentum?
In the US, there’s been much debate about Facebook’s impact on the upcoming 2020 election. In response, Facebook has teamed up with 17 academics to launch a new research project which will determine 'once and for all' whether Facebook is influencing political polarization, voter participation, trust in democracy and the spread of misinformation. Up to 400,000 US citizens are expected to take part in the study and must opt-in to allow Facebook to monitor their online behaviour.
However, there’s one major flaw in the study. Naturally, the results won’t be available until after the 2020 election has happened, and they’re not expected to be available to the general public until mid-2021. Experts have been sceptical about the research, and understandably so. It’s heavily focused on how users process and interact with the content available to them, but doesn’t necessarily account for changing algorithms, fake accounts and private groups.
So, in 2021, Facebook may come forward with further evidence of its ‘innocent’ behaviour – but can it be trusted?
If you’re looking to fill a vacant role within policy, public affairs or communications, we’re here to help with all of your recruitment needs. Simply get in touch with us to find out more.
Want to know more?Let's Talk
Please provide a few details below and our team will get in touch to book your consultation.