There’s no hiding the power that has accrued to tech companies. As they take action to temper the social issues arising from their platforms is self-regulation enough?
In a 2017 interview with Google CEO Sundar Pichai, journalist Jemima Kiss notes that as a company Google has constantly sought to underplay its geopolitical influence by sidestepping controversies. But as the writer adds, running a company with more customers than the population of the earth is a challenging business and politics demands attention.
As the list of concerns and accusations about tech companies has grown, from the viral spread of rumours and lies masquerading as news, to racist search results, concerns over data collection and collusion with the NSA, Silicon Valley has steadily lost its earnest charm.
Academic Geert Lovink, in his 2016 book Social Media Abyss, reminds us of the old Google slogan “Don’t be Evil”, arguing that it had to be dropped when the company realised that sometimes it’s necessary to ‘think with Evil’ because its economic, not social metrics that incentivise tech companies.
Areeq Choudhary is Chief Executive and Founder of WebRoots Democracy, a thinktank that examines the intersection of technology and the democratic process. Seeking to tackle issues like low political participation amongst young people and minorities, the campaigner sees huge potential in online tools. For example, referring to a report undertaken in June 2017, Choudhary describes how people with disabilities and visual impairments are amongst the major beneficiaries of tech advances.
Even as a self-described ‘techno-optimist’ Choudhary is wary about some of the problems being raised by technology’s rise in social and political life. Yet he questions to what extent they are new issues, as opposed to familiar problems in another setting. ‘Fake news’ can be likened to a rumour mill and ‘filter bubbles’, the idea that online we only interact with people that agree with us, are like our self-selected offline social circles, the only difference Choudhary suggests, is just the scale.
But when scale is this significant is it possible to argue that we are still talking about the same beast? Does the scale actually mean that these issues are presenting novel concerns that need to be treated as such?
The WebRoots founder argues that we need to see a combination of regulation and digital literacy education to fully harness the socially positive potential of tech tools.
Legal frameworks are far behind, resulting in a lack of clarity about what is permissible online. ‘Dark’ political advertising through social media, for example, which unlike a billboard is only ever seen by the targeted recipient, means that regulators can’t monitor compliance in terms of content and spending. The Cambridge Analytica revelations, that user data was being harvested without permission for political microtargetting, are testament that these innovations require new laws to ensure transparency and accountability. In this situation, current laws would only penalise a data leak, yet the broader picture of what’s at stake here is how data is collected without reasonable informed consent and put towards ends that the user will never likely know.
Regulation alone isn’t sufficient though, issues such as the circulation of false information and even targeted political messaging, also require better education of young people and adults alike. WebRoots Democracy’s June 2017 report titled Fake News, recommends a combination of teaching around how to critically analyse the media we consume, as well as education about how better to use online tools, which can be supportive of critical analysis too.
The growing controversies are driving tech firms to act, eager to avoid the spectre of legislation which is looming large after Davos 2018. The World Economic Forum meeting saw George Soros warn against advances towards a “web of totalitarian control”.
“They claim they are merely distributing information,” he charged. “But the fact that they are near-monopoly distributors makes them public utilities and should subject them to more stringent regulations, aimed at preserving competition, innovation, and fair and open universal access.”
But is the unilateral action being taken by internet firms to evade external involvement even more concerning?
In early 2018, Facebook funded a project by the Institute for Strategic Dialogue that involved identifying and contacting people at risk of political extremism through Facebook Messenger. In another initiative seeking to ‘crowdsource’ norms the company was criticised for a poll asking users if Facebook should decide whether adult men could use the site to solicit sexual images from children.
These instances are jarring in that they show Silicon Valley’s willingness to act for the purported common good, but led by a moral compass of the companies’ making. If social media platforms are going to play such a significant role in social and political life, and if they’re actually going to enhance democratic process, then we need more than self-regulation, we need transparency and accountability.
Image: Facebook testify Zuckerberg. Flickr/Stock Catalog (CC BY 2.0) www.thoughtcatalog.com