It’s pretty clear that social media hasn’t been in a good place for a while.
Then again society hasn’t been in a good place either.
One isn’t to blame entirely for the other’s headaches.
But both nag at each other like an old couple on a never ending circular coach journey who’ve used their last paracetamol but can’t remember who is to blame.
In Paul Sutton’s excellent Digital Download podcast this week he chatted to Euan Semple about an earlier comment that social media isn’t broken, people are.
What they had to say was fascinating. Yes, there needed to be some online regulation, Euan said. But the problems in the world are caused by people not technology, he says:
In some ways the internet is a mirror and its showing our true natures and the bits we don’t like have been more obviously evident than they have been in the past perhaps. That’s an opportunity for us to deal with it… and to accept that we need to do better.
Just to blame someone else or to expect social media companies to tidy it up and make it go away seems to me to be a missed opportunity. It’s our human characteristics that we are seeing and should be dealing with rather than just technology.
But what legislation?
Euan makes the point that there’s enough legislation. He may be right but there’s certainly not enough enforcing of that legislation. The imperious online abuse of MPs would show this to be true. Besides, he says, the ones demanding more legislation are often those that understand the internet the least. He’s broadly right on that. The US Congress’s grilling of Facebook’s Mark Zuckerburg is a case in point. So is the terrifying lack of understanding and regulation of AI.
What I think they’re talking about is a kind of change dilemma where change outstrips common understanding and effective regulation lags far behind.
A bit like this…
There is a moral obligation if you’re using technology to do better, Paul and Euan say.
But I’m really not remotely convinced at all that people will pick-up that obligation.
One fringe benefits of carrying out Facebook group research over the past three years is joining scores of Facebook groups myself. I’ll stick around to see how they behave. Like an online petri dish conversations will drop into my timeline. Often, unless there’s a good Facebook group admin the group can quickly degenerate into abuse. Why? Because there’s often no consequences and someone decides they can’t be arsed to be nice. It comes back to frameworks and laws, no matter how gentle.
I don’t buy entirely the idea that there’s a volume control on mob rule, especially if you’re public sector. An arm of government can’t block out views they find problematic. Echo-chambers are not a great idea. If you live in them you could wake-up puzzled that the country has just voted to leave. Or be surprised that those people in the wilds of Montana you’ve ignored have armed themselves with guns and some pretty wild ideas.
For me, the future is in the past. I’d love to know how the advent of the printing press went down and how long it took to be seen as a force for good.
William Tyndale, who revolutionised the printed Bible but was burnt at the stake for it may have a view on this.