algorithm

Algorithms destroy reputations. Ban them.

The mass scale of social media made its creators think first about the investor’s returns and profit margins and only then address usability and features. Everyone would post what they wanted, but some basic rules should be followed: sex and violence, treated as equally dangerous “violations” could not be depicted. When it comes to violence, everyone agrees. We’re not here to defend that mass shootings be exhibited on Instagram live. There couldn’t be a more opposed stance for this blog. But when it comes to sex, I’m not watching it live on camera either. I’ve touched upon the world of cam models, which is different, but nobody seemed to listen. Nevertheless, it’s still a violation, but these models have profiles on Instagram and Twitter.

Violence is never the answer. Moderation needs to be there. But when you fail to convince people that you’re right about something, and you realize they’re trying to control you, then you might change your behavior. That’s how most teenagers end up using drugs. The effects on their bodies causes a desired sense of otherness and presents an alternative. When it comes to sex, everyone is online and nobody does that anymore, since we found out that we could be easily identified.

But the question isn’t that. The main problem is everything we do, whether the cam is on or not, is being tracked. The time spent reading a news story is counterposed with the time spent watching cam models. So there is a reputational system. They don’t realize that all the reasons one could have for seeking pleasure, tackling addiction from the equation, aren’t being mapped out. The algorithm doesn’t say: “this user is lonely”. The algorithm says: “this user spent 4 hours watching cams”. And that repeats itself every single day.

If we don’t change the way that we think about social media maintenance, we won’t have a say when it comes to our jobs. Imagine trying to convince people that what you really care about is gender equality, when all you do is watch Pornhub. That doesn’t stick anymore. And they think they’re really smart. They don’t say who’s paying for ads on the site, and they don’t say what’s being done with our data. But they will, inevitably, judge based on that same data. Who convinced people this was fair?

Algorithms might make our experience better, but in the end, even when they’re excelling, we’ll have a feeling that we’re being shown “more of the same”. The need for new experiences is what drives creativity and discovery. We’ll always have that. The recent developments on AI technology invading conversations, lesson plans, customer service and other areas are moving in an astonishingly wrong direction. Algorithms are a form of control. But we control what we do and choose what to say, because that’s the stuff we want. Don’t be convinced of the opposite.

Leave a Reply Cancel reply