Everyone’s heard of Omegle. When the internet was still relying on Microsoft’s MSN Messenger and webcams had half a pixel definition, we didn’t really think that talking to a person on video was going to be massively adopted, but it was fun, for those who had a webcam, to have a videocall. Of course, nobody really had a laptop, and they barely had desktops; also, the laptops of 2006 didn’t come with an in-built webcam as a standard. You had to invest in that, and many of us focused on other stuff. There were, of course, people who thought this was the secret of success, and they were provided with the best internet speeds and the best image quality. We didn’t even know what cameras were, and used to call them “digital cameras”. Smartphones weren’t existent. If you wanted to send a dick pic on the blue brick Nokia, you’d have to type the number eight for the balls followed by many “equals” signs ending with a D, and perhaps a tilde if you were creative. Today, of course, there’s the eggplant emoji, but there’s also a culture of Facetime from Apple and countless videocalling services; a culture of image and a more advanced culture of video, exemplified by TikTok. The image quality on TikTok is not as important as your content, but one could argue the opposite. The point is: these things didn’t exist when the millenials were growing up, but since they all had to deal with their own moral standards and others’ perceptions of how to build credibility on social networks, a thing that came right after messaging platforms, the things that exist today are discredited because they’re not representative of the struggle of being deprived of mass communication.
An example of mass communication was torrent based websites. This was very strongly condemned by the music industry, and maybe less so by the film industry. The end of CDs completely destroyed what artists knew how to do, and companies as well. Nobody really talks about it. But film production worked with a different concept. It wasn’t audio, it was video. This was more expensive, so the argument would be that their profit was going to be made in movie theaters and VHS rental stores, plus the TV contracts to exhibit movies nationwide and other sorts of advertising, generically speaking — a business analyst will always prove you wrong on Twitter. This was an example of mass communication. Not people talking, but people watching the same thing, and in theory, a message being propagated and later debated for a long time. Of course, today messages are exchanged in a rate so fast that the notifications need to be turned off so that it’s even possible to use the phone. And the messages are going to be ignored. Selectively, people choose who to talk to, who to give attention to, while working with the assumption that most people who come to them are probably going to waste their time and whatever the platform has to offer is a chance to gain better ground on the digital landscape, obeying the algorithm in order to get what they want. If the algorithm said explicitly: “next, we need a picture of you on the beach”, you’d willingly do it, and be very happy with the results, and moreover, trusting the platform’s ability to choose what’s best for you and praising the revolution that technology has brought to us, in a certain age group, and actually demanding better outcomes, in another. The problem today is that not enough likes leads to depression, and there’s very much a consensus on how to deal with whatever’s unwanted, but eventually someone stands out and needs to be either blocked or exposed — maybe even reported to the cops. As adults, we understand that is a long and enduring process, but also as adults, we understand that technology didn’t keep up the pace when it comes to monitoring and reporting. If you analyze reporting cases and types of problems across platforms, you’ll find answers from “I’m not interested in this content” to “it’s posting content that shouldn’t be here”. That is wildly, irresponsibly generic, and there are not yet specific mechanisms to say “I’ve been harassed by this user” and how, with a description of the problem. They probably think that would lead to a lot of false reporting, or maybe a lot of work, and prefer the generic terms and the unhinged internet operating in the shadows, while they can successfully set up the Official Bullshit Policy. Few people understand that it’s everybody’s role to make them accountable for bad user experiences. And back to torrents, they were stolen content, but for an amazing, exclusive user experience.
The problem with video is that it’s a way too fertile territory for semiotics. File format, account type, duration, definition, size: this is one aspect of it. Discursive category, general category, elements on the video, description of elements, captions, sound and content appropriateness: there’s another challenge, probably a point for AI specialists to look at. Who’s in the video, doing what? That’s what everyone wants to know. But that’s not very much debated; we either like it or not. The amount of people who just want to see someone doing something and don’t even care who it is (again, look at TikTok) has been, for a while, scaling up, and the maximum expected engagement seems to be the like. That’s why people can’t stop talking about companies investing in AI: because that’s the blueprint of development of their systems. When someone likes a video, the most attractive form of engagement on the web, the algorithms can be improved in quality. Few people understand what’s at stake, and the ethical problems (like, say, the role of humor on influencing people and humor’s relationship with scorn) are ignored. Needless to say, academics have studied that, but maybe those same academics have had investments in the institutions they worked for cut for some reason (in this economy? Nobody’s interested in Philosophy).
But a curious case stands out. Among mass communication planning attempts, there is Omegle, a website with monitoring, algorithms and security that are all very questionable. The point was to allow real people to chat instantly, but in rows. The role that image plays in this kind of proposal makes us reassess things like beauty standards, gender roles and age groups. It also makes us look at how many of our time spent looking for quality interactions is deviated and plainly stolen from us. Everyone knows that Omegle is pure waste of time, when they reach a certain age; but not everyone is willing to ask the tougher questions: are there robots collecting data about kids and teens? Are hackers collecting IP addresses and invadng accounts in order to steal documents, have remote access to devices and plan organized attacks on groups of people, getting paid on demand? Is my kid safe making friends using the only thing available, the smartphone or laptop? Are they old enough to speak for themselves, make the right assessments on people and maybe foster relationships from the beggining in the online environment? And does it have to be there? How often? The possibility of talking to anybody on the planet with an internet connection might seem like the dream of the makers of the internet, but we’ve come to realize there are bad actors (and actresses, if we’re applying gender parity, confused or not on whether we should). How bad are they? That seems to be a million dollar question — and for the more well-informed, billion with a B. As it turns out, most people are not interested in talking. But why is that? Is it trauma? Is it context? Is it lack of training and education, or a politeness problem? Culture? Personality? Personal history? What word would you pick to describe Omegle’s main problem? Reliability, perhaps? Or do you keep the mainstream platforms’ concepts in mind and question reach and reputation? What criteria are you going to use to classify what’s appropriate or not, for whom, according to whom, and further, what is being constructed when a user enters the website?
Of course, sometimes we look at our lives and come to talk with a close contact, who will say: “you just need some new friends”. I personally had a very close person tell me this in a different way: “you need to find your hangout”. That is, of course, a translation; but it’s surprisingly suggestive, isn’t it? Actually, nobody says “hangout” as a noun. People say “hang” as a verb, and maybe what he said was more like “you need to hang with the right people”. Another guy told me: “if you wanna hang with us, you gotta be on our side for real”. Too gangster for you? Well, that happened. And it wasn’t on Omegle, you see. But how many people are we actually “hanging” with on Omegle? Nobody. Maybe if we’re lucky, one person every month, if we spend 20 hours a week on the platform. Isn’t that proof that something’s essentially wrong in there? The specialists would call this data analysis. And in fact, go to Coursera and search for Google’s Data Analysis course, and they’ll suggest you do your own data analysis with your own information. Things like how many cups of coffee you have every day are supposed to be written down and calculated at the end of a certain period, and then you’d have your data. Later, you’d compare it with glasses of water, juice, milk and beer, for example. You’d make a graph and you’d be your own nutritionist. Interesting suggestion? Apple has devices that measure our bodily functions; Amazon collects information about our homes; Peloton focuses on health, allegedly; Uber makes mobility possible to map. But so does GPS, the doctor and the neighborhood. Who are you on Omegle: the GPS, the doctor or the neighborhood? Maybe none of these: you’re just an anonymous user.
Now, anonymity has been discussed with a lot of skepticism, even by specialists and activists. It may be the attempt to hide the tracks of somethng you’re not supposed to be doing. And we could talk about many things here, but I’ll let the reader decide what’s the biggest concern he or she has when it comes to anonymous interaction and anonymous tracking. Recently, all websites were forced to display data collection notifications on their homepages. That’s anonymous tracking: you know the website is going to collect information about you, but you have no idea about the third party who analyzes your data and allegedly decides on what the best ads for you are. Let’s go back to Omegle? You have no idea who you’re talking to. A civilized society would say that’s something to consider. And yet, the most common interactions involve people calling you a “nigger”, saying “you’re gay”, that they would “fuck your mom”, asking if you have “a sister”, or simply showing you the middle finger, but some of them are raw and ask you to “kill yourself”. What are we supposed to do with this data? Let’s not forget about the terms like “nonce” and “wanker”, but remember that people exchange information, even if they’re expressly recommended not to: “what’s your Instagram?” or “are you on snap?” are very common. If Facebook had our “metadata” and was able to cross analyze two people’s profiles and say how they met, and supposedly they had “Omegle” as a data point, what would that do to your reputation, with Facebook, with our friends, with the rest of the internet?
The challenges are many. Breaking stereotypes is one of them. “The girl you called a slut, she’s a virgin. The girl you pushed down the stairs, she already suffers abuse at home”, the internet goes. Is that the “woke” internet? Absolutely not. That’s just a thing people wrote to protect others and themselves against bullying, and it’s a great sort of campaign. But it wasn’t advertised on Instagram or Twitter. It was a copy paste that circulated in even less populated places than Omegle. But if people apply this to Tinder, for example, they’ll see that expectations are what motivate us — and stop us on our tracks. If we realize that the majority of people seems to be ill-motivated, we’re not going to be willing to participate. This is something we learn in school. But social media, with an immense lack of consideration for cross-disciplinary education, forgot to teach people that talking ill about others behind their backs, with no proof of what they’re saying, constitutes a crime of defamation; indiscriminate screenshotting constitutes a crime of intellectual property theft; sharing intimate photos of others constitutes a crime of sex trafficking. And so far, we’re just worried that teenagers are going to see genitals. Don’t worry, they heard about Pornhub when their cousin mentioned it to them, at 8 years old, or maybe 13, and in the minds of conservatives who don’t even believe themselves, 18 — because we don’t just assume things, do we? And although the terms of use make specific that you should be in a certain age range to access these websites (Omegle recently changed its rules, but Pornhub has countless legal issues, most of them unspoken for the tired sake of convenience), the conscious choice of accessing them brings about an understanding of risk and a strong sense of will and desire. If teenagers are taught that they can’t watch porn, they’ll want it more because people are taking away their freedoms. Curiously, Omegle works with different assumptions. But here’s something that’s also relevant: if an adult says that they do not want any relationship or connection with the porn industry, they’ll be hunted down, and worse if they speak against it. While we spend time talking about the internet, sexual harassment in the workplace seems to be a less clickable story theme, simply because people need jobs, and a lot of them are denied them, based on questionable criteria. With that in mind, it’s time we address the real problems, before “skepticism” becomes the new standard for a self-medicated internet on the verge of a psychotic episode, or worse: a place run by people with self-proclaimed ill-intentions.