Everyone’s heard of Omegle. When the internet was still relying on Microsoft’s MSN Messenger and webcams had half a pixel definition, we didn’t really think that talking to a person on video was going to be massively adopted, but it was fun, for those who had a webcam, to have a videocall. Of course, nobody really had a laptop, and they barely had desktops; also, the laptops of 2006 didn’t come with an in-built webcam as a standard. You had to invest in that, and many of us focused on other stuff. There were, of course, people who thought this was the secret of success, and they were provided with the best internet speeds and the best image quality. We didn’t even know what cameras were, and used to call them “digital cameras”. Smartphones weren’t existent. If you wanted to send a dick pic on the blue brick Nokia, you’d have to type the number eight for the balls followed by many “equals” signs ending with a D, and perhaps a tilde if you were creative. Today, of course, there’s the eggplant emoji, but there’s also a culture of Facetime from Apple and countless videocalling services; a culture of image and a more advanced culture of video, exemplified by TikTok. The image quality on TikTok is not as important as your content, but one could argue the opposite. The point is: these things didn’t exist when the millenials were growing up, but since they all had to deal with their own moral standards and others’ perceptions of how to build credibility on social networks, a thing that came right after messaging platforms, the things that exist today are discredited because they’re not representative of the struggle of being deprived of mass communication.
An example of mass communication was torrent based websites. This was very strongly condemned by the music industry, and maybe less so by the film industry. The end of CDs completely destroyed what artists knew how to do, and companies as well. Nobody really talks about it. But film production worked with a different concept. It wasn’t audio, it was video. This was more expensive, so the argument would be that their profit was going to be made in movie theaters and VHS rental stores, plus the TV contracts to exhibit movies nationwide and other sorts of advertising, generically speaking — a business analyst will always prove you wrong on Twitter. This was an example of mass communication. Not people talking, but people watching the same thing, and in theory, a message being propagated and later debated for a long time. Of course, today messages are exchanged in a rate so fast that the notifications need to be turned off so that it’s even possible to use the phone. And the messages are going to be ignored. Selectively, people choose who to talk to, who to give attention to, while working with the assumption that most people who come to them are probably going to waste their time and whatever the platform has to offer is a chance to gain better ground on the digital landscape, obeying the algorithm in order to get what they want. If the algorithm said explicitly: “next, we need a picture of you on the beach”, you’d willingly do it, and be very happy with the results, and moreover, trusting the platform’s ability to choose what’s best for you and praising the revolution that technology has brought to us, in a certain age group, and actually demanding better outcomes, in another. The problem today is that not enough likes leads to depression, and there’s very much a consensus on how to deal with whatever’s unwanted, but eventually someone stands out and needs to be either blocked or exposed — maybe even reported to the cops. As adults, we understand that is a long and enduring process, but also as adults, we understand that technology didn’t keep up the pace when it comes to monitoring and reporting. If you analyze reporting cases and types of problems across platforms, you’ll find answers from “I’m not interested in this content” to “it’s posting content that shouldn’t be here”. That is wildly, irresponsibly generic, and there are not yet specific mechanisms to say “I’ve been harassed by this user” and how, with a description of the problem. They probably think that would lead to a lot of false reporting, or maybe a lot of work, and prefer the generic terms and the unhinged internet operating in the shadows, while they can successfully set up the Official Bullshit Policy. Few people understand that it’s everybody’s role to make them accountable for bad user experiences. And back to torrents, they were stolen content, but for an amazing, exclusive user experience.
The problem with video is that it’s a way too fertile territory for semiotics. File format, account type, duration, definition, size: this is one aspect of it. Discursive category, general category, elements on the video, description of elements, captions, sound and content appropriateness: there’s another challenge, probably a point for AI specialists to look at. Who’s in the video, doing what? That’s what everyone wants to know. But that’s not very much debated; we either like it or not. The amount of people who just want to see someone doing something and don’t even care who it is (again, look at TikTok) has been, for a while, scaling up, and the maximum expected engagement seems to be the like. That’s why people can’t stop talking about companies investing in AI: because that’s the blueprint of development of their systems. When someone likes a video, the most attractive form of engagement on the web, the algorithms can be improved in quality. Few people understand what’s at stake, and the ethical problems (like, say, the role of humor on influencing people and humor’s relationship with scorn) are ignored. Needless to say, academics have studied that, but maybe those same academics have had investments in the institutions they worked for cut for some reason (in this economy? Nobody’s interested in Philosophy).
But a curious case stands out. Among mass communication planning attempts, there is Omegle, a website with monitoring, algorithms and security that are all very questionable. The point was to allow real people to chat instantly, but in rows. The role that image plays in this kind of proposal makes us reassess things like beauty standards, gender roles and age groups. It also makes us look at how many of our time spent looking for quality interactions is deviated and plainly stolen from us. Everyone knows that Omegle is pure waste of time, when they reach a certain age; but not everyone is willing to ask the tougher questions: are there robots collecting data about kids and teens? Are hackers collecting IP addresses and invadng accounts in order to steal documents, have remote access to devices and plan organized attacks on groups of people, getting paid on demand? Is my kid safe making friends using the only thing available, the smartphone or laptop? Are they old enough to speak for themselves, make the right assessments on people and maybe foster relationships from the beggining in the online environment? And does it have to be there? How often? The possibility of talking to anybody on the planet with an internet connection might seem like the dream of the makers of the internet, but we’ve come to realize there are bad actors (and actresses, if we’re applying gender parity, confused or not on whether we should). How bad are they? That seems to be a million dollar question — and for the more well-informed, billion with a B. As it turns out, most people are not interested in talking. But why is that? Is it trauma? Is it context? Is it lack of training and education, or a politeness problem? Culture? Personality? Personal history? What word would you pick to describe Omegle’s main problem? Reliability, perhaps? Or do you keep the mainstream platforms’ concepts in mind and question reach and reputation? What criteria are you going to use to classify what’s appropriate or not, for whom, according to whom, and further, what is being constructed when a user enters the website?
Of course, sometimes we look at our lives and come to talk with a close contact, who will say: “you just need some new friends”. I personally had a very close person tell me this in a different way: “you need to find your hangout”. That is, of course, a translation; but it’s surprisingly suggestive, isn’t it? Actually, nobody says “hangout” as a noun. People say “hang” as a verb, and maybe what he said was more like “you need to hang with the right people”. Another guy told me: “if you wanna hang with us, you gotta be on our side for real”. Too gangster for you? Well, that happened. And it wasn’t on Omegle, you see. But how many people are we actually “hanging” with on Omegle? Nobody. Maybe if we’re lucky, one person every month, if we spend 20 hours a week on the platform. Isn’t that proof that something’s essentially wrong in there? The specialists would call this data analysis. And in fact, go to Coursera and search for Google’s Data Analysis course, and they’ll suggest you do your own data analysis with your own information. Things like how many cups of coffee you have every day are supposed to be written down and calculated at the end of a certain period, and then you’d have your data. Later, you’d compare it with glasses of water, juice, milk and beer, for example. You’d make a graph and you’d be your own nutritionist. Interesting suggestion? Apple has devices that measure our bodily functions; Amazon collects information about our homes; Peloton focuses on health, allegedly; Uber makes mobility possible to map. But so does GPS, the doctor and the neighborhood. Who are you on Omegle: the GPS, the doctor or the neighborhood? Maybe none of these: you’re just an anonymous user.
Now, anonymity has been discussed with a lot of skepticism, even by specialists and activists. It may be the attempt to hide the tracks of somethng you’re not supposed to be doing. And we could talk about many things here, but I’ll let the reader decide what’s the biggest concern he or she has when it comes to anonymous interaction and anonymous tracking. Recently, all websites were forced to display data collection notifications on their homepages. That’s anonymous tracking: you know the website is going to collect information about you, but you have no idea about the third party who analyzes your data and allegedly decides on what the best ads for you are. Let’s go back to Omegle? You have no idea who you’re talking to. A civilized society would say that’s something to consider. And yet, the most common interactions involve people calling you a “nigger”, saying “you’re gay”, that they would “fuck your mom”, asking if you have “a sister”, or simply showing you the middle finger, but some of them are raw and ask you to “kill yourself”. What are we supposed to do with this data? Let’s not forget about the terms like “nonce” and “wanker”, but remember that people exchange information, even if they’re expressly recommended not to: “what’s your Instagram?” or “are you on snap?” are very common. If Facebook had our “metadata” and was able to cross analyze two people’s profiles and say how they met, and supposedly they had “Omegle” as a data point, what would that do to your reputation, with Facebook, with our friends, with the rest of the internet?