Read Aloud the Text Content
This audio was created by Woord's Text to Speech service by content creators from all around the world.
Text Content or SSML code:
Cats are as closely connected with memes in the internet world as they are with meows, and they appear to spread like wildfire with the same carelessness and nonchalance that they do with their other favorite activities, like chewing on catnip or napping. But dogs aren't as common online as they are offline. Why cats then? Well, autonomy might play a role in it. Cats, as opposed to dogs, came to us on their own and, at least to some extent, trained themselves. Dogs, on the other hand, were domesticated by humans thousands of years ago. Even now, especially online, this feline propensity for independence is still noticeable. Cat memes, on the other hand, are adorable and entertaining because they depict strange and unexpected behaviors. Dog memes frequently show off training and discipline accomplishments. So why do we find this online uncontrollability so alluring? It's very legitimate to worry in today's modern society that we are losing our autonomy or that we are always being influenced, or, to put it another way, that we are turning into dogs instead of cats. In the internet environment, these chapters discuss how to foster feline independence and prevent canine reliance. They educate you on how powerful corporations like Youtube and Twitter are always exploiting you and serve as a passionate argument for canceling your accounts on social networks. These chapters will also teach you why social networking is a trap; why unpredictability is more reliable; and why Silicon Valley youngsters choose Waldorf colleges. Chapter 1 - Argument 1: You can be manipulated by social media, endangering your ability to exercise your autonomy. You might not be aware of it, yet you are confined. Even though the cage is tiny and can easily fit in your purse, there is still enough room inside to enter. Furthermore, while you are in this cage, you are being observed, controlled, and examined just like a scientific creature. Just take into account the facts if the above seems a little paranoid. The cage is probably a smartphone, which almost everyone owns. Naturally, you aren't physically confined within, but every time people utilize it to access social media, algorithms—not experts in lab coats—are monitoring and controlling you. The information that these algorithms have gathered about you—such as when you check in, how long you remain signed in for, and what you purchase—is subsequently based upon the information of thousands of many other individuals. This makes it possible for the systems to forecast your behavior. How? What if an algorithm discovered, after analyzing a ton of data, that individuals who consume the same foods as you discover a certain presidential figure less attractive while her image is framed in yellow as opposed to blue? It might not sound like a shocking or evil finding, but imagine if the campaign staff for this politician gets a hold of that knowledge. According to statistics, you are more inclined to support her when marketers offer you political advertising with her blue-bordered image. Social media corporations are also free to sell your information without any remorse. You are their product; you are not their client. Their customers are advertisers or businesses who purchase your personal information to use it to influence your purchasing decisions or political voting. This, in the writer's mind, amounts to outright behavior modification. Marketing has historically been deceptive, but it has only lately become possible for ads to be customized according to your specific tastes and internet habits. This customization is not 100% precise and just has an agency. You might not like green like the majority of individuals who have a diet related to yours, so you wouldn't vote for the politician with the green border. However, numerical impacts are trustworthy when applied to the full population. Therefore, the likelihood that you have been misled is higher than not. Chapter 2 - Addiction is built into social media networks. If you were a kid, you might get a chocolate bar right away anytime you said "please." Naturally, this would cause you to ask for things very frequently. Imagine if occasionally asking for the sweet you want doesn't work. Do you anticipate using the word "please" more or less frequently in light of this sporadic failure? Although it might appear contradictory - after all, why engage in an action if it doesn't result in the intended outcome? - According to a study, you would likely begin thanking people more frequently. This phenomenon, which is valid including both people and animals, was first identified by behaviorists decades ago. Data that is somewhat inaccurate is frequently more appealing than data that is 100% dependable. Social media, as we are all aware, strives to maintain our attention, and they accomplish this by making use of this behaviorist insight. Sean Parker, the founding president of Facebook, referred to it as a "feedback loop for social validation." An individual may occasionally, but not always, enjoy your posting or image. People become dependent on this aspect of unpredictability. Additionally, social networking engines frequently include some randomization as well. These algorithms, which go by the name of adaptive algorithms, are always changing to be more interesting. How adaptable are these algorithms? Let's imagine an algorithm displays an advertisement to you a second following you view a cute cat video. This approach will occasionally run a quick test. For example, it might do this to see if showing the advertisement after the movie increases your likelihood of purchasing the advertised product. The ad may be displayed to you for three and a half seconds if the first two and a half seconds weren't successful in getting you to make a purchase. However, what if neither of these options works? The algorithm occasionally performs a jump to prevent hitting a wall at three seconds. It will attempt to wait, say, one or five seconds. The program never stops evolving thanks to this randomness. And much like sporadic social cues, algorithmic uncertainty similarly adds to the addictive nature of social media. Silicon Valley parents send their kids to Waldorf colleges, where gadgets are typically not allowed, because social media is so seductive. Addiction can lead to a certain type of madness, that may cause you to become disconnected from the individuals and environment around you. We are all becoming hooked as a result of social media. Chapter 3 - Argument 2: The digital marketing model is overly intrusive and risky. Lead was once an ingredient in house paint, but over time, the proof of its risks became overwhelming. Despite this, there was no widespread protest against individuals holding up their homes. Rather, lead-based paints stopped being used as the norm marked a series of opposition and agitation. We ought to use a similar strategy when using social media. We are not required to ban the internet, cell phones, or digital socializing; doing so would be equivalent to outlawing home painting as opposed to outlawing a certain type of home paint. The author uses the term BUMMER to describe the dominating economic model of social media, which we need to eliminate. BUMMER is an acronym that stands for Behaviours of Users Modified & Molded through an Empire for Rent. Imagine that it is a device that alters our behavior, gathers information about us, and then makes money by selling it to advertising. Since the information gathered by BUMMER's methods is statistical, it cannot be used to predict with precision what a specific individual would do or like. However, it can almost certainly predict what the majority of individuals choose to do or prefer. Six parts make up the BUMMER system: A - Asshole supremacy through attention acquisition. In essence, this signifies that social networking is set up such that the obnoxious, loudest users attract the most notice. B - Interfering with each other's affairs. As previously noted, BUMMER businesses pry into people's life by monitoring their online behavior. C: Shoving information down everyone's throats. Users of social media are inundated with tailored content all the time. Consider your Facebook news feed, for example. D - Subtly influencing other people's conduct. On social media, algorithms influence your behavior and persuade you to do things like provisions. E - Getting paid to let the most reprehensible assholes discreetly manipulate everyone else. Companies that sell user data to advertising as well as other 3rd parties do so to generate revenue. They occasionally share information with shady bad actors, like the Russian state services, who use it to manipulate individuals. F - Fake society and Fake mobs. Online, a huge portion of "people" are bots, which adds to the general artificiality of society. Facebook and Google are the only two American businesses that fully utilize the BUMMER marketing strategy and hence contain the following six among its features. Some of these elements may be present in other businesses, including Wikipedia and eBay, but not each of them. It's critical to keep in mind that no one technology is solely responsible for the problems facing society today. The BUMMER marketing strategy's reliance on deceiving individuals who employ technology is the problem. Therefore, keep in mind that you don't need to stop using your smartphone or going to your favorite websites, just like people didn't have to cease painting homes. You should quit utilizing BUMMER services, please!