[ad_1]
You’ve probably heard that algorithms control everything you hear, read, and watch. They control the next song on your Spotify playlist or what YouTube recommends you watch after you finish watching a video. An algorithm might be the reason you can’t escape Sabrina Carpenter’s hit song “Espresso,” or why it’s suddenly happening. You might be overwhelmed by the desire to buy those pastel Stanley Cups. They set the tone for TV shows. will do and What books have been published?—A revolutionary paradigm shift ingrained in art and media. And it’s not going away any time soon.
In 2024, culture is boring and obsolete as algorithms run the gamut on what is produced and praised—or, as critics say, at The New Yorker Staff writer Kyle Chayka writes. Whole book about how Big Tech has succeeded in “flattening the culture” into facsimile coffee shops and mid-century modern furniture. Critic Jason Farago argued. in at new york times magazine The “jumping across our screens” and “delivering algorithmic recommendation engines” led to a lack of momentum. Condemning new inventions It’s not a new argument either, in the 1923 article. Aldous Huxley points to the ease of cultural production This is driven by a growing middle class desire for entertainment. This is a key reason why mass market books, movies, and music have fallen out of favor. “These simple pleasures A prepared distraction that is the same for everyone in the Western world,” he wrote, “certainly poses a worse threat to our civilization than the Germans posed.”
Yet cultural algorithms are only downstream of larger, uncontrollable forces that shape how art is created and supported. It’s not that Spotify makes music more boring or Instagram makes art more obsolete. But soaring rents and yawning inequality have destroyed many of the necessary ingredients for a culture to spawn and thrive. Art galleries and theaters are closed. Music venues turned into luxury condos and bank branches. Some of these results say Algorithm results The algorithm, though, has received less attention than the algorithm that powers TikTok or YouTube.
Part of the fixation on cultural algorithms is a result of the precarious position that cultural gatekeepers find themselves in. Traditionally, critics have played the dual role of gatekeeper and amplifier. by deciding which literature, music, or movies Which media (or just a few media) are worthwhile? Then enhance the experience by giving your audience more context. But to a certain extent They’re excluded by user-driven communities like BookTok and by AI-generated music playlists that make recommendations without the complications of critical thinking. Not long ago, you might have browsed magazine reviews. songs or ask for advice from record store owners. Now you just press “play” on your Spotify daily list and let the algorithms take control.
In this way, some consumers succumb to technological fatalism. People know that algorithms exist and often dictate how culture is disseminated to them. And there wasn’t much they could do. Except for abandoning the platform entirely and embracing retro fetishism regarding their consumption choices (not a bad outcome, actually), but algorithms aren’t just used to feed you content. They are also hired to Edit real estate pricesdo test and Asylum Uber Pricing Decisions during hurricaneDetermine whether or not the elderly person will be resuscitated. medical careAssess the risks arising from An inappropriate partnerand decide who gets it Aimed at a war zone–
in the united states Algorithms are now embedded in companies and government agencies at various levels. which speeds up various processes In the private sector, algorithms are attractive because they can automate tasks like adjusting prices in real time. Consider Walmart, which is replacing Price tag with electronic sign that can be changed remotely Obviously to save on labor costs, but in 2021 researchers It was found that the pricing was obtained according to the algorithm. Do not pass on savings to customers. After looking at how prices changed on e-commerce sites like Amazon, Target, and Walmart, they discovered that “High-frequency pricing algorithms can reduce competition and increase profits,” helping wealthy companies get richer.
These types of algorithms are much more frightening and difficult to disentangle from our daily lives. It’s more than making every coffee shop look the same. There is also a lack of a legal framework to challenge the impact of these algorithmic manipulations on redlining, the long-standing practice of denying government-backed home loans to Black Americans. That was outlawed in 1968 with the Fair Housing Act. The reason is simple: People should not be denied the ability to purchase a home because of their skin color. That’s a more straightforward consideration when mortgages are a face-to-face business. But most of the time Started online nowWhere the application works through software to improve risk management for lenders.
The mortgage industry’s digital transformation has been praised for its potential. It evens the playing field for future homeowners.—Automated systems are apparently not as biased as human loan officers. But because these online mortgage brokers use algorithms that focus on things like an applicant’s assets and credit score, Both of which have their own history of discrimination. They are therefore vulnerable to structural inequality and bias. And those dangers are not theoretical. A 2021 investigation by The Markup found this to be the case. 80 percent black applicants Loans are more likely to be rejected by algorithmic underwriting mechanisms. compared to white applicants from similar financial backgrounds.
Tenants in Massachusetts recently won consent in a class action lawsuit. with a renter screening company called SafeRent, which uses an algorithm to give landlords a “score” that speaks to a potential renter’s risk level. SafeRent’s algorithm uses flawed factors like credit scores, and plaintiffs accuse the company of discrimination. Black and Hispanic applicants who used federal housing vouchers without considering how they would help pay rent Some laws in the United States Prevent this kind of biased treatment with technology. That makes cases like the one in Massachusetts an uphill battle.
There are some signs that the government is at least trying to limit these practices. At the beginning of last year Ministry of Justice warns Landlords and companies screen tenants they don’t. They are said to be “escape from liability” if the algorithms they use turn out to be in violation of fair housing laws. By disproportionately denying housing to people of color, the DOJ statement doesn’t pose much of a threat to immediate prosecution. This is a sign that the government is paying attention to various cases. The same is true of the lawsuit against SafeRent.
Tech executives also have a real enemy in Federal Trade Commission Chair Lina Khan, who has sued giants Amazon and Meta for antitrust violations. In July, the agency said he would study “CCTV Pricing” where each person receives a different price tag based on their browser history. last year President Joe Biden issues a statement executive order Requires many federal agencies to create a regulatory framework for how they plan to build defenses against discrimination when using algorithm-driven systems, such as hiring contractors or awarding housing. That said, like with many attempts to control technology The executive order turned out to be more like a committee that would discuss meetings about plans than actual implementation. And the incoming Trump administration is stuffed with right-wing techies, including JD Vance, a Peter Thiel disciple, and the bizarre coupling of Elon Musk and Vivek Ramaswamy, who were recently tapped to do some of that. Called the Department of Government Efficiency (they may find themselves at odds with the potential of FCC Chairman Brendan Carr, who is broadly Even though it has been selectedCritics of mergers and acquisitions in the tech industry)
Perhaps the most worrying thing about algorithms is that It gives institutional actors a level of possible denial. If the decisions that dictate our lives are made by equations rather than people. Blame for those decisions shifts from the concrete to the abstract. As a consumer We are encouraged to think of technology as a product of a company. Instead, it’s a group of individuals who make decisions about what those companies do. And because these equations tend to be hidden in logos and brands, So we overlook the fact that Algorithms are sensitive to the biases of the people who create them.
That makes ignoring these powerful trends—and focusing on the algorithms that determine which song or video will be featured next—easier. When artists argue that their songs aren’t getting the right keyword attention Or their book doesn’t sell well because it didn’t reach the right influencers. What they really do It is an expression of fear that there is no space. To keep artists alive, musicians, writers, and painters have not become less interesting in the span of a single generation. The ground changed Rent is too high The wages are too low Wealth is too concentrated. Artists are forced to focus on survival rather than their work. This leaves little time or space to cultivate their skills. Changing those infrastructures can feel impossible. So when those efforts reach consumers Criticism also lies at the feet of delivery mechanisms such as Spotify and Instagram.
It’s not like we should all learn to stop worrying and love algorithmic recommendation mechanisms. But this is a consistent indication that boring Everything that we become distracts us from the bigger problem. Some technologies are eroding our ability to truly pursue public good. And perhaps enjoying or criticizing cultural works based on their merits might help us focus a little on our own. Asking yourself whether you like something because you really like it can help eliminate the sense of dread that some critics call “disillusionment.” “Algorithm Anxiety”
I often find this word too little for how people are feeling right now. It suggests that the source of our neuroses can be found in the soft glow of smartphone screens rather than the superstructure of power and influence that surrounds us. What drives many of us crazy is the apparent inability to change the power dynamic. Art-good A work of art—still in the making. But without major changes That might not happen for long. Many of us are stuck at one station. And turning the dial became more and more difficult.
[ad_2]
Source link