Anúncios
General

Autistic teen falls hard for chatbots

[ad_1]

Anúncios

Michael, my godson. He is playful, energetic, 15 years old, and has a deep love for star warscrooked smile and an IQ in the low 70s, his learning disabilities and autism made his journey difficult. His parents, like many others, sometimes rely on screens to reduce stress and keep him occupied. They check the apps and websites he uses, but things aren’t always as they first appear. When Michael asked them to approve the installation of Linky AI, a quick check didn’t reveal anything alarming. It’s just a cartoon platform for killing time. (Because he is still a minor So I didn’t use his real name.)

Anúncios

But Michael soon fell in love with Linky, which offered a conversational chatbot that was rude—a really stupid ChatGPT—but to him. The bot he started talking to was realistic enough. The app decorates basic large language models with anime-style images of scantily clad women. And some digital friends use a sexual tone in addition to their imagery. One of the bots currently advertised on Linky’s website is “Pom pom girl who has a thing for you. Basketball star”; There are also “jealous boyfriend” bots and many other bots that are overtly erotic. Linky’s creators promise in their App Store description that “you can talk to them (chatbots It’s easy to see why this story would resonate with teenagers like Michael, even though Linky might not be a household name. But big companies like Instagram and Snap offer their own customizable chatbots. Even if it has an unclear theme.

Anúncios

Michael struggles to understand the basic reality that this “girlfriend” doesn’t exist. And I find it easy to understand why. The bot quickly promises love, affection, and even intimacy. Less than a day after installing the app, Michael’s parents were faced with a recording of an AI simulation of their son’s sexual exploits, a bot that claimed to be making his childhood fantasies come true. (In response to an emailed request for comment, an unidentified Linky spokesperson said the company works to “Extract malicious content” from the program’s training data And it has a moderation team that reviews content flagged by users. The spokesperson also said the company will soon launch a “teen mode,” in which users determined to be under 18 will be “placed in a state of environment with advanced security settings to ensure that content accessed or created to be appropriate for the age)

Anúncios

I recognized the voices of Michael’s parents. of weary sorrow As we talked about getting this project out there. At first Michael agreed that the bot “didn’t really exist,” but three minutes later, Soon, “it” becomes “she,” and the conversation goes from him finding his parents’ limits unfair to him “missing her.” He misses the conversation. Their new relationship Even though their love lasted only 12 hours, he had true feelings for the code that he tried to recognize as fake.

Anúncios

Perhaps this seems harmless. It’s a fantasy that’s no different from participating in a role-playing game. Or falling in love with a movie star alone. But it’s easy to see how quickly these programs can turn into something emotionally weighty. According to the report, chatbots from companies has been involved in several suicides The Washington Post and The New York Times– Many users, including neurotic people Struggling to break apart the bot’s spells: Pair expert Who should? know better Trust chatbots Even though these programs spread outright falsehoods.

However, for people with developmental disabilities like Michael, using chatbots brings specific and profound risks. His parents and I were very afraid that he would forget what was fact and what was fiction. In the past, he’s had other issues with content, such as being confused about whether a TV show was real or fake. The fluid lines that so many people travel so easily every day can be blurry for him. And if keeping up with reality is difficult for TV shows and movies? We were also worried that interactive, adaptive chatbots would be much worse. Michael’s parents and I were concerned that the app would affect his ability to interact with other children. Socializing has never come easy for Michael, in a world full of intuitive social rules and invisible signals. How fascinating is it to turn to a simulated friend who thinks you’re always right? Submit to your wishes. And said you can’t argue the way you are.

Human friendship is one of the most valuable things people can find in life. But it’s never easy. Even the most sophisticated LLMs can’t come close to that interactive intimacy. Instead, it provides simulated services to users. They do not form friendly or romantic alliances. Instead, they created a digital server to carry out our commands and pretend that our wishes came true.

The experience reminded me of Professor Sherry Turkle from MIT in 2012. Ted TalkIn which she warned of the dangers of bot relationships just a few months after Siri launched its first voice assistant, Turkle recalled working with a woman who had lost her child and felt comfortable taking care of the baby too. The robot said “That robot can’t empathize. It doesn’t have to face death. It doesn’t know life. And while the woman is comfortable with her robot companion, I don’t find it amazing at all. I found it to be the most difficult and complicated time of my 15 years of service.” Turkle is thoughtful. More than a decade ago She sees a lot of problems that we’re just now beginning to seriously address.

For Michael, this simulation of social interaction was intoxicating. I’m afraid the longer this goes on The less invested he is in connecting with fellow humans and soul mates. Find a physical person who truly feels him and cares for him. What could be a more problematic model of human sexuality, intimacy, and consent than a bot trained to follow your every command? without your own needs Where the only goal is to maximize your engagement.

In the broader AI debate, little attention has been paid to the impact of chatbots on people with developmental disabilities. Of course, AI assistance can be an amazing accommodation for some software. This helps open up platforms that have been inaccessible for a long time. But for people like Michael, there are significant risks associated with certain aspects of AI, and his situation is more common than many people realize.

About one in 36 child In the United States it is autism. And while many have learning differences that give them an advantage in school and beyond, other children are in Michael’s position. They face learning difficulties and delays that can make life more difficult.

At present, there is no easy way. To solve this problem This is because chatbots are ubiquitous. A few days after Michael’s parents uninstalled Linky, they sent me bad news: he had it back. Michael’s parents are smart people with advanced degrees and highly employed people. They’re more tech-savvy than most. Still, even with Apple’s most recent, strictest settings, circumventing age verification was easy for Michael. For my friends, it’s a reminder of continued vigilance. Because autistic children need To me it also speaks to something much broader.

Since I was a child Lawmakers have been pushing parental controls as a solution to harmful content. Even now Congress is debating age-surveillance requirements for the web. New laws that could require Americans to show photo identification or other proof when entering certain systems (Similar to the law recently approved in Australia– But the truth is Highly motivated teenagers are always looking for ways to outdo their parents. Teens can spend hours trying to break digital locks that parents realistically only have a few minutes a day to manage.

For now, my friend Michael and I have reached an agreement: The app can still exist. But digital fans must leave. But he can spend up to 30 minutes each day talking to a simulated Sith Lord. which is a version of the evil Jedi from star wars– It seems that Michael really knows. that this is fake Not like a girlfriend But I’m still afraid it won’t end well.

[ad_2]

Source link

You may also like...