Todd Mitchem made every effort to have honest and productive conversations with his son. “He’s 15,” said Mr. Mitchem, 52, with a laugh. “It’s so difficult to connect with teenagers.”
Every time he tried to raise a sensitive topic, his son would give vague answers or run away, preferring to avoid serious conversations altogether.
In the past, when Mr. Mitchem needed parenting help, he would read a book or ask a question at the men’s support group he meets with weekly.
But recently he turned to ChatGPT. And he’s not alone: Others are turning to artificial intelligence chat bots to figure out what to say in situations where the stakes are high. They use the tool to talk to or read to their children, approach supervisors, give difficult feedback, writing vows or to write love letters.
Rather than turning to friends or even professionals for help, Mr. Mitchem said the bot gives seemingly objective advice. “The bot gives me answers based on analytics and data, not human emotions,” he said.
ChatGPTthe new virtual tool, powered by Open AI, pulls its information from a variety of online materials including books, news articles, scientific journals, websites and even message boards, allowing users to perform human-like conversions with a chat bot.
“It gives you what the collective crush on the internet would say,” said Irina Raicu, who directs the internet ethics program at Santa Clara University. (Other companies, including Google and Microsoft, have their own versions of this technology, and Microsoft’s called Bing AI was recently introduced made famous for aggressively declaring love to New York Times journalist Kevin Roose.)
Mr. Mitchem, who lives in Denver and is Executive Vice President of Learning and Product at an executive training company, opened his interview by briefly summarizing, “I need some kind advice.”
“OK, no problem,” ChatGPT replied, according to Mr. Mitchem. “What’s your name?”
During their conversation, ChatGPT told Mr. Mitchem that he was a good father as he was also just wondering how to approach a conversation with his son about the decision to join a basketball team. “It said something like, ‘It’s cool if you can’t get it right, but it’s great that you’re trying.’
Mr Mitchem said the bot then continued: “Teenagers try to enforce their independence as they grow up. Remember, when you talk to him, he needs to know that you trust your decisions.”
The next day, Mr. Mitchem approached his son and tried the advice. “I said to him, ‘You have to make that decision, you’re 14 and I have faith that you’re going to make a good one,'” Mr. Mitchem said. “My son is like, ‘Wow, that’s great. I’ll let you know what I decide.’”
“We left positive,” said Mr. Mitchem. “It totally worked.”
Once Upon a time …
For Naif Alanazi, a 35-year-old Ph.D. As a student at Kent State University, bedtime is a sacred ritual for him and his 4-year-old daughter, Yasmeen. “I have to work all day,” he says. “This is our special time.”
His Saudi Arabian family has a long tradition of oral storytelling. To continue it, he tried to invent new, exciting stories every night. “Do you know how difficult it is to come up with something new every day?” he asked, laughing.
Now, however, he lets the bot do the work.
Every night he asks ChatGPT to create a story that includes people (e.g. his daughter’s teacher) and places (school, park) from their day, along with a cliffhanger at the end so he can continue the story the next night . “Sometimes I ask to add a value that she needs to learn, like honesty or kindness to others,” he said.
“Being able to give her something that’s more than a general story, something that can strengthen our bond and show her that I’m interested in her day-to-day life,” he said, “it makes me feel her.” so much closer.”
love languages
Anifa Musengimana, 25, who studies at the Graduate School of International Marketing in London, is sure that chatbots can help make the boredom of online dating more interesting. “I have a lot of repetitive conversations about these apps,” she said. “The app can give me fun ideas to talk about and maybe I’ll find better people so far.”
“If I get interesting answers, I get drawn in,” she said.
She said she would tell her match she was using the tool. “I would like a guy who finds this funny,” she said. “I wouldn’t want a guy who’s so serious he gets mad at me for doing it.”
Some use chatbots to improve the relationships they already have.
James Gregson, 40, a creative director living in Avon, Connecticut, uses ChatGPT to compose love letters to his wife.
“I’m not a poet, I’m not a songwriter, but I can take themes from things my wife might enjoy and put them into a song or a poem,” he said.
He also believes in full disclosure: “I’ll give her one, but I’ll tell her who wrote it,” he said. “I’m not trying to cheat on her.”
Office applications
Jessica Massey, 29, a financial analyst at Cisco Systems who lives in Buffalo, wrote draft emails to her boss using ChatGPT. “I wanted to test its abilities to see if there was another way AI would formulate what I’m thinking in my head,” she wrote in an email. (One interviewee admitted to consulting ChatGPT to prepare for their interview for this story. Another admitted to using it for employee reviews.)
Ms. Massey used the bot to email her boss explaining why the company should pay for a particular job certification. The bot gave her a pretty default language, she said. She hasn’t sent it out yet, but plans to do so once she “changes the wording a bit to make it sound more like me.”
However, Ms Massey has one rule when it comes to relying on a chatbot: “Disclose it at the end of your job, or don’t use it at all.”
However, academics studying technology and ethics have mixed feelings about using ChatGPT for highly personal communication.
“We shouldn’t automatically dismiss tools that might help people navigate a difficult conversation,” said Michael Zimmer, director of the Center for Data, Ethics and Society at Marquette University. It’s like buying a Hallmark Card for a birthday or anniversary. “We all accepted that because the words on the card match something I believe,” he said.
However, Santa Clara University’s Ms. Raicu is concerned about people using ChatGPT for face-to-face communication. She doesn’t like the idea that there is a “right” and “wrong” way to communicate. “I think the right words depend on who the people who are communicating are and the context,” she said. “There is no formula for many of these things.”
Ms Raicu said that using ChatGPT for face-to-face communication could erode trust: “People might ask, ‘Do I really know who I’m talking to?'”
Photos from Getty Images