Hey Alexa, what should students learn about AI?

0
29

Rohit Prasad, a senior Amazon executive, had an urgent message for the ninth and tenth graders at Dearborn STEM Academy, a public school in the Roxbury neighborhood of Boston.

He had come to the school one morning to attend an Amazon-sponsored artificial intelligence lesson in which students learn to program simple tasks for Alexa, Amazon’s voice-activated virtual assistant. And he assured Dearborn students that there will soon be millions of new jobs in AI

“We need to create the talent for the next generation,” said Mr. Prasad Chief Scientist for Alexasaid the class. “That’s why we find out about AI early on at the grassroots level.”

a few miles away Sally Kornbluththe president of the Massachusetts Institute of Technology, delivered a more sobering message about AI to students from local schools gathered at the Kennedy Library complex in Boston for a workshop on AI risks and regulation.

“Because AI is such a powerful new technology, it really needs some rules to make it work well in society,” said Dr. Kornbluth. “We have to make sure it doesn’t do any harm.”

The same day’s events – one encouraging work in artificial intelligence and the other warning against rushing to deploy the technology – reflected the broader debate currently raging in the United States about the promise and potential danger of AI

Both student workshops were organized by an MIT Responsible AI initiative whose backers include Amazon, Google, and Microsoft. And they underscored a question on the mind of school districts across the country this year: How should schools prepare their students to navigate a world where, according to some prominent AI developers, the rise of AI-powered tools seems all but inevitable ?

Teaching AI in schools is nothing new. Courses such as computer science and civics now regularly include exercises on the societal impact of facial recognition and other automated systems.

But the push for AI education gained urgency this year after news circulated in schools about ChatGPT — a novel chatbot that can create human-like homework assignments and sometimes produces misinformation.

“AI competence” is now a new buzzword in education. Schools are looking for resources for teaching. Some universities, technology companies, and nonprofit organizations are responding with pre-packaged curricula.

Tuition is increasing even as schools are grappling with a fundamental question: should they be teaching students to program and use AI tools, giving them training in the technical skills employers are looking for? Or should students learn to anticipate and mitigate AI damage?

Cynthia Breazeala professor at MIT who leads the university’s initiative Responsible AI for social empowerment and educationsaid her program aims to help schools do both.

“We want students to be informed, responsible users and informed, responsible designers of these technologies,” said Dr. Breazeal, whose group organized the AI ​​workshops for schools. “We want to educate informed, responsible citizens about these rapid developments in AI and the multiple impacts they are having on our personal and professional lives.”

(Disclosure: I was a recent Fellow of the Knight Science Journalism program at MIT)

Other education experts say schools should also encourage students to consider the broader ecosystems in which AI systems work. This could include students exploring the business models behind new technologies or examining how AI tools use user data.

“When we encourage students to learn about these new systems, we really need to think about the context that surrounds these new systems,” he said Jennifer Higgs, Assistant Professor of Learning and Humanities at the University of California, Davis. But often, she noted, “that piece is still missing.”

The workshops in Boston were part of a program organized by Dr. Breazeal organized the AI ​​Day event, which was attended by several thousand students from around the world. It provided an insight into the different approaches schools are taking to AI education.

At Dearborn STEM, Hilah Barbot, senior product manager at Amazon Future Engineer, the company’s computer science education program, led a lesson on speech AI for high school students. The lessons were developed by MIT with the Amazon program, which provides programming curriculum and other programs for K-12 schools. The company provided more than $2 million in grants to MIT for the project.

First, Ms. Barbot explained some language AI jargon. She taught students “utterances,” the phrases consumers could say to prompt Alexa to respond.

The students then programmed simple tasks for Alexa, such as telling jokes. Jada Reed, a ninth grader, programmed Alexa to respond to questions about Japanese manga characters. “I think it’s really cool that you can teach him to do different things,” she said.

dr Breazeal said it was important for students to have access to professional software tools from leading technology companies. “We’re giving them future-proof skills and perspectives on how to collaborate with AI to do things they care about,” she said.

Some Dearborn students who built and programmed robots at school said they appreciated learning how to program another technology: voice-activated helpbots. Alexa uses a range of AI techniques, including automatic speech recognition.

At least some students also indicated that they had privacy and other concerns about AI-powered tools.

Amazon is recording consumers’ conversations with its Echo speakers after a person says a “wake word” like “Alexa.” Unless users opt out, Amazon may use their interactions with Alexa to do so Target them with ads or use their voice recordings Train its AI models. Last week, Amazon agreed to pay $25 million settle federal charges that it retained child voice recordings indefinitely, in violation of the federal Children’s Online Privacy Protection Act. The company denied the allegations and denied breaking the law. The company pointed out that customers could review and delete their Alexa voice recordings.

However, the hour-long workshop led by Amazon did not address the company’s data practices.

Dearborn STEM students regularly engage with the technology. A few years ago, the school introduced a course in which students used AI tools to create deepfake videos of themselves and investigate the consequences. And the students thought about the virtual assistant they were learning to program that morning.

“Did you know there’s a conspiracy theory that Alexa is listening in on your conversations to show you ads?” asked a ninth grader named Eboni Maxwell.

“I’m not afraid to listen,” replied Laniya Sanders, another ninth grader. Still, Ms Sanders said she avoided using voice assistants because “I just want to do it myself.”

A few miles away, at the Edward M. Kennedy Institute for the United States Senate, an educational center that houses a full-scale replica of the US Senate room, dozens of students at the Warren Prescott School in Charlestown, Massachusetts, explored something else: AI politics and safety regulations.

In the role of senators from different states, the middle school students participated in a mock hearing where they debated provisions for a hypothetical AI security law.

Some students wanted to ban companies and law enforcement agencies from using AI to target people based on data such as their race or ethnicity. Others wanted to require schools and hospitals to assess the fairness of AI systems before deploying them.

For the middle school students, the exercise was not unknown. Nancy Arsenault, Warren Prescott’s English and civics teacher, said she often asked her students to think about the impact digital tools are having on them and the people they care about.

“As much as students love technology, they are very aware that they don’t want unrestricted AI,” she said. “They want to see boundaries.”

LEAVE A REPLY

Please enter your comment!
Please enter your name here