Jennifer Goodnow, who teaches English as a second language in New York, feels the same way. Now he connects complex readings, such as essays or extracts of books, in chatgpt and asks him to create separate versions for advanced and beginner students, with corresponding questions of depth of knowledge.
Amanda Bickerstaff, former teacher and CEO of Ai for Education, an organization that offers training and resources to help educators integrate artificial intelligence into their classes, puts it in no uncertain terms: “teachers are incorporating the IA because they have always needed better planning tools. Now they finally have them.”
The same goes for students with personalized educational plans, commonly called IP, in particular those with disabilities for reading or processing. If a student struggles with the understanding of the text, for example, a teacher could use generative artificial intelligence to simplify the structures of the sentence, highlight the key vocabulary or cut down the dense passages in more digestible blocks. Some tools may even reformat the materials to include visual or audio elements, helping students access the same content in a different way.
Chamberlain, Johnson and Goodnow teach all the linguistic arts, the subjects in which the IA can offer advantages and setbacks – in the classroom. Mathematics teachers, however, tend to be more skeptical.
“The large models are really bad on the calculation,” says Bickerstaff. His team explicitly recommends using tools such as chatgpt to teach mathematics. Instead, some teachers use the IA for adjacent tasks: generate slides, strengthen the mathematical vocabulary or walk the students through the steps without solving the problems.
But there is something Otherwise, teachers can use the IA to: stay in front of the AI. Almost three years after Chatgpt became available to the public, teachers can no longer ignore that their children use it. Johnson remembers a student who was asked to analyze the song “America” from History of West Side Just to turn a thesis on the song of the same name as Simon & Garfunkel. “I was like ‘friend, did you also read the answer?'” He says.
Instead of banning the tools, many teachers are planning around them. Johnson has studied detailed essays in a Google document with the history of the enabled version, which allows him to keep track of the progress of students writing as it appears on the page. Chamberlain requires students to present their planning documents together with the final work. Goodnow is playing with the idea of connecting students to essays generated by the AI and therefore criticize the results.
“Three years ago, I would have launched the book to them,” says Chamberlain. “Now it’s more similar,” show me your process. Where was an agent in this? “”
Also so, the detection of the use of artificial intelligence remains a game of vibrations. Plagio doctors are notoriously unreliable. The districts have been reluctant to trace difficult lines, in part because the tools move faster than the rules. But if there is one thing that almost everyone agrees, this is this: students need literacy Ai and they are not getting it.
“We have to create courses for high school students based on the use of the AI and I don’t know that nobody knows the answer to this,” says Goodnow. “A sort of ongoing dialogue between students and teachers on how ethically, to use these tools.”
Organizations such as IA for education aim to provide that literacy. Founded in 2023, he works with school districts in the United States to create guide and training AI. But also in the most proactive schools, the goal is still on the use of tools, not a critical understanding. Students know how to generate answers. They do not know how to say if those answers are inaccurate, biased or invented. Johnson began to build lessons around hallucinations, how to ask chatgpt how many r are in the word “strawberry”. (Spoiler: often wrong.) “They must see that you can’t always trust it,” he says.
As the tools improve, they are also reaching younger students, raising new concerns about how children interact with LLM. Bickerstaff warns that younger children, still learning to distinguish the facts from fiction, can be particularly vulnerable to excessively audition generative tools. That trust, he says, could have real consequences for their development and sense of reality. Already some students use the IA not only to complete the tasks but to think about it, bringling the line between tool and tutor.
Through the Council, the educators say that this autumn seems a turning point. The districts are launching new products, students are becoming more experienced and teachers are running to set the rules before the technology gives them alone.
“If we know that we are preparing the students for the future workforce – and we are listening to the leaders of many different companies that the IA will be very important – then we have to start now”, says BickersTaff.
This is what teachers like Johnson and Goodnow are doing, a prompt, a student, a strange scenario of apocalypse at a time.
Be First to Comment