Local schools adapt to a world with AI

Local schools adapt to a world with AI

In summer 2022, a Stafford School Board member went to the Virginia School Board Association convention where she attended a short session on GPT artificial intelligence. When she returned to Stafford County, she told Chief Technology Officer Jay Cooke that it was something the district should be looking into.

“I’m like, ‘What? How do you even know about that?’” Cooke said.

The School Board member’s interest proved to be prescient. ChatGPT launched in November 2022, and the whole world started talking about what the new technology would mean for the future. AI is now a topic of discussion for every educational institution. Teachers and administrators are concerned with how AI could be used by students to cheat on assignments, but they’re also looking into how the technology could improve lessons and prepare students to be productive citizens in a future that certainly will include AI.

People are also reading…

AI has been a major focus of Stafford County Public Schools’ Technological Advisory Committee. Last December, the School Board adopted a regulation guiding the use of AI in the county’s schools, making Stafford one of the first school districts in Virginia to do so.

The regulation sets out seven guiding principles for AI use in schools, including “avoiding misrepresentation and plagiarism,” keeping AI use relevant to the curriculum, ensuring it is used ethically and appropriately, and promoting training for staff and teachers through professional development.

In keeping with the guiding principle of professional development, the county is offering guidance to teachers on how to use AI through online resources, in-person training sessions, and a semester-long professional development course. Cooke described the class as “very extensive” and “like a college course.” Last semester 60 teachers participated. This year another 80 have signed up.

Fredericksburg City Schools has been investigating AI since the launch of ChatGPT. Instructional Technology Supervisor Emily Horne said the district is still in the “fact-finding phase.” She said it plans to soon convene a workgroup to develop a framework for how AI should be used in schools. Fredericksburg has also held informational sessions for teachers on AI and appropriate uses in the classroom.

Cooke and Horne both said that AI opens up many possibilities in the classroom. Teachers can use AI to generate lesson plans and instructional materials.

Cooke said this is a big help to teachers with busy schedules.

“If it does nothing else, it’s helping teachers with their time management,” he said.

Cooke emphasized AI should be used as an aide to the “teacher-student relationship.” He sees AI as a tool that teachers can use to improve their lessons and help students who are struggling.

Cooke mentioned his own son, who struggled with writing in school. He said he feels that AI could help teach children like his son to write better.

“He hated writing because he could never get started,” Cooke said. “So, using AI, in his situation, to help generate ideas or even to write a starting sentence … would’ve been so helpful for him. So, in that sense it could have been a great assistive technology.”

What’s considered cheating?

School districts are looking to shape policies that ensure AI is not used for plagiarism. But what is considered plagiarism may become murky when dealing with unfamiliar technologies.

“If a student used AI to help them get ideas for a paper they had to write, is that cheating?” Cooke asked.

To clarify issues like this, Stafford County has created a contract for teachers to share with their students clarifying their expectations around AI. Teachers can choose if and how they want students to use AI on assignments. The form outlines a wide range of expectations teachers can opt for. On one end of the spectrum, they can ask students to not use AI and do all the work on their own. On the other end, they can allow students to use AI to write entire essays if students edit them afterward. In between, there are numerous specified uses of AI that can be permitted.

One problem is that AI-written text is becoming increasingly hard to discern from original work. AI detection software does exist, but it is not always accurate.

Horne said that Fredericksburg schools are encouraging teachers to create assignments that cannot be easily responded to using AI.

Cooke said this is a good idea, but acknowledged that it is difficult.

He said the most important thing is that students understand the reason they are being made to do assignments. He said that at Colonial Forge High School the policy is that if a student is suspected of using AI inappropriately, they are not failed, but instead required to have a conversation with the teacher about expectations, and are often required to do the assignment over again.

“It’s an eye-opener for the students that all they’re doing is cheating themselves,” he said.

Fredericksburg has an existing cheating policy, Horne said, but the Virginia Association of Supervision and Curriculum Development recommends school districts review these policies in light of AI. This will be a focus for Fredericksburg’s upcoming workgroup.

‘We don’t want to prepare students for 10 years ago’

“We want students who are not intimidated by that idea [of AI], that can see it as a partner or resource of sorts they can really use in their job, no matter what that may be,” Horne said.

She believes that AI is important for students to be familiar with, no matter what career they end up working in.

The Stafford County Schools website page on AI includes a quote from the title of a Harvard Business Review article expressing a similar idea.

“AI will not replace humans,” it reads. “Humans who use AI will replace humans who don’t.”

However, both Horne and Cooke emphasized that students should learn about AI in an age-appropriate manner. Horne said Fredericksburg students would not use AI tools until secondary education but would learn about AI conceptually in younger grades. Cooke also said the elementary students in Stafford would not be using generative AI, though they might be able to use educational chatbots, such as Khan Academy’s Conmigo.

“[Young children] don’t have the development to be able to tell that it is not a real person on the other end,” he said.

Another reason not to allow young children to use AI is to protect them from data harvesting. Most AI companies make money from harvesting their users’ data. The Children’s Privacy Protection Act limits how companies can collect personal information from minors, so ChatGPT and many other AI companies require users to be over 13 and for those under 18 to have permission from a parent or guardian. One of the guiding principles adopted by Stafford County schools is data privacy and security.

Still, educators agree that they would be failing their students if they did not give them some exposure to AI.

“We want students to begin using generative AI and learn how to use it because they have to be ready when they graduate,” Cooke said. “Because we don’t want to prepare them for 10 years ago. We want to prepare them for the future.”

Horne emphasized equity as a concern. She said if the school does not allow the use of AI, some students may have access to these resources on other devices, creating an equity gap.

“Teaching students about AI is important because it is a part of our lives now and going forward,” she said. “Methods that use AI, are AI aligned, or AI informed will be part of their work, from what was once considered blue collar … to decision making at the top levels of government. If they know AI, our students will be better equipped to take on a job.”

Spotsylvania County Schools did not respond to inquiries about how the district is handling AI use.

Originally Appeared Here