How teachers can implement AI in their classrooms responsibly

How teachers can implement AI in their classrooms responsibly


Voiced by Amazon Polly

The Southern Regional Education Board (SREB) recently published a new report to help educators and policymakers adopt artificial intelligence (AI) tools responsibly in schools.

The report, Guidance for the Use of AI in the K-12 Classroom, offers four pillars that educators can work from as they introduce AI to the classroom: Design cognitively demanding, AI-supported tasks; streamline teacher planning and reduce administrative burden; personalize student learning; and foster ethical and informed use of AI among students.

“Teaching students to use AI ethically is crucial for shaping a future where technology serves humanity’s best interest,” said Leslie Eaves, the report’s author and SREB program director for project-based learning. “As educators, we can help students master technical skills and cultivate a sense of responsibility and critical thinking about the consequences and complexities of AI.” 

SREB is an interstate organization that does research and policy work with the goal of improving public education. North Carolina is a part of this compact along with 15 other states. The work of SREB’s recently formed commission on AI in Education can be found here.

More from the report

The first pillar of the report explains that AI can create assignments that are more challenging for students. It also notes that factual and conceptual tasks with little critical thinking are the easiest for AI to reproduce. 

Educators can use AI tools to assess how intellectually demanding their assignments are, the report says. It also suggests that educators use AI to get ideas for assignments and scenarios students can work through that align with state standards.

SREB also says that students can use AI to help them create videos, presentations, or interactive stories to help them understand the course content better. 

“Now more than ever, students need to be creators rather than mere purveyors of knowledge. Students should be taught the ethics of these tools as part of their creation and problem-solving process, allowing them to focus class time on polishing and adding their unique perspective to their work,” the report says. “While AI should not be the final step in the creative process, it can effectively serve in the early stages.”

In this first pillar, SREB says educators should be aware of AI’s biases and limits.

It should only be used as a starting point when teachers or students are creating content, the report says, and it requires critical thinking skills to assess the information for accuracy and applicability. The information it produces comes without awareness of the state standards or school’s specific needs. 

SREB recommends that teachers should only introduce AI tools after students have the full grasp of the concept they are teaching. 

With the second pillar, the report advises using AI to streamline teachers’ administrative tasks.

AI can be used to help teachers with grading, the report says, along with lesson planning, and grading multiple choice tests or assignments. The report also says AI can be used to help translate assignments for English-language learners. 

However, SREB recommends that educators not overly rely on AI-generated content for student work. 

The report cautions that schools also must protect student privacy. Companies that own AI systems may be storing data to sell to third parties, for instance.

Therefore, the report says districts should carefully review the terms and conditions of these products. SREB also advises that schools and their districts never use any personally identifiable information in AI systems. Developing checklists or other guides to evaluate the safety of an AI system for use with their students’ is recommended to streamline this process, SREB says. 

The third pillar emphasizes the ability for students to personalize their learning experience. 

In the report, SREB explains that AI systems can analyze student data — meaning performance, engagement levels, and learning behaviors — in real time. The systems can then provide more resources to the students who need it and give advanced students more challenging material.

AI can also allow teachers more opportunities for critical thinking and creating connections with students, the report says. 

To take advantage of the AI tools, students need consistent access to technology and internet connection, the report notes, adding that not providing these tools to disadvantaged students could further widen achievement gaps.

SREB also cautions teachers to manage their students’ screen time and to make sure they are not over reliant on the technology.

Lastly, the fourth pillar discusses media literacy. The report recommends that educators teach and discuss bias, cheating, deepfakes, and hallucinations with students. 

AI can manipulate photos and videos of things that look realistic but are false representations, known as deepfakes. Hallucinations are when the systems generate content that looks and sounds accurate but are entirely made up. Both of these iterations can spread misinformation, the report says. 

To combat this, the report recommends that schools form student-led AI ethics committees. The committees would deliberate how to ethically complete AI projects and use applications. They could also provide a platform for ongoing dialogue and learning, the report says. 

“Teaching students to use AI ethically is crucial for shaping a future where technology serves humanity’s best interests,” the report says. “As AI becomes increasingly integrated into daily life, students must not only master the technical skills to use these tools but also understand the ethical implications of their use.”

You can read the full report on SREB’s website. 



Content Curated Originally From Here