Skip to main content

Artificial Intelligence makes it possible for machines to attack people as in “The Terminator” but also could dumb down humanity as in another movie, “Idiocracy,” the administrator who drafted a policy for using AI in Hazleton Area Schools said.

“This is why schools should move very slowly in adopting AI for students,” Kenneth Briggs, Hazleton Area’s chief information officer, said in an email.

The policy that Briggs drafted and the school board approved on Oct. 23 places AI use within guardrails to protect students but also requires them to think critically.

Teachers, students and even computer technicians in the Technology Department that Briggs directs cannot use AI until they’ve had training to protect themselves from disclosing personal information and to consider the ethics, capabilities and limits of artificial intelligence.

The policy also says students and faculty only can use AI tools that Hazleton Area has reviewed for bias, age levels, data security and privacy protection.

“The primary thing we look for is safety for our students and staff. Not only about having their personal information leaked, but also not allowing an AI to give counseling or comforting advice to students or staff in regard to their emotions or mental health,” said Briggs, who wrote about online learning in his doctoral thesis and has led Hazleton Area’s Technology Department for 16 years. “We also make sure that the developer of the product has integrated safeguards into the AI to prevent learned bias.”

Tools that Hazleton Area selects will focus AI’s conversational abilities on specific tasks and not allow the AI to generate personal advice, he said.

For example, the first AI tool that Hazleton Area approved is called Goblins and tutors student in math.

“We saw it in action. It’s phenomenal,” Superintendent Brian Uplinger said during the Oct. 23 meeting before the board voted to start using Goblins. “It doesn’t give answers. It helps point toward an answer.”

As teachers broaden their use of AI, they will set rules for each assignment.

A chart within the policy lists five categories ranging from no use of AI to full use with human oversight and explains when students have to disclose how they used AI..

If allowed to use AI to edit a paper, for example, students can refine their work with AI but cannot generate content with it, and they must disclose AI’s role, the chart says.

Uplinger said giving teachers flexibility about how to use AI will let them shape assignments to their subject matter, grade level and learning goals.

“A creative writing teacher might allow generative brainstorming, while a math teacher could restrict AI to problem explanations only,” Uplinger said an email.

Briggs said students might use AI to visualize experiments in science, offer tips on readability in creative writing and to practice speaking or translate assignments for those learning a second language.

“Teachers will need to ensure students are using AI as a research tool or in a limited capacity as a tutor,” Briggs said, “and not to complete an assignment.”

His department isn’t using AI yet.

After technicians have had training, however, they might use AI to review logs for potential issues in computer hardware and software, detect misuse, research new products and gain suggestions for configuring the computer system.

Hazleton Area School District Technology Director Ken Briggs at his office on Thursday, Oct. 30, 2025.(John Haeger / Staff Photographer)
Hazleton Area School District Technology Director Ken Briggs at his office on Thursday, Oct. 30, 2025.(John Haeger / Staff Photographer)

Before using AI tools for preparing budgets, payrolls, enrollment forecasts and other central office tasks, Hazleton Area administrators will verify the accuracy of the tools, the policy says.

While Briggs has heard educators compare the rollout of AI now to that of calculators decades ago, he notes a difference. With calculators, students still needed to know steps for solving problems. With AI, students can surrender the whole process to the machine.

“While it’s true that we’ll all have AI at our fingertips in the future,” he said, “we still need critical thinking skills to know what questions to ask to get the desired results.”

AI can be wrong, biased and, at least for now, is too immature to offer counseling in Briggs’ view.

ChatGPT, which uses AI to answer questions, told a 16-year-old Colorado boy not to disclose his suicidal thoughts, his parents told a Senate panel in September after their son died by suicide.

Hazleton Area’s policy says district workers will report problems that they identify with IT tools and establish ways to verify their accuracy and reliability.

Briggs said Hazleton Area will look for tools that detect plagiarism but realizes that those tools can falsely flag content, especially if created by someone using a second language.

AI might make plagiarism so easy that schools will reverse the roles of classroom instruction and homework, Ethan Mollick, associate professor of management at University of Pennsylvania’s Wharton School, suggested. Writing in “Co-Intelligence: Living and Working with AI,” Mollick said students might do group work and hands-on learning in class, where teachers can monitor AI use; but for homework, teachers might tell students to learn new material by watching video lectures or reading.

Patrick Patte, Hazleton Area’s director of curriculum, doesn’t foresee a reversal like that at least in the near future.

He was interested in Goblins because he wants to take “every opportunity we can to help our math” instruction. Even with Goblins, the district will roll out the tool as a pilot program.

“It’s a great asset,” Patte said about AI, “but people are our best resources, our teachers.”

The policy encourages teachers to use AI to “discover lesson plan ideas, create assignments and to generate ideas for the personalization of student learning.”

But AI tools won’t make final decisons on student grades, academic integrity or discipline, the policy says.

Likewise, administrators can use AI to help in the human relations process as long as people make final decisions about hiring, promoting, evaluating and dismissing employees.

“AI cannot be trusted with solely making a decison that affects a human life,” Briggs said. ” We should never take the ‘human’ out of HR.”

Highlights of the AI policy

AI-assisted idea generation: AI is used for brainstorming and generating ideas only. No disclosure

AI-assisted editing: AI is used to edit or refine student work, but not to generate content. Student must disclose how AI was used

AI for specified task completion: AI is used to complete certain elements of a task or part of a project with human oversight and evaluation of AI-generated content. Student must disclose how AI was used

Full AI use with human oversight: AI may be used throuhout the assignment. The student is responsble for providing human oversight and evaluating the AI-generated content. Student must disclose how AI was used.

Source: Hazleton Area School District

Source link

Subscribe our Newsletter

Congratulation!