Man, we were out here worried about teenagers using AI to cheat at school...
But what we really need to worry about is bureaucrats using AI to cheat at their jobs.
Alaska's Education Commissioner Deena Bishop drafted a policy on phones in schools and posted that draft online and presented it to the Board of Education. When people read the proposal some noticed that she cited studies that didn't exist.
The state's top education official relied on generative artificial intelligence to draft a proposed policy on cellphone use in Alaska schools, which resulted in a state document citing supposed academic studies that don't exist.
The document did not disclose that AI had been used in its conception. At least some of that AI-generated false information ended up in front of state Board of Education and Early Development members.
If you're going to cheat at your job and use AI to do your work for you, at least double check all the info!
This lady didn't even have enough sense to ask her staff to look through the draft before she presented to the BoE!
A department spokesperson first called the false sources 'placeholders.' They were cited throughout the body of a resolution posted on the department's website in advance of a state board of education meeting, which was held in the Matanuska-Susitna Borough this month.
Later, state Education Commissioner Deena Bishop said they were part of a first draft, and that she used generative AI to create the citations. She said she realized her error before the meeting and sent correct citations to board members. The board adopted the resolution.
The whole report was peppered with fake citations, made up by artificial intelligence. And, of course, this didn't stop the board from accepting and adopting the resolution.
I guess the report was just a formality?
4 of the 6 studies cited were from real online scientific journals, but the web links didn't work, and searches revealed that the articles didn't really exist. The AI program just made them up to support the conclusion.
You know, kind of like the mainstream media.
Ellie Pavlick, an assistant professor of computer science and linguistics at Brown University and a research scientist for Google Deepmind, reviewed the citations and said they look like other fake citations she has seen AI generate.
'That is exactly the type of pattern that one sees with AI-hallucinated citations,' she said.
A hallucination is the term used when an AI system generates misleading or false information, usually because the model doesn't have enough data or makes incorrect assumptions.
Even when they were caught, they gave the lame "placeholder" excuse before updating the document. Then, even after they updated the document, they didn't remove all of the references to fabricated studies!
If you thought bureaucracy was bad already, wait until the robots are the ones making up all the rules out of thin air!
P.S. Now check out our latest video ๐