Sunday September fifteen 2069.
The long awaited report from the Eastwood inquiry, set up by the previous government two years ago, has finally been released. That it has taken a change of government to bring the findings into the light of day has been a source of much speculation about a cover up. Entitled ‘Growth in Student Assignment Fraud and AI Personation’, it makes for uncomfortable reading and exposes major flaws in the current regulatory system. The findings are explosive and indicate that cheating is widespread, with those involved working out more and more clever ways to bypass the checks and regulations.
It appears that the inquiry has uncovered a massive criminal
conspiracy involving students, university managements and as yet undisclosed
criminal gangs. The gangs appear to operate in the same manner as the ruthless drugs
gangs prevalent in the late 20th Century. The universities in the Alliance
of Independent Providers (AIP) group of institutions come in for particular criticism.
There is a suspicion that they specifically targeted students that were
performing poorly to offer help with their assignments and boost their rankings.
Also that staff turned a ‘blind eye’ to what was going on. Perversely, the AIPs
had developed ‘in house’ the latest anti-plagiarism technology that the essays
were now by-passing. The tougher external examination process muted over the
last few years would have little effect.
The inquiry into the role of Artificial Intelligence (AI) in
‘unregulated student assessed assignments’ has its roots in the controversial collapse
of the trial of a student over 20 months ago.
Student X, whose name is still the subject of a court reporting embargo, was
suspended from the University of Middle England (a member of the AIP group of
universities) on suspicion that he was
supplying hundreds of plagiarised essays to his fellow students for thousands
of bitcoins each. He was charged under the 2026 Act and accused in court of
making over ₿ 6,200 per essay.
The prosecution case hinged on the method used to manipulate
the essays prior to submission. The accused
claimed that the essays genuinely arose from the final efforts of the candidates
themselves before making formal submissions to the university. His role had
been to provide the means and information to assist the students. His counsel
noted that it was the equivalent of supplying a computer 50 years ago or a pen
100 years ago. The accused could not be liable for how the technology was used.
In effect, Student X was purchasing top ranked assignments from earlier cohorts
of students and manipulating them for distribution to students in later years.
His defence was that he was only a go between and was supplying examples of sound work that was
legal since 2026. However, and crucially, he also provided access to an AI
robot, TurnitOut, which regenerated essays by analysing the lexicon and
grammatical structure and replaced key words and sentences to get past the
latest anti-plagiarism technology. The prosecution case floundered on two points.
Firstly, using the robot’s efforts to rewrite well know
paragraphs from classic literature as an example did not impress the jury. The opening lines from Herman Melville’s ‘Moby Dick’,“Call me Ishmael. Some years ago - never mind how long precisely - having little or no money in my purse, and nothing particular to interest me on shore, I thought I would sail about a little and see the watery part of the world.”
Became:
“You may refer to me as Ishmael. A few years back, don’t bother counting, I was skint and out of bitcoins and had little interest in what was on dry land. I thought I would set sail for a bit and look at the aqueous sections of this planet.”
Secondly, Student X dramatically revealed
that he was “82% robot himself” having
been largely reconstructed after a boating accident. This raised the tricky question,
when does a human become a robot and can they be legitimately charged with an offence?
Using the work of others in course assessments and
assignments in Universities and Colleges was made a criminal offence in England
in 2020 soon after the ‘incident’ of 2019. Then, the defence by the so called ‘Essay
Mills’ was that they were merely supplying examples of good work and were not
committing any offence. They “supplied the gun and training but did not shoot
it”. Needless to say, the law proved
unworkable as more and more ‘Essay Mills’ emerged that sold their products as “bespoke literature searches and technical reports”
that were marketed as “Let us do the
searching for you so you can relax and shop”. A similar law passed across the
newly formed Western European Confederation of States (WECs) in 2026 also
failed to stem the tide of “cheating by
people of means”. This resulted in a
change towards licensing and regulation of the practice in the WECs and, by then, the
Former UK States. (FUKS). In England the Office for Learner Provision (OfLP) took
over licensing of the more official looking ‘Literature Assisting Technology
Enterprises’ (LATE). They were strictly not licensed or allowed to alter any
documents or assist in a final assignment as this was a criminal offence.
We interviewed the elusive chair of the inquiry, Professor Sir
Harry Eastwood, who is no stranger to government inquiries and is known for his
uncompromising stance on difficult issues. He offered considerable praise for
our investigative reporter, Travis Bickle, who uncovered the initial story. He acknowledged that Travis had taken a great
risk by working as the official driver for the VC of The University of Middle
England, the much feared Virginia Fox. He overheard and recorded crucial
evidence in conversations from inside the car. When challenged, he pretended he
was not listening by looking dumb and saying no more than; “are you looking at
me”. However, unbeknownst to his employers, he enrolled as a part-time student of
Applied Criminology and Politics at the university and befriended many of the students
that were illegally using AI generated assignments. They revealed a deep rooted
problem. Professor Harry Eastwood noted that every student should be careful
about what they are being offered; “When
a student is offering essays at a price, whichever way you look at it, I figure
he isn’t out collecting for the Red Cross”.
He was also keen to point out that the system is very unfair to
those with little means: “It seems that in
this world there’s two kinds of people, Those with AI and those who fail” . He
warned that the changes needed would be uncompromising and urged all students
considering this route to consider what they are really getting in terms of
personal development and education: “If
you want a guarantee, buy a toaster”.
Travis Bickle,
Investigative Reporter
Artificial Intelligence (AI) course with ExcelR will provide a wide understanding of the concepts of Artificial Intelligence (AI) to make computer programs to solve problems and achieve goals in the world.
ReplyDeleteai courses
They have to create a history one which couldn't be easily verifiable, and come up with some sort of fake birth certificate because as we know robots are not born. artificial intelligence certification
ReplyDelete