To AI or not to AI: CMU says 'we do'
Artificial intelligence has been a polarizing topic since its inception, and institutions have been debating whether to embrace or ditch it. Central Michigan University seems to be doing the former.
CMU is learning more about the world of AI and how it can be better used on campus by students, faculty and staff.
University Provost Paula Lancaster said the University is examining ways to help faculty use AI tools, to enhance teaching and learning. She said as well as ways to equip students to become proficient, ethical users of the technology.
“As a university, we are exploring all the opportunities we have to leverage AI while also mitigating risks that come along with the adoption of new technology,” Lancaster wrote in an email statement.
She also said that CMU faculty and staff have been experimenting with AI for several years, and included the Office of Curriculum and Instructional Support.
One of the seeds of AI that CMU planted was the addition of its first special advisor for Artificial Intelligence, Ben Andera to its workforce.
“The purpose of establishing this role was to provide the bandwidth for a senior leader to focus on strategic exploration, governance and university-wide integration of artificial intelligence,” said Jim Bujaki, vice president of the Office of Information Technology at CMU, said in a statement.
The position was announced on Feb. 2 by Central Michigan University President Neil MacKinnon and other leaders across campus. Andera said his position will move CMU forward in the field of artifical intelligence.
“We’ve been talking a lot about it at a leadership level of just how this impacts CMU,” Andera said. “Just really excited to work with leaders all across campus to find ways to be more successful, to utilize it in good ways and to be doing it in a safe and thoughtful way.”
His role is to learn the challenges students, faculty and staff are facing regarding AI. Andera will guide a coordinated approach to AI adoption. This means he is focused on getting the most out of AI and how it can benefit the campus.
“We already are embracing AI. Each one of the colleges (across campus) has different groups that are meeting, and faculty are trying to understand it,” Andera said. “We have different RSOs that students are pushing AI on.”
He said CMU has agreements with Firefly by Adobe and Copilot by Microsoft. CMU is also testing its own AI.
“We’re actually building an environment, our own technology and our data center that can run, so we can bring down open-source models and run them,” Andera said. “The data never leaves our data center.”
The university has a 2023-2028 Strategic Plan to guide the university. Andera’s new position will develop an AI-focused strategic roadmap for this plan.
He said as of right now, the roadmap could be a webpage or hub that students can connect to for AI.
He said the university currently needs to update its policies, bring clarity, raise awareness of AI and focus on training efforts.
“We really need to continue to expand on the infrastructure capabilities that we have,” Andera said. “There are faculty that really want to do exciting things with restricted data.”
He said there have been listening sessions to hear feedback on AI from faculty, staff and students.
He said the main feedback he received was that students were interested in more clarity about whether they could utilize AI in their academics.
Andera also said they don’t want to “impede on faculties’ autonomy” when making decisions about AI in the classroom.
“They know best of how their students and their classes should work,” he said. “Faculty get to decide what tools get used or don’t use.
“They get to decide whether it’s an open-book or closed-book test, right?”
AI regulation
Despite administrators embracing AI, many course syllabi still include a section on limiting usage in the classroom. Professors are given a sample statement that many include in their syllabi.
The statement said work students submit must be the product of their own efforts, and submitting AI-generated products as your own original work is prohibited.
It states behaviors that constitute academic dishonesty are noted in the CMU Bulletin or in the university’s Academic Integrity Policy.
However, the policy doesn’t state what will happen if students use AI for their coursework, and it doesn’t mention AI.
“The inclusion of AI usage when it comes to academic integrity is ultimately up to the faculty member,” Douglas Kendrick, assistant director of student conduct, wrote in an email statement. “Typically, the faculty member will note in their syllabi if the use (of) AI is permitted or not in their course.”
Stephen Juris is the director of Institutional Assessment and Curriculum and the Chair of the University’s Academic Senate. He said if a student is not allowed to use AI in the classroom, but does, they are subject to disciplinary action by the Code of Student Conduct.
“The problem associated with it, it’s a lot harder to detect AI use, right?” Juris said. “There are faculty members who want an AI-detection tool and say, ‘I need this to my advantage to be able to know if a student is using it unethically, if they’re using it improperly.’”
He said if a student is accused of using AI for their coursework by a faculty member, the member would need to have substantial evidence to factuly support the decision.
In general, Andera said there have been rules in place to protect students’ private data and faculty’s private data.
“There’s really two policies that we’ve had for decades that tried to keep people from taking a bunch of data, exfiltrating it and putting it into a Google Drive, or giving that data away,” he said.
The policies are the Responsible Use of Computing Policy, Andera said. The policy requires students and faculty to maintain regularly used technologies and networks while preserving the privacy and integrity of those resources.
Juris said, in regard to decisions on the future regulation of AI in academia, the Academic Senate would need to make the decision.

