‘Really up to them’: Faculty members take lead on AI syllabus … – Duke Chronicle

Duke Learning Innovation responded to concerns about the use of artificial intelligence to conduct academic dishonesty by creating a set of guidelines for faculty to consult as they design their courses.

As large language models such as OpenAIs ChatGPT become more accessible, professors worry that students may use these programs to draft papers, solve math equations and complete other assignments. Last year, before the University released official guidance on the use of AI in the classroom, some faculty members opted to change their courses in response, while others didnt believe it was necessary to make changes just yet.

Now, universities across the country, including many of Dukes peers, have released guidance on how professors can address AIs use in the classroom. Dukes own guidelines recognize that it is up to each professor to determine whether they will allow AI to be used in their courses.

Some faculty members, like Professor of Statistical Science Jerry Reiter, have made changes to their syllabi for the fall semester to address the use of AI. Reiter does not allow students to use AI to complete assignments in his course, Statistical Science 322/522, Design of Surveys and Causal Studies.

Students need to sit and struggle with the problems in order to get the fullest conceptual understanding, something that can not be achieved by simply plugging an equation into AI, he said.

I try to provide a lot of office hours and TA office hours and help for students who struggle so that they can get those questions answered and hopefully not have to turn to the AI for help, Reiter said. For me, it's really about, how can I set up my course so that students get the most out of it?

Students cannot use AI in a manner that violates the Duke Community Standard, which considers using, consulting and/or maintaining unauthorized shared resources including, but not limited to, test banks, solutions materials and/or artificial intelligence as a form of cheating.

Denise Comer, professor of the practice and director of the Thompson Writing Program, also stressed the importance of providing students with additional resources for classes where the use of AI is prohibited. She highlighted the Thompson Writing Studio as a useful resource for writers at any stage of their work.

You might be shortchanging your own education and development and growth by taking unauthorized shortcuts or by engaging in questionable ethical decisions, she said. If students are thinking of making an unethical choice that's against the policy on the syllabus, [the next step] might be to recognize that writing is thinking, and when we engage ourselves as humans in the writing process, we're actually thinking through ideas and developing perspectives.

Comer also said she appreciates that Duke Learning Innovation acknowledges the benefits of AI in academia, alongside its drawbacks.

Her colleague, Xiao Tan, a lecturer in the Thompson Writing Program, received funding from the Pellets Foundation to license generative AI that allows her students to create photographic essays with AI-generated images.

Some of my colleagues in the writing program are also using generative AI to offer opportunities for students to think really deeply about various aspects of writing, such as revision, Comer said.

Both Reiter and Comer said they believe that the guidelines have validated the perspectives of individual professors and encouraged them to take the lead on how AI should be used.

Reiter said he appreciates that Duke is giving faculty both freedom and guidance to make their own decisions about the role of AI in their courses.

Faculty should have ownership of their course design and how the learning outcomes are addressed throughout the course, Comer said. Its really up to them.

Signup for our weekly newsletter. Cancel at any time.

Continued here:

'Really up to them': Faculty members take lead on AI syllabus ... - Duke Chronicle

Related Posts

Comments are closed.