Exploring Dimensions of Faculty Use of AI in a Liberal Arts Context (Part 1: Syllabus Design)

student and faculty during the CLT symposium

This article is part one of a three-part series that re-caps the process and findings from a session CLT ran in the November 2025 symposium where we brought together faculty members, students and staff to explore a new angle in the AI discussion: faculty use of AI. Each table had at least one student and some faculty members and staff, and were given scenarios to discuss.

The Process

To warm up, we asked participants to share how they feel about AI these days, and to share ways they have heard of faculty members using AI in their courses. 

In order to frame the conversation, we first discussed a variety of well known ethical frameworks, phrased in simple language in case some of the students had not been previously exposed to these approaches. Participants were encouraged to approach each upcoming scenario by thinking about what good and poor consequences were possible (what is called utilitarianism in philosophy). They were also asked to analyze whose rights are affected and what the duties of each stakeholder are. Finally, they were invited to consider equity issues, such as how particular actions might harm some groups while benefiting others.

One concrete example was shared by CLT so participants could see how these ethical frameworks could be applied (see table 1 below)

Ethical lensExample of AI Detector Use
ConsequencesPossible good: May be a deterrent Possible bad: Falsely accuse innocent student
RightsStudents’ rights to privacy and security of their own data and intellectual property
DutiesTransparency of students & teachers about AI use and AI detector use
Equity considerationsMore likely to flag non-native speaker writing as using AI (a bias)

Table 1: The Example of AI Detector Use, from Multiple Ethical Lenses

Each table was given one of three scenarios, describing a situation where a faculty member used AI either in 1) creating their syllabus, 2) giving feedback, or 3) designing assessments.

In order to ensure conversations on each table were not dominated by one or two people, we had a timed turn-taking Liberating Structure called Conversation Cafe (watch a demo video). After discussing the scenario from all ethical lenses above (several rounds to cover each lens), participants were then invited to suggest potential guidelines that AUC could develop in order to promote a more ethical and equitable way forward for faculty use of AI. CLT facilitators (co-authors of this article) helped with time-keeping and note-taking, as well as providing some input in some cases. At the end of the process, a participant from each table shared the key takeaways from their table with everyone present.

We will publish the outcomes of this workshop as three-part series, where each will share one of the fictional scenarios and a summary of the findings from the tables that were discussing that particular scenario, followed by some additional resources for faculty on the topic. Part 1 is on using AI for syllabus design, part 2 is on using AI for giving feedback, and part 3 is on using AI to create assessments.

Scenario: Syllabus Design

Description

A faculty member was told in the first week of the semester that the class they thought they would be teaching has been canceled, and they have to teach a new course that they had never taught before and that is not within their specialization area, but which they could still teach. The department chair did not send the professor a previous syllabus to start with, and only the course description. The first day of class is tomorrow, so the professor gives an AI tool the course description and prompts it to create a syllabus.

Findings

Consequences

  • Positive consequences for faculty included saving time and effort, providing ideas, designing layouts for scaffolding, offering a synopsis and some reading recommendations BUT only if used the “correct way” and it is always necessary to check the output 
  • Negative consequences: course description might be insufficient to create a suitable syllabus or offer appropriate assessments from AI; AI generated syllabus might be too general and appear good but not be deep enough; the AI may not know students’ context and write a syllabus suitable e.g. for a US university but not an Egyptian one. AI is not an expert on the subject matter

Rights

  • Faculty member’s right to have enough time to know the material they are teaching and prepare and get a sample syllabus to adapt.
  • Students rights: to get a professor who is expert in the field and subject matter and is well prepared.
  • Students have a right to see an AI disclaimer by the professor on the syllabus.

Duties

  • Administrators’ duties to give faculty members sufficient notice and support/guidance when teaching a new course and assign courses to someone knowledgeable about the topic
  • Department’s duty to keep and provide sample syllabi
  • Faculty member has a duty to revise output of AI for accuracy and accessibility and should disclose their use of AI in the syllabus

Equity considerations

  • This situation is unfair to both students and faculty to ask the professor to create a syllabus from scratch without giving them a sample or guidance. 
  • Inequities amongst those who have access to paid AI versions and those who use the free versions which produce lower quality outputs.

Possible guidelines

  • Alignment between Institution/department and Faculty AI policies
  • Faculty members to acknowledge the use of AI in the syllabus so that students are aware of faculty use
  • Faculty members to check all the data generated by AI and change/add your own ideas
  • Students felt this could be a chance to co-create with the professor. They suggested the professor asks their students what they would like to learn and how to guide the structure of the course or the learning experience. This may be more appropriate for high level courses more than freshman courses

Recommended Resources

CLT recommends that faculty members’ read:

  1. Laila ElSerty’s and Rania Jabr’s article Building Teacher Confidence by Making Responsible Choices to Teach Using AI which has a section on using AI for designing lesson plans.
  2. Engageli’s blog post How to Use AI for eLearning Course Creation: Tips & Tools which offers insights into leveraging AI for curriculum development and content generation.
  3. this “prompt library” for using AI for educational purposes such as lesson plan and rubric creation.

Stay tuned for part 2 on using AI for giving feedback.

Associate Professor of Practice  at AUC |  + posts

Leave a Reply

Your email address will not be published. Required fields are marked *