As AI and machine-learning tools such as Microsoft Copilot, Google Gemini, and more
continue to evolve and expand their capabilities, faculty are asked to consider the
role these tools will play in their courses and how best to manage their use in ways
that support teaching and learning.
Since policies will vary across courses and instructors, it is important that faculty
communicate with students clearly and early in the semester. It is also important
to consider that students will be navigating multiple courses with varying AI policies
which can be confusing if guidance is inconsistent or unclear.
To help students understand what AI use will look like in a course, faculty should
develop a clear and consistent AI policy that students can easily reference throughout
the semester and revisit as needed. In addition to a syllabus statement, we also recommend
incorporating specific AI guidance within individual assignment guidelines as well.
Prior to creating your AI syllabus statement, take some time to reflect on your course
goals and learning outcomes.
The following questions can be helpful:
Is the use of generative AI going to be permitted in my course?
What defines appropirate and ethical usage of generative AI? What will be my parameters?
For what purpose will students be permitted to use these tools? Can they use them
for brainstorming? Proofreading? Composing text?
What will the expectations be for students who use AI?
What do I consider to be academic dishonesty within my course with respect to generative
AI usage?
How can I ensure that students are informed of and respect any confidentiality and/or
private policies?
How will I ensure that students understand their responsibilities when working with
AI-generated content?
What are the indicators of student misuse of AI tools? How will I review assignments
to ensure adherence to our AI policies and guidelines?
Will there be any assignments where my expectations deviate from the guidelines discussed
at the beginning of the semester? How can I best communicate this to students?
The AI Policy Table from the American Historical Association can be a great resource to help consider
what constitutes as acceptable use of AI tools as you develop your own policy for
your courses.
Once you have developed your course AI policy, it is important to think about where,
when, and how you will communicate this information to your students and share this
policy with them. Some recommendations for this include:
Post your policy in an easy-to-access location. We recommend including this information
in your Getting Started module with other important course materials such as the course
syllabus (and including it in your syllabus as well).
Communicate your expectations and policies on AI usage to students at the beginning
of the semester.
Be open to answering questions about your policy and use this opportunity to clarify
any reservations or doubts students may have.
Remind students of your policy before major assignments are due.
If there are times where instructions or guidance changes, be sure to let students
know.
Common AI Policy Frameworks
Most often, AI usage in a course falls in one of the three categories below. When
developing your own policy, it can be helpful to look through these to determine which
one aligns closest with the expectations for your course.
If the usage of AI tools is permitted in your course, we recommend including the following
information in your syllabus:
Clearly identify which assignments allow the use of AI tools
Indicate where students can find guidance on appropriate and acceptable AI use
The following additions can also be helpful:
Discuss or provide best practices on creating effective prompts and generating high-quality
AI outputs
Provide detailed instructions for how students can cite and disclose their use of
AI tools
If the usage of AI tools is limited in your course, we recommend including the following
information in your syllabus:
Clearly identify which assignments allow the use of AI tools
Indicate where students can find guidance on appropriate and acceptable AI use
The following additions can also be helpful:
Discuss or provide best practices on creating effective prompts and generating high-quality
AI outputs
Highlight and explain policy in areas that can cause confusion
Provide detailed instructions for how students can cite and disclose their use of
AI tools
If you decide not to allow the use of AI tools in your course, we recommend including the following
information in your syllabus:
A clear statement prohibiting the use of AI tools
Clear and explicit language indicating that AI tools may not be used for any part
of the course
Considerations for following up on this policy include:
Reiterating the policy in class and through class announcements when appropriate
Clearly explaining the consequences for violating the policy
Establishing a plan for how unauthorized AI usage will be addressed
Another option is to adopt a tiered framework for AI use across course assignments,
with expectations adjusted based on the purpose and learning goals of each assignment.
The AI Assessment Scale, developed by Mike Perkins, Leon Furze, Jasper Roe, and Jason MacVaugh, offers a
structured model that helps both instructors and students understand appropriate levels
of AI involvement in their assessment design. Similarly, the Spotlight Model uses a stoplight metaphor to illustrate varying degrees of permitted AI use across
assignments.
Sample AI Syllabus Statements
Review the sample AI syllabus statements below for guidance. These examples are intended
to serve as starting points and may be adapted to align with your course goals, teaching
approach, and expectations for student use of AI. Faculty are encouraged to revise
the language as needed before incorporating it into their syllabus.
In [Course Name / Course Number], the use of artificial intelligence (AI) tools is permitted and encouraged as a learning aid when used responsibly and transparently. AI tools include, but
are not limited to, generative text tools (e.g., Microsoft Copilot, Google Gemini,
etc.), AI-enhanced grammar and writing tools, data analysis tools, image generators,
and discipline-specific AI applications relevant to [your field/discipline].
You may use AI tools to support learning tasks such as [brainstorming ideas, exploring concepts, checking clarity or grammar, practicing
problem-solving, generating feedback on drafts, or other learning tasks here]. However, AI tools should not replace your own thinking, analysis, or decision-making. All submitted work must reflect your understanding of the course material and achievement
of the stated learning outcomes.
Please be aware that AI tools can produce biased, misleading, or inaccurate results.
You are responsible for critically evaluating and confirming the accuracy of any AI-generated
content used in their coursework. You should also consider data privacy implications,
as many AI platforms may retain or reuse user-provided content. Do not upload copyrighted
materials, original work, or personal information unless explicitly permitted.
Student Expectations and Disclosure
You are expected to critically evaluate and verify any AI-generated content before
using it.
You must clearly disclose AI when submitting work, including what tool was used, how it was used, and at what stage of the assignment (e.g., idea generation, drafting, revision).
AI-assisted work must comply with citation and attribution guidelines outlined in
[citation style].
Academic Integrity
Failure to disclose AI use, misrepresenting AI-generated work as entirely your own,
or using AI in ways that undermine the learning objectives of the course may constitute
a violation of the University Academic Integrity Policy and can result in a referral to the academic judiciary.
In [Course Name / Course Number], the use of artificial intelligence (AI) tools is permitted in limited and clearly defined ways. AI tools include, but are not limited to, generative AI platforms (e.g., Microsoft
Copilot, Google Gemini, etc.), AI-powered grammar and editing tools, and discipline-specific
AI applications.
AI may be used only for the following purposes or stages of work:
[e.g., brainstorming topics, outlining, grammar checks, coding support, data visualization]
These limitations are designed to support learning while ensuring that core intellectual
work remains your own. If you are unsure about whether a particular use of AI is appropriate,
please check with the instructor before proceeding.
Please be aware that AI tools can produce biased, misleading, or inaccurate results.
You are responsible for critically evaluating and confirming the accuracy of any AI-generated
content used in their coursework. You should also consider data privacy implications,
as many AI platforms may retain or reuse user-provided content. Do not upload copyrighted
materials, original work, or personal information unless explicitly permitted.
Student Expectations and Disclosure
Students must follow the AI-use boundaries established for each assignment or assignments.
Any use of AI tools must be disclosed according to [instructor's disclosure requirement].
AI-generated content must be reviewed for accuracy, bias, and relevance and must not
replace original thought or analysis.
Where allowed, AI contributions must be cited or acknowledged following [citation guidance].
Academic Integrity
Failure to disclose AI use, misrepresenting AI-generated work as entirely your own,
or using AI in ways that undermine the learning objectives of the course may constitute
a violation of the University Academic Integrity Policy and can result in a referral to the academic judiciary.
In [Course Name / Course Number], the use of artificial intelligence (AI) tools is prohibited unless explicitly authorized by the instructor. AI tools include, but are not limited to, generative AI platforms (e.g., Microsoft
Copilot, Google Gemini, etc.), AI-enabled writing or editing tools, and discipline-specific
AI applications.
This restriction applies to [all coursework OR specific assignments such as exams, reflections, clinical scenarios,
research analyses] and is intended to ensure that all submitted work directly reflects your independent
knowledge, skills, and learning.
If you are unsure about whether a particular use of AI is appropriate, please check
with the instructor before proceeding.
Student Expectations and Disclosure
Students may not use AI tools for coursework unless written permission is provided
by the instructor.
If AI use is approved for a specific task, students must disclose the tool used, how
it was used, and the extent of its contribution.
Unauthorized AI use or failure to disclose approved AI assistance is not permitted.
Acadmeic Integrity
Unauthorized use of AI tools or misrepresentation of AI-generated work as your own
constitutes a violation of the University Academic Integrity Policy and can result in a referral to the academic judiciary.
The following document, created by Lance Eaton, Senior Associate Director of AI in Teaching and Learning at Northeastern University,
offers examples of AI syllabus statements tailored to different levels of allowance
across a variety of academic disciplines. This resource can serve as a valuable guide
when crafting your own AI syllabus statements for your courses. When establishing
your policy, it’s crucial to communicate this decision with your students and, as
mentioned earlier, include reminders about AI usage in your assignment instructions
as well.
Important Note on AI Detection Tools
At Stony Brook, Turnitin’s AI detection is available and can be enabled for assignments in Brightspace. With that said,
AI detection tools are never perfect and can at times result in false positives and/or
false negatives. We do not recommend using AI detection tools as the sole means of
determining academic dishonesty. To ensure fairness and accuracy, we recommend following
our AI-Flagged Paper Evaluation Process. We do not recommend the use of AI detection tools that have not been approved by
the university.