Skip to Main Content

Generative AI in the Classroom & Research: Classroom Best Practices

Policies in your classroom

Under the concept of academic freedom, faculty have the ability to determine the extent to which artificial intelligence may be used in the classroom. Policies should be in keeping with the University's Academic Integrity policies.

Syllabus language

Over the next few years as generative artificial intelligence (AI) and machine learning tools such as ChatGPT continue to evolve, faculty are faced with the challenge of how to manage how they are used in their courses.  With permission from the University Delaware, Center for Teaching and Assessment of Learning, we offer these three scenarios and examples of syllabus language faculty can choose from when considering the role of AI in their course. Note, there are a few small word changes.

It is important to note that according to the current Academic Integrity policy using an AI tool, if prohibited by the instructor, is considered an academic integrity violation.  Additionally, using language, structure, ideas and/or thoughts of another and adopting the same as one’s own original work without citing the work appropriately is considered plagiarism. 

We strongly recommend faculty remind their students to always cite when using any AI tool to assist with an assignment. The Office of Student Rights and Responsibilities has created several Academic Syllabus Statements that you can add to your syllabi.  They have created faculty resources and upholding academic integrity activities for inclusion in your course. Thus, even if permitted by the instructor, using an AI tool without properly citing its use is also plagiarism.

 
Please modify these statements to suit the needs of your course.  

USE PROHIBITED

Students are not allowed to use advanced automated tools (artificial intelligence or machine learning tools such as ChatGPT or Dall-E 2) on assignments in this course. Each student is expected to complete each assignment without substantive assistance from others, including automated tools.

USE ONLY WITH PRIOR PERMISSION

Students are allowed to use advanced automated tools (artificial intelligence or machine learning tools such as ChatGPT or Dall-E 2) on assignments in this course if instructor permission is obtained in advance. Unless given permission to use those tools, each student is expected to complete each assignment without substantive assistance from others, including automated tools.

If permission is granted to use advanced automated tools (artificial intelligence or machine learning tools such as ChatGPT or Dall-E 2), they must be properly documented and credited. Students should seek out the appropriate source (e.g., MLA, APA, etc..) and cite in the most correct format according to the style guide. Note, online style guides are being updated continually to include new sources such as ChatGPT

If a tool is used in an assignment, students must also include a brief (2-3 sentences) description of how they used the tool, in addition to citing the use of any tool used.

USE WITH ACKNOWLEDGEMENT

Students are allowed to use advanced automated tools (artificial intelligence or machine learning tools such as ChatGPT or Dall-E 2) on assignments in this course if that use is properly documented and credited. Students should seek out the appropriate source (e.g., MLA, APA, etc..) and cite in the most correct format according to the style guide. Note, online style guides are being updated continually to include new sources such as ChatGPT.

If a tool is used in an assignment, students must also include a brief (2-3 sentences) description of how they used the tool, in addition to citing the use of any tool used.

Text adapted from the University of Delaware (n.d.). Considerations for using and addressing advanced automated tools in coursework and assignments. Center for Teaching & Assessment of Learning. Retrieved April 24, 2023, from https://ctal.udel.edu/advanced-automated-tools/

Ideas for assignments

We asked ChatGPT to come up with a list of ideas for undergraduate assignments.

  1. Image Recognition: Have students use an AI-powered image recognition tool to classify images based on certain features or characteristics.

  2. Natural Language Processing: Have students use an AI-powered tool to analyze a large corpus of text data and identify patterns, themes, or sentiment.

  3. Chatbot Development: Have students develop a chatbot using an AI-powered natural language processing tool, which can answer common questions about a topic or subject.

  4. Predictive Modeling: Have students use an AI-powered predictive modeling tool to make predictions about future events or trends based on historical data.

  5. Automated Grading: Have students use an AI-powered tool to grade assignments or essays and compare the results to human grading.

  6. Personalized Learning: Have students use an AI-powered adaptive learning tool to develop personalized learning experiences for themselves or for others.

  7. Recommender Systems: Have students develop a recommender system using AI that suggests relevant resources or materials based on user preferences and behaviors.

  8. Speech Recognition: Have students use an AI-powered speech recognition tool to transcribe and analyze spoken language data.

  9. Image Generation: Have students use an AI-powered image generation tool to create new images based on specific characteristics or features.

  10. Intelligent Tutoring: Have students develop an AI-powered virtual tutor that can provide personalized feedback and guidance to learners as they work through problems or assignments.

These assignments can help students develop a better understanding of how AI works and how it can be applied in different contexts, as well as improve their skills in areas such as data analysis, programming, and critical thinking.

What does ChatGPT say?

We asked ChatGPT to answer the following questions.

Q: How can faculty in higher education use artificial intelligence?

A: Higher education faculty can use artificial intelligence (AI) in a variety of ways to enhance teaching and learning. Here are a few examples:

  1. Personalized Learning: AI-powered adaptive learning systems can be used to create personalized learning experiences for students, by analyzing their learning patterns and adapting the content and pace of instruction to their individual needs.

  2. Automated Grading: AI-powered tools can be used to grade assignments and provide feedback to students, freeing up faculty time for other tasks such as research, course development, or student mentoring.

  3. Intelligent Tutoring: AI-powered virtual tutors can be used to provide students with immediate feedback and guidance as they work through problems or assignments, helping them to stay on track and master the material more quickly.

  4. Chatbots: AI-powered chatbots can be used to provide students with answers to frequently asked questions, such as course schedules, office hours, or assignment deadlines. This can help reduce the workload on faculty members who might otherwise spend significant time answering the same questions repeatedly.

  5. Research: AI can be used by faculty to conduct research, by analyzing data, identifying patterns, and making predictions. This can help faculty members to develop new insights and solutions in their field, and can also lead to new opportunities for collaboration and funding.

Overall, the use of AI in higher education has the potential to greatly enhance teaching and learning, by providing personalized learning experiences, automating repetitive tasks, and supporting faculty research. However, it is important to carefully consider the potential ethical implications of using AI in education and ensure that AI-powered tools are being used in a responsible and transparent way.

Q: What are the ethical implications of using artificial intelligence in assignments for higher education?

A: There are several ethical implications of using artificial intelligence (AI) in assignments for higher education. Here are a few examples:

  1. Unfair Advantage: AI-powered tools like automated essay grading software or plagiarism detection systems can provide an unfair advantage to students who use them. This could lead to academic dishonesty, and ultimately undermine the integrity of the educational system.

  2. Privacy Concerns: AI tools may require access to student data, which raises privacy concerns. In addition, if student data is used to train AI models, there is a risk that it could be used to create profiles or make predictions about students that could be used in ways that are harmful to them.

  3. Bias: AI models are only as good as the data they are trained on, which means that they can perpetuate biases that exist in the data. This could lead to unfair treatment of certain students or groups of students, particularly those who come from underrepresented backgrounds.

  4. Lack of Transparency: Some AI models, particularly those that use deep learning techniques, can be difficult to interpret. This lack of transparency could make it difficult to understand how decisions are being made, which could be problematic in situations where decisions have significant consequences.

Overall, while there are potential benefits to using AI in higher education assignments, it is important to carefully consider the ethical implications and take steps to mitigate any potential harm. This may involve being transparent about the use of AI, ensuring that AI-powered tools are fair and unbiased, and protecting student privacy.

Should I cite ChatGPT?

Citing ChatGPT and other generative AI will allow your work to remain transparent while also demonstrating ethical considerations. If generative AI is used in research, best practice would dictate to mention the use of this tool in the narrative.

Suggestions for citations may be found below:

Copyright and AI-generated content

The US Copyright Office “has launched an initiative to examine the copyright law and policy issues raised by artificial intelligence (AI) technology, including the scope of copyright in works generated using AI tools and the use of copyrighted materials in AI training. After convening public listening sessions in the first half of 2023 to gather information about current technologies and their impact, the Office will publish a notice of inquiry in the Federal Register.” Additional announcements and links to documents are available here.

Their basic position as of mid-March 2023 is that texts and images solely or primarily generated by AI are not eligible for copyright protection; texts and images that originated in AI but have had significant authorial contributions may be eligible on a case-by-case basis. Guidance in the Federal Register for March 16, 2023 outlines their perspective and gives instructions on filling out the copyright registration forms to acknowledge the role of AI in any work being submitted.

A word about hallucination

Hallucination refers to the phenomenon involving the complete fabrication of information by generative artificial intelligence. Because AI relies on networked datasets to generate connections that emulate human writing, it is incapable of actually thinking for itself. Therefore, fact-checking is not consistently accurate.

Instructors should practice caution when permitting students to rely on AI for evidence-based assignments, particularly because AI may present its false information with confidence that may lead students to believe that the generator has credibility. For example, ChatGPT has been known to fabricate references for sources, as well as provide fake evidence for health sciences inquiries (Alkaissi & McFarlane, 2023).

 

Reference

Alkaissi, H., & McFarlane, S. (2023). Artificial hallucinations in ChatGPT: Implications in scientific writing. Cureus. https://doi.org/10.7759/cureus.35179