AI and Teaching: Policies and Guidelines

Information to students

Generative AI (Artificial Intelligence), henceforth GAI, tools have the ability to generate new data, text, images or other material, rather than just processing existing data. Many tools are rapidly becoming widely available and user-friendly. Examples include ChatGPT, Google Bard, Claude, Bing Chat, Site, Dall-E image generator, etc. Given the rapid pace of the development it is currently difficult to define exactly what these tools can and cannot do.

This bears crucial implications for academic teaching and learning. GAI tools can provide students with important resources with the potential to support, improve, and accelerate learning. At the same time, they raise questions about fair and secure assessment (i.e. risks of cheating and plagiarism) and the risk that the tools replace tasks that we consider core to academic learning.

Uppsala University is currently preparing centralised guidelines. While we wait for more instructions, the following draft policy is to be shared with all students enrolled in Department courses. We wish to ascertain that GAI tools are used responsibly, ethically, and in a way that promotes and accelerates learning while we also safeguard fair assessment and discourage cheating. To do so, our current guidelines rest on four important pillars:

First, we allow the use of GAI tools for learning, but not instead of learning. GAI tools should neither replace activities which currently are considered core to learning about our subject, nor replace activities which we currently consider core to academic work. Examples of using GAI for learning include: asking ChatGPT to explain concepts, brainstorm ideas, suggest counter-arguments, improve writing, detect errors, or provide feedback. An example of using GAI tools instead of learning is giving ChatGPT the assignment instructions and quoting the text directly or with little further engagement. As learning outcomes and forms of assessment vary, your course convener will provide course-specific guidelines on the extent to which GAI tools are suitable for learning in their course.

Second, the learning outcomes always take precedence. Central to all courses are the learning outcomes, as specified in each course plan. These learning outcomes have not become less relevant with the emergence of GAI tools. When grading assignments, teachers can withhold a passing grade if they deem a submission does not sufficiently reflect independent work (as a learning outcome) Additionally, there is a risk of disciplinary action if examining teachers in the grading of assignments concludes that a student has used AI instead of learning and not for learning.

Third, as always, students are responsible for all works submitted in their name. When you as a student submit your work in your name it means that you are responsible for the words, images, and data included in the submission. You should be able to explain and justify the material used, how the final assignment was completed, and develop your thinking around key ideas or choices made in your submission. If you use a GAI tool to complete an assignment or examination in a different way than the teacher intended, you may mislead the examiner about, for example, your own knowledge and skills or how you carried out the assignment. Such actions can be considered a disciplinary offence according to the Higher Education Ordinance. Further, copy-pasting text, images or other output directly from a GAI tool could constitute plagiarism.

Fourth, students should be aware of the limitations and risks involved in relying on text generated by GAI tools. These risks include e.g.

  • They often produce incorrect or false information;
  • They may provide non-sensical information, as they cannot recognise if a prompted question is absurd or has no answer;
  • They may provide you with copyrighted information or personal data that you do not have the right to use;
  • They may not give a balanced view of the subject, this includes reproducing myths, biases and stereotypes as truths;
  • They may plagiarise someone else’s text or ideas without giving proper credit;
  • They may fabricate citations, i.e., they cite sources that do not exist.

Students of peace and conflict studies should familiarise themselves and be cognisant of these risks, as overlooking the risks of GAI use may limit their learning, misrepresent their thinking, and (in the worst case scenario) lead a student to fail an assignment or face disciplinary action. Carefully weighing the risks and benefits of GAI use is also good practice for their future careers, where GAI use may raise similar concerns.

For now, and given that we neither know what the central UU guidelines will stipulate, nor how the Vice Chancellor’s Disciplinary Board will rule on these matters, we recommend students exercise caution when using GAI tools. Also note that if you take a course outside the Department, they may have a different approach.

We urge you to follow these guiding principles:

  1. Use GAI tools for learning, not instead of learning.
  2. The learning outcomes of a course take precedence.
  3. Remember that you are always responsible for works submitted in your name.
  4. Be aware of the current limitations and risks of using GAI tools, and remember to check the terms and conditions as with any software.

For questions on how the policy applies to a specific course, contact the course convener.

For questions about the policy, contact the Director of Studies (erika.forsberg@pcr.uu.se)

FOLLOW UPPSALA UNIVERSITY ON

facebook
instagram
twitter
youtube
linkedin