This document offers guidance as well as ideas for how teaching staff and programme teams can engage with AI in their learning, teaching and assessment practice.
What is AI?
AI is a collective term for a set of technologies that use large data sets to attempt to replicate certain actions such as creating an image, writing text, or responding to questions. AI also has links to fields such as Machine Learning and data processing. An example currently prominent in education is ChatGPT but there are many other similar technologies. AI is originally ‘trained’ off a large data set and then uses input data to refine its approaches, so the more work that it conducts in a particular area the more refined its outputs become.
AI is here to stay: It's worth pointing out that AI is already widely used in the form of spell-checking, predictive text, translation and speech-to-text. Hence, while recent developments might be seen to pose a greater threat to academic integrity, our response should be to adapt learning, teaching and assessment practices to work with emerging technology and support our students to learn to use it ethically and responsibly. Of course, any technology can be disruptive and problematic if inappropriately used. However, AI offers a wide range of benefits and opportunities for education and employment. It has the potential to reduce barriers to learning by helping students, who may otherwise struggle, to understand and reflect on the information provided in class or on the VLE (Virtual Learning Environment). Therefore, we need to consider a balance of actions to encourage the informed and responsible use of artificial intelligence. In addition to having clear strategies to discourage misuse, we need to prepare our students for a rapidly changing employment market where effective use of AI is likely to be an important skill.
Using AI for learning
Below are some suggestions to promote the responsible use of this technology. Now is the time to try something innovative that may also prove to be more effective and motivating for students.
- Get inquisitive and share your understanding: Try out some of these emerging AI tools to understand how they are changing common tasks. Consider how the use of the technology might help students learn. Bear in mind, that this is a rapidly developing field, so the nature and capacity of tools will change. Try to keep up-to-date and share your understanding with colleagues.
- Create clear expectations: Discuss with students the concept of academic authorship and its relevance to the use of AI, and the extent to which it can be used to support the development of academic writing skills. Use this as an opportunity to reinforce debate around academic honesty and integrity, intellectual property, and the best use of technology. Explain and show how these systems make serious mistakes and hold bias that requires critical thinking, subject knowledge and understanding to identify and correct. Repeat these messages, in lectures, tutorials. Don’t just rely on a one-off talk.
- Review assessment strategies: Outlined below are suggestions for how assessment can be managed to reduce the likelihood of cheating (through misuse of AI and other means). Some of these can be implemented at once, others may require approval through institutional quality processes. Individual strategies implemented at module level can have some impact, but the most effective way to manage this issue is through coordinated, collective, programme-level actions.
The University of Greenwich has produced a rubric (ARMS) which may help structure the review of assessment strategies (See below). Questions to ask the team - Are the examples below valid representations of the description? Which assessments are in the higher and lower categories? How can assessment be developed to use AI purposefully?
AI Risk Measure Scale | Description | Examples |
1 Very low | It is highly unlikely that students can use AI to produce this type of assignment. | Assignments that embed authenticity in the design (e.g. field trip + reflective report), assignments that allow establishing the identity of the person (e.g. presentations, in-person exams). Subjective assignments that require personal reflection or creative thinking, such as personal narratives, or artistic projects. These types of assignments are typically based on the student's opinions and insights, which are difficult to replicate using AI. |
2 Low | Students could potentially use AI to produce the assignment, but it is very unlikely to have a significant impact on the assignment's quality and/or originality. | Assignments that draw on unique teaching material (e.g. novel cases produced by tutor). Assignments that have clear guidelines, such as solving math problems or coding exercises, where AI could assist but the student's approach or solution is what is being evaluated as the main focus of the assignment. |
3 Moderate | There is a moderate likelihood that students can use AI to produce the assignment, and it could have a moderate impact on the assignment's quality and/or originality. | Assignments where AI could be used to assist students in completing the assignment, but the final work would still require the student's critical thinking, analysis, and interpretation. Assignments that require a more complex analysis of a topic, e.g., critical analysis essay or a scientific report. Students may use AI tools to help with data analysis, visualisation, or interpretation in some areas, but the writing and argumentation are largely based on the student's understanding and critical thinking. |
4 High | It is easy for students to use AI to produce the assignment, and it could significantly impact the assignment's quality and/or originality. | Assignments that focus on well-published company case studies (e.g., Innocent, Apple, Bohoo, Starbucks etc.) and rather generic topics (e.g. advantages and disadvantages of FDI) which students can easily obtain through AI bots. Assignments that involve sophisticated algorithms or complex modelling, such as financial forecasting, predictive analytics, or image recognition, where students could use AI to generate both, results and insights/commentary. |
5 Very high | It is very easy for students to use AI to produce the assignment, and it will have a significant impact on the assignment's quality and/or originality. | Assignments that require students to produce summaries or abstracts of published articles, reports, or research papers, this includes research proposals. These assignments require no input/modification from students and can be entirely produced by AI. Assignments that involve large-scale data processing, such as machine learning projects or artificial intelligence simulations, where students could rely entirely on AI to generate both results and analysis. |
What can you do now?
- Use authentic assessments: AI will struggle to connect theoretical knowledge to real-world applications. Challenging students with scenarios that require creativity, problem solving and that make sense to their lives should limit scope for cheating and increase engagement.
- Get specific: AI works best when subjects are generic, and where salient information is widely available. So, ask students to work with specific data sets, refer to course content or use specific references.
- Think about referencing: AI generated texts often use fictional references. Asking students to supply hyperlinks to reference materials will enable you to quickly assess whether those references are real or not.
- Be clear in your assignment brief what is needed: If students are confused by what is required this might lead them to rely on AI to complete the assignment.
- Promote the wider purpose of the assignment: Make it clear to students how the skills and knowledge gained from the assignment relates to other assignments, their future career or life goals
- Reduce assessment bunching: Assessment bunching across a programme may prompt students to rely on AI as a shortcut to meet deadlines. Block teaching or staggered module end-dates can help reduce end-point assessment overload.
- Check your assignments:Test your own assignment titles using generative AI tools to understand the type of output it can create and the exposure your assessment design has to AI.
- Consider subject discipline: AI is strong when it comes to structured and predictable content. Generating answers for STEM subjects and programming code is an area where it presently excels. Consider this when setting work in these areas.
What may need more time and programme level action
- Increase the variety of assessment types: using a range of assessment formats has many benefits. It helps students to express their knowledge in diverse ways and demonstrate different skills. Variety also means that students cannot rely on a specific technology to "do the work for them".
- Focus on process, as well as outcome: Design assessments that capture the development process where the learning is occurring, not just the result. Include elements such as formative feedback, peer review, planning, supervision, or personal reflection on learning.
- Encourage the use of AI: Tell students how they can use AI in your assignment. Design tasks and assignments that use the beneficial elements of AI to develop students' awareness and critical faculties. Asking students to critically analyse AI generated text or explicitly use AI in the preliminary stages of literature review can be helpful. Write in your assessment brief how students should/could use AI. The following statements could guide you.
- “In this assignment, you will critically analyse examples of AI generated text and explore the errors, bias, and ideas as part of your learning. Please clearly state in submission where you have used AI.”
- “Use AI in the development of this work to help you structure your ideas and provide stimulus for the final work. Please clearly indicate where you have used AI.”
- Encourage attribution: If suggesting students use AI as part of their learning require that they show where it has been used to generate content and for what purpose.
- Guidance on referencing materials generated by AI is available. In effect, it is seen as ‘personal communication’ and should be addressed in-text along the lines of ‘When asked to explain why AI is a challenge for educators, ChatGPT (OpenAI ChatGPT, 2023) response included ....’ Name of AI tool (Year of communication) Medium of communication Receiver of communication, Day/month of communication.
- Use in-person assessments: Invigilated examinations can seem like an obvious solution to concerns about AI. Certainly, they have their place in any assessment repertoire. However, whilst they are fine for recall, they have less value when it comes to assessing deeper understanding. In addition, consider opportunities for students to demonstrate their knowledge in real-time. Presentations, vivas, coursework discussions can help students prove their learning in a meaningful sense.
- Use tutorial discussions: take the opportunity in tutorials for students to talk about their work and explain the processes they went through in developing the assignment. This could be developed into a programme assessment strategy in the form of an overarching assessment across the full academic year.
AI and Academic Misconduct
If you suspect a student has inappropriately used AI in their work, ask them to explain how they developed their assignment and where they got their ideas from. If they used a tool such as ChatGPT, it is unlikely they would be able to provide a compelling answer. This is the sort of approach that we would take if we suspected a student of contract cheating.
If you believe that a student has used AI with the intent to gain unfair advantage, please use the normal academic misconduct procedures.
- Don’t rely on detection systems to save the day. There are significant questions over the reliability of technology for identifying AI generated work. Detection tools are available but, in light of concerns over their efficacy, at this point the University does not endorse any specific product. If you choose to use any system, please do so with caution and bear in mind that, in addition to potential inaccuracies, there may be data protection issues.
There are some common features of AI generated text that might help in detecting where it has been used:
- AI generated text tends to be very regular and structured in its style. It is clear, concise and to the point. Generally, it follows standard writing conventions. Typically, for example, there will be one point per paragraph, presentation of argument for, followed by opposing arguments and then a conclusion.
- Words and phrases are often repeated in AI generated text. In effect, the technologies are good at writing coherent sentences, but less effective when it comes to long-form content.
- AI generated text can often refer to factually incorrect material that looks plausible but does not stand up to scrutiny. It is more important to the AI that it is ‘believed’ than it necessarily be correct.
- References in AI generated text can be fabricated. AI will may refer to an existing known author or journal in a particular field and then invent a non-existent title, for example.
- ChatGPT is trained on data that is not wholly current and so will struggle in handling prompts referring to or relying on contemporary data.
We need your help
Everyone is on a steep learning curve in terms of how to manage assessment in the face of increasingly sophisticated artificial intelligence software. Please contact the Teaching and Learning Academy if you have ideas or techniques that you are happy to share. Alternatively, let us know what you think about this guidance, and whether the suggestions have been useful or not.
Acknowledgements
- QAA (2023) The rise of artificial intelligence software and potential risks for academic integrity
- USEME-AI (2023) Adapting to use AI in schools
- Monash (2023) Policy and practice guidance around acceptable and responsible use of AI technologies
- Greenwich (2023) Using the AI Risk Measure Scale (ARMS) to evaluate potential risks