Considerations of AI in teaching

For staff teaching and supporting students, it is important to approach the use of artificial intelligence and generative AI from a positive perspective as it lends itself to innovative use that can make teaching and learning more supportive, effective, and inclusive. 

To make the best use of generative AI tools, focus on enhancing student learning and efficiency while ensuring that it does not become a replacement for developing students’ key skills; that authenticity is embedded; that the knowledge, experience, and insights that academics and tutors bring to the classroom are maintained. It is important to develop students’ critical thinking, creativity, and problem-solving and incorporate such tools to best effect into teaching and learning activities and assessments. 

Although knowledge of generative AI’s potential and making best use of the tool in practice are vital, caution is also advised. Understanding the basics of generative AI tools and how to use it ethically and responsibly is essential. By approaching with care and attention, teaching staff can help students reap the benefits of these technologies. Staff can become more (time) efficient whilst maintaining the integrity and quality of their teaching. 

Microsoft Copilot

Microsoft Copilot is now available for both University of Plymouth staff and students; through the Edge browser and the Microsoft Copilot website. Importantly, when accessed through institutional credentials, Copilot offers commercial data protection, safeguarding searches and data. *If you have difficulty accessing CoPilot this could be to do with the nature of your staff contract with University of Plymouth, please contact IT help desk for further clarity around this. 

Below, we provide recommendations for practice:

Familiarise yourself with the technology
  • Put your assessment (questions) into Copilot, Gemini & ChatGPT and other LLM tools to see what responses are produced and to compare such responses. Consider the quality of the responses, review and rephrase your questions to obtain better responses.

  • Ask the tool(s) to write a bio about yourself and your place of work to identify possible untruths.

  • Produce some examples to discuss with your colleagues. Consider the implications for teaching and assessment practice.

  • Reflect on your values. Be ready to discuss the ethical implications of generative AI with colleagues and students alongside the value of what you are teaching.

  • Be aware that the test datasets may not be up to date, each generative AI tool is pulling its data from a defined test set predominantly from the internet.

Using Generative AI to help make practice more inclusive

The potential of these technologies to ‘level the playing field’ and address inequalities is substantial. Consider carefully how you can use and embed the use of generative AI tools to make your practice more inclusive, for example: 

  • Use the technology yourself and allow students to translate texts into different languages, find references for a piece of text, or rewrite text to make it more succinct and accessible. 
  • Encourage students to support each other when using the technology in their assessments, e.g., to determine the best structure for an assignment, to help write effectively, and to encourage peer feedback. 
  • Encourage students to critically evaluate generated responses and ask and discuss questions relating to academic factors (e.g. academic referencing; standards expected), punctuation, grammar, sentence structure, narrative, etc. 
  • Redesign your assessments to focus more on the learning process and students’ motivation than on written output. 
  • Generative AI can help speed up the more mundane tasks and allow students to reflect on and explore the relevance to their personal environment, context, work, and subject area. 
  • Build flexibility and encourage creativity so students have options based on their interests and motivations (choice/angle of topic, medium, format). 
Considering Generative AI in assessment design
  • Generative AI might be used to offer a shortcut for those students who are short on time. Review your course schedule and question if you are unknowingly pushing your students to create shortcuts. You may decide to allow the use of the technology for certain aspects of the assignment but change how marks are awarded to focus more clearly on problem-solving and analysis, for example.
  • Provide specific guidelines and constraints for the assessment, such as word limits, formatting requirements, and referencing standards. If you are interested in using it for some assessments, consider assignment coversheets that students use to declare its use. 
  • Consider your assessment questions or statements. If there is a challenge or disagreement within your field, could students use generative AI responses to analyse these arguments? 
  • Re-design assessments with less focus on writing skills, e.g. introduce professional conversations/vivas,round-table discussions, role-play, practice-based assessments such as in-tray exercises, case-based scenarios, etc. 
Using Generative AI to improve teaching practice and curricula design
  • Encourage students to critique already written responses (for example, a critique of an AI written response as part of a tutorial activity)
  • Encourage students to ‘show their working’, for example, by sharing notes and drafts alongside essays.
  • Use to plan curricula and assessments. You could, for example, ask for help with lesson plans, marking rubrics or for ideas for a teaching session. The quality of the responses depends on the quality of the questions or instructions you enter. If the responses are not quite what you were looking for, break the question down to make it more specific. For example, you can include information on who the audience is, what the session tries to address, that you would like, say, three ideas for teaching activities in a particular subject, etc. Use the technology as you are having a conversation – the interaction can help to drill down what you need.
Making assessment more authentic

Remember that AI is not human. It cannot critically evaluate the text it draws upon nor does it know our local area. Therefore, try and use clever (authentic) assessments to prevent or minimise reliance on ChatGPT, such as:  

  • Assessments to include analysis of class discussion or workplace experience; scenarios/case studies with variables; reports on independent research activities; or similar. Furthermore, it is more inclusive to use a combination of different assessment methods, such as oral presentations and problem-solving exercises. In addition, there is less opportunity for plagiarism. 
  • Using/creating unique and original prompts can make it difficult for students to find pre-existing answers generated by generative AI.  
  • Encourage students to explain their thought processes and motivations, so instead of only asking for the final answer, ask them to reflect and bring in their personal insights; explain their thought process in solving the problem. 
  • Other examples of tasks that are more difficult for AI to do well include: progressive/reflective portfolio-style assignments that are built up over time; interactive oral assessments; programme-level or synoptic assessments; analyses of images or videos.  
  • Consider how students could illustrate concepts, connections, theories or approaches in your field or discipline visually. AI cannot make visual representations of content, although please note that AI can also generate images, and this area is developing quickly. 

Further considerations for staff and their practice

How to communicate with students about the tool and promote academic integrity?

Before you talk to students about such tools: 

  • ‘Engage early with students to provide information about the capabilities and limitations of AI software tools (such as inappropriate forms of citation and referencing and implicit bias) and how indiscriminate use may not only harm the quality of their education but also undermine confidence in the qualification they are working towards.’ (QAA Briefing Paper)
  • Build trust with students rather than suspicion and include them in conversations about what the institution is doing. Set clear expectations. Be clear of your citation and referencing guidelines and expectations across programmes. Educate students on the importance of academic integrity and the consequences of plagiarism.
  • Should you also develop guidance for generative AI tools and consider where you would include it? As mentioned before, encourage students to engage with generative AI, and then critique it as a formative assessment activity. 
How to reference the use of such tools?

Check the referencing style used in your course for how to cite generative AI or follow the format of Cite them Right examples for personal communication. 

There are no specific guidelines for citing generative AI for many referencing styles. 

We recommend that you base the reference for generative AI content on the reference style for personal communication or correspondence unless the referencing style has specific guidelines. 

Content from generative AI is a non-recoverable source as it cannot be retrieved or linked, so it is essential to reference it at the point of use. 

Note that both the in-text citations and references begin with the name of the sender of the communication, for letters, emails, texts, and generative AI. 

When referencing the use of ChatGPT or other AI language models include the full name of the model and any relevant version numbers. See below for examples: 

Harvard 

In-Text Reference Example: 
(Open AI’s ChatGPTMar23 Version, 2023) 

Reference List example: 
Open AI’s ChatGPTMar23 Version, AI language model, personal communication, 7 February 2023 

APA 7th 

In-text citation: 
(Open AI’s ChatGPTMar23Version, February 7, 2023) 

Reference list: 
Paraphrase from OpenAI’s ChatGPTMar23Version, AI language model, personal communication, February 7, 2023 

MLA 

In-text citation: 
(Short form Title of source) 
(“Describe the symbolism”) 

Work cited: 
“Title of source” prompt. Name of AI Tool, version, Company, Date content was generated, General web address of tool. 

“Describe the symbolism of the green light in the book The Great Gatsby by F. Scott Fitzgerald” prompt. ChatGPT, 13 Feb. version, OpenAI, 8 Mar. 2023, chat.openai.com/chat. 

If you are writing for publication and need to cite the use of generative AI, you should check the publisher’s information for authors. Different publishers are taking different approaches to whether generative AI is allowed. 

How is copyright affected by the use of such tools?

There is a government consultation on copyright and related rights in the UK.

Artificial intelligence call for views: copyright and related rights – GOV.UK (www.gov.uk)

Content created by generative AI is derived from content that has been previously generated by others. Therefore, the implications of copyright for reusing this content are unclear: when is the output “inspired” from existing works, and when is it actually infringing them? Where to draw the limit? A generative AI user may, therefore, unwillingly end up infringing someone’s copyright if it publishes output that resembles an existing work too closely. 

While the content generated may be protected by copyright, it will not be owned by the AI itself. Under European (and US) law AI cannot own the copyright, as it cannot be recognised as an author and has no legal personality, which is a pre-requisite for owning (intangible) assets. However, it elaborates answers from the information that it has gathered in its database and creates a new answer, which may be protectable by copyright (European Commission,2023). 

As case law evolves, the concepts of Copyright and ownership rights will become more clearly defined. 

How to detect use?

There are a few ways to spot text that was produced by language models like Copilot, Gemini & ChatGPT:  

  • The text may be very coherent and well-written, with good grammar and vocabulary.  
  • The text may contain information that is not up-to-date or is not relevant to the current context.  
  • The text may not contain any errors or mistakes, as language models are trained to produce error-free text.  
  • The text may contain repetitive patterns or phrases, as the model generates text based on patterns it has seen in the data it was trained on.  
  • The text may lack the creativity or nuance of text written by a human.  
          How to prevent students’ use of such tools?

          One way for academics to prevent students from plagiarising using generative AI is to use plagiarism detection software, however use of such software is not full proof and has been known to be inaccurate and produce false positives. Plagiarism detection software scans text for similarities to other existing text, and can flag any instances of potential plagiarism.  

          You can also use closed-book assessments as this can prevent students from searching for answers online, including using generative AI tools. Additionally, educators can educate students on the importance of academic integrity and the consequences of plagiarism and provide resources on how to properly cite sources and paraphrase text. An important strategy is to re-think your assessment design and use specific prompts and constraints that limit the ability of generative AI to produce plagiarised content, such as encouraging students to create reflections. Where possible, encourage your students to apply their understanding to real-world scenarios or supply your students with unique data sets that they can interrogate.  

          Update academic integrity policies and referencing guidelines to clarify use and referencing of such tools.