When it comes to using AI tools, prompts are very much in vogue. Many people wonder what a good prompt looks like in order for AI to show the desired performance or deliver good results.
Lecturers at universities also ponder this question – although their thoughts are in relation to the performance of their students. It is in teaching specifically that lecturers have been using prompts as tools for a long time already to formulate tasks and require students to take certain actions in their learning.
Learning exercises include prompts that require the students to activate certain thought processes, deal with materials in a particular way or show a desired performance. By applying learning exercises that contain “good prompts”, lecturers can invite students to adopt a certain learning behaviour and thereby contribute to guiding their learning process.
What are good prompts in learning exercises?
When it comes to good prompts, a few general rules have been tried and tested – regardless of whether they are intended for AI or for students:
- Formulate prompts so that they are specific: Well-defined requests mean that students can understand exactly what is expected of them. Lecturers should therefore choose clear, precise formulations and also use short sentences that are easy to understand.
It could also be the case, of course, that lecturers do not expect a specific kind of behaviour from the students. Instead, they want to see how students deal with certain situations or questions while applying knowledge they have learned. In that case, the prompts should be formulated as open-ended questions. The students are thus given the freedom to decide how to deal with the learning exercise.
- Provide the context: Prompts should contain relevant background information and/or examples. That way, the students can understand the context of the exercise and thus improve their performance. With prompts in learning exercises it can also make sense to tell the students what role to adopt when doing the task: are they a teacher or a lawyer or a programmer, for example?
- Review prompts critically: Lecturers should examine a learning exercise carefully to check that the prompts really do contain all the necessary information so that the students can show the desired performance. This also includes formulating model answers.
- Adapt prompts: Prompts should be revised based on the results and the students’ answers. If students are not (yet) able to show the desired performance, that might be due to the degree of difficulty and the scope of the learning exercise. The exercise would then need to be revised accordingly and given to the students again.
Furthermore, prompts in learning exercises particularly need to take didactic principles into account, however:
1. Create prompts as learning questions in the first person
Prompts in learning exercises can be structured in the form of instructions for a certain procedure or in the form of learning questions. Both versions require the students to demonstrate a certain behaviour. Prompts given as instructions – e.g. “Find a practical example of the model presented in the video” – show this request openly and the students sense that this is a clear work assignment given by the lecturer.
If a prompt is packed into a learning question, however, students are more likely to perceive this direction as a request for them to think about it and to grapple with the learning content in more depth. If the prompt is formulated in the first person on top of that, it creates a relationship to the person themselves and to their own learning process. Including the first-person perspective has a positive effect on the students’ motivation. They are given the feeling that their own opinions are important and that it is not primarily about right or wrong when they answer the question.
Two examples of a prompt as a learning question in the first person could be:
- What practical examples can I think of that, for me, validate the model presented in the video?
- What examples can I find that conflict with the model in my opinion?
2. Prompts require the use of learning strategies
Good prompts require students to apply adequate learning strategies when processing learning content (Nückles, 2023). Students use appropriate learning strategies based on the request hidden in the prompts: for instance, they consider whether they can find practical indications of a theoretical model.
Here, lecturers can require their students to apply both cognitive and metacognitive learning strategies.
Prompts for applying cognitive learning strategies
Cognitive learning strategies can be divided into three categories:
- repetition of knowledge
- organisation of information and learning material
- elaboration of new units of knowledge.
Prompts in learning exercises can require students to repeat knowledge they already have and thereby activate their previous knowledge. Lecturers can require students to summarise the core content they learned in the last session in a short podcast, for example. That way, the students repeat the learning content and also apply a learning strategy to organise learning content at the same time. They are required to clarify the structure of the learning content and extrapolate core points.
Apart from that, further examples of prompts for putting organisational strategies into practice include:
- For me, what aspects of text XY are the main points?
- How would I structure the core information from the video if I had to explain it to a fellow student in five minutes?
Learning questions aimed at the processing of new content and embedding it in existing knowledge structures serve to elaborate on knowledge. Here, prompts that ask about connections and links between knowledge content that is already known and knowledge content that is new are helpful. Furthermore, they can also require students to find practical examples of theoretical constructs and encourage them to examine new content critically or to think about it.
The examples given in item 1 are prompts that require students to apply elaboration strategies. Here are some more examples:
- What connections are there between the results of study XY and my own day-to-day experience?
- Where can I see connections between the content of the last presentation on topic XY and the model in text Z in the current learning unit?
- Which aspects of video XY do I find interesting, useful and convincing and which ones not so much?
Prompts for applying metacognitive learning strategies
Metacognitive learning strategies do not directly refer to the processing of new information, but involve the active management and monitoring of the students’ own learning (Winne & Azevedo, 2022). On the one hand, they serve to monitor the learning progress in that the student reflects on whether the current approach is really leading to the desired progress (monitoring), and on the other, they monitor the learning process and encourage students to deal with the learning content in a different way if their current approach is not effective (self-regulation). Metacognitive prompts require students to check consciously whether they still have gaps in their knowledge or not and what options they have for confronting these issues (Berthold, Nückles & Renkl, 2007).
Examples of prompts for applying metacognitive learning strategies include:
- What content in text XY do I already have a good understanding of?
- What connections between today’s seminar session and the last one are still not clear to me?
- What questions does the scientific article raise for me or still haven’t been answered in my opinion?
- What sections of the video should I work through or watch again?
- What options do I have for resolving the difficulty I have with understanding?
3. Using prompts regularly and relating them to learning content
Using prompts in the form of learning questions in the first person can have a long-lasting effect on students’ learning behaviour (Berthold, Nückles & Renkl, 2007). When used regularly, students practise the procedure and thereby learn how to deal with the new learning materials. Here, it does not matter which media contain the material. The learning question can relate to dealing with written texts, research articles and chapters in textbooks as well as videos, animations, images or audio files. The important thing is for the prompts to explicitly involve dealing with the relevant learning materials and for their formulation not just to be general in nature.
When lecturers use learning questions like this regularly in a lecture or seminar, they should make sure that the prompts are varied and that they keep on requiring a new approach. That way, students become familiar with different learning strategies and, in addition, the variety has a positive effect on the students’ motivation.
Here are two examples of varying prompts that both require the application of organisational strategies:
Session 1: What do I think the most important pieces of core information from video XY are?
Session 2: How would I design a mind map for video XY?
4. Prompts are linked to feedback
Students can only build skills if they know the extent to which their performance correlates to a model answer and what areas they still need to optimise their performance in. That is why they should receive feedback after each learning exercise. Due to the great amount of time and effort involved, however, it is often not possible for lecturers to give students individual feedback on their performance. In that case, lecturers should at least provide general feedback that addresses frequent mistakes, misunderstandings and things that are unclear.
The method of peer feedback constitutes another option for students to receive feedback. This method can be used directly as a learning exercise. What is important here is to provide the students with a feedback sheet or core questions to answer. That way, lecturers ensure that the feedback is concrete and constructive. To be able to give feedback, the students need to be very familiar with the learning content. They may discover gaps in their knowledge as a result of the peer feedback. This method therefore lends itself to students checking their own skill levels as well. With this method, lecturers can therefore kill two birds with one stone – students get feedback and indirectly monitor the status of their own knowledge as well. You can find an example of how peer feedback can be applied in the blog post “Well-being in digital learning environments”.
Good prompts are an effective tool for lecturers to guide students’ learning behaviour. This applies to classroom situations as well as to designing self-learning phases (Hiltmann, Hutmacher & Hawelka, 2019). However, as with the interaction with artificial intelligence, when it comes to students there should be no expectation that every prompt will lead to the desired result. The guidelines highlighted here help reduce misunderstandings and minimise ambiguity.
Berthold, K., Nückles, M., & Renkl, A. (2007). Do learning protocols support learning strategies and outcomes? The role of cognitive and metacognitive prompts. Learning and Instruction, 17 (5), 564-577. doi.org/10.1016/j.learninstruc.2007.09.007
Hiltmann, S., Hutmacher, F., & Hawelka, B. (2019). Selbstlernphasen Studierender unterstützen. In D. Jahn, A. Kenner, D. Kergel & B. Heidkamp-Kergel (Hrsg.), Kritische Hochschullehre. Impulse für eine innovative Lehr-Lernkultur (S. 305-322). Springer VS. doi.org/10.1007/978-3-658-25740-8_16
Nückles, M.(2023, August). Journal Writing as Medium for Thinking and Learning: Instructional Support to Foster Self-Regulated Learning. Keynote at the EARLI Conference, Thessaloniki, Greece.
Winne, Ph. & Azevedo, R. (2022). Metacognition and Self-Regulated Learning. The Cambridge Handbook of the Learning Sciences (pp. 93-113). Cambridge University Press.
Suggestion for citation of this blog post: Rottmeier, S (2023, November 16). Good prompts – How do I say it to my students? Lehrblick – ZHW Uni Regensburg. https://doi.org/10.5283/ZHW.20231116.EN