the manifestation of cognitive overload

ATC -- Frazzled Cartoon Lady

The reactions people have to cognitive overload are varied. Some get angry, some withdrawn, some somewhere in the middle. What is common to all who experience it though is feeling overwhelmed, feeling uncomfortable, feeling frustrated and sometimes feeling worthless. Imposter syndrome can be common.

Understandably, it’s an experience we want to avoid. It can be exhausting.

How students handle it is largely determined by their temperament, which is affected by a multitude of factors. The more obvious reactions are the extremes: poor behaviour, lashing out, belligerence and compete withdrawal. Of course, poor behaviour and withdrawal is more complex than just cognitive overload, but at times it is certainly a factor, and I find it strange that little to no conversation ever discusses improving behaviour in the same way that we endorse the elimination of academic cognitive overload – through incrementally improving cognitive skills. Inculcating new behaviours surely needs the same level of design and commitment? Perhaps less obvious is cognitive overload in students who externally give few clues that they are experiencing it; perhaps they are not reacting because of compliance to the school’s rules or respect for authority, or perhaps because they don’t want to be seen as not understanding what is being taught; peer pressure is huge in all education sectors. Perhaps they are having a difficult time outside of the classroom, most certainly a factor affecting higher education students who may have lost their employment during COVID. Needless to say, cognitive overload reduces learning.

See the source image

A practical and relatively simple solution to mitigate against too much cognitive load is the design of learning sequences that focus on the building of schema that include lots of formative assessment to check learning. Good communication with students also allows you to gauge how students are feeling in their learning, and this can be an extremely useful form of formative assessment too.

It’s not just students who feel it

It’s certainly not just students who experience it . Any time you are under pressure in a new situation you are likely to experience it to some degree as your mind grapples with the new content and searches relevant schema to connect it to: the more the pressure and the fewer the connections, the greater the load. You are likely to experience it when you attend a conference where presentations don’t adhere to multi-media principles, you are likely to experience it in a meeting when you don’t have the relevant background knowledge on a topic being discussed, and you are likely to experience it when you yourself are presenting/teaching and you don’t fully understand or believe in what you are discussing. Of course, the most obvious analogy is when your practice is being observed. All of these examples are the times when you are effectively a student, a novice learner. As an educator, it is important that you reflect on the feeling of cognitive overload and how easily it can occur, and use that knowledge to consider how you design and shape the learning experiences of your students so they experience it less, and learn more.

I’m Paul Moss. I’m a learning designer at the University of Adelaide. Follow me on Twitter @edmerger

“ATC — Frazzled Cartoon Lady” by campbelj45ca is licensed under CC BY-SA 2.0

chunking lectures – it’s a bit of a no-brainer

Breaking a lecture up into distinct chunks or sections is a bit of a no-brainer. It is all to do with understanding the implications of cognitive load theory, specifically that the brain can only process a small amount of new information at once. Presenting more information than the brain’s architecture can handle leads to overloading the working memory, and usually a significant decrease in learning.

Breaking your lecture into chunks provides students a chance to process each chunk before new material is presented. Designing opportunities for students to be active (black boxes) in the processing of the content also assists in facilitating the content’s understanding, and eventual transfer into long term memory.

So, here’s a possible live -streamed lecture design that considers cognitive load implications, the need for the student to be active in their learning, and is very manageable for the lecturer. The model can be applied to both live and recorded lectures, but the recorded lecture will need some more specific context discussed, which I will do in another post.

I’ve talked before about the possible mixed-mode future of live lecturing, with it being able to facilitate a breakout room. The below model considers this as a possibility.

lesson segmentrationaletech to assist
introThe lesson begins with a retrieval quiz.
The benefit of retrieval is enormous. It strengthens the memory of key ideas and content. The purpose of this is so the knowledge can be automatically brought to cognition when new learning is presented, without taxing the working memory. The more knowledge the student can draw from, the greater the opportunity to delve into more higher order independent learning, so building students’ schema through retrieval is is a bit of a no-brainer.
The lecturer will place answers on the screen, and spend 2-3 minutes explaining answers if common errors were made.
Echo 360
Canvas quiz
teachingDelivering content.
10-12 min.
Incremental building to application is is a bit of a no-brainer. The lecturer is conscious of the need to present content clearly and simply, very much aware of multimedia principles that promote the efficient encoding of new information. They are also aware of the importance of modelling problem solving and incorporate worked examples into the presentation. Where appropriate, the lecturer connects the new learning to real world applications, not just to make the content relevant, but more so to build the mental patterns and analogies in the students’ schemata.
The lecturer also frequently mentions the reasons why decisions in the teaching are being made so as to strengthen the students’ metacognition.
PPT slides.
Document camera.
Students can take notes in Echo, can raise confusion flag, and ask a question at precise point in either the live stream.
student activityStrengthening understanding
This provides students a chance to take in what has just been presented, and think about the concepts before tehy are presented with more content. Essentially the student is trying to convert the abstract to the concrete. Providing students with the opportunity to complete worked examples, practise solving similarly structured problems, or discussing with a peer possible analogies to the content is valuable at this point in the lecture, and is a bit of a no-brainer.
Breakout rooms.
Mentimeter open question.
Echo discussions. Canvas discussions.
teachingDiscussion of last task if necessary – may not be if practising or completing examples.
Delivering content.
10-12 min.
Incremental building to application is a bit of a no-brainer. The lecturer is conscious of the need to present content clearly and simply, very much aware of multimedia principles that promote the efficient encoding of new information. They are also aware of the importance of modelling problem solving and incorporate worked examples into the presentation. Where appropriate, the lecturer connects the new learning to real world applications, not just to make the content relevant, but more so to build the mental patterns and analogies in the students’ schemata.
The lecturer also frequently mentions the reasons why decisions in the teaching are being made so as to strengthen the students’ metacognition.
PPT slides.
Document camera.
Students can take notes in Echo, can raise confusion flag, and ask a question at precise point in either the live stream.
Formative assessmentChecking for learning
A quiz of short answer opportunity to see if what you have presented so far has been understood is is a bit of a no-brainer. The questions also provide another opportunity for a student to process the content and develop a better understanding.
Questions up on screen.
Zoom polling.
Using Canvas discussions as student answer repository.
teaching Check answers – you may need to pivot the lecture if misconceptions are still prevalent.
Delivering content.
10-12 min.
Incremental building to application is a bit of a no-brainer. The lecturer is conscious of the need to present content clearly and simply, very much aware of multimedia principles that promote the efficient encoding of new information. They are also aware of the importance of modelling problem solving and incorporate worked examples into the presentation. Where appropriate, the lecturer connects the new learning to real world applications, not just to make the content relevant, but more so to build the mental patterns and analogies in the students’ schemata.
The lecturer also frequently mentions the reasons why decisions in the teaching are being made so as to strengthen the students’ metacognition.
PPT slides.
Document camera.
Students can take notes in Echo, can raise confusion flag, and ask a question at precise point in either the live stream.
student activityStrengthening understanding
This provides students a chance to take in what has just been presented, and think about the concepts. Essentially the student is trying to convert the abstract to the concrete. Providing students with the opportunity to complete worked examples, practise solving similarly structured problems, or discussing with a peer possible analogies to the content is valuable at this point in the lecture.
Breakout rooms.
Mentimeter open question.
Echo discussions. Canvas discussions.
summary Recapping key ideas. Tying the lecture all together: linking it to previous learning and real word contexts. Discussion and questions asking students to link their learning is a great way to draw attention to the key concepts again, and is a bit of a no-brainer. Mentimeter open ended question.

I’m Paul Moss. I’m a learning designer at the University of Adelaide. Follow me on Twitter @edmerger




This is the second instalment of a 4 part series aimed at assisting an educator in designing a sequence of learning that drives towards the ultimate goal of knowledge transfer. The intro post is here.

Concrete to abstract to concrete  

Delivering new learning in concrete terms using concrete examples makes it easier for a student to encode the content. We assimilate or accommodate new information in the context of the patterns, mental models, examples, analogies and experiences that either already were or have been converted to the concrete in our brain, representations that have formed the basis of our schemata. Kolodner describes this reliance on prior learning as case-based scenarios1. The use of models help students to capture the mental patterns required for the knowledge to become understood. It could be suggested that this predisposition to preferring the concrete could lead one to theorise that the concrete represents understanding. However, a necessary process in the incremental development of a schema is the gradual introduction of abstractions, as they instigate the machinations of knowledge transfer.  

A lot of what we teach in higher education is abstract in nature, and necessarily so: ‘The goal of learning an abstract concept is not simply knowledge of one instantiation; it is the ability to transfer, or apply conceptual knowledge to a novel isomorphic situation.’2 However, this creates two issues for learners: abstract content is inherently harder to learn than concrete content, and the links between abstract content and real word applications can often seem distant at best.  

Why is it harder to learn through abstraction? 

It seems our brains are wired to think in concrete terms3. So, if it is harder to learn through abstraction, why not simply avoid it altogether and convert material to the concrete for our students? Well, it all has to do with the facilitation of knowledge transfer. Resnick and Omanson4 assert that ‘learning with concrete objects supports initial understanding of the instructed concept but does not support the transfer of that knowledge to novel but relevant contexts.’ Pashler et al5 concur and extend this: ‘Many experimental laboratory studies and a growing number of classroom based quasi-experiments have found that teaching students about key principles or concepts using only abstract or only concrete representations of those concepts leads to less flexible knowledge acquisition and use than teaching students to recognize and use those key principles across a range of different situations’.   

So how do we get the balance right? It appears that once we have set a foundation using concrete examples, turning to metaphor is the next step. Reece (2003)6 suggests that human analogical reasoning engaged through metaphor-based environments helps learners to incorporate new concepts into their existing mental schema. She advocates the use of metaphors believing them to be ‘especially appropriate when learners are introduced to new, abstract concepts’, and she cites Jonassen, (1981)7 who asserts that metaphor acts as a scaffolding ; that is, ‘a learner structures the to be-learned domain, the target, according to the relational structure of the concrete and more familiar domain, the source domain (see structure mapping theory in Gentner, 1983, 1989; Gentner & Markman, 1997).’ 

In designing effective metaphors, Carroll & Mack8 instruct that ‘Within pedagogical applications, an instructional metaphor source domain can be carefully structured so that it replicates the relational structure of the target domain’. It seems that this deliberate scaffolding of metaphor is consistent with the need to attend to the incremental building of a schema; the metaphor is designed to incrementally build abstraction by initially making connections to relational structures quite close, and then gradually making them more abstract. The student’s schema thereby develops by adding to the bank of patterns contained within.   

But a well-developed schema does not necessarily mean that transfer of learning to new contexts is automatic.  Gentner, Rattermann & Forbus (1993)9 report that ‘people often fail to access prior cases that would be useful, even when they can be shown to have retained the material in memory’. Transfer then, requires a lot more attention to design than just acquiring knowledge. One of the ways to achieve it is through analogous examples.  

That is the base of the next post.


  1. Case-Based Reasoning: Kolodner, Janet L; 24 April 2005, The Cambridge Handbook of the Learning Sciences,

2. Do Children Need Concrete Instantiations to Learn an Abstract Concept?
Jennifer A. Kaminski (, Vladimir M. Sloutsky (, Andrew F. Heckler ( – 

3. Lawson, A. E., Alkhoury, S., Benford, R., Clark, B. R., & Falconer, K. A. (2000). What kinds of scientific concepts exist? Concept construction and intellectual development in college biology. Journal of Research in Science Teaching, 37(9), 996-1018. 

4. Resnick, L.B., and Omanson, S.F. (1987). Learning to understand arithmetic. In R. Glaser (Ed.),  Advances in instructional psychology (Vol. 3, pp. 41-95). Hillsdale, NJ: Erlbaum.

5.  Pashler, H., Bain, P., Bottge, B., Graesser, A., Koedinger, K., McDaniel, M., and Metcalfe, J. (2007) Organizing Instruction and Study to Improve Stu. dent Learning (NCER 2007-2004). Washington, DC: National Center for Education Research, Institute of Education Sciences, U.S. Department of Education. Retrieved from

6. Reece, D. 2003. Metaphor and content: An embodied paradigm for learning. Dissertation submitted to the Faculty of the Virginia Polytechnic Institute and State University in partial fulfillment of the requirements for the degree of Doctor of Philosophy In Curriculum and Instruction (Instructional Technology). 

7. Jonassen, D. H. (1981, April 7). Content treatment interactions: a better design model. Paper presented at the Association for Educational Communication and Technology, Philadelphia, PA.

8. Carroll, J. M., & Mack, R. L. (1999). Metaphor, computing systems, and active learning. International Journal of Human-Computer Studies, 51, 385-403.

9. Gentner, D., Rattermann, M. J., & Forbus, K. D. (1993). The roles of similarity in transfer: Separating retrievability and inferential soundness. Cognitive Psychology, 25, 524-575.

I’m Paul Moss. I’m a learning designer at the University of Adelaide. Follow me on Twitter @edmerger

The 3 a’s of knowledge TRANSFER: ACQUIRE, ANALOGISE, APPLY intro

THIS 4 PART BLOG SERIES is created to assist an educator design a sequence of learning that drives towards the ultimate goal of knowledge transfer. Consensus around the notion of transfer in learning is loose to say the least: some deny it’s existence, some accept it but differentiate the types of transfer possible, including near, far, (etc), and others dedicate entire epistemologies to its achievement. But whatever your position, few could deny that a major goal of education is to be able to apply what has been taught in the classroom to a broader context, whether that be work and/or the advancement of community, and so these posts attempt to position you better in being able to design a sequence of learning that strives as much as it is possible to transfer student learning to new contexts.  

INTRO: Towards independent learning and transfer

One approach to facilitating transfer has been to teach students how to learn. The rationale seems sound: if a student understands and practises how to go about learning, then they should be able to do it in a new context, independently. Making a student aware of their metacognition is really important, but unfortunately the resulting pedagogy that most often subsumes this direction is a) inspired by the belief that strength and resilience in learning how to learn comes from the student constructing their own knowledge of the process and b) characterised by immersing students in the context of having to think and find knowledge independently. The ostensible bonus is the potential replacement of an anachronistic teaching practice with a modern 21st century student centered pedagogy.  

But, the focus on a modern pedagogy liberating students from the shackles of the sage on the stage and all the imbalance of power that is associated with it misreads the argument of a large body of work* dedicated to providing caution to the increasing popularity of such a discovery/inquiry pedagogy. The sagacity that you need knowledge in a domain to become proficient in that domain, and more pertinently, that achieving the knowledge is more efficient if an expert scaffolds that journey for the novice, as opposed to the novice trying to find the knowledge themselves, is not driven by an impulsion to maintain a neoconservative agenda, or a thwart on choice or constructivist prerogative**, but ultimately driven by a goal to arrive at independent learning faster.  

But student learning will be stronger if they have found the knowledge themselves, won’t it? 

Interestingly, for such a widely held notion, I can’t find any evidence to support the idea that learning things on your own creates a stronger understanding than learning it from a teacher/peer, except if you are already quite proficient in a given topic/area of learning (reversal effect). Determining then how we teach the novice learner, who I contend, makes up quite a large percentage of the modern undergraduate cohort, to independence, needs a pedagogy that is less emotive and more scientific in its design, and one that is conscious of the reality of a curriculum that is starved of time.  

Standing on the shoulders of giants 

Initiating a context where novice students are expected to find knowledge on their own concomitantly initiates a context where novice students may not make the necessary connections between key ideas for a host of reasons: they may invest too much time in researching irrelevant knowledge, they may not ‘see’ the connections between ideas, they may, as John Sweller states, ‘use general problem-solving strategies such as means-ends analysis when faced with a problem’ and exhaust working memory, or worse, they simply may not engage with the autonomy of the context and do no work. Because much of what we teach is sequential, the consequence of students not arriving where we want them to be in the curriculum is that learning gaps will emerge, and these will have to be addressed in the limited time available. Understandably this ‘extra’ teaching is foregone by most, and this invariably leads to equity issues, with often only the highly motivated, intelligent or culturally literate students able to cope, as they are able to draw from schemata developed from these cultural and mental literacies. But I contend that even those students could be afforded a better more efficient pedagogy, one that scaffolds the acquisition of schema so that more meaningfully higher order thinking can be conducted sooner, and one that facilitates the creative extension of knowledge generated by ‘giant’ scholars.

The imperative of schema 

The reason why it’s inefficient to not scaffold the development of a novice’s knowledge base is highlighted by schema theory. When presented with unfamiliar content, we attempt to either assimilate or accommodate it into our schema, but if the gap between what we have and what is new can’t be connected, the working memory essentially exhausts itself, cognitive dissonance ensues and little learning, if any, happens at that point. In light of encouraging efficient transfer of knowledge, Dunbar’s finding that novices struggle significantly to encode the deeper structures of problems is pertinent: without sufficient analogies in a schema, making a new context consonant with learnt contexts in troublesome.  

The first step in building an appropriate schema is to teach in concrete terms with concrete examples. That is the base of the next post.  

*here are some examples:

Assessment training effects on student assessment skills and task performance in a technology-facilitated peer assessment. Xiongyi Liua and Lan Lib. 2013

Cognitive Load During Problem Solving: Effects on Learning. JOHN SWELLER, University of New South Wales 1988.

Constructivism as a theory for teaching and learning. Simply Psychology. McLeod, S. A. (2019, July 17)

John Hattie on Inquiry Based Learning.

The Use of Advanced Organisers in the Learning and Retention of Meaningful Verbal Material. Ausubel 1960:

What We Know About Learning. Herbert A. Simon. Department of Psychology, Carnegie Mellon University. Source:

Why Education Experts Resist Effective Practices (And What It Would Take to Make Education More Like Medicine). Douglas Carnine

Why Minimal Guidance During Instruction Does Not Work: An Analysis of the Failure of Constructivist, Discovery, Problem-Based, Experiential, and Inquiry-Based Teaching . Kirschner, Sweller, Clarke. 2006

**well it may be for some, but for me it’s about efficiency in learning  

I’m Paul Moss. I’m a learning designer at the University of Adelaide. Follow me on Twitter @edmerger


This is part 2 of an essay based on self-regulated learning, and whether it needs to be taught for students to become skilled in it. Part 1 is here.

In part 1 I discussed how explicitly teaching and modelling to students how to think with knowledge potentially facilitates students being able to self-regulate such thinking. The proposition has implications for the explicit modelling of thinking critically and creatively. In this post I will expound on Zimmerman and Moylan’s 2009 paper that theorises that motivation is inextricably linked to these metacognitive processes, and just like everything else connected to learning, needs to be explicitly taught to students in equal measure for them to eventually be able to use the knowledge independently.

Zimmerman and Moylan suggest that there are 3 differentiated stages in achieving self-regulation. These can be equated with the EEFs appropriated terms: planning, monitoring and evaluation. The diagram below represents the cyclical processes of self-regulation.


IT’S A CASE OF WHICH COMES FIRST, the chicken or the egg, but in order for a student to get their learning off the ground, they need to be motivated to do so. Oftentimes in the school sector, this may not be an intrinsic motivation, with extrinsic rewards and punishments tending to dominate the setting. Upon presentation of a new learning activity, a student will process a range of thoughts evaluating whether they should in fact participate in the endeavour. Students immediately process the expectations against any prior experiences or knowledge, drawing on their schemata to ascertain the extent of having to set new goals and strategies to achieve the new learning, whilst probably concomitantly deciding if they have any intrinsic interest in the task. If they arrive at the conclusion that they don’t possess either of these motivators, your work is immediately cut out for you.

Compounding this will be the fact that students also naturally draw from that schemata the affective responses they had or indeed have built over time in dealing with similar types of activities or learning experiences. If this audit brings up negative memories, perhaps emanating from a lack of success, or serious disinterest, then this will heavily impact on their motivation to continue. It certainly won’t be the case that ‘If you build it they will come’. A student’s self-efficacy or belief that they will be able to positively engage in the task will most certainly affect their planning, strategy and goal setting capacity. So, besides forcing students to participate, what can be done to break this thought pattern?

METACOGNITION – Make explicit the possible reactions students may have to a new task: ‘You may have had a negative experience with this type of problem before, but this time is different because…’, ‘You may immediately think there’s no relevance to this task, but…’, ‘You may have not achieved the grade you wanted in the last task, but this time we are going to plan the response better…’. By making such reactions explicit, explaining how demotivating factors can arise, and providing explicit strategies that ‘show’ how a different outcome may eventuate, the teacher is training the student to think about the new context in a new way, and mitigating against poor self-efficacy inhibiting impetus.   

Also crucial to setting up learning is making explicit the goal orientation of the task. Plenty of research suggests that ‘performance’ orientated goal setting, where students’ motivations to learn are primarily centered on comparison and competing against others, is tellingly inferior to having a ‘learning’ goal orientation: here. The positioning of a task’s import as being an opportunity to strengthen personal understanding against personal standards has been shown to facilitate a deepening of learning: ‘In this activity, let’s think about how we can incrementally improve our knowledge of the topic…’, ‘I want you to think about what your level of knowledge is on the topic and set yourself a goal of looking to strengthen it by the time we have finished….’, ‘In this task, we are going to concentrate on mastery…’ However, such ambition is made infinitely more difficult in a system predicated on accountability. Nonetheless, a good teacher will explicitly and inexorably focus their students’ attention on setting goals for self-improvement, and that learning is indeed a continuum that takes time and practice to master. When such purpose is part of the learning culture, once the task is successfully completed the student’s evaluation process then positively feeds into and strengthens the self-efficacy required to engage in a new learning context, regardless of how they fared compared to others in the cohort.  

This personal growth rather than competitive epistemology is particularly relevant if you are trying to encourage students who are working hard but not quite succeeding – and observing others around them achieving – in the beginning of a course. These students not only need the explicit discussion of what success means (improvement against your last effort), but precise feedback that articulates what the gaps in knowledge are, and crucially, scaffolded activities that facilitate the opportunity for observed improvement against the last effort. Mastery pathways not only provide opportunity for incremental success, but also the chance to eventually catch up to the expected standard. Because success is the greatest motivator of all, when those achievements are explicitly labelled to the student, s/he will accommodate their self-efficacy to become more positive.


During the task, drawing students’ attention to how they are solving problems and the progress they are making and the motivation required to do so will facilitate the eventual automaticity of such thinking. Modelling self-questioning and verbalisation of thinking processes whilst scaffolding learning through worked and completion examples builds the schema of such processes in students’ minds, and teaching students how to manage time and set up an appropriate learning space should never be assumed to be assumed knowledge. Providing as many opportunities as necessary to facilitate a culture where the student can control these learning strategies and can readily select the most appropriate tools to negotiate the context they find themselves in should be an engrained aspect of a teacher’s curriculum design. When students feel such control over the strategies they employ to negotiate the present task, their motivation and self-efficacy will be strong.

The explicit drawing of attention to higher order thinking processes during the task goes towards developing the schema for doing so in future, independent contexts. As argued in part 1, assuming students will engage in higher order thinking once knowledge is sufficiently acquired is not a good idea, as students may not do this unless they are highly motivated in the discipline or topic in question. Prompting with questions like ‘So if we know this about …., what would happen if …..?’, ‘What is the connection of this idea to the topic we looked at last week?’, ‘What would happen if we combined these 2 ideas?, ‘So imagine this scenario…., how would you solve the problem at hand?‘ If you model this thinking, students will use the model as a strategy when asked to think about knowledge in new contexts, and being able to do so will boost their confidence in engaging with knowledge in interesting ways. This confidence develops self-efficacy, and thus motivation.


From my experience, one of the most difficult things to do is to get students to reflect on their performance and planning after the event. This is especially difficult if the student entered the transaction with a performance goal orientation and wasn’t overly successful. The immediate deflation is palpable. Explicitly discussing this with the students is important at this very moment. But perhaps most importantly, understanding the causal attributions some students may have applied to their success or failure is necessary to ensure that they are able to benefit from the evaluation.

Many students attribute their experience to fixed ability, which is particularly detrimental if they engaged in the activity with a performance goal and didn’t succeed. The comparison against others that essentially results in a defeat if unsuccessful solidifies a negative self-efficacy, which in turn has a negative influence on the planning stage of the next learning moment. If however, the student can be persuaded by the learning continuum theory and that their ability in the task is not fixed and can in fact be improved by application of effort, practice and good revision and study techniques, then the probability of their motivation being secure for the next task is high.  

Unfortunately, over time and repeated negative experiences in learning environments, some students develop entrenched negative evaluations that seriously inhibit motivation to continue or engage in future learning contexts. Procrastination may be a milder symptom of such a state, but more serious and damaging is learned helplessness, a notable defence mechanism employed that prevents a student from trying because they believe that there’s nothing that they can do to change an inevitable failure. Often, such a state becomes an unconscious default, and can only be changed by carefully designed scaffolded learning opportunities that promote success, as well as making the psychological context explicit. Of course it is time consuming, but a well-constructed audit of a student’s performance, including how they approached and revised etc for the task, will likely find a host of issues that could be rectified. A checklist may work in helping students evaluate their performance in a task, and the explicit discussion about how neglect in each element on the list is quite impactful could act as a motivator for a student to alter their preconceived beliefs that they aren’t in control of changing their learning potential.


Teaching students about motivation and how past experiences affect the present, and helping students identify patterns of behaviour, their ‘real’ causes and how they can be adjusted is as imperative as teaching them content. Making thinking explicit can go a long way to positively affect how a student perceives a task and their ability to process, engage with, and succeed in it. The result is that students will willingly drink from the water you have led them to.    

The next post will discuss how beneficial it can be for students to understand how learning actually happens.

I’m Paul Moss. I’m a learning designer at the University of Adelaide. Follow me on Twitter @edmerger


Achieving independence and self-regulation in learning is the holy grail of education, but how to go about it is as equally mystical. Essential to the quest is developing a rich schema through the building and interaction of knowledge, and whilst belief in the explicit teaching of students in how to think about their thinking processes (metacognition) and how to evaluate them as being an integral part of self-regulation is gaining momentum (EEF), this 2 part post will seek to extend the current understanding by discussing whether it is necessary to promote critical and creative thinking inside subject domains. The essay also expounds on Zimmerman and Moylan’s 2009 paper that theorises that motivation is inextricably linked to both of these metacognitive processes, can’t be omitted from the discussion, and in fact needs to be explicitly taught to students in equal measure. As Kuhn exhorts, ’People must see the point of thinking if they are to engage in it.’  


Whilst many argue that labelling skills such as critical thinking and creativity as ‘21st century’ does an injustice to those who for thousands of years exhibited such proficiency in them, few could argue that there is a growing demand for graduates to be strong in these areas in the age of increasingly automated and mechanised jobs. How to equip students with such skills then has become the mission of educators, but many well-intentioned educators have erroneously conflated the desired outcome with a direct pedagogy, succinctly stated by Kirschner: the epistemology of a discipline should not be confused with a pedagogy for teaching or learning it. The practice of a profession is not the same as learning to practise the profession. There are plenty of excellent voices who assent to this notion, none better then Daisy Christodoulou, specifically pointing to the fact that thinking critically or creatively relies entirely on a strong bedrock of knowledge and can’t be taught in the abstract. If we think about this it seems rather logical – you can’t think about things you have no knowledge of, and most creativity is the accommodation of knowledge already in existence. Such constraints make the application of such skills heavily context and domain dependent. But what tends to be lacking from such unequivocal pedagogy is the answer to this question: once the foundations of knowledge are secure, do students need explicit modelling of how to think critically and creatively with that knowledge? I contend that the answer is yes.  

If we consider how learning is characterised by the acquisition of schema, and how crucial modelling is in that continuum, I would argue that modelling how to play with knowledge is no less important than modelling the knowledge itself. However, it is something that is often overlooked in modern curricula for three reasons:  

  • Because we sometimes assume that students will naturally think in these ways  
  • Because of the need to fit in so much content in so little time  
  • Because it is hard to assess, relying on subjective and therefore unstable evaluation 

The first relies on Geary’s theory of primary vs secondary knowledge. The exposition of the theory is that once sufficient knowledge is obtained, the mixing/matching and challenging/critiquing of what is understood should become axiomatic. From my experience though, without the continuous prompting by the teacher to engage with the knowledge in this way, such an outcome tends to rely heavily on a student being highly motivated in a specific domain of knowledge, with the less interested, but equally as capable student, content with achieving in assessment but not necessarily interested in exploring the content further. But what is notable however about the self-motivated student, is that they still will undertake a process of learning in how to mix and match and challenge what they know, albeit, independently: it is through the experimentation of their thinking and its evaluation that they may eventually arrive at something unique and interesting, but this ostensibly natural skill is actually being practised and refined to be maximised – and quite possibly, inefficiently, compared to what some guidance in the process could afford. When motivation to pursue a discipline is not as high, students need to be prompted to engage in ‘higher order’ thinking. Interestingly, sometimes it is only after these higher order prompts that real interest and motivation is sparked, and so the explicit provocation of them in a learning environment is important.

Sweller’s addition to Geary’s thesis, that : ‘Organizing general skills to assist in the acquisition of subject matter knowledge may be more productive than attempting to teach skills that we have evolved to acquire automatically…’ supports the earlier statement that teaching critical and creative thinking in the abstract is pointless, but it is the focus on the word ‘organising’ that is crucial here: the conclusion then is that it’s not enough to assume students will naturally engage with this type of thinking – it is only through the explicit organisation and modelling of it that will facilitate students being able to self-regulate this thinking.

Practising the application of critical and creative thinking needs time and space for it to be strengthened, and this is why the existence of the 2nd obstacle in educational contexts is so concerning. The impetus of non-invigilated exams has certainly made apparent the need for assessment to involve the application of knowledge. But to do so requires a carefully designed curriculum that facilitates such opportunity in the sequence of learning.  I tend to promote a sequence patterned by the rhythm: learn, practise, apply. New knowledge is introduced by the expert, students interact with and practise using the knowledge to confirm understanding, students then apply their knowledge to do something with it. The application doesn’t have to be a large project type task. It may simply be the asking of higher order questions that include hypothesising, creating analogies, exploring various points of view, wondering if the content can be applied in other contexts, what the connections are to other aspects of the course, or brainstorming with a view to generate new ideas for a real-world context. The latter is especially relevant for the later stages of higher education.  

It is such a pattern of learning that models for students how to interact with the understood knowledge they now have in their possession, a modelling process that observes what Volet (1991) imports as the necessity of identifying and making explicit how an expert thinks. This is relevant to not just when the expert is presented with new problems, but also how they think with the knowledge they already have. Palincsar &Brown (1989) concur, ‘By demonstrating the different activities by which subject matter may be processed, problems solved, and learning processes regulated, the teacher makes knowledge construction and utilization activities overt and explicit that usually stay covert and implicit.’ Like all learning, the goal is to take the metacognition to automaticity so the propensity for self-regulation in the next sequence of learning isn’t compromised by cognitive overload.   


Whether or not this explicit process of thinking within specific domains can be transferred to new contexts remains to be seen, but Simon, Anderson, & Reder (1999) arouse our curiosity when they suggest that transfer happens far more frequently than we might think. They cite reading as a prime example, but more specifically challenge a famous study by Gick and Holyoak who demonstrated that students were unable to see the abstract similarities between two problems even when they were presented side by side:  

One of the striking characteristics of such failures of transfer is how relatively transient they are. Gick and Holyoak were able to increase transfer greatly just by suggesting to subjects that they try to use the problem about the ‘general’. Exposing subjects to two such analogues also greatly increased transfer. The amount of transfer appeared to depend in large part on where the attention of subjects was directed during the experiment, which suggests that instruction and training on the cues that signal the relevance of an available skill might well deserve more emphasis than they now typically receive–a promising topic for cognitive research with very important educational implications.’  

They then continue to suggest that: ‘Representation and degree of practice are critical for determining the transfer from one task to another, and transfer varies from one domain to another as a function of the number of symbolic components that are shared.’ It follows then that for Dignath and Buttner’s claim to be valid, in their meta-analysis on Components of Fostering Self-regulated Learning, that ‘Providing students with opportunities to practice strategy use will foster the transfer of metastrategic knowledge to real learning contexts’, relies on students being able to recognise patterns or connections between contexts where they can apply their metacognition.  

As stated earlier, you can’t think critically and creatively without a strong foundation of knowledge, and further, some of that thinking will be only relevant in specific domains. But it does seem likely that some of the higher order strategies stated above (hypothesising etc) would be able to be applied in a range of disciplines, and that a student observing the modelled thinking processes of a teacher in a second context will recognise some (if not many) elements learnt from their first. Once reinforced through this observation, students will begin the regular learning continuum of taking the skills to automaticity through practice. Once achieved, being able to apply the thinking in new contexts is made more possible – it will be up to further research to ascertain whether, having met these conditions, such transfer is actually possible.  


 Another consideration when teaching critical thinking draws from Kuhn, who exhorts that the development of epistemological understanding may be the most fundamental underpinning of critical thinking. In no uncertain terms, she beseeches that teachers provide the opportunity for students to reach an evaluative level of epistemological understanding, realising that simply possessing an absolute epistemology constrains and in fact eliminates a need for critical thinking, as does a ‘multiplist’ stance, allowing students a degree of apathy characterised by statements such as “I feel it’s not worth it to argue because everyone has their opinion.” The explicit modelling of an evaluative epistemology, where students are encouraged to the fact that people have a right to their views with the understanding that some views can nonetheless be more right than others, sets up a learning culture where students see the ‘weighing of alternative claims in a process of reasoned debate as the path to informed opinion, and they understand that arguments can be evaluated and compared based on their merit (Kuhn, 1991).’ Such a pedagogy may satiate an interesting question posed by Martin Robinson: ‘Should the result of a good education include all students thinking the same or thinking differently?’

The 3rd obstacle also looms large. Assessing creativity especially is a difficult thing due to its subjectivity. Rubrics are notoriously imprecise as a reliable reference in determining success or failure of creativity: what I may think satisfies one element of a rubric may be argued against by a colleague; maintaining consistency even with myself in marking is difficult. And if we don’t assess, will students not particularly interested in the topic lose motivation, and make the process a challenging one to manage? I think the answer lies within the answer to Martin Robinson’s question: surely we don’t want everyone robotically programmed. We want students to engage critically and creatively with concepts, and participate in the building of a dynamic and interesting world, so we have to have faith that the knowledge taught to our students, when learnt well, will provide avenues for curiosity that will engage them to participate. Such an epistemology then satisfies stakeholder desires to employ graduates who can think critically and creatively in a modern workplace.      

So how is motivation linked to it all?

 In the next post, I will extrapolate on Zimmerman’s imperative that metacognition is inextricably linked to motivation, and how educators can ensure they incorporate both in learning design.  


Anderson, J. R., Reder, L.M., & Simon, H.A. (2000, Summer).Applications and Misapplications of Cognitive Psychology to Mathematics Education.Texas Educational Review. 

Dignath, C., Buttner, G. (2008). Components of fostering self-regulated learning among students. A metaanalysis on intervention studies at primary and secondary school level. Article in Metacognition and Learning · December 2008 retrieved from here 

Geary, D. (2001). Principles of evolutionary educational psychology.
Department of Psychological Sciences, University of Missouri at Columbia,
210 McAlester Hall, Columbia, MO 65211-2500, USA here

Palincsar, A. S., & Brown, A. L. (1989). Classroom dialogues to promote self-regulated comprehension. In J. Brophy (Ed.), Advances in research on teaching, Vol. 1 (pp. 35–67). Greenwich, CO: JAI Press. 

Sweller, J. (2008) Instructional Implications of David C. Geary’s Evolutionary Educational Psychology, Educational Psychologist, 43:4, 214-216, DOI: 10.1080/00461520802392208

Volet, S. E. (1991). Modelling and coaching of relevant metacognitive strategies for enhancing university students’ learning. Learning and Instruction, 1, 319–336. 

Zimmerman, B., Moylan, A. R. (2009). Self-Regulation from:
Handbook of Metacognition in Education. Routledge.

I’m Paul Moss. I’m a learning designer at the University of Adelaide. Follow me on Twitter @edmerger


Teaching to the test doesn’t work. But teaching students about the test is imperative. Not only that, exam performance IS a thing, and you can assist students to get better at that performance. It’s all about mitigating cognitive load.

GAME TIME – Any sports person will tell you that match fitness is everything. Regardless of how much you prepare, you never achieve the same level of fitness and game knowledge compared to actually playing. Why? Because when the real thing happens, not only do nerves and adrenaline consume vast amounts of energy, interfering with the ability you have coming to the surface, but lots of other unexpected occurrences happen, all leading to increased cognitive load, and leading to exhaustion quicker. The cognitive load can be so debilitating that the player has to rely on muscle memory to get them through. When a student sits an exam, adrenaline and anxiety will naturally surge through their veins. Helping them revise the content is a must, but importantly, helping them become more familiar with the game/exam context is climactical, and this can be achieved by training students to automaticity with exam technique.


1. Exam layouts

 Show students, and get them used to, the layout of the online exam. The more they see the module and layout of the exam and understand what the expectations are of each section, the less pressure they’ll feel when they see the real thing.  

Of particular importance with students having to complete exams online is detailing the processes involved if they experience technical issues. Take them through the procedures so if it happens during the exam they don’t lose all confidence and panic. ALSO: Ensure students have read the academic integrity policy and that you discuss it repeatedly – the more you talk about academic integrity the more of it you’ll get.

cognitive load
 Student A – no trainingStudent B – training
Before beginning exam20%20%
Exam layout5%0%

2. Question requirements

Ensure students know what each question is demanding of them.

How long is a piece of string?

What does a short answer look like? What gets you full marks? What does a long answer look like? What gets you full marks? How much working out is necessary? How much detail is required?

Don’t expect students to guess the answers to these questions. Students who have to worry about what constitutes a good answer expend lots of valuable cognitive load. Model the expectations by showing previous examples, past exams, etc.

Manageable student
cognitive load
 Student A – no trainingStudent B – training
Before beginning exam20%20%
Exam layout5%0%
Exam content 30%0%


1. Time training

Training students with timings of questions in exams will significantly propitiate cognitive load. It’s one thing to know what the question demands of you, but another to actually do it in a stressed environment. If a student isn’t used to the pressure of time, the longer the exam goes on, the greater the likelihood of their cognitive load increasing and their performance reducing as they panic with the evaporation of time. So, get them to practice doing a mock of a section in the exam – let them experience what it’s like to type in the allocated time – do their fingers get tired? What’s it like to upload if necessary etc. The more practice they get the better, but if you are running out of lesson time to train students, at least give students the chance to practice once – just one section that requires an upload process for example.

This image has an empty alt attribute; its file name is image-2.png

The other aspect of time training is in helping students to set personal timers. Obviously, the online exam doesn’t have all the usual cues that an invigilated exam offers: a large clock, a warning by the invigilator of 5 minutes to go, and even the cues of students completing and organising their work on the next desk. But an advantage of online exams is that students can set their own alarms to negotiate each individual section of the exam, and not accidentally spend too much time on a certain section:

Manageable student
cognitive load
 Student A – no trainingStudent B – training
Before beginning exam20%20%
Exam layout5%0%
Exam content 30%0%
Exam timing training20%0%

2. Editing their work

Rereading responses is difficult for exhausted students to do at the end of a lengthy exam. It is usually at this point that they have a sense of relief, and the last thing they want to do is reread what they’ve done. Of course, it’s madness not to, to ensure there are no silly mistakes, particularly in multiple choice questions, or content mistakes. Even checking for structural, punctuation and/or spelling issues could benefit the overall grade. 

So, I have to build that practice into their normal way of working, so it becomes a part of the process, and not an add on. This can really only be achieved by repeatedly physically getting students to do it: at the end of each ‘mock’ assessment, stop the test and get students to spend 4 – 5 minutes in dedication to proof reading…and explain the rationale, repeatedly: I always tell my students they WILL lose more marks with errors (they can fix) than they are able to gain by writing more response in the last 5 minutes. But without it being a normal way of working, exhausted students won’t do it automatically.   

Manageable student
cognitive load
 Student A – no trainingStudent B – training
Before beginning exam20%20%
Exam layout5%0%
Exam content 30%0%
Exam timing training20%0%
Editing responses5%0%

3. Being professional

Not panicking in certain situations is crucial in reducing cognitive load. Taking students through possible scenarios will help to calm them if the situation presents in the exam, scenarios such as:  If you’re running out of time what should you focus on to get you the most marks? What to do if you can’t answer a question – do you panic and lose total focus for the rest? Should you move on and come back to questions? Are you aware that the brain will warm up and so coming back later may be easier than it is now? This last point is absolutely crucial to convey to students. As the exam progresses, lots of the exam content itself may trigger or cue retrieval of content that couldn’t previously be answered, so teaching students this metacognitive notion could make a significant difference to their overall performance.

Manageable student
cognitive load
 Student A – no trainingStudent B – training
Before beginning exam20%20%
Exam layout5%0%
Exam content 30%0%
Exam timing training20%0%
Editing responses5%0%
Being professional 10%5%

As you can see by the very much made up numbers, the cognitive load experienced by Student A is significantly greater than Student B, and would indubitably affect performance in the exam. The student’s knowledge would have to fight a great deal to break through the pressure. 


The more you do something the better at it you get, provided of course you’re doing it the right way. Students don’t really get that many opportunities to learn to negotiate the exam environment on their own, especially in the current context of moving to online non-invigilated exams, and so providing them with such training is critical. 

I’m Paul Moss. I’m a learning designer at the University of Adelaide. Follow me on Twitter @edmerger, and follow this blog for more thoughts on education in general.  

Is it even possible to set an online open book mathematics exam?

When trying to offer advice on how to modify exams for the coming semester exams, some subjects have presented with unique issues. Mathematics, for example, has the unenviable dilemma of not being able to set calculation type questions as students can simply type them into an online calculator and be presented not just with the solution, but the workings out too.

The remedy presented to other subjects that require numerical calculations, such as statistics and accounting, of randomising questions, both through the formula question type in Canvas as well as question banks, is not appropriate for mathematics.

The only hope of confidently reducing the amount of ‘Googling’ during the exam is to create more complex questions, questions that require deeper understanding or the application of knowledge, which also requires deeper understanding. Whilst this is of course the ultimate goal of any subject, if such application demands haven’t been taught, then the likelihood of students producing quality answers in exams is limited. If the amount of content that has been introduced determines that only superficial understanding is possible, a breadth rather than depth approach, then question types in the exam can’t change because it’s now open book – students simply wouldn’t have been prepared sufficiently, and thus the exam will not produce valid inferences.

In defense of mathematics, many of the calculation questions that an ordinary invigilated exam would test are designed as such to strengthen fundamental processes and skills that are required for further study in the discipline. The building of the schema is essential to be able to apply understanding in further contexts. But open book exams now pose a large threat to such a design of curriculum. It may be in the future that a depth rather than breadth approach is the only feasible option, so that the depth of understanding in less of the content can open opportunity to assess the application of the knowledge, and thus mitigate against cheating.

Baby with the bathwater?

However, there is something that mathematics’ exam designers should also be conscious of before eliminating all questions that a student could simply look up. The beginning of an exam should really be designed to ease students into the process, to provide a quick boost as they solve a question they find relatively easy. The anxiety, practically 100% concomitant with sitting university examination, is immediately partially assuaged, and thus reduces cognitive overload and allows a student to think more clearly. Exams that begin with very difficult problems can throw off students’ confidence significantly, even those who know enough to pass. It may be that you still set those initial questions as fundamental skill questions that could be looked up but knowing that for the majority, who won’t need to look them up, they will benefit from gaining some confidence in the initial stages of the exam that will facilitate better attempts at the more difficult questions later on.

In the end, it’s not about those who will cheat, it’s about those who won’t.

I’m Paul Moss. I’m a Learning Designer at the University of Adelaide. Follow me on Twitter @edmerger


Few would argue that a goal of education is for knowledge to be able to be transferred from one context to another. However, making it happen is not as easy as it seems, and this has implications for epistemological decisions needing to be made in designing curricula, exams, and indeed, deciding on an institutional ethos.

From research discussed below, knowledge transfer relies on two conditions:

  • transfer is usually only possible when a student possesses a relatively well-developed schema: the closer to expert the better
  • the transfer needs to happen within or close to the known and acquired domain of knowledge.


What characterises an expert is their acquisition of schema. Experts tend to have lots of knowledge about a subject, but knowledge that is organised and elaborate in how it connects it all together. Particularly important, in terms of knowledge transfer, is the expert’s ability to see the underlying deep structure of problems, regardless of surface differences. It is this ability to make analogies with what they have previously encountered that not only improves the encoding of new content, but also its retrieval:

  • Experts are better than novices at encoding structure in examples and recalling examples on the basis of structural commonalities (Dunbar, 2001). For example, Novick (1988) found that students completing a second set of mathematics problems all recalled some earlier problems with similar surface features to the present problems, but students with high Mathematics SAT scores recalled more structurally similar problems and were also better at rejecting the surface features than were students with low scores.
  • The reason for this is that when experts think about problems, they draw on/retrieve their large reserves of schema that have evolved, through practice and deliberate exposure to worked examples, to contain the deeper structural features of question types. On the other hand, novices tend to do the reverse, only being able to identify the surface structural characteristics and thus using an inefficient means-end solving strategy (Sweller 1998). The issue with this is that it heavily taxes the working memory, and often results in cognition being overloaded. What’s worse, is that such a taxing ultimately denies the problem from becoming a part of the schema for future use – so there’s a double loss.

The implications of this for education are enormous. The need for schema is irrefutable, from Bartlett to Ausubel and even to Bruner: but for novice students to develop it efficiently, they need to engage in learning that builds knowledge over time and experience, through examples they can store and eventually make analogies with, and interestingly, as Sweller states above, not through problem solving.

So, here’s how transfer can be developed:

  • a student learns by an example, which with the right conditions (retrieval), is then stored in their long-term memory. At this point, only the surface structure of the problem is recognised.
  • The student then encounters another example that has a similar surface structure. Now the student has 2 models to draw from. At this point, only surface characteristics are likely to be seen.
  • The student then is provided another example but this time the surface structure is different but the deeper structure is analogous. The teacher at this point must direct student attention to the analogous deeper connections, as they usually won’t see them for themselves, as proven by Duncker’s tumour problem – see the study below.
  • Repeating this process eventually builds the student’s repertoire of problems they can draw from to make analogies with. The more they have, the greater the chance of them behaving like an expert, identifying the deeper structural components and working forward with the problem, thereby using less cognitive load, and inevitably adding another example to the schema.

How to deliver the analogous examples

Gentner, Lowenstein and Thompson (2003) conducted a study to ascertain what the most efficient delivery combination was. The study used 2 negotiation scenarios, one from shipping and one from travelling as a means of training students to be better negotiators. 4 contexts of delivery were investigated:

  • separate examples, where student were presented both examples on separate pages. Students were asked questions about each text
  • comparison examples, where students saw both examples on the same page and were directed to think about the similarities between the 2 stories
  • active comparison group, where students were presented with the first example on one page and the solutions to that example were carried to a second page that presented the second example with questions asked about the similarities between the two
  • a group that had no training

Clark and Mayer (2008) adapted the findings and presented them graphically:

The results showed that an active comparison was a far superior technique to train the students

Implications for exam design

There are 2 considerations in this regard:

  • When designing open book exams that rely on the application of knowledge (in the current climate primarily to mitigate cheating), it is important to consider the cognitive conditions for transfer to take place. If you have taught your students a range of examples that have facilitated analysis of deeper structural connections, then your question in your exam can test understanding of the deeper structural connection. If you haven’t taught your students in such a way, then your question choice will be limited to more surface level questions. If you ‘jump’ to deeper structural questions, in an attempt to make the questions harder to compensate for the openness and accessibility of the content, then the results of the exam may well be invalid, as you have tested for something that students weren’t capable of doing.
  • On the other hand, knowing that you can safely change the superficial structural elements of a question and test ‘real’ understanding because transfer is difficult if the concept isn’t truly understood, also mitigates against cheating as students can’t simply rely on their notes. If they can’t make the connections, an indicator of a novice learner, then they can’t benefit from the notes as an expert would – who ironically, probably wouldn’t need them anyway.

Duncker’s tumour problem

A problem that has been studied by several researchers is Duncker’s (1945) radiation problem. In this problem, a doctor has a patient with a malignant tumour. The patient cannot be operated upon, but the doctor can use a particular type of ray to destroy the tumour. However, the ray will also destroy healthy tissue. At a lower intensity the rays would not damage the healthy tissue but would also not destroy the tumour. What can be done to destroy the tumour?

Gick and Holyoak used this story to test the transference success of knowledge. Prior to the tumour problem, students are then given the story below, and another group a second story to accompany the current 2. Both additional stories have superficial differences to the tumour case, but similar structural or convergent features. They found that most students who tried to solve the tumour problem on their own had difficulty, those with the aid of one story still struggled, but those with the aid of 2 stories could see the convergent abstract similarities. In other words, they were able to see the deeper structural analogies.

A small country was ruled from a strong fortress by a dictator. The fortress was situated in the middle of the country, surrounded by farms and villages. Many roads led to the fortress through the countryside. A rebel general vowed to capture the fortress. The general knew that an attack by his entire army would capture the fortress. He gathered his army at the head of one of the roads, ready to launch a full-scale direct attack. However, the general then learned that the dictator had planted mines on each of the roads. The mines were set so that small bodies of men could pass over them safely, since the dictator needed to move his troops and workers to and from the fortress. However, any large force would detonate the mines. Not only would this blow up the road, but it would also destroy many neighbouring villages. It therefore seemed impossible to capture the fortress. However, the general devised a simple plan. He divided his army into small groups and dispatched each group to the head of a different road. When all was ready he gave the signal and each group marched down a different road. Each group continued down its road to the fortress so that the entire army arrived together at the fortress at the same time. In this way, the general captured the fortress and overthrew the dictator.


Clark, R., Mayer, R. (2008). e-learning and the Science of Instruction. Pfeiffer, San Francisco, CA.

Image sourced from here

I’m Paul Moss. I’m a learning designer. Follow me on Twitter @edmerger


Participation is crucial in any learning environment, and a Zoom session is no different. Participation encourages attention, which is a requisite for learning. If students aren’t attending to the content or discussions on offer, they have no chance of encoding that content and then being able to use it at a later time: in other words, learning it. Being skillful in ensuring participation is therefore imperative.

Varying the way students are asked to participate is a powerful way to encourage engagement. Zoom can encourage participation in several different modes, which sometimes is not possible in a regular face to face session. Here’s how a teacher/tutor can engage students in a Zoom session:

  • Immediate quiz/questions
  • Explaining your method
  • Non-verbal feedback
  • Verbal questions
  • Written questions
  • Polls/quizzes
  • Breakout rooms
  • Screen sharing
  • Using the whiteboard
  • Modifying content


Because of the way our memories function, recapping content from previous sessions is essential to help the knowledge move into the long-term memory where it can then be recalled automatically to assist in processing new information. Students who arrive on time to your Zoom session should immediately be put to work, either doing a 3 or 4 question quiz on previous learning, or producing short answers to a question or 2. Both of these are shared from your screen. This then does 2 things: firstly, it activates prior knowledge that will assist in today’s learning, and secondly, it gets the students involved straight away. Late comers also won’t miss the new content. Answers to the quiz etc are briefly discussed and then the current session begins with students’ minds active.


By articulating the strategies you will employ in the session up front you are likely to alleviate students’ anxieties about some of the processes they’ll experience during the session, and therefore encourage participation. Explaining why you are repeating questions, why you are talking about things from previous sessions, why you are asking for different types of responses and feedback, why you are insisting everyone responds before you move on, why you are using polls and why you are so keen on student participation and its effect on learning will help students feel more comfortable during the session and feel more able to participate.


You will have to turn on NON-VERBAL FEEDBACK in the settings:

Getting students to indicate a yes or no or a thumbs up encourages participation. Whilst you can’t guarantee such an assessment for learning truly proves students have understood your question, as students could just be guessing or indicating to avoid being asked why they haven’t, it still gets students involved. Even if a student answers to try to avoid a follow up question when the tutor sees they haven’t responded they are still actively listening, which is a condition of learning. Varying the type of questions can also generate some humour and fun in a session – asking if students are breathing, or if they know that Liverpool football club is the best team in the world for example. Non-verbal feedback is best used in triangulation with other assessment for learning options, such as verbal questions:


Effective questioning is a powerful way to assess for learning and guarantee participation. The key to effective questioning is to ask, wait for students to process the question, and then check a number of answers before saying if the answers are right or wrong. Repeat the questions at least 3 times during the processing stage. Keeping the questions ‘alive’ is important to encourage participation because as soon as you provide an answer the majority of students will stop thinking about the answer – they have no need to keep thinking: allowing time for students to think about the answer gets the retrieval process activated as they search their minds for connections to previously encoded information. By randomly choosing students to answer you not only get a sense of their levels of understanding which allows you to pivot the next sequence if necessary, but it also keeps students on their toes as they realise that they may be called on next. This random selection of students will even work in a very large tutorial.

Sometimes it’s the little things. Be aware that you might naturally tend to favour interacting with those you can see in the session. Those without their cameras on, as in the image below, may not get asked as many questions, so an awareness of this and conscious questioning of unseen students will encourage a broad participation in the session.


Using the chat section to elicit answers to check for learning encourages participation. It is a variation on simply just listening and answering verbally. Having students write down an answer proves they know or don’t know the content. Dedicating a time in a session for this process not only varies the type of participation, but can be a great indicator that students have the required knowledge to continue. Opening up the chat lines for student to student interactions also encourages participation as some will answer questions and feel empowered in the process, and some will just enjoy the interactions. It is important though that the chat area is monitored as it can lead to the wrong kind of participation – like students just chatting in the classroom/lecture theatre which means they are not paying attention to the content. You can’t write/read and listen at the same time. I write about that here.


Using the poll function in Zoom is easy. You have to ensure it is turned on in the settings:

Once you’ve designed your questions, preferably before the session, you can then launch the poll.

Students then participate by responding. You then share the results, which at this point are anonymous, with the whole group. This serves as an assessment for learning opportunity, and you can pivot the session based on the answers if necessary. In answering the questions, students’ minds are activated as they search for knowledge in their schemata. There is an art to designing effective polls and multiple choice questions, and I discuss that art form here.  

Canvas quiz can also be incorporated into the Zoom session. The advantage of this is that it has a variety of question types that further encourage participation. There are many other apps too, such as Quizizz, Kahoot, and Mentimeter, but should be used with caution if not supported by your institution, as students may not want to sign-up for such platforms that essentially require them to surrender their data.


Sending students into groups to discuss a concept or problem is a fantastic way to encourage participation. Homogeneous groups tend to work best, because those with vastly different levels of developed schema tend not to engage with each other as well as those with closer skill levels. It is sometimes of benefit of the more knowledgeable student to help another peer, but this then relies on effective teaching skills to work, and in reality that is a big ask of a student. So setting them up before a session may be your best bet.

Providing guidance on what to do when students are in the session is crucial, and it is worth popping in to each group to see how it is progressing. As Tim Klapdor, an online expert at Adelaide University suggests, ‘Encourage discussion by promoting the students’ voice. Use provocation as a tool for discussion. Ask the students to explain and expand on concepts, existing understanding and their opinions on topics. Get students to add to one another’s contributions by threading responses from different students. Promote a sense of community by establishing open lines of communication through positive individual contributions.’ Attributing a member of the group to be a scribe is also worth doing, so that when the group returns to the main session they are able to share their screen and discuss their work/findings/solutions etc.


Getting students to share their screen encourages participation. This is especially effective coming out of a breakout room, but can be used at any point in a session. A student may be asked to demonstrate their workings of a problem, an answer to an essay question etc and the tutor can use it as a model to provide feedback. Of course caution would be used here, and only positive/constructive feedback provided.


Sharing the whiteboard and getting students to interact with the content you or they put on there is a great way to encourage participation. You could model your thinking process in this medium and explain or annotate examples to discuss how students could gain a better understanding of the content. You could also have students annotate the board, asking them to underline key words, complete equations etc. Getting multiple students to add their own annotations is probably more beneficial with smaller groups, such as in the breakout rooms. Unfortunately in Zoom you can’t paste an image on the whiteboard, only text.


I firmly believe that there will only be a very small percentage of students who are genuinely unwilling to participate in this medium. Such students would be expected to use the chat option and only ‘send to the host’ for example to ensure they are still participating. If you have tried all of the above strategies and your students are still not really getting involved, it is likely that they just don’t know the answers. As humans, we naturally want to succeed, and non-participation may indicate to you that you need to strip it back a bit, and come back to some foundational knowledge. It doesn’t matter what you think students should know, it is about what they actually do, and the relevant development of their schema. It is better that you facilitate the construction of knowledge, and provide questions that students will know answers to so they can build up their confidence in participating, By doing this, you will slowly, but surely, build their schemata so they will want to get involved consistently.

Online participation is essential for the session to be effective. If you have other tips and advice how to engage participation, please let me know and i’ll add to the list.

I’m Paul Moss. I’m a learning designer. Follow me on @twitter