Progress, anxiety and context

One of the challenges when teaching first year students with a diverse range of maths backgrounds is that you’ll encounter students with maths anxiety. Exactly what it sounds like, maths anxiety prevents students from being able to engage with material that would be otherwise within their grasp. Whether they’re Maths A students who have always struggled, Maths B students who got through without really understanding how, or mature age students for whom high school maths is a distant yet painful memory, some students will almost refuse to attempt anything new.

Mathematics and statistics are vital skills for scientists, whether they realise it or not at the start of their studies. We’ve built and rebuilt SEB113 over the years at QUT, putting together a teaching team culture, topic list and order, and set of teaching activities that aim to provide students with a learning environment in which they are made responsible for their own learning but have access to a lot of support for when they stumble.

The main ideas that have featured strongly since the unit’s inception have been:

  • Problems worth solving in modern science require computation
  • Mathematical and statistical models allow us to go further than just data summaries
  • We need to mindful of what the data are, how they were collected, what they represent, and how best to treat them

To this end, we’ve structured the unit around blended learning for individual engagement with lecture and lab material, problem-based learning in group environments, the use of a cohesive suite of packages in R, regular progress quizzes and authentic assessment. The aim here is to encourage engagement, demystify programming, and build confidence with topics from Senior Mathematics B. The confidence and willingness to engage is then leveraged to develop an understanding of new topics and the curiosity and skills to learn further mathematics and statistics.

Diagnosing unknown unknowns

The first step in becoming responsible for one’s own learning is to diagnose one’s current level of competence relative to the unit’s assumed knowedge. We run a diagnostic quiz just before semester starts that gives students an awareness of their level of competence relative to Maths B. Students who score poorly across mathematical foundations, algebra, calculus, and statistics and probability are encouraged to enrol in MZB101, a first year unit developed for Bachelor of Education students that covers calculus at a Mathematics B level.

Previously, students were encouraged to take one of QUT’s Maths B bridging courses but these run at times which don’t align with the semester timetable so it can be difficult to catch up prior to beginning the semester when you’re not sure which university course you’re accepted into. Prior to the introduction of the MZB101 pathway, approximately 10% of the students had not studied Maths B and had very little opportunity to understand, at the start of the semester, just how much they didn’t know.

Those students not taking the diagnostic tend to score poorly at the end of the semester, which indicates that their maths anxiety is not just holding them back from learning new material but prevents them from even assessing what they do currently know. I suspect that for those students, the strategy for the semester is to hope that they can learn it all without stopping to check at any point along the way that they are actually understanding. I try to dissuade students from just assuming that being around maths will make them better at it, and that one needs to actively participate in classroom activities, ask for clarification, etc. in order to improve. Are students who aren’t doing the diagnostic actually engaging with the weekly prep material, or are they just clicking through to access the teaching material for that week? Perhaps we could tie the adaptive release to a short comprehension quiz (one or two questions) drawn from the prep material.

Progress quizzes

For those students who stay in SEB113, we have a set of progress quizzes throughout the semester to check that they’re keeping up to date. Previous discussions within the faculty have centred around whether they’re worth enough marks for students to take, whether they’re too frequent for students to handle, whether it’s best to have one, three or an unlimited number of attempts, and whether students stress out about getting all the marks they can. For my part, I’m in favour of regular learning checks that have an intrinsic reward of knowing you’re on track and understanding things and an extrinsic reward of having the quizzes’ worth comprise anywhere from 10% to 30% of the semester’s marks.

Recent research indicates that high school students, and I’ve got them not long after high school, find that retrieval practice quizzes (i.e. not problem solving) help alleviate anxiety about later major pieces of assessment (Agarwal et al. 2014). These quizzes are different to homework activities; for us, homework tends to be question sets to improve familiarity with the mechanics of a mathematical technique, or exercises in computer labs for students to get additional practice analysing and visualising data sets.

Nguyen and McDaniel (2015) reviewed a number of articles about testing as a learning tool and conclude that quizzes that provide opportunities to extend a student’s knowledge within a familiar application can help reinforce students’ knowledge of key concepts so long as the quiz questions are tailored to the learning outcomes of the unit. Not all quizzes are created equal and so perhaps as instructors we should abandon quiz banks that ask context-free questions that are not relevant to students’ classroom experiences and domain knowledge, as they risk diminished student performance.

Supervised classroom discussions

I have begun introducing more discussion in the workshop activities in SEB113 by having students interpret the results of a piece of analysis and discuss within their group before discussing with a tutor either separately or part of a full room discussion. In the paper helicopter experiment we run (Annis 2005) in Week 1 we show students the form of the paper helicopter and describe the experiment and ask them to determine what variables other than the length and width of the helicopter blade may affect its time of flight as it is dropped from Level 5 to Level 4 of P block at QUT. Rather than just conducting the experiment as written, students are encouraged to consider the experimental setup and identify environmental factors and/or properties of the helicopter which may affect its flight. This has students thinking about the context of the data collection, which we will reinforce in future weeks when we summarise (Week 2 workshop), visualise (Week 3 workshop) and model (Weeks 8-13) the times of flight of the helicopter and discuss sources of variation.

Elaborative learning strategies, which requires students to offer an explanation for a stated fact, can be very useful in terms of linking new information to what is already known by the student (Endres et al. 2017). A number of workshops on mathematical modelling now have students discuss what other systems could be modelled in a similar way to what we have investigated. I’d like to move some of this discussion up to the front of the workshop, with students recalling what they already about oscillatory systems in the natural world that may be able to be modelled as a simple harmonic oscillator. One way we’re trying to relieve maths anxiety is to show that the mathematics and statistics skills and techniques allow them to answer questions that are already familiar to them with prior experience, such as figuring out how high a ball rises when thrown or modelling how the growth of a tree slows over time.

Weaning students off relying on teachers

This is all part of an overall strategy to introduce, reinforce, and implement a range of mathematical and statistical techniques on a week to week basis to improve confidence not through repetition but through guided exploration. By using problem-based learning under supervision in the workshops, we are aiming to have students discuss their way to an understanding of the results and to check that understanding with the workshop tutors. Many of the early workshops, and early activities in later weeks, provide code for solving the problem. We gradually reduce the amount of code provided within and across the workshops as we expect that increased familiarity with the process of using RStudio to solve a problem (in a supportive group structure) will build confidence and skills so that by the end of the semester students can write code unaided and use information retrieval strategies to read help files and web resources for guidance on how to complete the task.

References

Agarwal, Pooja K., Laura D’Antonio, Henry L. Roediger, Kathleen B. McDermott, and Mark A. McDaniel. 2014. “Classroom-Based Programs of Retrieval Practice Reduce Middle School and High School Students’ Test Anxiety.” Journal of Applied Research in Memory and Cognition 3 (3):131–39. https://doi.org/https://doi.org/10.1016/j.jarmac.2014.07.002.

Annis, David H. 2005. “Rethinking the Paper Helicopter: Comnining Statistical and Engineering Knowledge.” The American Statistician 59 (4). [American Statistical Association, Taylor & Francis, Ltd.]:320–26. http://www.jstor.org/stable/27643703.

Endres, Tino, Shana Carpenter, Alf Martin, and Alexander Renkl. 2017. “Enhancing Learning by Retrieval: Enriching Free Recall with Elaborative Prompting.” Learning and Instruction 49:13–20. https://doi.org/https://doi.org/10.1016/j.learninstruc.2016.11.010.

Nguyen, Khuyen, and Mark A. McDaniel. 2015. “Using Quizzing to Assist Student Learning in the Classroom: The Good, the Bad, and the Ugly.” Teaching of Psychology 42 (1):87–92. https://doi.org/10.1177/0098628314562685.

comments powered by Disqus