Narrative (1400 words)

A1. Design and plan learning activities and/or programmes of study

After a couple of years teaching a pre-established undergraduate chemistry laboratory experiment, I gained some insight into how lab sessions work, what students think about and want from them, and how things could improve. Therefore, I took the initiative to develop a new undergraduate experiment based on my own research work (see attachment). My main objective was to provide a learning activity that would that would enrich the undergraduate chemistry curriculum by fostering higher levels of Bloom’s Taxonomy.1-2

Practical laboratory activities are particularly valuable because they are intrinsically examples of problem-based learning activities.3 More than conveying academic material, laboratory tasks provide an environment in which students can develop critical thinking by employing reciprocal teaching, discovery and collaborative learning. Such practical learning tasks have long been recognised as an important component of scientific courses, being rated as one of the factors that most impact students’ interest in, and engagement with, their respective subjects.4 Practical teaching has a great impact in science students’ experience and is key to successful undergraduate teaching and excellent student academic achievements.5 However, irrelevant/uninteresting laboratory sessions can cause students to disengage with their subject area.6 When designing this experiment, I sought to employ a wide range of advanced pedagogy and teaching approaches in order to address these issues. In particular, I very seriously invested in the development of critical thinking by fostering the student as a researcher approach7, and in motivating the students beyond the “classroom” by making the experiment itself a transformative experience,8 as discussed further on.

In simple terms, to complete the laboratory activity I have developed, titled “Photochemistry of Sunscreen Molecules”, students will first have to make an extraction from commercially available sunscreen lotions and collect their respective radiation absorption spectra. From these results, students will be asked to reflect on the relationship between absorption of radiation and SPF, and to compare the efficacy of sunscreen brand A and B. In the following stage, students will be guided to build on their previous knowledge of physical chemistry to predict the ideal photophysical behaviour for a sunscreen molecule – i.e. it should not fluoresce or phosphoresce. Students will then measure the fluorescence of two samples of sunscreen molecules using the in-house built fluorometer and decide if these molecules would be suitable for sunscreen use. One of the sample molecules should be found to be suitable for sunscreen use and should, in fact, also be found in the ingredients list of sunscreen products: this relates the scientific knowledge acquired in the experiment back to “real-life”. The post-lab tasks, which are formally assessed, require that students interpret their result in light of previous knowledge they’ve gained in their physical chemistry modules, with questions being specifically tailored to connect the laboratory task to material covered in these modules.

This laboratory activity addresses student engagement in a number of ways. First of all, I designed the experiment around a topic students can relate to – commercially available sunscreen products – so that it provides a transformative experience, as defined by Pugh et al. (2017).8 This means that students should be able to see their day-to-day interaction with sunscreens and ultraviolet radiation from the perspective of the new knowledge they have acquired from this experiment. This pedagogy has been reported in the literature by several authors to enhance student experience, engagement, and to encourage deep lasting learning.8-10 Moreover, I designed the pre- and post-lab activities, as well as the experimental protocol for this practical task, with a student as a researcher approach in mind: apart from following protocol, students are expected to critically analyse their results and use that to guide ensuing steps in their experiments. In particular, post-lab tasks were specifically set so that students get an opportunity to interpret their results in light of previous knowledge and to guide them towards their own conclusions. The whole project is tailored as an introduction to research so that students are trained as researchers from early on in their academic careers. Finally, this undergraduate experiment also brings a much-needed reinforcement to the teaching of spectroscopy (a field of physical chemistry that deals with light-matter interactions) in the chemistry undergraduate laboratory classes.

I have received good feedback regarding this laboratory activity from other teaching staff within the department. The project is still in early stages, hence I haven’t had the opportunity to have students try it and provide feedback but this is planned to happen before the new experiment in implemented in the undergraduate chemistry curriculum, so that any alterations can be made according to feedback.

 

A3. Assess and give feedback to learners

My main experience with assessment and feedback relates to marking post-laboratory questions for the laboratory activities I help students with, the assessment criteria for which I did not define. My role in this assessed work was to mark it according to pre-set criteria and provide feedback to students. While marking and feedback are done on Moodle for this module, I also give students my e-mail so that they have an open communication channel via which they may request further details or clarification, if needed. An example of a student who asked for such further feedback – and my response to it – is given in the attachments, with relevant elements of the feedback provided being highlighted.

The importance of feedback has been extensively reviewed in the literature, with good feedback being reported as a powerful tool for enhancing learning and hence student performance.11-13 However, for feedback to be effective and thus act as a learning instrument, it needs to observe the Seven Keys to Effective Feedback, as defined by Scherer (2016): “goal-referenced; tangible and transparent; actionable; user-friendly (specific and personalized); timely; ongoing; and consistent” feedback.13 I agree with these criteria for good feedback suggested by Scherer and these are elements that I strive to include in the feedback I give to students. Some of these aspects have also been shown to be important to students, in work by Poulos and Mahony, which explores the students’ perspective on the effectiveness of feedback.14 Their research revealed that students value consistent, transparent, clear and timely feedback, in line with suggestions by Scherer.13,14

Goal-referenced feedback relates to comments which remind students what the goal of the activity is, or what is the learning outcome that they may not be demonstrating. I find this is very important especially in things that are not core knowledge, such as presentation of data, written communication, ability to be concise, etc., which students tend to overlook despite their importance. In the attached document I demonstrate how I highlight what the learning outcomes are, and why they are important (highlighted in green/blue). I also strive to be tangible and transparent when giving feedback, i.e. being clear, specific and using informal language that avoids confusion with any technical terms (decoding, as Small and Attree (2016) call it).15 Moreover, I make my comments actionable by clearly stating how the student can improve (highlighted in the attachment in green). Highlighted in yellow in the attached document is an example of how I ensure my feedback is user-friendly by acknowledging the students’ individual needs and difficulties. Finally, I also do my best to deliver my feedback in a timely and consistent manner. The one topic in Scherer’s Seven Keys to Effective Feedback14 with which my feedback does not comply is continuity: due to the occasional nature of my teaching I see students only once and hence do not have an opportunity to provide ongoing feedback. This is, in fact, something that I regret deeply because I feel that following a students’ progress and being able to give regular feedback would be beneficial to them. Namely, it would allow me to improve on the personalisation of my teaching and feedback to the students’ individual needs, since I would have more of an opportunity to get to know the students. This is surely something I will seek to do in my teaching career and I will always defend that students should be accompanied and supported throughout their academic careers and learning journeys (hence why we still insist in ensuring students have personal tutors).16,17

Finally, and while I have less experience with defining students’ assessment, I did plan the (assessed) post-lab questions for the undergraduate experiment I am developing, as discussed in the above section. My main concern when designing these questions was that they would not only assess the students’ understanding but also prompt their critical thinking and support learning by asking thought-provoking, intellectually engaging questions: essential questions, as defined by McTighe and Wiggins (2013).18 For example, instead of asking students to “plot absorbance vs. wavelength for each SPF sample”, I formulate the question so that they need to think about how to present their data, i.e. “plot your UV/Vis data clearly demonstrating the relationship between SPF value and absorbance within the same brand”. I believe that understanding by design,19 that is, carefully formulating tasks and questions to support understanding and learning, is key to achieving meaningful assessment that acts as a tool for learning.

 

 References

  1. R. Krathwohl, B. S. Bloom and B. B. Masia, Taxonomy of educational objectives: the classification of education goals, Handbook II: Affective domain, 1964, New York.
  2. R. Krathwohl, Theory Into Practice, 2002, 41(4), 212-218.
  3. Walker, et al., Essential Readings in Problem-Based Learning, 2015, Purdue University Press, Indiana, USA.
  4. C. Barrie, et al., International Journal of Science Education, 2015, 37(11), 1795-1814.
  5. Hofstein and V. N. Lunetta, Review of Educational Research, 1982, 52, 201–217.
  6. W. Rice, et al., Tertiary science education in the 21st century. Sydney: Australian Learning and Teaching Council, 2009.
  7. Smyth, F. Davila, T. Sloan, et al., Higher Education, 2015, 72, 191-207.
  8. J. Pugh, et al., The Journal of Experimental Education, 2017, 1-29.
  9. C. Heddy, et al., Science Education, 2013, 97(5), 723-744.
  10. W. Taylor, International Journal of Lifelong Education, 2007, 26(2), 173-191.
  11. Askew, Feedback for Learning, Routledge, New York, 2000.
  12. M. Brookhart, How to Give Effective Feedback to Your Students, ASCD, Alexandria, 2017.
  13. Scherer, On Formative Assessment: Readings from Educational Leadership, ASCD, Alexandria, 2016.
  14. Paulos and M. J. Mahony, Assessment & Evaluation in Higher Education, 2008, 33(2), 143-154.
  15. Small and K. Attree, Studies in Higher Education, 2016, 41(11), 2078-2094.
  16. Ghenghesh, Journal of Further and Higher Education, 2017, 1469-9486.
  17. Fry, S. Ketteridge and S. Marshall, A Handbook for Teaching and Learning in Higher Education, Routledge, New York, 2009.
  18. McTighe and G. Wiggins, Essential Questions: Opening Doors to Student Understanding, ASCD, Alexandria, 2013.
  19. Wiggins and J. McTighe, Understanding by Design, ASCD, Alexandria, 2005.