PDF Domain 2: Evaluation of Teaching

Sharon Flynn
4 min readMay 29, 2017

--

Continuing in my series of posts reflecting on my professional development within the National Forum’s Professional Development Framework, this post will consider the second element of domain 2: Professional Identity, Values and Development in Teaching and Learning.

Element 2.2: Evaluation of teaching and impact on student learning, based on self/peer review/peer observation, student feedback and/or other evidence.

Our CELT website has lots of useful advice related to the evaluation of teaching, so it makes sense for me to start there. I’ll consider two aspects to my teaching:

  1. The full module that I teach, CEL263 Learning Technologies. Each year this has a group of students (approx 12–14) who I meet regularly, and incorporates module design, delivery, and review, as well as assessment.
  2. One-off workshops.

Student Feedback

While I completely agree that we should collect student feedback, use it to inform our teaching, and ensure to close the feedback loop, I have come to the conclusion we are now doing too much formal end-of-module questionnaires, and that our students are suffering from survey fatigue.

When I lectured bigger classes, or as part of the programme that I was responsible for, I regularly collected student feedback — focussed on the questions that I wanted feedback on — not the standard feedback form.

Now, for my module, with only 12–14 students, formal module evaluation is not particularly useful. Instead, I get feedback through ongoing communication with my students, throughout the module. For each workshop session we have, my students write a workshop report (a blog post in our group blog). This gives me a good sense of how the workshop has gone, what each person has learned, and whether they enjoyed it or not.

As part of the programme review each year, the class rep for the PG Dip will elicit feedback on the module, and will speak to this as part of the programme board meeting. There is rarely anything raised that I’m not already aware of, but I take this feedback on board.

Longer term, I have monitored the activities of my students and former students beyond the module, to see what is the impact of the module over time. I am particularly interested to see if the participants have become technology champions, influencers in their discipline, or have engaged in scholarship based on their use of technology in teaching and learning. [1]

For one-off workshops I sometimes collect immediate feedback, at the end. However, I am more interested in seeing whether the workshop has influenced behaviour longer term, so I find the immediate feedback of little value.

I always invite participants to email me, or get in contact via twitter, after a workshop if they have any follow up questions or issues. Occasionally this will result in a thank you message.

For example, following a recent webinar on Academic Integrity:

Or by email, following a 3 hour workshop on the same topic:

Many thanks for an excellent workshop in RCSI yesterday, it was very useful.
Your comments about the Turnitin similarity index cannot be stressed enough. In an area driven by MCQ exams and clinical evidence, interpretation of that “silver bullet” number can challenge both students and staff.

Probably the best sign of a successful workshop is if I get invited back again.

Peer Review and Observation

I developed the current University protocol for Peer Review and Observation, along with Dr Tim Murphy. However I have never undergone a formal peer review of my teaching.

On the other hand, I teach academic staff. So you could argue that I am constantly being peer reviewed. I believe that the “students” who make up my class would very quickly tell me if I was wasting their time.

One very nice thing happened last year — of which I was secretly very chuffed. I hope by sharing it here I do not come across as being boastful. Two of my students (academic staff taking CEL263) nominated me for a President’s Teaching Award. Since I, wearing another of my many hats, co-ordinate the awards, I couldn’t think of going forward. But it really brought a smile to my face and I can understand now when nominated staff members express their delight that their students took the trouble to make the nomination.

Reflective Practice

In more than 20 years of teaching, I do believe that I have developed as a reflective practitioner. I do reflect, after each workshop, on how well it went, what didn’t work, what might work better. Usually the reflection is not formal in any way — I don’t keep a teaching journal, for example.

However, if something does work well, or if I want to share the achievements of my students, I will sometimes write a blog post. Sharing good practice is an important element of reflective practice. I am less likely to share experiences that don’t work well, but maybe that’s something that I should do.

A recent example where I reflected on a successful workshop is from October 2016, when I wrote of the Visitors and Residents workshop I ran as part of CEL263.

[1] Flynn, S. (2015) “Learning Technologists: Changing the Culture or Preaching to the Converted?”, in Hopkins,D. The Really Useful #EdTech Book.

--

--