Sharing the Practice of Performance Analysis

Exploring Medium as a vehicle

Darrell Cobner
15 min readJan 26, 2014

In a recent blog on Clyde Street on “Sharing the Practice of Performance Analysis”, I questioned:

“Should the conversations about academic papers migrate to platforms, including blogs, that are more conducive to conviviality and that might facilitate/accelerate change? Would this conversational approach enable the immediate flow of ideas between research and practice? This would enable more people to contribute to discussion and work in applied contexts.”

I recently shared this link within this string of comments on an associated blog on the importance of an online presence. I thought this ‘medium’ platform was worth exploring to add more dynamism to commenting on, editing and sharing online content as a more collaborative event.

To test out the platform conceptually, I have chosen to broadcast an unpublished report that has sat dormant in my computer for years. It approaches Digital Learning Objects, which is hopefully relevant and connects the content with more animated mechanisms for knowledge transfer. It also helps illustrate the importance of bridging different areas of theoretical principles and literature, such as educational technology and data visualisation etc., into Performance Analysis discipline.

Screencasts are powerful mechanisms which allow a narrator to visually reinforce their thoughts, providing a richer, more dynamic story to help portray information to a reader/observer. For example, here is a screencast that was used to complement Slideshares from an open online course (#OAPS101) compiled in November 2013.

Referring to the content in the report (below), there may now be more literature available (completed in 2008), nevertheless, it may still be worth sharing and maturing in it own right? But essentially, it is intended to provide a useful resource to pilot and explore ‘medium’ as a tool for creating working documentation and to consider its application as a technique for crowd sourcing in a pending project.

So, lets see if this generates any interaction…

INTRODUCTION

Having been involved in elite sport in the past, there was an inherent desire to transfer acquired IT skills by incorporating them into the academic world through adapting teaching philosophies, curricula, mechanisms of learning and assessment protocols. This provides a distinguishable link with the field of educational technology, which is concerned with the design, development, utilisation, management, and evaluation of processes and resources for learning (Luppicini, 2005). Being given the role of module leader for a new technological-driven subject, ‘performance analysis’, prompted the development of original resources to create a more student-centred learning environment; devised to progressively equip the students with the tools, vocational skills and knowledge of the theory, processes, and statistics procedures to become employable as a performance analyst.

The aim of the module was to create a synergy between the learning process and the learner’s interaction with the processes and tools of the trade. In planning the module an assumption was made that the students would possess the necessary computer competence to start the course. Unfortunately, the basic IT skills were not universally apparent across the learners, which prompted an intervention to assist the students to complete one of the coursework tasks. The aim of the intervention was to pilot new technology to create an automated learning management system to accommodate all learners, particularly those who experienced difficulties with the coursework requirements (Peter and Vantroys, 2005).

REVIEW OF LITERATURE

The teacher’s role in the learning process comprises all activities that impart knowledge, facts, ideas or skill (Anohina, 2005). The relationship between instruction and imparting knowledge is intricate; furthermore the impact of computers creates an intricate e-learning paradigm (von Brevern, 2004). Digital tools are increasingly being used to support teaching in higher education (Wake et al., 2007). Innovative multimedia assets can dynamically disseminate information independently of time and place (Anohina, 2005); thus creating an electronic classroom (Bender and Vredevoogd, 2006). These assets have been termed ‘learning objects’ (e.g. Boyle, 2003), or more recently ‘nuggets’ (Bailey et al., 2006). A learning object is defined as any entity, digital or non-digital, that may be used for learning, education or training (IEEE, 2002). Nuggets are primarily comprised of tasks that learners will undertake in a particular context in order to attain specific learning outcomes (Bailey et al., 2006). Assets, when correctly used, can make hard-to-implement instructional methods such as simulation more feasible (Roblyer et al., 2004); and can assist instructors to prompt and promote interaction within the group (Hu, 2006). However, integration of technology into higher education is not merely the presentation of assets. Peter and Vantroys (2005) suggested the activities should provide a relevant pedagogical scenario for the use of the learning objects rather than having a “passive” consumption of them, which reinforces the view of Boyle (2003) that ‘a learning object should be based on one learning objective or clear learning goal’.

Educators are increasingly expected to author digital resources, but technical support is generally needed from additional experts (Di Iorio et al., 2006; Bailey et al., 2006). Peter and Vantroys (2005) implied that the teacher is ‘an expert in a specific domain’ and collaborates with a pedagogical engineer to link the requirements and potential solutions, and a resource provider to design and construct a learning object. Hsu and Sharma (2006) highlighted that integration of technology into curricula takes time and effort, which can be illustrated by the model adapted from Riley (2007) and Wang and Woo (2007) (Figure 1). However, neither of these is strictly true if the teacher is able to interpret their own ideas and has the technological competencies and infrastructure to execute the task.

Figure 1. Three-level model of change in practice through technology integration (adapted from Riley 2007 and Wang and Woo, 2007)

Peter and Vantroys (2005) represented the development of ADs from the initial recognition of the need, through the analysis and design process, where the object is assembled into an accessible format. The completed object is subsequently used by the students for testing, and the tutor evaluates the impact through observation and feedback (Figure 2).

Figure 2. Design process of unit of study (Peter and Vantroys, 2005)

Traditional IT user guidelines resemble commands scripted in Microsoft Word with annotated screen grabs to assist the instruction. This format has recently been superseded by intuitively designed software to record on-screen activity with associated verbal narration. In an educational environment, tools of this nature can be referred to as animated demonstrations (ADs) (Despotakis et al., 2007). Animated diagrams are advanced instructional mechanisms to assist the user by mirroring the navigation around the interface of the software. The user is able to replicate the mouse movements and keyboard presses to execute the tasks.

Although there are obvious strengths to the concept, there have been conflicting results regarding learning efficiencies in previous studies. The main concern is the longevity of knowledge recollection (Palmiter and Elkerton, 1991), and also detraction from learning due to mimicry (Atlas et al., 1997). The presence of the teaching aid potentially does not place enough demand on the metacognitive processes of the learner.

Despotakis et al. (2007) investigated users’ attitudes towards animated demonstrations, which met praise towards their “authentic representation of task sequences”. The detailed interview data highlighted navigational inefficiencies and failure to meet individual learning requirements. However, the appraisal was influenced by the nature of the software program, students’ knowledge and prior experience of the application, and the narrator’s characteristics. Animated diagrams have the capacity to “simulate one-on-one teaching in an exceptional manner”, providing autonomy over their learning (Despotakis et al., 2007).

METHODS

An AD was constructed to assist with the systematic process of data collection using the performance analysis software. iShowU was adopted as the screen-recording tool. The recording was made without the use of any prompts/script to maintain a natural flow and avoid a robotic style of presentation. However, at the same time it was intended to be as specific and succinct as possible, whilst being paced appropriately. The tutorial took 3 hours to construct and the final product lasted 18 minutes. A screen grab does not fully demonstrate the functionality of the AD, but provides an illustration of the interface followed by the observer (Figure 3).

Figure 3. Screen grab of the AD

The intention was to pilot test the concept of the AD with a small group of end users to acquire qualitative, voluntary feedback from the users (Hsu and Sharma, 2006). Similarly to the sample selected by Despotakis et al. (2007) the users were exposed to ADs for the first time. The AD was deployed to the students on a Performance Analysis module (n=49) to assist the data collection for a piece of coursework.

The coursework requirement was to use Kappa calculation to assess the inter and intra-observer reliability of two software packages assessing time-motion analysis. The initial process required of the student was a complex, multi-phase systematical procedure to create, code and extract information from the software. The tutorial additionally demonstrated the follow up procedures to transfer the information into a Microsoft Excel template to calculate inter-observer reliability values using Kappa.

The tutorial was uploaded onto a designated multimedia server a week before the coursework submission date. This timing was to encourage the students to work independently on the task and not to be reliant on this digital support mechanism from the outset.

RESULTS

The study followed an investigative approach to assess whether the AD would be utilised and well received. Unfortunately, there was a poor response from the voluntary request for feedback — only 15 responses out of 49 students (31%). This was partly attributed to the late upload of the tutorial, assuming that some students had already completed the study; which on reflection indicated that the initial practical demonstration lecture equipped the students with the skills to conduct the data collection.

The feedback from the current cohort was consistent with that of Despotakis et al. (2007), with compliments about the accuracy and ease of use of the AD. A summary of the key comments is illustrated in Table 4. The complete verbatim statements are presented in the Appendix, however. a couple of extracts were:

“Thank you for the amount of help the tutorial gave us. Every time we came across a problem we were able to find the area we were stuck with, watch and listen to what had to be done next. The instructions were simple and easy to follow and the verbal narration helped me to understand the process as I learn better through hearing instructions.”

“Just a quick email to let you know how useful I thought the video tutorial was. Despite doing much of the work in the lab sessions, I had forgotten aspects and found the tutorials to be a good reminder as well as a good way of gaining little hints and tips to help me with my work! They were greatly appreciated and found to be a good learning resource!!”

Table 4. Summary of voluntary comments by the students.

DISCUSSION

There is a complex relationship between technology and the teaching-learning environment (Wake et al., 2007). The field of performance analysis requires constant interaction with video and computers to provide information to coaches; therefore this was made an essential component of the module content and assessment protocol. The curriculum was strategically designed to get the students ‘hands on’ with relevant software. However, skill gaps were identified in IT competency and it was necessary to prescribe supplementary learning material to offer personalised learning (Atif et al., 2003).

During the initial demonstration lecture it was apparent that students had different IT proficiencies, with some struggling to keep up with the progression of the lecture; these students were the predominant targets for this project. Using multimedia, instructions can be represented in a graphical and/or auditory form; this multi-sensory method could improve the organisation and automation of individuals with learning difficulties, who may have problems with sequencing (Beacham, 2008). Creating a virtual instructor allowed individuals to progress through the task at their own pace empowering the student to feel more independent as a learner. Importantly, the AD was not intended as a substitute for face-to-face teaching, but to incorporate a blended learning approach by reinforcing previous lecture material. The production of the AD was initially time consuming, but it eradicated the need to repeat the process on an individual basis, particularly to non-attendees.

The voluntary feedback evaluation process only elicited 15 responses, limiting a full assessment of the impact of the AD. According to Lanzilotti et al. (2006) an “e-learning systems quality is the extent with which technology, interaction, content and offered services comply with expectations of learners and teachers by allowing them to learn/teach with satisfaction”. In hindsight, a more scientific appraisal of the AD would consider four aspects:

1) the impact of the technology (complementary media selection, enriched the content, overcame barriers),

2) interaction (between students and student-lecturer, the presentation of the interface and activity level of the system, errors identified, emphasis on learning the content),

3) content (the educational process, learner-centred design, learning path decisions, facilitate understanding)

4) services (support mechanisms, ease of navigation) (Figure 5).

Figure 5. TICS framework for the quality of e-learning systems (Lanzilotti et al., 2006)

Animated diagrams have been criticised for being inflexible (Despotakis et al., 2007) and users “aversion to explicitly restrictive navigation” (Bailey et al., 2006). In this study, a library of clips was created to allow the user immediate access the key bookmarks of the tutorial. This allowed instant access to specific elements rather than follow a predefined path, allowing the students to replay the content until they were able to comprehend and replicate the procedures. In contrast to the previous literature fore mentioned, individual needs were encompassed.

It could be argued the AD produced was instructional and didactic. However, direct instruction has its place where “procedural knowledge needed to perform simple and complex skills and for declarative knowledge that is well structured and can be taught in a step-by-step fashion” (Arends, 1998). Although the practice was goal orientated, future ADs require a more problem solving approach to fit the definition of educational technology by Luppicini (2005). However, Anohina (2005) attempted to differentiate the terminology used in education, suggesting that instruction and training imply a more vocational, practical method of learning, which reinforces the ‘hands on’ nature of the module.

The definitions and terminology used in the challenging education-technology conundrum are diverse, vague and largely undecided; this leaves debate as to where the AD actually fits? Ryan et al. (2000) used the term “resource-based learning’ which is an umbrella term incorporating themes encountered in the current project: computer aided, individualised and flexible learning, and additionally brings in project-based, student-centred element. This definition completely encapsulates the AD.

CONCLUSION

A successful AD was designed and piloted, which facilitated remote independent learning and alleviated the time required to support. It was a goal-orientated, complementary practice that enthused the sample population. However, was this due to the scaffold it provided to support them through part of the coursework objectives, which ultimately contributes to their degree?

Technology was harnessed and incorporated into a teaching repertoire. Boyle (2003) stated ‘good eLearning resources are expensive to produce’. In contrast, during this case study the software to create the AD cost £10 and involved 3 work hours to produce (an assumption is that the AD was ‘good’?). The interaction with stylish multimedia assets engaged users (von Brevern, 2004), and enriched the learning environment and student experience (Hu, 2006; Bailey et al., 2006). It gave the students the chance to work independently, applying technology used in the field, and exposing them to transferable IT skills (for example copying and pasting between packages).

On reflection, the AD was deemed to be successful and a complementary intervention to assist student-centred learning. The decision to release the AD with only a week to submission was justified as it prevented the students being dependent on the resource, but could be criticised as it potentially helped the poorly organised students more.

REFERENCES

Anohina, A. (2005). Analysis of the terminology used in the field of virtual learning. Educational Technology and Society, 8(3), 91-102.

Arends, R. (1998). Learning to teach. New York: McGraw-Hill.

Atif, Y., Benlamri, R. and Berri, J. (2003). Dynamic Learning Modeller. Educational Technology and Society, 6(4), 60-72.

Atlas, R., Cornett, L., Lane, D. M. and Napier, H. A. (1997). The use of animation in software training: Pitfalls and benefits. In Quinones, M. A., and Ehrenstein, A. (Eds.), Training for a Rapidly Changing Workplace: Applications of Psychological Research. Washington DC: American Psychological Association, 281-302.

Bailey, C., Zalfan, M.T, Davis, H.C., Fill, K. and Conole, G. (2006). Panning for Gold: Designing Pedagogically-inspired Learning Nuggets. Educational Technology and Society, 9(1), 113-122.

Beacham, N. (2008). The potential of multimedia to enhance learning for students with dyslexia. http://www.skillsforaccess.org.uk/articles.php?id=150 <accessed on 22.3.2008>.

Bender, D.M. and Vredevoogd, J.D. (2006). Using Online Education Technologies to Support Studio Instruction. Educational Technology and Society, 9(4), 114-122.

Boyle T. (2003) Design principles for authoring dynamic, reusable learning objects. Australian Journal of Educational Technology, 19(1), 46-58. Available from http://www.ascilite.org.au/ajet/ajet19/boyle.html <Accessed 18 Apr 2008>.

Despotakis, T.C., Palaigeorgiou, G.E. and Tsoukalas, I. A. (2007). Students’ attitudes towards animated demonstrations as computer learning tools. Educational Technology and Society, 10(1), 196-205.

Di Iorio, A., Feliziani, A.A., Mirri, S., Salomoni, P. and Vitali, F. (2006). Automatically Producing Accessible Learning Objects. Educational Technology and Society, 9(4), 3-16.

Hernández-Ramos, P. (2006). How Does Educational Technology Benefit Humanity? Five Years of Evidence. Educational Technology and Society, 9(4), 205-214.

Hsu, P.-S. and Sharma, P. (2006). A Systemic Plan of Technology Integration. Educational Technology and Society, 9(4), 173-184.

Hu, B.-Y. (2006). Book review: Integrating technology into higher education (M. O. Thirunarayanan and Aixa Perez-Prado). Educational Technology and Society, 9(1), 359-360.

IEEE (2002). Draft Standard for Learning Object Metadata. Available from, http://ltsc.ieee.org/wg12/files/LOM_1484_12_1_v1_Final_Draft.pdf <accessed 18 Apr 2008> .

Lanzilotti, R., Ardito, C., Costabile, M.F. and De Angeli, A. (2006). eLSE Methodology: a Systematic Approach to the e-Learning Systems Evaluation. Educational Technology and Society, 9(4), 42-53.

Littlejohn, A. (2003). Reusing Online Resources: a sustainable approach to e-learning. London: Kogan Page.

Luppicini, R. (2005). A Systems Definition of Educational Technology in Society. Educational Technology and Society, 8(3), 103-109.

Palmiter, S. and Elkerton, J. (1991). An evaluation of animated demonstrations for learning computer-based tasks. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems: Reaching Through Technology. New York: ACM Press, 257-263.

Peter, Y. and Vantroys, T. (2005). Platform Support for Pedagogical Scenarios. Educational Technology and Society, 8(3), 122-137.

Riley, D. (2007). Educational Technology and Practice: Types and Timescales of Change. Educational Technology and Society, 10(1), 85-93.

Roblyer, M.D., Edwards, J. and Havriluk, M.A. (2004). Integrating educational technology into teaching (4th Ed.). NJ: Prentice Hall.

Von Brevern, H. (2004). Cognitive and Logical Rationales for e-Learning Objects. Educational Technology and Society, 7(4), 2-25.

Wake, J.D., Dysthe, O. and Mjelstad, S. (2007). New and Changing Teacher Roles in Higher Education in a Digital Age. Educational Technology and Society, 10(1), 40-51.

Wang, Q. and Woo, H.L. (2007). Systematic Planning for ICT Integration in Topic Learning. Educational Technology and Society, 10(1), 148-156.

Westera, W. (2005). Beyond functionality and technocracy: creating human involvement with educational technology. Educational Technology and Society, 8(1), 28-37.

APPENDIX

Examples of feedback:

The video tutorial was really useful, was very easy to use and explained everything very well.

Great online tutorial. Was a great help, especially as we could stop and try it out!

It went through each stage well and explained clearly all the points I needed to know, giving demonstrations.

I found the video tutorial useful. I didn’t know how to copy a section of the screen on a Mac computer, but now I do. Happy Days. Plus it was useful when we were merging the timelines together.

The video tutorial was quality!!!!! Really easy to go through and was a great help. Cheers!

I found the video tutorial a really helpful guide when having to use the Sportscode package. It was useful in reminding me of information from previous lab sessions.

I found the tutorial very helpful. It helped me understand the process and because of it I was able to carry out my work efficiently.

The video tutorial was very helpful. Thank you.

The video tutorial was pretty good. I had already finished my results, but using the timeline made it easy to skip through and double check I had performed kappa testing correctly. Verbal instructions were also very helpful.

Very interesting, I attended all labs and still found it useful as someone was talking you through the process, as well as just showing you. Well presented. Very useful. Would like to have more in the future for reference and guidance.

The video tutorial was an excellent idea and it was extremely helpful. Lets hope it shows in my coursework.

The tutorial was a great tool in going through the software. A great step by step guide. I needed it. Cheers.

I used the tutorial — it was brilliant. I followed it as the video was playing — very useful. The voice sounded like someone I’ve met before

Thank you for the amount of help the tutorial gave us. Every time we came across a problem we were able to find the area we were stuck with, watch and listen to what had to be done next. The instructions were simple and easy to follow and the verbal narration helped me to understand the process as I learn better through hearing instructions.

Just a quick email to let you know how useful I thought the video tutorial was. Despite doing much of the work in the lab sessions, I had forgotten aspects and found the tutorials to be a good reminder as well as a good way of gaining little hints and tips to help me with my work! They were greatly appreciated and found to be a good learning resource!!

--

--

Darrell Cobner

A shared curation of Performance Analysis resources from a pracademic perspective