In comparison to last week, I ploughed through this weeks’ work. Probably because I have a separate education tech training event on this week – the ACODE Learning Technology Teaching Institute – and wanted to clear the decks to give this due focus.
This week in MOOC land saw us looking at the evaluation phase of a project. Once more, the videos and examples were a little (ok entirely) focussed on K-12 examples (and a detour into ed tech startups that was slightly underwhelming) but there were enough points of interest to keep my attention.
I even dipped back into GoAnimate for one of the learning activities, which was fun.
The primary focus was on considering barriers to success before starting an implementation and it did introduce me to a very solid tool created by Jennifer Groff of MIT that provides a map of potential barriers across 6 areas. Highly recommend checking it out.
Groff, Jennifer, and Chrystalla Mouza. 2008. “A Framework for Addressing Challenges to Classroom Technology Use.” AACE Journal 16 (1): 21-46.
As for my contributions, as ever, I’ll add them now.
Barriers and Opportunities for using PollEverywhere in uni lectures.
There are a number of potential barriers to this success of this project, however I believe that several of them have been considered and hopefully addressed in the evaluation process that led us to our choice of PollEverywhere. One of them only came up on Friday though and it will be interesting to see how it plays out.
1 School – Commercial Relationships
Last week I learnt that a manager in the school that has been interested in this project (my college is made up of four schools) has been speaking to a vendor and has arranged for them to come and make a presentation about their student/lecture response tool. (Pearson – Learning Catalytics). Interestingly this wasn’t a tool on my radar in the evaluation process – it didn’t come up in research at all. A brief look at the specs for the tool (without testing) indicates though that it doesn’t meet several of our key needs.
I believe that we may be talking to this vendor about some of their other products but I’m not sure what significance this has in our consideration of this specific product. The best thing that I can do is to put the new product through the same evaluation process as the one that I have selected and make the case based on selection criteria. We have also purchased a license for PollEverywhere for trialling, so this project will proceed anyway. We may just need to focus on a pilot group from other schools.
2 School – Resistance to centralisation
Another potential obstacle may come from one of our more fiercely independent schools. They have a very strong belief in maintaining their autonomy and “academic freedom” and have historically resisted ideas from the college.
There isn’t a lot that can be done about this other than inviting them to participate and showcasing the results after the piloting phase is complete.
3 School – network limitations
This is unfortunately not something that we can really prepare for. We don’t know how well our wireless network will support 300+ students trying to access a site simultaneously. This was a key factor in the decision to use a tool that enables students to participate via SMS/text, website/app and Twitter.
We will ask lecturers to encourage students to used varied submission options. If the tool proves successful, we could potentially upgrade the wireless access points.
4 Teacher – Technical ability to use the tool
While I tried to select a tool that appears to be quite user-friendly, there are still aspects of it that could be confusing. In the pilot phase, I will develop detailed how-to resources (both video and print) and provide practical training to lecturers before they use the tools.
5 Teacher – Technical
PollEverywhere offers a plug-in that enables lecturers to display polls directly in their PowerPoint slides. Lecturers don’t have permission to install software on their computers, so I will work with our I.T team to ensure that this is made available.
6 Teacher – Pedagogy
Poorly worded or times questions could reduce student engagement. During the training phase of the pilot program, I will discuss the approach that the teacher intends to take in their questions. (E.g. consider asking – did I explain that clearly VS do you understand that)
Beyond the obvious opportunity to enhance student engagement in lectures, I can see a few other potential benefits to this project.
Raise the profile of Educational technology
A successful implementation of a tool that meshes well with existing practice will show that change can be beneficial, incremental and manageable.
Open discussion of current practices
Providing solid evidence of improvements in practices may offer a jumping off point for wider discussion of other ways to enhance student engagement and interaction.
Showcase and share innovative practices with other colleges
A successful implementation could lead to greater collegiality by providing opportunities to share new approaches with colleagues in other parts of the university.
This isn’t incredibly detailed yet but is the direction I am looking at. (Issues in brackets)
- Develop how-to resources for both students and lecturers(3)
- Identify pilot participants (1,2)
- Train / support participants (3,4,6)
- Live testing in lectures (5)
- Gather feedback and refine
- Present results to college
- Extend pilot (repeat cycle)
- Share with university
Oh and here is the GoAnimate video. (Don’t judge me)