Tuesday, 27 July 2010

Take me to your leader...

We have had an interesting few weeks where our Pro Vice Chancellor, our Executive Sponsor and the chair of our Senior Management Working Grouphas left the university. This could have had a big impact on the project, and left us with a number of questions. Who would become our new "leader", how would they feel about Enable, and how would it fit with their role in the university? These were big questions. Thankfully our Project Director, as head of department was facing the same questions and knew the right people to talk to, and temporarily at least, we are now reporting in to another member of the executive, the Deputy Vice Chancellor. This was good news, and better was to follow. Thanks to the practice we’ve had getting the message over to senior management and being further into the project we were able to communicate what the Enable project was about and how this related to University plans. This seemed to resonate with him and the Director of Academic Development. We are now busy writing some new documentation about how Enable can help with some possible executive-led initiatives that they are interested in exploring. What appears to be useful is that the focus is less on immediate quick wins but an understanding that making useful changes will take time and planning. More on that later!
Thanks to the work of Enable, and the cross dissemination we have been doing with the other institutions in the area, including the workshop we did in June around managing change, the Deputy Vice-Chancellor and Director of Academic Development have requested that Prof Mark Stubbs comes to talk to them and others involved in Enable about the work they are doing within the CDD programme at MMU. This is all very positive for the message we are communicating about managing change and information within our institution.

Wednesday, 7 July 2010

Reflections from Day 2 (pm) EISTA 2010

Academic Integrity was something discussed over lunch as it was the last presentation of the morning, this was done by Greg Williams who linked reducing cheating to strong instructional design (ID) (a systematic approach to curriculum development, using ADDIE - Analysis, Design, Development, Implementation and Evaluation). He believes that ID staff need to work closely with subject matter experts to get the best out of it. Sounds very similar to the work of the LDI team at Staffordshire University! We discussed the technologies that are supposed to help with cheating such as turnitin and a colleague mentioned a YouTube video can be found showing students how to get around this (and other) technologies. Greg spoke about the fact that a lot of elearning assessment is low level (true/false, multiple choice etc) and this assessment needs to be reviewed to support strong learning objectives that are more specific to learners and using world examples, and that the tutor needs to be involved further to the start of the assessment, reviewing drafts and feeding back on work already taking place.

The afternoon covered the session I was participating in. The presentation was kindly recorded by Nasir Butrous who also did a presentation in the session and I will be editing and publishing this soon, but have a video back log at the moment from the workshop in Stafford too. We had some good discussions around the role of a trusted individual in managing change in an institution, which was very useful. In the same session we had a presentation around Industry based assessment (ERP), which looked at supporting learners using relevant software in the classroom and getting employers engaged in developing the assessment so that they have real world examples to work with. Hiram Bollaert spoke about learning objects and I was surprised that this seemed a new term for a lot of the audience, along with IMS and SCORM. Hiram and I had an interesting discussion after his presentation around finding a SCORM player that does more than show SCORM packages but does all the tracking you want, that is also open/free/cheap and can be multilingual . Hiram discussed how the learners were the creators of the learning objects, helping them reflect, develop their ICT skills and encourages them to manage their knowledge. As they are displayed to a wider audience it also gives them recognision beyond their tutor.

Nasir spoke about how he had analysed online access patterns against student performance, by looking at learners accessing content online, and their performance at the end of the course he could see when the best time would be for tutor intervention on the course to improve performance.

Saturday, 3 July 2010

Reflections from Day 2 (am) EISTA 2010

The first day was very focused on curriculum change, the next day focused more on supporting blended learning using online tools and different approaches. A number of presentation looked at how "e" can replace paper, and whether "e" was in fact better. I noticed that in some cases there was an assumption that it would be better, and in others, where stakeholders were engaged from the beginning there was a recognition of the fact that user perceptions were that in fact it would be much harder, more time consuming and require extra resources not necessarily available to the tutor. Project that captured those perceptions at the start of their work were able to demonstrate better the success of their work when at the end those same stakeholders said that the "e" approach (for example to learner evaluation to teaching) actually improved the process and made them think more about the questions and the approach they had previously used. This relates to Enable nicely, we need to be able to show how people feel now, and how things are better due to intervention, support and adaption of new approaches. It also highlights issues with some initiatives and how they haven't captured that information which has made it harder for them to measure the impact of their work.
A particular paper that caught my attention focused on studies around software and if learnability of a particular tool impacted on its adoption to the mainstream. Interestingly the work demonstrated if the need/ motivation was high enough to use the technology even with negative learnability it will become adapted. This has come as no real surprise as it reflects my own experience of using technology, and those of people I talk to in the university with regards to some of our internal systems. It raises the question - should we (as a university) be looking at replacing these systems as the motivation is there from staff to use them regardless? It's a sticky one as surely with more positive learnability then tools may be used in more intuitive ways and information used to support learning more than what can be seen as a "chore" at the moment?
More reflections from EISTA coming soon!

Thursday, 1 July 2010

EISTA 2010 Day One - Managing the curriculum

This is a quick summary of the first day of the EISTA 2010 conference. The main focus of the day, for me, was around presentations on transforming curriculum. The first thing I noticed was although the speakers (USA: Drexel University, and University of Texas, along with AEFIS) talked about Assessment, they were discussing what we would call curriculum review and that their award programme structures were very similar to ours and they are experiencing the same problems with how they managed their curriculum. The presentations didn't address this from an institutional perspective, but rather a faculty. They were also presenting the benefits of using AEFIS who also presented, and mentioned that they were looking for academic partners from across the world.

The first presentation talked about creating & sustaining change, linking reviews to student outcomes. They note the same issues with change that has been encountered by Enable, culture, hierarchy, communication, lack of understanding of reason for change. They discuss a faculty top down approach to change and how they managed communication to set up a new review process (the old one did not work, people never did it as they felt it was too much effort, and that the one size fits all approach does not work). They they talked about their implementation of the AEFIS to track development and review of course curriculum to improve the student experience. They now use the system as a central point to deliver resources to faculties.

The second presentation focused on how curriculum mapping with, at first Excel, and then AEFIS, helped them recognise that often course units where not being assessed in the best way to promote student experience, that they had too many units at an introduction level (14 programming units at introduction level and none at emphasis, reinforce or applies level so the students could say "hello" but not hold a conversation in any of them). The mapping exercise involved looking at 14 different learning outcomes and then breaking them down to 70 performance criteria (this break down took a couple of months of research as they did not start from scratch - nor would they recommend that anyone tried it from scratch). They said that the entire process of mapping took them 16 months in Excel but with AEFIS a lot of the work is automated and could be done in a day! We then did a nice paper exercise around curriculum mapping outcomes against level of learning and means of assessment, manually identifying issues with the curriculum - the presenter pointed out that the software helps but can't do everything.

The next presentations I attended focused on managing the curriculum using analytics to improve and inform learning. The first stressed the importance of being able to use good quality data, and that the most important needs to be selected from the avalanche of data that faculties and institutions collect. We were shown a model of developing learning analytics, with the focus on the individual student learning experience, to support optimising the student path through the course - this seemed to link to APEL and allowing students to use what they already know to skip units.

The second discussed the development of an Instructional Decision Support System. This system is designed to link student characteristics (learning styles etc), student performance, instructor characteristics, learning outcomes and instructional methods to inform faculty decisions (yes that was straight off the handout!). After collecting data on students in different courses they were able to show, for example, that on one course students were sequential learners but the teacher was using global learning and as such the feedback from the course was not positive, by adjusting her teaching to fit sequential learners over two years feedback from the students was much more positive.

The final presentation for this area was around creating EduApps - creating support documentation and applications for faculty staff so that they can focus on innovative teaching, these "apps" are not necessarily technological solutions, but can be guidance notes on using different technologies. They are developed based on faculty issues, and those faculties that raise the issues become owners of the solution. The main criteria of the apps is that they should be focused on a particular problem, pose low risk for faculty adoption, the tools should not require a learning curve or a significant investing of time or resources to use it.

Despite the great promotional work done by the universities and AEFIS themselves I did have a concern that they spoke about being the one system for university's fitting into different areas, rather than looking at how to support universities with different systems (along the lines of the work already happening in the UK around FSD and SOA). Although this could just be me and unfortunately I could not follow up on this during lunch as they did not provide any and I really need to eat at the moment!