I have registered for the University of Canberra's free workshop Monday on "Work Integrated Learning - a disciplined approach". The pre-reading is Ruge and McCormack (2017) on "reframing" assessment for employability. This is something I am comfortable with, as I teach computing and engineering students, where there are clear professional roles with defined skills.
As a computer professional I have been involved with setting national and international skills standards and the accreditation of university programs. I have then designed university courses to meet these standards. But this would be a challenge for those from disciplines which are not so vocationally orientated. I suspect the greatest challenge for those facilitating the workshop will be the reluctance of some academics to see themselves as vocational educators, or worse "trainers". ;-)
Ruge and McCormack (2017) report on a five year Australian study of students in Building and Construction Management at the University of Canberra. This is clearly an industry related discipline and "Constructive alignment" should be easy for the construction industry. ;-)
The researcher's suggestions are not surprising, including linking curriculum to industry requirements and scaffolded assessment. The one suggestion I have difficulty with is the use of reflection to have student think about their own learning. Having been through several reflective writing exercises myself as a student and a HEA Fellows applicant, I find the process perplexing.
My computer and engineering students find it equally perplexing to be told to reflect on their learning. What worked well this semester was to re-frame the capstone reflective writing exercise for ANU Techlauncher students, as a job application, based on what they had learned in the course. They were still reflecting on what they learned, but in a way relevant to their short term goal: get a job.
Ruge and McCormack (2017) discuss educational design principles for learning through authentic assessment. In theory this is not that difficult: simply provide assessment tasks which simulate what the graduate will have to do in the workplace. As an example, I teach students how to estimate the carbon em missions due to use of ICT in an organization, so I get the students to do that and assess the results. The terms I use to describe this assessment item comes from the learning objective for the course, which in turn comes from the skills definition of the gallivant international professional standard.
However, designing and delivering authentic assessment is a complex and time consuming process. Either the students are understanding the task in a real workplace, or a simulation of one. Having students in workplaces requires specialized supervision skills.
Students will not necessarily be interested in, or understand the alignment of assessment with
the profession's requirements. This understanding may only come years later.
Promises of "personalized" formative feedback can create unrealistic expectations in students and an unrealistic workload for staff. As with lectures, which students say they want, but do not attend, detailed feedback is something expected but not necessarily used. One a course as a student of assessment I read research results which indicated students did not read detailed feedback. I thought this nonsense: I spent a lot of time composing feedback and surely the students read it. My assignment on feedback reflected that view. The assignment came back from the assessor and I looked at the mark on the front (which was okay), then flicked it aside. At that point I stopped and realized I was a student exhibiting the behavior I said students did not have: I had not read the feedback. Since then I have adopted the practice of of very brief feedback next to the mark.
Ruge, G., & McCormack, C. (2017). Building and construction students’ skills development for employability–reframing assessment for learning in discipline-specific contexts. Architectural Engineering and Design Management, 1-19.