Thursday, May 14, 2026

AI for Higher Education Faculty: Course by the Digital Education Council

After congratulating myself for completing the short course "AI Literacy for All", by the Digital Education Council, I discovered there was another one "Certificate in AI for Higher Education - AI for Faculty", which perhaps I should have done instead. The "All" course was very short and simple, with content I already knew. The Faculty course is much longer (perhaps too long), with more depth (perhaps too much, with much I already knew). 

While providing a good introduction to AI for teaching staff, the course goes into too much detail on pedagogy, policy and projects. Much of this detail is material trained educators, administrators and project managers should already know. If they don't, they are only going to learn enough from a short course like this to get themselves, and their institution, into trouble. 

I suggest flipping the design: present the last section, from Singapore Management University (SMU), first. SMU provide practical techniques which educators could use to improve their teaching and reduce workload. The other material on policy and theory could be built on the back of that (or skipped for those of us already trained in it). As it is, the SMU material appears to be tacked on the end as an afterthought. Also I am still not sure exactly who, or what, the Digital Education Council is: how long has it been around, where is it based?

Synthetic Video Presentations

I had one "Wow!" moment in the course, with one video by Tamas Makany at SMU, about how to use an avatar for a training video. They explained how they used HeyGen software to create a video presentation featuring a synthetic version of themselves. It took a minute to realize that what I was looking at on the video was a an example of what was being discussed: it looked like a video of a real person. 

For several years I have been using text to speech software to create short video slide shows. These have a voice with an Australian male accent, which sounds remarkably like me. This use sa much simpler process than Dr Makany describes. I create a power-point presentation, with the narration for each slide in the notes. An online tool then turns that into a video. Being able to add a synthetic talking head would be useful. Dr. Chris Poskitt, also from SMU, described his use of a tool for adding questions to a presentation, unfortunately I couldn't work out what the tool was. The closed captions on the video were not available in English, which made it difficult. Also course system kept switching to high resolution video, which I kept switching back to low resolution, so my slow home wireless Internet modem could cope better. 

Course No Substitute for Teacher, Policy, or Project Training

The course gives a potted overview of teaching, but anyone teaching should know this stuff. There is an overview of how to do policy. There was a potted introduction on how to run an AI policy. However, as trained educator, I know about teaching and just need to know how to apply AI to this. As an experienced bureaucrat I know how to run policy processes, and don't need to be spoon fed this in an AI course. Similarly, as a certified computer professional I know how to run a project. The overviews of these areas in the course might be enough to give an untrained person the false impression they could do teaching, policy making or IT project management.

This course starts with a potted history of online learning and the application of AI to it. This history is somewhat short sighted, in that it claims online learning starts around the 1980s. I guess there isn't time for anything more detailed, but I am not sure why online learning need to be mentioned at all, as it is a separate topic from AI. You can do online learning without AI, and you can use AI for learning in a classroom. Is the Digital Education Council pushing a specific agenda? 

For me, the most significant insight from the course was the use of exiting learning tools enhanced with AI.  Tools such as drill and practice quizzes (Quizlet), lab simulations and chatbots can be created using AI. This can be done by the students themselves, with teacher guidance. This is an appealing approach, as it takes away some of the tedium creating the content for the tools, while retaining the teacher's oversight. It also involves students actively in their learning. 

More usefully Digital Education Council point to research they have done showing students across the world say they are already using AI. This is a useful wake-up call for educators who just wish AI would go away. But then some claims made I am not so sure about, for example that AI gives students instant feedback and that is a good thing. I am not so sure instant feedback is a good thing. We need to entourage students to think about the topic and the work they have done. If they get instant feedback from an apparently authoritative source, that cuts out reflection. However some of the pedagogy theory in the course is a bit dated, such as use of Bloom's Taxonomy.

The "AI Literacy for All" had videos with transcripts in multiple languages. However, this course appears to only have closed captions in English. This is a problem as I like to speed read through the transcript. Also some of the videos have no accompanying text. 

The course covers more than just Generative AI, including chatbots, which are applicable in education. In 2018 I took part in a workshop at University of Wollongong as part of the IEEE International Conference on Teaching, Assessment, and Learning for Engineering (TALE). Chi-Un Lei, Yuqian Chai, Xiangyu Hou, and Vincent Tam from University of Hong Kong, took us through using IBM Watson Assistant chatbot engine. My chatbot answered questions about an assignment deadline. It parsed the student's request for an extension and said "No". ;-) 

Some of the uses of AI proposed have ethical difficulties, such as an analysis of the student's performance across all courses and on phone calls. Other proposed uses are not new, such as analysis of the student's performance & personalization of lessons. Claims for these have been overstated. In a well design course it is possible to monitor student's performance & provide feedback via the regular assessment. 

Offering students ways to summarize their notes miss the point that the ability to summarize is one of the key ways students learn & is a skill valued by employers. 

No Dogfooding

Something which struck me a little way into my second Digital Education Council course on AI was the lack of dogfooding: that is doing what you advocate others do. That is, these are courses on how AI could be used in education, so where is the use of AI in these courses? 

Despite talking about how AI could be applied, these are very conventional online courses. There is a fixed linear structure: for each module you watch a video, read some readings, write some reflections, do a multiple choice quiz, then move to the next module. The only apparent use of AI is in some of the reflections, where the student is asked to make use of AI for an exercise ,then discuss the results. This is not necessarily a bad thing, this is a proven format for online educaion, which I am comfortable with. But where is the AI powered education DEC talks about, but doesn't do?

Something which adds to the difficulty of using AI, in association with the DEC AI courses, is that copying and pasting has been turned off in some modules. This also makes normal, non-AI study hard. Normally I would copy a question I am asked, then edit that to produce the answer, then paste it back. I would also write the answer in an editor which has spell checking. The AI courses, on my web browser, allow spell checking, but not correction. So I know I have words spelled wrong, but not what the right word is. I have a couple of goes at guessing what the correct spelling is, then give up and leave the misspelled word in. This is demoralizing and insulting for me. 

Western Blokes in Suits Bias

The early videos mostly feature a western males, in dark business suits or dark Silicon Valley expensive casual clothes. The dress standard gets more casual as the course progresses and the range of presenters increases, but there is, only one female presenter. The voices change between videos and I started to wonder if this was all AI generated, until the last more friendly and scruffy professor appeared, surprisingly from Singapore Management University. It would help if there was a more diverse range of presenters, although I must confess I am sitting here in my tweed coat, with tortoise shell glasses on. ;-)

Too Much Stuff

While talking about good education design, the course suffers from a problem common to such modules: too much stuff. The course took me weeks to do, even without the absurdly large number of optional readings. What is the point of offering a short course and then filling it with enough readings for a semester long program? 

While an experienced online student of education (seven years, three qualifications, in two countries), I struggled to keep on at this course, hour after hour, day after day, week after week. Around about the end of module 2, I started to loose the will to live. ;-)

Benefit from More Systematic Approach to Learning Design

Many of the steps recommended for use of AI in education also apply generally. To use the AI you need to know what you are teaching and how you want to assess it. Those steps also are important in education using pencil and paper.

Advertorial?

Some sections of the course described the benefits of particular products in such detail, that this started to sound like a paid advertisement for the product. 

Too Busy

While most of the course videos are slides with talking heads, a few are very busy animated product demonstrations. Text appears, a pointer moves around, windows pop up and disappear. This can be a bit overwhelming.

Weak on Privacy 

While the course repeatedly warns about privacy it demonstrates many tools which are hosted in the cloud in the USA. For institutions in Australia, Europe, and other parts of the world with privacy legislation, this makes them unusable.

Questions Require Knowledge Not in the Course

Several times the multiple choice questions ask about topics I don't remember being in the course. Perhaps I am just not a very attentive student, but perhaps the course was revised, material deleted and the questions not updated. As a student this was disconcerting, but generally I could guess the correct answer, by eliminating all the implausible options.

AI for Transfer Credit

One example which got my attention was AI course transfer credit.  I volunteered to help assess students applications for course credit. Eight long years later, I am still "temporarily" doing this thankless task. ;-)

If AI could be used to help assess course credit that would reduce the drudgery of the task, and as the course points out, improve equity.

Students not concerned about value for money

There are some interesting results from the Digital Education Council's survey of students internationally, although it was a relatively small number of students. What I found surprising was the concern about the value and value for money of their degree was the least of their worries, when it came to AI. Privacy was the top concern, which is not what I expected. A survey of staff had even fewer participants and while views of AI were more for than against, it is likely there is respondent bias in this survey (those keen on AI would be more likely to complete it).

Integrity?

The section of the course on integrity goes far beyond just AI and could be omitted. As an example, fake certificates used for enrollment predates AI by thousands of years. AI didn't cause this problem, even if it makes the problem worse. Solving the problem is not for educators to do.

No document presented by the student can be trusted. In every case you have to check with the institution it clams to be from if they really got those results. Some institutions have difficulty with this. When enrolling in Canada, I proudly presented my digital certificate from ANU. But I was asked for a certified paper copy. I had to pay to have this printed, signed, placed in an envelope, seals placed over the envelope seams, the seals each signed, placed in another envelope and couriered to Canada. This was accepted happily, but was all nonsense, as I could have easily faked it. 

One section of the course cited multiple universities who have decided Turnitin can't detect AI use. This is hardly a revelation, as anyone who has spent a few minutes trying it will see. ANU turned off Turnitin's AI detector on 1 January 2024.

Building AI

The section on building AI was unnecessary, and wrong, in parts. AI is a specialized form of IT and so IT professionals are needed for this. But the course suggests academics can build an AI system themselves, without mentioning of needing competent professionals. This is dangerous advice.

The second option for was for academics to do an AI startup, that is inventing the AI yourself, which is just silly. This is the equivalent of a catering company needing a computer, so soldering one together. While one company once did that 75 years ago, it is not an approach needed today. ;-)

The third option presented was to cooperate with corporations and startups, but without mentioning this is a high risk strategy (about 75% of computer projects fail). The most obvious, low risk, sensible option is not mentioned, which is to simply buy an AI service. There is no need to built it, invent it, collaborate, just buy an already built product. 

As an computer professional, I am often asked for recommendations on software and hardware. My advice is to buy what your friends and colleagues use. What you buy will be more likely to work and when you have problems you can ask people you know for help. The mistake many individuals and organisations make is to buy something new, on the assumption it must be better.

Decentralization

The course warns of the problems of decentralization, with different parts of the university having different approaches to AI, thus confusing the students. In a way this is understating the problem, with different instructors having different approaches: the decentralization goes all the way down to the individual. 

On the other hand decentralization doesn't matter so much to the student, as the university is just one organisation they have to deal with. At one point I was enrolled in two universities and a vocational education college at the same time, each with its own rules and timetables. Students will almost all be working so exposed to at least two sets of AI rules, one of their employer and the other where they are studying.

Guidelines

The section of the course on university guidelines was useful, as it included many examples, including an Australian one. 

Communication

The course got a little silly when it suggested communicating AI policy by email, intranet and web pages. What next, suggest what grade of paper is best for AI posters? ;-)

More seriously, communication of AI policy is important, but is no different than many other polices which need to be communicated. Just as no special strategy is needed for AI project management, none is needed for communications.

Fantasy Future

The last section of the course did not start well, presenting a future, where robotics & AI proves leisure time for the population & more time for students to explore their interests. This future has been predicted for other technological developments applied to learning, from correspondence courses to the Internet. However, they did not take into account the unchanged human nature. Traditional educational institutions have been able to accommodate each technological development, with the students & the community rejecting radical changes to educational delivery.

Despite this, I was most impressed with the last section of the course provided by SMU. Perhaps DEC should have left it to SMU to provide the entire course. But still, much of this was about the processes of education, which anyone teaching at a university should have learning in their basic teacher training (I did). One part I did not like was a 55 minute video. This was an interesting conversation between two experts, but 55 minutes is far, far too long. This should have been broken into smaller chunks.

One curious metaphor used was "hands on the wheel" of AI. My new car is controlled with voice and buttons. There is a steering wheel, but most of the time the car steers itself. That might be a better metaphor for AI in education: I select the destination, the car proposes a route, which I amend as needed then confirm. I then monitor the car's systems, while it dives itself. 

Singapore Education Leadership

Singapore's universities previously provided me with career, and life, changing inspiration. In 2013 I was giving a talk in Singapore on e-learning. One of the audience mentioned Singapore had an e-learning week, after a SARS outbreak, to prepare for future pandemics. As a result I was mentally and technically prepared for the COVID-19 outbreak a decade later. The SMU section of this course may well prove to be a similar pivotal moment in my teaching.

ACRONYMS

The educational acronyms used in the course were getting a bit silly, particularly the ABCDEF framework. What next an alphabet framework? Copilot wrote me one. ;-) 

A–Z Educational Acronym (All Letters in Order)

  1. Active
  2. Building
  3. Curiosity
  4. Drives
  5. Engaged
  6. Focused
  7. Growth,
  8. Helping
  9. Innovative
  10. Judgement
  11. Kindle
  12. Lifelong
  13. Mastery.
  14. Nurturing
  15. Open
  16. Problem‑solving
  17. Questions
  18. Reinforces
  19. Skilled
  20. Thinking,
  21. Uplifting
  22. Valuable
  23. Wisdom
  24. Xpanding
  25. Youthful
  26. Zeal.

Output

Here is one of the exercises I completed as part of the course. I am still not sure what I was supposed to do with it, as there was nowhere to submit it. This was done with the aid of MS Copiliot:

Learning Activity Brief: Exploring Project Reports with Microsoft Copilot

Learning Goal

Students will develop a clear understanding of the structure, purpose, and quality expectations of professional project reports in computing. By the end of the activity, students should be able to:

  • Identify the essential components of a high‑quality computing project report

  • Distinguish between technical detail, justification, and reflective analysis

  • Evaluate the clarity, coherence, and professionalism of report writing

  • Apply these insights to improve their own project reporting practices

Role of AI (Microsoft Copilot)

Students will use Microsoft Copilot as an exploratory and analytical tool. Copilot will support:

  • Information gathering — e.g., asking Copilot to explain typical report structures, compare academic vs industry reports, or summarise best‑practice guidelines

  • Critical evaluation — e.g., prompting Copilot to critique sample report excerpts or identify weaknesses in clarity, structure, or justification

  • Reflection — e.g., asking Copilot how a reader might interpret a section, or what improvements would strengthen a report’s argument

Copilot is not used to generate a full project report. Instead, it acts as a catalyst for discussion, critique, and deeper understanding.

Activity Workflow (1 hour, small groups)

  1. Initial Prompting (10 min) Groups ask Copilot to outline what makes an effective computing project report.

  2. Deep Dive (20 min) Each group selects one component (e.g., requirements, design rationale, testing, evaluation) and uses Copilot to explore expectations, common pitfalls, and examples.

  3. Critical Challenge (15 min) Groups provide Copilot with a short, deliberately flawed excerpt (provided by the instructor) and ask it to critique and suggest improvements.

  4. Synthesis (15 min) Groups consolidate their findings into a concise artefact.

Student Product

Each group produces a one‑page “Project Report Quality Guide” that includes:

  • A definition of the chosen report component

  • Key quality criteria

  • Common mistakes students make

  • A short example of improved writing based on Copilot’s critique

  • A brief reflection on how AI supported (and sometimes limited) their understanding

This product must be written in the students’ own words, with AI‑generated text clearly identified or paraphrased.

Assessment

This activity contributes to formative assessment and may be graded on participation or as a low‑stakes submission. Assessment focuses on:

  • Accuracy and clarity of the group’s explanation of their report component

  • Depth of analysis in identifying quality criteria and pitfalls

  • Critical engagement with AI — evidence that students questioned, validated, or refined Copilot’s output

  • Quality of the improved example — demonstrating understanding, not AI‑dependence

  • Reflection on the role of AI in academic and professional writing

Rubric criteria can be aligned with:

  • Understanding of report conventions

  • Critical thinking

  • Communication quality

  • Responsible AI use 



No comments:

Post a Comment