Teaching Portfolio CS-1P Programming - Glasgow University - Quintin Cutts

--- Introduction
--- Content Summary
--- Acknowledgements

--- Teaching Philosophy
--- Institutional Context

Course Structure
--- Aims, Objectives, Content
--- Delivery Methods
--- Assessment

--- Commenting on Content
--- Use of Voting Handsets
--- Laboratory Examination
--- Written Examination
--- Continuous Assessment
--- Overcoming Blocks

--- New course rationale
--- Personal learning

Laboratory examination

As a group, those who used a laboratory exam format in the Disciplinary Commons wrote a paper about the similarities and differences. A copy is available here.

Format of the Lab Exam

This description expands on that given in the assessment section elsewhere in this portfolio.

Students are given the lab exam question 10 days to two weeks before the exam session itself. See this file for an example of this year's first lab exam question.

The students can use the time between handout of the question and their 2-hr lab slot in the final week of the semester to prepare a solution to the question.

In the final week, we aim to create the exam conditions that might be found in a typical written exam. The lab is isolated from the network. The students come to their usual laboratory slot that week, bringing nothing into the lab with them. They use an exam account only valid for this exam slot, and any work they complete will be submitted using an electronic system.

Students are expected to use the 110 minutes of the lab slot to type in the solution they have prepared previously. They have the original question available to them for reference, as well as a crib-sheet of the main Ada constructs we've covered, some blank paper for rough work, and access to the printer for print-outs of their code. The Ada-Gide programming environment also has the language reference manual and a guide to common programming errors.

Why this format?

Until five years ago, we had continually and summatively assessed exercises throughout the module and written exams midway through and at the end of the module. Plagiarism in the exercises was rife, and we knew this through the use of detection software backed up by our own eyeballing of likely cases. Reporting these plagiarism cases formally took excessive amounts of time and additionally we were poorly supported (in our view) by the university.

From an educational point of view, I'd never been happy with assessing summatively the students' skill development process, which is how I view the exercises a student completes during the course. I was keen to divide the development of their practical skills from its summative assessment. A lab exam was the obvious answer, but to use the traditional "get them all into a lab, give them an unseen problem, let them get on with it" seemed untenable with over 450 students. How could we get a lab large enough?

Instead, we opted to use the students' standard lab slots, in an exam week. We were reluctant to develop numerous different exams, for the single week, to ensure that each student was indeed sitting an unseen exam. Hence we opted for the option described above, where all students see the question in advance. This does seem like an odd step, but we believe it is still viable from an assessment viewpoint.

What is being assessed?

This exam format is clearly not assessing problem solving skills. We have no way of ensuring that the students have solved the problem on their own - indeed there is plenty of circumstantial evidence that some students do lift a complete solution from their friends.
However, the students' practical skills are being thoroughly assessed. They must be able to get the program into the machine. They must be able to diagnose and fix errors. They must be able to test their programs against a variety of input data that they will need to generate during the exam.

Can't they just memorise an entire solution?

We believe that a student will not be able to memorise a program of reasonable size perfectly. Whilst they may be able to memorise the overall structure of the program, they will need to augment this with a reasonable understanding of programming concepts, the programming language itself, and debugging skills in order to get a fully-working program.

This belief is supported by observations taken in almost every lab exam of the student who has tried the memorisation technique. They come into the lab, wait expectantly for the start signal looking positively pregnant with code, and then on the mark, it spews out of their fingers as fast as they can release it and into the program file. This process may take around 10 minutes. They then try to compile it, find numerous errors, and spend the next 100 minutes trying, but failing, to fix the errors. Usually, they have failed to learn the proper syntax for a particular construct, and have additionally not learned the skills to resolve this - for example, making use of the crib sheet of Ada constructs, or the available language reference manual.

We warn them of this situation that we have seen so often in advance of the exam, and let them know of the resources available to help, but many fail to take heed of our advice.

I will find an example of this kind of "stuck student" from the electronic copies we have of all their submissions, to include here as an artefact. I will also include a complete solution to give an idea of the size of program that we think is not memorisable.

If not here, where are problem solving skills assessed?

We use a traditional written exam to assess problem solving. Although possibly somewhat artificial, if a student can produce a reasonable outline/plan for solving a problem, and write the basics of a program for it, then we do think they have gained a thorough grounding in problem solving. We accept that their code will be flaky (and tell them this), but it is the overall structure we are assessing here.

It does of course depend on a principally top-down programming style, and on reflection, our weaker students are probably not experienced enough to be able to program this way - for me, top-down works once a thorough range of patterns are well-learned, and so the process can be guided with this knowledge.

The written exam aspect is covered in more detail in a later section