Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.
About a year-and-a-half ago, some students and I wrote OMSI, Online Measurement of Student Insight, an online software tool to improve examinations for students and save instructors lots of time and drudgery currently spent on administering exams. It is written in a mixture of Python and R. (Python because it was easier to find students for the project, R because it built upon a earlier system I had developed entirely in R.)
I will describe it below, but I wish to say at the outset: I NEED TESTERS! I’ve used it successfully in several classes of my own so far, but it would be great to get feedback from others. Code contribution would be great too.
From the project README file:
Students come to the classroom at the regular class time, just as with a traditional pencil-and-paper exam. However, they use their laptop computers to take the exam, using OMSI. The latter downloads the exam questions, and enables the students to upload their answers.
This benefits students. Again from the README:
- With essay questions, you have a chance to edit your answers, producing more coherent, readable prose. No cross-outs, arrows, words squeezed in above a line, no points off for unreadable handwriting.
- With coding questions, you can compile and run your code, giving you a chance to make corrections if your code doesn’t work.
In both of these aspects, OMSI gives you a better opportunity to demonstrate your insight into the course material, compared to the traditional exam format.
It is a great saver of time and effort for instructors, as the README says:
OMSI will make your life easier.
OMSI facilitates exam administration and grading. It has two components:
- Exam administration. This manages the actual process of the students taking the exam. You get electronic copies of the students’ exams, eliminating the need for collecting and carrying out out a large number of papers, and making work sharing much easier among multiple graders. As noted in “Benefits for students” above, OMSI enables the student to turn in a better product, and this benefits the instructor as well: Better exam performance by students is both more gratifying to the instructor and also makes for easier, less frustrating grading.
- Exam grading. OMSI does NOT take the place of instructor judgment in assigning points to exam problems. But it does make things much easier, by automating much of the drudgery. For instance, OMSI automatically records grades assigned by the instructor, and automatically notifies students of their grades via e-mail. Gone are the days in which the instructor must alphabetize the papers, enter the grades by hand, carry an armload (or boxload) of papers to give back to students in class, retaining the stragglers not picked up by the students, and so on.
Here is an R example showing sample exam questions. At the server, the instructor would place the following file:
QUESTION -ext .R -run "Rscript omsi_answer1.R" Write R code that prints out the mean of R's built-in Niles dataset, starting with observation 51 (year 1921). QUESTION -ext .R -run "Rscript omsi_answer2.R" Write an R function with call form g(nreps) that will use simulation to find the approximate value of E(|X - Y|) for independent N(0,1) variables X and Y. Here nreps is the number of replications. Make sure to include a call print(g(10000)) in your answer, which will be run by OMSI. QUESTION -ext .R -run "Rscript omsi_answer3.R" Suppose X ~ U(0,1). Write an R function with call form g(t) which finds the density of X^2 at t, for t in (0,1). Make sure to include a call print(g(0.8)) in your answer, which will be run by OMSI. QUESTION Suppose an article in a scientific journal states, "The treatment and nontreatment means were 52.15 and 52.09, with a p-value of 0.02. So there is only a 2% chance that the treatment has no effect." Comment on the propriety of that statement. At the client side, the student would see this:
After the student enters an answer and hits Save and Run, the student’s code would be run in a pop-up window, displaying the result. When the student hits Submit, the answer is uploaded to the instructor’s server. There is a separate directory at the server for each student, and the answer files are stored there.
Again, the autograder does NOT evaluate student answers; the instructor does this. But the autograder greatly facilitates the process. The basic idea is that the software will display on the screen, for each student and each exam problem, the student’s answer. In the case of coding questions, the software will also run the code and display the result. In each case, the instructor then inputs the number of points he/she wishes to assign.
The package is easy to install and use, from both the student and instructor point of view. See the README for details.
R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.