Notes and resources from the February 21 Practice Exchange on assessment.
Resources mentioned in the session
Some of the resources mentioned in the session are posted:
These were mentioned as useful resources for designing information literacy instruction (although they are ostensibly aimed at students) :
Badke, William. Research Strategies: Finding Your Way Through the Information Fog, 3d ed (New York: iUniverse, 2008) http://go.utlib.ca/cat/6386623
Booth, Wayne C, Gregory G. Colomb & Joseph M. Williams. The Craft of Research, 3d ed (Chicago: University of Chicago Press, 2008) http://go.utlib.ca/cat/7673451 (ebook)
Notes from the discussion
Assessments can have different purposes, eg learning outcomes vs accountability.
Things that have worked
ADVANCE SURVEYS: In preparation for a series of sessions, sending an advance survey asking students about things like their level of study, area of research, and what they hope to get out of the session.
ON THE SPOT TOPIC SELECTION: A multiple choice poll-everywhere type survey at the beginning of class to let students identify areas they most want/need instruction on from among a list of topics you’ve prepared for. This lets you target your instruction and helps with the problem of trying to cram too much into a session. It assumes students will be good judges of what they need to know, so it’s usefulness probably varies by group.
3-2-1 AT THE BEGINNING OF A CLASS: using these at the beginning of a session is a great icebreaker that allows you to take the temperature of a class while getting everyone involved right off the bat.
Pre-tests can be enormously useful, but some care is needed when using them lest they have unintended consequences or ‘backwash’. One participant noted that a multiple choice pretest targeting at-risk students may have had a demoralizing effect on the very group it was intended to help.
More on 3-2-1s & other forms of assessment
3-2-1 FOR POST ASSESSMENT: example : 3 things I learned, 2 things I can use right away, 1 thing I have a question on (muddiest). Engages students in reflection on what they’ve just been taught and provides us with feedback. TIP: Make the last section a ‘tear off’ so you can collect and review the muddiest points and students can take home their recollections of ‘things I learned’ and ‘things I can use right away’. You can do this type of assessment with online tools like Survey Monkey and Google Docs as well.
Online polls are probably not as useful for small groups. A simple show of hands might be used instead, for example.
ACTIVE LEARNING ON EVALUATING GOOGLE RESULTS: students were asked to search on topics like the Chernobyl disaster, post-modern art, consumer trends in the use of tablets. The topics were tailored to produce search results and generate strong, sometimes opposing viewpoints. Students quickly got very animated and involved. They were asked to assess things like: are any results scholarly, who are authors, what are their affiliations, what do you think is the best result, when was information last updated, do the authors cite any evidence, would you use this in an assignment. The exercise takes about 35 minutes plus 5-10 minutes of discussion, so it is time consuming and probably most feasible within a multi-class series or a course, but it quickly engages students with the practice and norms of information literacy.
USING ASSESSMENT RESULTS TO UPDATE LIBGUIDES/FAQS . Tell students in your session that filling out the assessment form is important because you will update the Libguide or other resource with their questions and your answers. Students really like this and it works very well, however it involves a significant amount of after-class work for you.
FIND OUT WHAT THEY ‘REALLY’ DON’T KNOW: Newer students in particular often simply don’t know what they don’t know. Early assessment can help determine the right jumping off point for a session.
Given the limited impact of 1 shot sessions, should we be doing 321 type assessments more and other types of assessments less?
Short surveys sent to faculty are useful for a number of reasons, not the least of which is that faculty are often reluctant to give critical feedback face to face right after a session. A pdf containing some examples of short surveys for faculty compiled by Rita Vine is attached. These types of surveys try to obtain data on whether a session impacted the quality of the student assignments.
Faculty assessment addresses some of the limitations of student self- assessment as well as timing issues (ie allows post-assignment assessment).
At one university performance assessment for librarians includes a demonstration of the impact of teaching sessions, for which faculty input is often sought.