Proquest Databases @ U of T – Liaison Update Forum

Links and materials from the June 14 2016 Liaison Update Forum to review methodologies and communications options related to the discontinuation of selected Proquest databases.

  • Library web site: “Libraries approach to collection building” (includes links to Confluence-posted content for library staff)
  • Presentation slides: PQC methodology June14
  • Proquest Central documentation (in Confluence)
  • Sample questions and responses from today’s discussion: Each scenario was reviewed by a small group of librarians, who developed a strategy for response.  The responses were shared with the entire meeting for additional suggestions and changes.
Scenario 1 (click to enlarge)

Scenario 1 (click to enlarge)

Scenario 2 (click to enlarge)

Scenario 2 (click to enlarge)

Scenario 3 (click to enlarge)

Scenario 3 (click to enlarge)

Scenario 4 Group 2 (click to enlarge)

Scenario 4 Group 2 (click to enlarge)

Scenario 4 Group 1 (click to enlarge)

Scenario 4 Group 1 (click to enlarge)

Scenario 5 (click to enlarge)

Scenario 5 (click to enlarge)

test

Radical Change in Library Assessment Called for by Elliott Shore at Northumbria Conference

From his address at the 2013 Northumbria Conference on Performance Measurement in Libraries & Information Services, Elliott Shore, Executive Director of the Association of Research Libraries, calls for a major shift in the ways that libraries think about and gather data:

“There was a time when the research library had a monopoly on research—if you wanted to do research, you had to use the library, literally, physically. We lost that monopoly over the last 20 years but our historical-legacy thinking and practice have not come to terms with this loss. In fitting our measures to our goals, we need to realize this fundamental truth if we want to have a fighting chance and not focus on the library solely, but the world of information in which we now live.”

Read more on Shore’s address

 

test

Who enrolled in U of T MOOCs and why?

Are you surprised that about 70% of all those enrolled in U of T MOOCs already had undergraduate or graduate degrees? And that the primary reasons for enrolling in courses was to further job-related skills and/or because the course looked like fun?

These initial demographic trends remind us that there are many reasons that people sign up for MOOCs. They may take a little or a lot from the program. They may only watch one or two videos — and get precisely what they needed or wanted. They may complete the entire course and seek credentials. Or not.

See the June 16 2013 Demographic Report on Coursera MOOCs

test

Feb 21 Practice Exchange on Assessment notes

Notes and resources from the February 21 Practice Exchange on assessment.

 Resources mentioned in the session

 Some of the resources mentioned in the session are posted:

Books

These were mentioned as useful resources for designing information literacy instruction (although they are ostensibly aimed at students) :

Badke, William. Research Strategies: Finding Your Way Through the Information Fog, 3d ed (New York: iUniverse,  2008) http://go.utlib.ca/cat/6386623

Booth, Wayne C,  Gregory G. Colomb & Joseph M. Williams. The Craft of Research, 3d ed (Chicago: University of Chicago Press, 2008)  http://go.utlib.ca/cat/7673451   (ebook)

  Notes from the discussion

Assessments can have different purposes, eg learning outcomes vs accountability.

Things that have worked

ADVANCE SURVEYS: In preparation for a series of sessions, sending an advance survey asking students about things like their level of study, area of research, and what they hope to get out of the session.

ON THE SPOT TOPIC SELECTION: A multiple choice poll-everywhere type survey at the beginning of class to let students identify areas they most want/need instruction on from among a list of topics you’ve prepared for. This lets you target your instruction and helps with the problem of trying to cram too much into a session. It assumes students will be good judges of what they need to know, so it’s usefulness probably varies by group.

3-2-1 AT THE BEGINNING OF A CLASS: using these at the beginning of a session is a great icebreaker that allows you to take the temperature of a class while getting everyone involved right off the bat.

Pre-tests can be enormously useful, but some care is needed when using them lest they have unintended consequences or ‘backwash’. One participant noted that a multiple choice pretest targeting at-risk students may have had a demoralizing effect on the very group it was intended to help.

More on 3-2-1s & other forms of assessment

3-2-1 FOR POST ASSESSMENT: example : 3 things I learned, 2 things I can use right away, 1 thing I have a question on (muddiest). Engages students in reflection on what  they’ve just been taught and provides us with feedback. TIP: Make the last section a ‘tear off’ so you can collect and review the muddiest points and students can take home their recollections of ‘things I learned’  and ‘things I can use right away’. You can do this type of assessment with online tools like Survey Monkey and Google Docs as well.

Online polls are probably not as useful for small groups.  A simple show of hands might be used instead, for example.

ACTIVE LEARNING ON EVALUATING GOOGLE RESULTS: students were asked to search on topics like the Chernobyl disaster, post-modern art, consumer trends in the use of tablets. The topics were tailored to produce search results and generate strong, sometimes opposing viewpoints.  Students quickly got very animated and involved. They were asked to assess things like: are any results scholarly, who are authors, what are their affiliations, what do you think is the best result, when was information last updated, do the authors cite any evidence, would you use this in an assignment. The exercise takes about 35 minutes plus 5-10 minutes of discussion, so it is time consuming and probably most feasible within a multi-class series or a course, but it quickly engages students with the practice and norms of information literacy.

USING ASSESSMENT RESULTS TO UPDATE LIBGUIDES/FAQS . Tell students in your session that filling out the assessment form is important because you will update the Libguide or other resource with their questions and your answers. Students really like this and it works very well, however it involves a significant amount of after-class work for you.

FIND OUT WHAT THEY ‘REALLY’ DON’T KNOW: Newer students in particular often simply don’t know what they don’t know. Early assessment can help determine the right jumping off point for a session.

Given the limited impact of 1 shot sessions, should we be doing 321 type assessments more and other types of assessments less?

Faculty feedback

Short surveys sent to faculty are useful for a number of reasons, not the least of which is that faculty are often reluctant to give critical feedback face to face right after a session. A pdf containing some examples of short surveys for faculty compiled by Rita Vine is attached. These types of surveys try to obtain data  on whether a session impacted the quality of the student assignments.

Faculty assessment addresses some of the limitations of student self- assessment as well as timing issues (ie allows post-assignment assessment).

At one university performance assessment for librarians includes a demonstration of the impact of teaching sessions, for which faculty input is often sought.

test

New ALA project: Assessment in Action

http://www.ala.org/acrl/AiA

ALA has received a substantial multi-year grant for “Assessment in Action: Academic Libraries and Student Success” (AiA).  Undertaken in partnership with the Association for Institutional Research (AIR) and the Association of Public and Land-grant Universities (APLU), the grant will support the design, implementation and evaluation of a program to strengthen the competencies of librarians in campus leadership and data-informed advocacy.

From the press release: “Each participating institution will identify a team consisting of a librarian and at least two additional team members as determined by the campus (e.g., faculty member, student affairs representative, institutional researchers, or academic administrator). The librarian team leaders will participate in a one-year professional development program that includes team-based activities carried out on their campuses.”

test

Recent publications from the RAILS project

“Happy RAILS to You! Using Rubrics for Authentic, Reliable, and Convincing Learning Assessments”
ALA Annual Conference
Anaheim, CA, June 2012
Poster

“A Multi-Institution Study of Rubric Assessment: Lessons Lived and Learned”
Association for the Assessment of Learning in Higher Education Annual Conference
Albuquerque, NM, June 2012
PPT

“500 Students, 50 Raters, and 5 Rubrics Later: What We Learned from an Authentic, Collaborative, and National Assessment Project”
presented by Megan Oakleaf, Jackie Belanger, Ning Zou, Carroll Wilkinson
LOEX Conference
Columbus, Ohio, May 2012
PPT

test