Liaison Update Forum June 19 2017 – Unpaywall & Report on University Professor metrics

Slide decks from the June 19 2017 Liaison Update Forum

  1. Stephanie Orfano and Mariya Maistrovskaya: Unpaywall and other tools to bypass publisher paywalls.
  2. Gail Nichol, Mindy Thuna, Klara Maidenberg, Susan Barker, Heather Cunningham, Stephanie Orfano. Metrics & the University Professor Submissions

 

test

Electronic Resource Management Lifecycle and Workflow at UTL

Slide deck from the staff presentation  in March 2017 by Weijing Yuan, Marlene van Ballegooie, Klara Maidenberg

Do you wonder what happens behind the scenes to acquire and manage access to e-resources? Are you curious to know why some resources have multiple access points and some have various restrictions? Have you heard talk of the COUNTER standard and wish you knew what it was and how to use it? If so, we hope you’ll join us for a presentation titled The eResources Lifecycle. This presentation will help you become familiar with the e-resource management workflow and will cover licensing, access, assessment and associated challenges.

test

Using Adaptive Research Consultations to Support Scholars More Effectively

Megan Potterbusch

From the April SHARE Update, this posting by Megan Pottersbush,  2016-17 National Digital Stewardship Resident (NDSR) at the Association of Research Libraries: 

“Although I enter research consultations with questions in mind—and often on paper if there is a particular library or technical goal to illuminate, I try not to assume that my current favorite tool or tools will be the best answer to whatever the researcher’s current challenge might be or even that I already know the right solution to a given challenge. Instead, I gather resources and best practices throughout my work and mentally file them away to call on when the situation warrants it. My goal is to better understand researchers’ workflows and challenges from a human-centered service perspective, and to adapt my questions and solutions to the needs I hear arise in their answers—always seeking to gain a better understanding.”

Read the full post

test

Introduction to Elsevier’s CiteScore

Liaisons and other librarians working with faculty should be aware of Elsevier’s recent release of a new  bibliometric, called the CiteScore Index (CSI). This metric will be a direct competitor to Thomson Reuters’ (now Clarivate Analytics’) ubiquitous Journal Impact Factor (JIF). The metrics are similar in that they both purport to measure the impact of academic journals based on the ratio between citable content published in the journal to citations to the journal.

While the JIF is based on content indexed in the Web of Science database, CSI will be based on the content in Scopus, which indexes a significantly larger number of titles (22,000 titles compared to 11,000).

If a journal’s impact is a consistent and measurable attribute, it stands to logic that its impact rank and score would be very similar regardless of who calculates the metric. However, preliminary analyses are showing that this is not the case. Librarians might wish to read the findings of early comparisons by Carl T. Bergstrom and Jevin West (developers of yet another metric, the EigenFactor). Surprising no one, they report that Elsevier journals seem to enjoy a boost in ranking using the new CiteScore, while the scores for Nature and Springer journals (now owned by the same company, and a major competitor to Elsevier journals in the space) are lower than what you might expect given their Impact Factors. Additionally, journals published by Emerald, which performed poorly compared to journals from other publishers in the same disciplines during our own analysis, have also seen a boost from the new metric.

These findings underscore the fact that reputational metrics are neither impartial nor objective and are subject to the influences of the entities that produce them. Librarians should be prepared to engage in critical evaluation of these metrics and to answer questions from faculty.

(Thank you to Klara Maidenberg, Assessment Librarian, for providing this information.)

test

Using short stories and drawings in information literacy instruction

From Navroop Gill:

Yesterday [June 16] at our meeting, I shared a little about the speakers I had seen at WILU, David Brier & Vicky Lebbin who are at the University of Hawaii at Manoa. Their approach to information literacy incorporates short stories and drawings which they have found to be highly engaging methods for students.

I’ve attached their handouts which provide ideas of how to structure lessons using these techniques ( just a note: they had read through hundreds of short stories to find ones that were suitable for IL!)

Their articles if you’re interested:

Brier, D. J., & Lebbin, V. K. (2015). Learning information literacy through drawing. Reference Services Review, 43(1), 45-67 http://simplelink.library.utoronto.ca/url.cfm/505223

Brier, D. J., & Lebbin, V. K. (2004). Teaching information literacy using the short story. Reference Services Review, 32(4), 381-385. doi:10.1108/00907320410569734 http://simplelink.library.utoronto.ca/url.cfm/505227

 

test

Proquest Databases @ U of T – Liaison Update Forum

Links and materials from the June 14 2016 Liaison Update Forum to review methodologies and communications options related to the discontinuation of selected Proquest databases.

  • Library web site: “Libraries approach to collection building” (includes links to Confluence-posted content for library staff)
  • Presentation slides: PQC methodology June14
  • Proquest Central documentation (in Confluence)
  • Sample questions and responses from today’s discussion: Each scenario was reviewed by a small group of librarians, who developed a strategy for response.  The responses were shared with the entire meeting for additional suggestions and changes.
Scenario 1 (click to enlarge)

Scenario 1 (click to enlarge)

Scenario 2 (click to enlarge)

Scenario 2 (click to enlarge)

Scenario 3 (click to enlarge)

Scenario 3 (click to enlarge)

Scenario 4 Group 2 (click to enlarge)

Scenario 4 Group 2 (click to enlarge)

Scenario 4 Group 1 (click to enlarge)

Scenario 4 Group 1 (click to enlarge)

Scenario 5 (click to enlarge)

Scenario 5 (click to enlarge)

test

SPARC-CARL Webinar on Supports for Tri-Agency Policy

Now Online: SPARC-CARL Webinar on Supports for Tri-Agency Policy

The recording and slides of the SPARC-CARL Webinar, “Library and Research Services Supports for the Tri-Agency Open Access Policy on Publications,” are now available on the CARL website. / / L’enregistrement et les diapositives du webinaire de SPARC et de l’ABRC, « Politique des trois organismes sur le libre accès aux publications : Soutien offert aux chercheurs par les bibliothèques académiques et les services de recherche universitaire » sont maintenant disponibles sur le site web de l’ABRC. Session en français  English-language session

test

Webinar: Research Impact Metrics for Librarians

Thank you to all of you who attended today’s webinar on research metrics, Research impact metrics for librarians: calculation & context | May 19, 2016.  This was a great overview of the challenges of metrics. Although the presentation focused on sciences, the content of the slides may be helpful to all of us who need to become better acquainted with benefits and limitations of key metrics tools.

You can now view the presentation on demand at your convenience with audio.

Additional documents:

test