1932

CREDIT: V2.Graphics via Shutterstock

How An Acadmic Library Built a Research Impact and Intelligence Team

Library positions dedicated to research impact and intelligence are rare. Our team shows what they can accomplish.

LAYOUT MENU

Insert PARAGRAPH
Insert H2
Insert H3
Insert Unordered List
Insert Ordered List
Insert IMAGE CAPTION
Insert YMAL WITH IMAGES
Insert YMAL NO IMAGES
Insert NEWSLETTER PROMO
Insert QUOTE
Insert VIDEO CAPTION
Insert Horizontal ADVERT
Insert Skyscrapper ADVERT

LAYOUT MENU

During recent decades, universities have faced increasing pressure to demonstrate their value and impact by contributing to real-world problem-solving and meeting broader societal needs. The reasons for this increased pressure are complex and numerous—reflecting socio-economic and socio-political considerations, globalization and intensifying competition, and growing demands for accountability and demonstrable public value. At Virginia Tech, our library’s research impact and intelligence team, of which we are all members, supports institutional strategy, researcher visibility, and decision-making in response to these demands. In this article, we’ll outline the emergence of research impact and research intelligence work in libraries, trace the development of our department, and illustrate how analytics, research information management, and consultation services are operationalized alongside ongoing efforts to promote responsible interpretation and use of research metrics.

What Are Research Impact and Research Intelligence?

Until about 2016, research impact librarian positions did not exist in research libraries in the United States. Today, a quick search reveals about 35 US libraries in where “research impact” is either a part of a job title or a significant component of the position’s responsibilities. We suspect there are many more. Of course, that does not mean bibliometric-based services, often at the heart of research impact services, were not being provided prior to the creation of these positions. Bibliometrics have been used by librarians for a long time. But as a dedicated, titled position focused on demonstrating impact, the research impact librarian is relatively new, especially in comparison to similar roles, such as subject-based liaison or acquisitions librarians.

In the US, research intelligence positions are even rarer. Throughout the US, the UK, Europe, and Australia, the term “research intelligence” is explicitly promoted as a service in roughly 25 libraries, with the highest numbers in mainland Europe and the UK. In the United States, the term is more often used outside the library—it is often associated with research offices—and the people performing these services often exhibit limited formal training in bibliometrics. Conversely, in the UK and the Netherlands, librarians routinely perform research intelligence work, often alongside subject matter experts, research development professionals, institutional effectiveness units, and information scientists.

While “research impact” can mean different things, our team defines research impact, fundamentally, as the demonstrable effects that research has—or can have—on society, innovation, and the broader research enterprise. Research impact can be evidenced in many forms, such as qualitative narrative, but in this context research impact metrics often serve as proxy measurements for research impact. Research impact as a service, then, can be defined as a collection of actions involving data collection, analysis, consulting, and reporting to demonstrate the effects of research.

Research intelligence can be defined as collecting data and reports and investigating, analyzing, and reporting on a particular research ecosystem to support research strategy with the aim of strengthening future research impact. While the process is not entirely linear, research intelligence is often done before research to maximize research impact; the activities, tools, and goals of research impact and research intelligence are deeply interwoven.

Why Research Impact and Intelligence Services Matter

There is a growing mountain of information in the world, with some scholarly databases including over 250 million scholarly works (i.e. OpenAlex, Lens.org). The data connected to those scholarly works is a rich resource to explore and understand. Research impact and intelligence services allow institutions to leverage that data to effectively demonstrate their value, plan strategically, obtain funding, recruit talent, and more and can help individuals, groups, and organizations demonstrate their impact for purposes such as promotion, tenure, and rankings.

Libraries—with their experience in metadata associated with scholarly works, scholarly communication, and scientific research—are particularly well positioned to provide research impact and intelligence services. They can also collaborate effectively with researchers, research offices, administrators, and their parent institutions.

History of Research Impact and Intelligence at Virginia Tech

In 2016 and 2018, Virginia Tech’s University Libraries created two novel positions: First, in 2016, co-author Connie Stovall assumed the role of assistant director for strategic research, which was developed in part to support a newly established unit on campus now known as Innovation & Partnerships (I&P). I&P assists faculty and units in developing corporate partnerships, launching start-ups, and managing the university’s technology transfer services. Connie was charged with translating information to insights, which she did by collecting, analyzing, and synthesizing data from literature databases, analytics databases, marketing and finance databases, government funding databases and the world wide web. Her work included:

  • Conducting “landscape scans” or “gap analyses”—targeted, exploratory analyses of a research domain that use curated keyword searches and visual summaries to assess where scholarship is concentrated and where gaps, including imbalances in terminology or conceptual framing, may exist
  • Identifying internal and external collaborators, competitors, and corporate partners to support proposals and academic-corporate initiatives
  • Benchmarking external programs, centers, and institutes through analysis
  • Evaluating the marketability of innovation
  • Identifying funding opportunities and performing retrospective funding analysis
  • Reporting on academic job candidates’ research outputs, funding, and patents and the post-graduation outcomes of their students to inform hiring decisions

Connie synthesized, analyzed, and presented findings in slides, often including visualizations created with dashboard analytic tools, such as Tableau, and scientometric visualization tools, such as VOSviewer.

In 2018, the provost approved another position in a different division—the research impact librarian—which was filled by co-author Rachel Miles. Rachel was tasked with helping faculty, researchers, and units advance research impact, researcher visibility, and profile management through coordinated research information management (RIM) services. She supported the discovery, tracking, and communication of research, scholarly, and creative works; promoted best practices in researcher identifiers and profiles (e.g., ORCID and institutional researcher profiles); and contributed to the development and use of the university’s research information management system (RIMS). While Rachel’s responsibilities initially emphasized research impact analysis, metrics-informed consultations, and customized reporting, they evolved to focus on RIMS leadership, researcher visibility, and profile curation.

Today, Rachel is responsible for a coordinated portfolio of services spanning research information management systems and data quality, researcher visibility and profiles, research impact analysis and assessment, and the development and assessment of tools, training, and workflows that support the discoverability and communication of Virginia Tech scholarship.

As time went on, given the commonalities inherent in their roles and skillsets, Rachel and Connie regularly interacted. They saw the value in establishing a more cohesive set of interwoven services, more consistent messaging, and titles that captured the attention and respect of the senior administrators they often supported. Thus, a new department was established, with Connie at the helm as director: Research Impact and Intelligence.

To develop the department, and true to the tenets of research intelligence practice, Connie gathered and analyzed information about research impact and research intelligence services globally to identify best practices and ideas about how to strengthen services. Finding information on research impact was significantly easier because ofthe Research Library Impact Framework Initiative; however, she often had to look abroad, including to the United Kingdom and the Netherlands, for model research intelligence units and library positions. She also studied research development offices, university hospitals, law offices, and corporate business or competitive intelligence operations.

Upon the department’s official creation in late 2022, Rachel, who had been promoted to a coordinator role, brought with her one part-time student who was responsible for baseline researcher data curation. On the research intelligence side, Connie supervised an engineering research analyst and two graduate students.

Since the department’s creation, the demand for its services has grown, prompting the hiring of five additional staff members, including a research impact librarian, a research intelligence data scientist, one part-time research intelligence associate, and one computer science graduate assistant. We regularly collaborate with the Office of Research & Innovation, Faculty Affairs, Office of Analytics & Institutional Effectiveness, and Innovations & Partnerships, and those collaborations spur many more projects. Additionally, we have provided at least 25 consultations to other universities interested in offering similar services. Our current services, some of which naturally overlap, include:

Research Intelligence and Analysis

  • Program, proposal, and funding intelligence (current and retrospective)
  • Market, industry, and external benchmarking analysis
  • Research gap and landscape analysis
  • Discovery of potential collaborators and competitors
  • Data-driven recruitment and workforce intelligence

Research Impact and Evaluation

  • Impact analysis and reporting
  • Responsible use and communication of research metrics
  • Science visualization and analytic dashboards

Research Information Management and Visibility

  • Support for the institution’s RIMS, Symplectic Elements (Elements), for faculty activity reporting (FAR); its connected public-facing profile system, Virginia Tech Experts; and related external researcher profiles and persistent identifiers (e.g., ORCID)
  • Researcher data curation and quality assurance in Elements
  • Integration of researcher identifiers and profile systems

Engagement, Consultation, and Capacity Building

  • Consultations and workshops on research intelligence and impact
  • Departmental liaison services and embedded partnerships

The tools and metrics we commonly utilize are listed in Tables 1 and 2.


Table 1: Common Tools Utilized

Functional Category

Tools/Platforms

Literature Databases and
Scholarly Data Sources

Web of Science; Scopus; Google Scholar; OpenAlex; Dimensions; The Lens; Elements

Research Analytics Platforms

InCites; SciVal; Dimensions Analytics; Academic Analytics; Publish or Perish; The Lens

Altmetrics and
Policy Data Sources

Altmetric Explorer; Overton; Policy Commons

Patent and Innovation Databases

Google Patents; USPTO; The Lens; Dimensions

Funding and Institutional
Data Sources

Grants.gov; USAspending.gov; NSF HERD; IPEDS; Elements

Market, Industry, and
Financial Intelligence

MarketResearch.com

Visualization and
Network Analysis Tools

Tableau; VOSviewer; Excel; Inkscape; Adobe Acrobat Pro; SciVal

Data Analysis and
Processing Tools

Excel; Python; R; OpenRefine

Research Metrics Dashboard Development

Django; Vue.js; PostgreSQL; REST API; Git; SciVal/Scopus APIs; Data pipelines (ETL); AWS


Table 2: Common Metrics Utilized

Metric Type

Metric Level

Output/Publication

Author

Journal/Venue

Group/Unit

Bibliometrics (academic attention and impact)

Publication counts; citation counts; citation percentiles; field-weighted citation impact (FWCI) (e.g., OpenAlex, SciVal); citations per publication

h-index; h-5 index; FWCI (author); citation counts; collaboration counts; citations per publication, citation percentiles

Journal Impact Factor; CiteScore; SNIP; journal quartiles; acceptance rate; top 1%/10%/25% journals

Collaboration impact; national/international collaboration rates; co-citation percentile rank; FWCI

Altmetrics (attention and dissemination)

Altmetric Attention Score; news media mentions; policy citations; syllabi mentions; social media mentions; clinical guideline mentions; monograph library holdings; views/downloads

NA

Altmetric Attention Score

Policy citations; policy source sector analysis; news media mentions; social media mentions; clinical guideline mentions

Funding Metrics

Grant counts; total funding dollars; funding source

PI funding totals; proposals submitted; grants funded

NA

Total grants and contracts; funding trends over time

Innovation Metrics

Patent counts; publication-to-patent citations; patent-to-patent citations

NA

NA

Patent-to-patent citations; startups formed; academic–corporate collaboration % and impact; international collaboration % and impact; startups formed

Recognition and Esteem

Book reviews and ratings

Awards and honors; fellowships

NA

Prestigious group or institutional awards

Responsible Research Evaluation

Embedded in all of this work is communicating about responsible use of metrics—or responsible research evaluation. While research metrics can provide insight on many fronts, interpreting them requires expertise, which includes understanding the importance of context. Just as significant, experts recognize how readily metrics, like key performance indicators (KPIs), can be gamed and know that it happens often. Our work includes challenging conversations, often with senior administrators. Sometimes we must explain the problems of imposing one’s disciplinary expectations on another discipline, particularly that it is unrealistic and can significantly damage trust and relationships across the university. When having tough conversations, we focus on upholding our values and point to peer-reviewed studies as evidence.

Since her arrival in 2018, Rachel, now assistant director for research impact and information management, has actively engaged in university governance through sustained service in the faculty senate, including as a senator and, in 2025, as faculty senate president. Building on earlier faculty senate subcommittee efforts to examine research assessment practices, she chaired a university-wide task force on the responsible use of research metrics in 2022. The task force was charged with advancing concrete actions to support a healthier, fairer, and more transparent research culture at Virginia Tech. Drawing on international best practices—particularly the Leiden Manifesto for Research Metrics—the task force drafted the Virginia Tech Statement on the Use of Research Metrics, which was approved by the faculty senate in 2023 and subsequently endorsed by the university council and the president in 2024. This work represents an ongoing institutional commitment, with continued efforts to operationalize responsible research assessment principles across policies, systems, and practices at the university.

In addition to engaging deeply with the peer-reviewed literature on both the benefits and limitations of research metrics, all full-time members of the team have completed training with Leiden University’s Centre for Science and Technology Studies (CWTS). This training extends beyond use of the VOSviewer science mapping tool to include broader instruction in researcher impact indicators and responsible research evaluation practices. Team members also actively participate in professional communities focused on bibliometrics and research intelligence, including the LIS-Bibliometrics listserv and the Bibliomagician blog, and regularly attend key international conferences, such as Canada’s Bibliometrics and Research Impact Community (BRIC) Annual Conference, the US-based Research Analytics Summit, Europe’s Science & Technology Indicators (STI) Conference, and the Nordic Workshop on Bibliometrics and Research Policy. Engagement with the Research Intelligence Netherlands Network further situates this work within an international community of practice.

In addition to these ongoing professional development activities, as a co-principle investigator on an Institute of Museum and Library Services (IMLS) conference grant, Rachel helped organize and promote a 2024 US-based conference on the SCOPE Framework. The conference supported administrators, librarians, research managers, and vendors in developing effective, context-sensitive processes for responsible research evaluation and reflects continued leadership in advancing fair and transparent research assessment practices.

Operationalizing Research Impact, Intelligence, and RIMS

Our team’s work to date has focused on translating research impact, intelligence, and RIMS principles into concrete, scalable services that support both researchers and institutional decision-making. Core activities include the development of customized reports, dashboards, and analyses for departments, colleges, and administrative units that surface patterns of scholarly attention, influence, collaboration, and funding. Scalable research intelligence depends on reliable data integration across platforms, which we support through identity resolution, affiliation verification, and data normalization workflows that reduce duplication and strengthen the reliability of metrics-informed reporting. This data stewardship work helps ensure that analytic outputs are both accurate and appropriately contextualized. These services are designed to support strategic planning, proposal development, and resource allocation; we continue to expand awareness of these services among campus administrators to increase their visibility and use.

In parallel, we provide sustained support for RIMS and annual faculty activity reporting through a combination of workshops, asynchronous training, and consultations. Recent efforts include the launch of the Researcher Identity Challenge Course, a modular, self-paced program that guides researchers through setting up and connecting online scholarly profiles, including their Elements record and, optionally, their public-facing Virginia Tech Experts profile. We have also offered Elements profile curation services, training student workers to follow standardized workflows to import and verify data from CVs and external systems, improving the completeness and accuracy of scholarly records in support of both faculty review processes and institutional reporting.

Beyond training and infrastructure, we provide expert consultations to faculty, students, and administrators on the responsible use of research data and metrics. For example, we collaborate with the Academic Resource Management unit by supplying and interpreting bibliographic data used to inform the allocation of university funds to colleges. One of the most frequently requested research intelligence services is competitive intelligence for grant proposals, in which the team delivers tailored data, visualizations, and interpretive guidance to strengthen applications. For the past two years, we’ve provided this service to applicants to Virginia Tech’s Destination Areas program, through which the provost’s office supports seed grants to advance transdisciplinary research addressing problems of global importance.

Building Capacity and Advancing Responsible Research Evaluation

Looking ahead, we are focused on continued growth in professional capacity, cross-campus collaboration, and national engagement. In 2024, we launched the US Research Impact & Intelligence Community of Practice Slack channel to address a gap in the US research ecosystem, which currently lacks a dedicated national organization or conference focused on research impact and intelligence. This emerging community provides a foundation for sustained networking and knowledge exchange, and we intend to collaborate with peer institutions to offer opportunities for future programming and workshops.

At the institutional level, we aim to deepen partnerships with university leadership and pursue additional avenues for embedding responsible research evaluation into policy and practice at Virginia Tech. Planned efforts include exploring external funding opportunities, facilitating structured discussions with academic leaders, and contributing guidance related to promotion and tenure processes through frameworks such as the SCOPE Framework. Collectively, these activities reflect an ongoing commitment to fostering a research culture that is healthier, fairer, and more transparent—one that balances quantitative indicators with qualitative judgment and disciplinary context.

Conclusion

Collectively, our work demonstrates how a library-based research analytics and intelligence team can support strategic planning, proposal development, and decision-making across the university. Through a combination of analytic expertise, data stewardship, and close collaboration with campus partners, we deliver actionable insights while maintaining sensitivity to disciplinary context and institutional needs. Building on this foundation, we also play a critical role in guiding the responsible interpretation and use of metrics, helping ensure that analytic outputs are applied in ways that are fair, transparent, and aligned with the values of a healthy research culture.

This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error