Summary

A robust system for attracting, training, and retaining highly qualified personnel is essential for Canada to navigate changing priorities arising from US-Canada relations, defense concerns, AI, climate change, and the demand for critical minerals. Canadian universities are vital to attracting and training talent. Other research organizations such as Mitacs, the Canadian Light Source, and Genome Canada play crucial roles in this ecosystem by offering internships, scholarships, and specialized infrastructure to nurture talent.

It is essential to systematically measure the outcomes of talent development initiatives to ensure they support Canada’s ability to respond to these shifting priorities. Research organizations can now take full advantage of online career data to track where alumni are working, how their careers progress, and whether training programs are effectively preparing individuals for roles that advance national interests. Additionally, comparing the outcomes of program participants to control groups—individuals who did not participate in the same initiatives—provides a clearer picture of the program’s true impact. This evidence-based approach empowers research organizations and policy makers to make informed decisions that strengthen Canada’s capacity to tackle evolving national challenges.

This article summarizes several common organizational, contextual and technical issues and how to address them, using real examples from research organizations that are exploring or have implemented studies of their alumni’s career data.


Introduction

Canadian universities and other research organizations are vital to attracting and training talent. Panelists at the 2025 CSPC conference frequently recognized Canada’s excellence in training talent through this ecosystem of research organizations, and yet called on us all to do better, rising to challenges such as US-Canada relations, defence, AI, climate, and demand for critical minerals. This need was recently underscored by the CCA Expert Panel on the State of Science, Technology, and Innovation in Canada, whose report argued, “STI talent is key to enabling Canada to navigate these changing pressures and demands.”

Many panelists discussed ideas for improving the attraction and training of talent, suggesting models such as work-integrated learning or business-skills programs for science graduate students. Regardless of the proposed solution, they agreed that we need data and metrics to identify what programs are having the most impact. Often the data and metrics to enable evidence-based decisions regarding talent development through research are lacking.

Many research organizations contribute to talent development

Canadian universities represent a major force in talent development, conferring around 8,000 doctoral and 70,000 master’s degrees annually. A wide range of other research organizations provide valuable, complementary contributions: for example, Mitacs has facilitated over 100,000 internships. National research facilities such as TRIUMF, the Canadian Light Source, and Ocean Networks Canada collectively enhance the education of thousands of students and post-doctoral researchers who access their unique infrastructure. Not-for-profit health and life sciences organizations—including Genome Canada, the Stem Cell Network, the Centre for Aging + Brain Health Innovation, Brain Canada, and BioCanRx—significantly benefit students and emerging researchers through tailored talent development programs and awards that open up research opportunities for highly qualified personnel (HQP). Additionally, the federal tri-councils (NSERC, CIHR, SSHRC) collectively distribute nearly 5,000 scholarships and fellowships each year, with provincial agencies awarding a comparable number across Canada.

While research organizations throughout Canada excel at demonstrating the impact of their knowledge outputs, there is a pressing need to better highlight the contributions made through the development of HQP. Without robust evidence of the influence these organizations have, it becomes challenging for Canada to properly recognize organizations driving meaningful change and to motivate others to enhance their efforts in talent development.

The career data needed to show impact from talent development is publicly available

Research organizations, more than any other, should know that if a phenomenon is real, there is likely a way to measure it. After all, that’s what research is all about.

The good news is, yes, you can measure impact from talent development due to the recent surge in publicly accessible online career data. This wealth of information provides ample opportunity to demonstrate the influence you have had on your alumni, that is the students, post-docs and early career researchers who have benefited from your organization.

Thanks to the availability of publicly accessible career data, you can now objectively measure and report on several key outcomes, including:

  • Retention rates within your research field, targeted industry sector, high-tech sectors, as well as within your province and across Canada
  • Pace of career advancement, reflecting leadership roles and increased earning potential
  • Length of employment with organizations, signalling a strong match between acquired skills and job requirements
  • Engagement in further academic pursuits or advanced degrees
  • Variations in career trajectories and outcomes based on gender or other demographic characteristics

And most importantly, you can benchmark your findings against a control group to enable meaningful interpretation of the results.

Issues and Challenges

1) Issue: You are not a degree-granting institution. Response: You are most likely enriching the university-based education and training of highly qualified personnel by providing specialized experiences, resources, and support that complement formal academic programs.

Research organizations often provide access to unique opportunities that enhance students and post-docs education. Examples include hands-on experiments at national research infrastructures, access to unique and powerful datasets, use of virtual infrastructure such leading-edge industrial software for micro-chip design, specialized training initiatives, travel funds to valuable conferences and other opportunities, or scholarships, internships, or distinctive networking experiences.

Real-world example: An HR manager at a major research facility claimed they provided no special training opportunities to mention to the funder, yet our alumni study found the facility greatly influenced students’ education and career paths. Visiting the unique research site to conduct experiments motivated many to pursue research careers, leading to increased funding based on these findings.

Real-world example: An officer of a granting body reported that they are not directly involved in training students. Yet they award thousands of scholarships that enable students to focus on their research without the distraction of a side job to put food on the table. The impacts of those scholarships are measurable and should be credited to the granting body.

2) Issue: As a newly established organization, you may not have many alumni yet. Response: You can highlight achievements from predecessor organizations or from your research field.

New organizations have limited track records. Yet new organizations are often created from a restructuring or federation of previous organizations. In such cases, the history of HQP training from precursor organizations is still relevant. Even without a precursor organization, studying alumni from a specific research field (e.g., astronomy) remains valuable.

Real-world example: A study of alumni who worked with neutron beams at the former Canadian Neutron Beam Centre showed that this experience influenced their pursuit of advanced academic degrees and led to jobs in high-tech industries where science and technology skills are in demand. The results have supported successful grant proposals for new projects in the same area, helping to highlight the benefits Canada can expect from investing in neutron beam research infrastructure.

Real-world example: To demonstrate the impact of astronomy and supporting investment in large infrastructure, astronomers review PhD and Master’s alumni from Canadian astronomy research programs and report their findings.

3) Issue: The organization doesn’t prioritize long-term impact measurement in training. Response: This is a legacy issue. Those who engage will benefit, those who don’t may miss out.

Due to limited data, research organizations and funders have traditionally relied on short- and medium-term metrics like course satisfaction, number of trainees, or job placement rates. With the rise of accessible long-term career data, more organizations are adopting improved impact measurements, which funding bodies will increasingly expect.

Real-world example: The University of Toronto’s 2022 Career Outcomes Study identified over 90% of 16,000 PhD graduates using public sources, helping the program better align with typical graduate career paths beyond research professorships.

Real-world example: In our experience, research organizations often worry that alumni analysis may be unproductive, but after a successful first project, they plan regular updates.

4) Issue: Your organization is short-staffed and budgets are tight. Response: It’s important to see the value and understand that it doesn’t have to cost a lot.

Tracking alumni career data is often seen as yet another extra duty for you or your staff and is done ‘off the side of a desk.” If you’re connected to a university, you may have reached out to the alumni office for assistance, only to find their reasons for collecting alumni data differ from yours. The information they provide doesn’t help you assess impact, and they lack the resources to take on a special project for your needs.

To control expenses, consider starting with a small pilot project using a subset of your alumni to assess its value before committing significant resources to a larger study. This approach is effective whether you manage the project internally or outsource it.

Real-world example: A physics research organization conducts alumni analysis internally by hiring interns for online data collection and basic analysis.

Real-world example: A major research facility uses administrative and communications staff to survey alumni, gather online data, and analyze findings. While the process is labor-intensive, staff are relieved that this process will only occur periodically in line with their multi-year funding cycle.

Real-world example: Some research organizations outsource data collection and analysis. Some complain that tech firms charge prohibitive amounts for simple data-scraping services. Others report a good return on investment with meaningful insights delivered in a full report—insights that are useful for government relations and reporting to funders.

5) Issue: Your organization doesn’t have good alumni records. Response: You can often create a list of a sample of your alumni using public records.

Many research organizations have no record of many of the beneficiaries of their programs. They maintain records of some alumni such as former staff (i.e., interns, coop students, and summer students), but not others, such as students who used their research services or students whose stipends were supported by their research grants.

Research organizations often keep publication lists that, when cross-referenced with sources like university websites, LinkedIn, and thesis databases, help identify which authors were students or postdocs. This process yields a sample of alumni who participated in research relying on the organization’s infrastructure or services.

If a publication list is unavailable, one can be compiled by automated searches of scientific publication databases for mentions of the organization or its specific infrastructure, equipment, datasets, experiments, or methods. Manual curation of the automated findings will enhance the result.

Real-world example: A researcher wants to examine the career outcomes of women versus men graduates from Canadian universities in STEM fields using LinkedIn’s extensive database. Since most users display their academic credentials and institutions on their profiles, gathering alumni data from individual universities isn’t necessary.

Real-world example: A researcher wanted to examine the career outcomes of HQP trained in her specific field. A pilot analysis successfully utilised key words in that field to identify Master’s and PhD graduates via online sources.

Real-world example: A research organization lacked a publication list because it was a merger of multiple centres, each with their own practices. Spurred by the need to perform an alumni analysis, staff searched for the past publications online and curated the list. New publications are now being logged to keep it up to date. It is taking the next step to recreate an alumni list from the publication data.

5) Issue: You have questions about data privacy. Response: Public data is public. Personal data requires protection. Alumni studies are benign and generally low risk.

Data-scraping should be conducted ethically. Choose companies that access only public data. Publicly available data, like LinkedIn profiles, is accessible to anyone. Platforms cannot control use of information once they publish it, and the public has a right to use any public data. However, the platform may still have social and legal obligations to inhibit misuse of personal information, which is one reason they try to limit data collection volumes.

Generally, linking alumni lists with public data for aggregate research on career outcomes doesn’t require individual consent or violate privacy laws. Privacy regulations often permit personal data to be used for statistical research or program evaluation without explicit permission. However, when feasible, it’s still considered good practice to ask for consent directly.

Real-world example: A privacy officer at a major research facility reviewed an alumni study plan and found that most personal data used was already published through university graduate lists and LinkedIn profiles. Data subjects would only be included in the study if they published their career profile on LinkedIn. Results would be reported in aggregate. The proposed use of personal data met legal requirements for reasonableness. The privacy officer approved the plans noting low risk of sensitive information release and high potential value.

Real-world example: A granting agency has begun automating collection of personal data through web interfaces required to access services. The consent collected through these interfaces includes consent for following their careers through public data on online platforms such as LinkedIn.

6) Issue: You are unsure how to best use career data after you have collected it. Response: The critical steps involve systematically mapping and benchmarking the data, followed by establishing causal links.

Self-reported data often lacks consistency and is difficult to aggregate for analysis. For meaningful interpretation, alumni data should be compared to other groups, but few research organizations attempt this, assuming it’s too complex and not worth the effort. So not much thought or work is invested to devise a benchmarking strategy.

Real-world example: A research organization’s training manager emphasized benchmarking: “It’s easy to get data, but if you have nothing to compare it to, it’s hard to interpret. We did a bibliometric study that was a waste of time because it wasn’t benchmarked. Having a control group for our alumni studies would be very valuable, though we hadn’t considered it possible.”

Real-world example: A major research facility discovered that mapping alumni job titles to corporate hierarchies, and comparing them with peers from the same research field, showed its alumni were twice as likely to hold executive or management roles.

Real-world example: A major research facility conducted a benchmarked study revealing positive correlations between utilization of their facility and career outcomes, including pursuit of advanced degrees, career advancement, and retention within Canada. To enhance understanding of these findings, a representative sample of alumni participated in interviews. The interviewees identified factors stemming from their experiences at the facility that influenced their career decisions and accomplishments. Several commonalities emerged among participants, supporting the validity of the causal interpretation.

Conclusion

Most Canadian research organizations, whether or not they grant degrees, play a valuable role in training highly qualified people. But few assess their long-term impact on alumni careers sufficiently to inform funding and policy decisions. Despite some obstacles, organisations can use public data like LinkedIn profiles to measure impact. Properly analysing and comparing this information helps produce useful insights for reporting, evaluating programmes, and strategic planning.

Acknowledgement

This article reflects the original thought and voice of the author. AI tools were used to support the writing process.