Skip to main content

Interpreting the data

EI Team, RRADS

This page provides more information about the Engagement and Impact assessment. This includes information on the ARC's requirements for EI, data requirements, and interpretation of the results.

Search for a topic

What is a national Engagement and Impact (EI) assessment?

EI is a new national assessment to assess:

  • how well researchers in Australian universities engage with end users beyond academia.

  • what kinds of impacts are occurring outside of academia as result of research undertaken by Australian universities.

  • how well Australian universities support their researchers to deliver research which has an impact beyond academia.

Why have a national EI assessment?

To encourage innovation and entrepreneurship by:

  • encouraging researchers to better engage with users of their research
  • encouraging greater focus on delivering research impacts
  • encouraging Australian universities to support their researchers in these activities
  • ERA and EI run as companion exercises; specific ERA metrics (matrix of indicators below) are used in the Engagement narrative
  • ERA continues to assess research quality and acknowledges and encourages blue sky research
  • The EI assessment encourages universities to look beyond the academic sector

What is the 2018 assessment framework?

The 2018 EI assessment framework is available through the ARC website

How are the Impact case studies structured?

Impact is assessed using qualitative information in the form of impact studies.

These impact studies include:

  • Approach to impact
  • Details of the impact
  • Associated research

The impact studies will allow for the inclusion of measures or indicators of impact.

What is the ARC definition of impact?

The contribution that research makes to the economy, society, environment and culture beyond the contribution to academic research.

What is the ARC definition of engagement?

The interaction between researchers and research end-users outside of academia, for the mutually beneficial transfer of knowledge, technologies, methods or resources.

What is an end-user?

An individual, community or organisation external to academia that will directly use or directly benefit from the output, outcome or result of the research

What is Aboriginal and Torres Strait Islander research?

Aboriginal and Torres Strait Islander research means that the research significantly:

  • Relates to Aboriginal and Torres Strait Islander peoples, nations, communities, language, place, culture or knowledges and/or

  • Is undertaken with Aboriginal and Torres Strait Islander peoples, nations, or communities.

Can institutions attach supporting evidence and testimonials to impact studies?

No, institutions must not attach separate supporting evidence and testimonials to their impact studies. All information relevant to the impact study must be contained within the impact study itself.

Why was FoR 11 is split into two?

FoR 11 was split into two (Biomedical and Clinical Sciences & Public and Allied Health Sciences) because of the size of the code.

Biomedical and Clinical Sciences Public and Allied Health Sciences
1101 Medical Biochemistry and Metabolomics 1104 Complementary and Alternative Medicine
1102 Cardiovascular Medicine and Haematology 1106 Human Movement and Sports Science
1103 Clinical Sciences 1110 Nursing
1105 Dentistry 1111 Nutrition and Dietetics
1107 Immunology 1117 Public Health and Health Services
1108 Medical Microbiology 1199 Other Medical and Health Sciences
1109 Neurosciences
1112 Oncology and Carcinogenesis
1113 Ophthalmology and Optometry
1114 Paediatrics and Reproductive Medicine
1115 Pharmacology and Pharmaceutical Sciences
1116 Medical Physiology

How will the ARC calculate the low volume threshold for FOR11 now that it has split into two FORs?

The threshold amount is for each part of FoR11 - Biomedical and Clinical Sciences and Public and Allied Health Services. For example, if a university met the threshold for Biomedical and Clinical Sciences but not Public and Allied Health Sciences then the university would only submit for Biomedical and Clinical Sciences. Where the university meets low volume threshold in both units of assessment, then the university will need to submit to EI in both.

What are the reference periods?

The reference periods for EI assessment are listed in the following table.

Stage Reference period Years
Impact study 1 January 2011 to 31 December 2016 6
Associated research 1 January 2002 to 31 December 2016 15
Engagement 1 January 2014 to 31 December 2016 3

Can information that is outside the reference period be included in the impact study?

As outlined EI 2018 Submission Guidelines (Section 2.2), the reference period for the impact study is 1 January 2011 to 31 December 2016 (6 years). The reference period for the associated research is 1 January 2002 to 31 December 2016 (15 years). While a reference period is not specified for approach to impact, the approach must be retrospective and within the context of the impact study.

Information that relates to activity or outcomes outside of the reference period should not be included. Impact studies may, however, refer to external evidence that verifies the claims being made, for example an auditor’s report on return on investment, even if this was published after the impact reference period.

What are the ARC indicator principles?

The following ten indicator principles were used by the Australian Research Council (ARC) to guide the development of the pilot methodology and set the framework for the EI 2018 assessment methodology:

  • robust and objective: objective measures that meet a defined methodology that will reliably produce the same result, regardless of when and by whom the principles are applied.
  • internationally recognised: while not all indicators will allow for direct international comparability, they must be internationally recognised measures of research engagement and impact. Indicators must be sensitive to a range of research types, including research relevant to different audiences.
  • comparability across disciplines: indicators will take into account disciplinary differences and be capable of identifying comparable levels of research engagement and impact.
  • not discourage interdisciplinary and multidisciplinary research: indicators will not discourage institutions from pursuing interdisciplinary and multidisciplinary research engagement and impact.
  • research relevant: indicators must be relevant to the research component of any discipline.
  • repeatable and verifiable: indicators must be repeatable and based on transparent and publicly available methodologies.
  • time-bound: indicators must be specific to a particular period, as defined by the reference period.
  • transparent: it should be possible for all data submitted against each indicator to be made publicly available, to ensure the transparency and integrity of the process and outcomes.
  • behavioural impact: indicators should drive responses in a desirable direction and not result in perverse unintended consequences. They should also limit the scope for special interest groups or individuals to manipulate the system to their advantage.
  • adaptable: recognising that the measurement of engagement, and the assessment of impact over time, require adjustment of indicators for subsequent exercises.

What is the EI 2018 rating scale?

The ratings are established by a panel of experts based on the indicators and the narratives provided. One of three ratings will be given

Rating
High
Medium
Low

Rating Scale descriptors

Impact

Rating Description
High (H) The impact has made a highly significant contribution beyond academia. A clear link between the associated research and the impact was demonstrated.
Medium (M) The impact has made a significant contribution beyond academia. A clear link between the associated research and the impact was demonstrated.
Low (L) The impact has made little or no contribution beyond academia.

Approach to impact

Rating Description
High (H) Mechanisms to encourage the translation of research into impacts beyond academia are highly effective and well-integrated within the field of research (FoR). Mechanisms for translating research facilitated the impact described.
Medium (M) Mechanisms to encourage the translation of research into impacts beyond academia are effective and integrated within the FoR. Mechanisms for translating research facilitated the impact described.
Low (L) Mechanisms to encourage the translation of research into impacts beyond academia are not effective and integrated. The mechanisms for translation did not facilitate the impact described.

Engagement

Rating Description
High (H) The FoR is characterised by highly effective interactions between researchers and research end-users outside of academia for the mutually beneficial transfer of knowledge, technologies, methods and resources. Research engagement is well integrated into the development and ongoing conduct of research within the FoR.
Medium (M) The FoR is characterised by effective interactions between researchers and research end-users outside of academia for the mutually beneficial transfer of knowledge, technologies, methods and resources. Evidence that research engagement is incorporated into relevant parts of the research process within the FoR and/or that research engagement is improving.
Low (L) The FoR has little or no effective interactions between researchers and research end-users outside of academia for the mutually beneficial transfer of knowledge, technologies, methods and resources. Little or no evidence that research engagement is incorporated into the research process or that research engagement activities are being developed.

How are the ARC review panels structured?

The assessments for EI 2018 will be undertaken by panels comprising a mix of distinguished academic researchers and highly experienced research end-users. There are five assessment panels for EI 2018.

What Impact types and indicators were used?

Impact types and indicators

The impact case studies and engagement narratives were analysed for impact types, impact indicators and engagement indicators. For this process the VV Impact tracker impact indicator taxonomy, WHO and CSIRO impact indicators were reviewed to inform the categorization of the impact types and indicators.

Impact

The Impact indicators were grouped into 7 impact types (academic, economic, environmental, health and welfare, public policy, societal and reach). The Impact section contained 200 unique indicators.

Definitions:

  • Academic: Academic esteem, teaching, learning and informing pedagogy and curricula, publications, reach
  • Economic: Economy, commerce and organisations, government funding, spin-off companies, international development, grants, institutional support, profits, philanthropy, engagement
  • Environmental: Planet, air, plant life, wildlife
  • Health and Welfare: Improving health and wellbeing, practitioners and professional services

  • Public policy: policy development, influence on professional standards, guidelines or training, engagement, public services and law

  • Societal: Creativity, culture and society, public discourse, public engagement, world view and human understanding

  • Reach: Reach was defined as a change in national or international reach, change in online presence and uptake of technologies, services, etc.

Approach to impact

The Approach to impact indicators were grouped into 12 impact types (academic, engagement, awards, communications, economic, health and welfare, institutional support, multidisciplinary, PhD students, public policy, patent and reach). The approach to impact section contained 121 unique indicators.

Definitions:

  • Academic: Publications, fellowships, expert evidence, PhD students, philanthropic support, professional engagement, reviews

  • Engagement: Collaborations, workshops, keynote addresses, royal commissions, memberships, conferences, community engagement, panel memberships, films, industry and government investments, industry engagement, partnerships, government engagement, honorary professorship, congress, plenary addresses, seminars, studies, committees, pro bono services, stakeholder meetings, trials

  • Awards: Internal awards, external awards, fellowships

  • Communications: Media, media reach, government publications, publications, online presence, radio interviews, social media

  • Economic: Government funding, internal funding, external funding, grants, institutional support, In kind support, industry engagement, partnerships, spin-off companies

  • Health and welfare: Reach, use of public services, treatment outcomes, online presence

  • Institutional support: Infrastructure, knowledge, internal funding, internal supervision (PhD students), grants, honorary doctorates, diversity strategy, professional staff

  • Multidisciplinary: Multidisciplinary focus and knowledge

  • PhD students: Internal supervision, institutional support, industry partnerships, engagement

  • Public policy: Influencing public policy, public advice, guidelines, expert witness, committee memberships, inclusion in parliamentary enquiries, change in policy

  • Patent: Patents granted

  • Reach: Media reach, online presence, surveys

Engagement

The Engagement narrative had 10 engagement types (engagement, communications, public policy, institutional support, economic, reach, academic, HDR students, patents and awards).

Definitions:

  • Engagement: Collaborations, workshops, keynote addresses, royal commissions, memberships, conferences, community engagement, panel memberships, films, industry and government investments, industry engagement, partnerships, government engagement, honorary professorship, congress, plenary addresses, seminars, studies, committees, pro bono services, stakeholder meetings, trials, book sales, public lectures, co-authorship of research outputs with research end-users
  • Communications: Media, media reach, government publications, publications, online presence, radio interviews, social media
  • Public policy: Influencing public policy, public advice, guidelines, expert witness, committee memberships, inclusion in parliamentary enquiries, change in policy
  • Institutional support: Infrastructure, knowledge, internal funding, internal supervision (PhD students), grants, honorary doctorates, diversity strategy, professional staff
  • Economic: Government funding, internal funding, external funding, grants, institutional support, In kind support, industry engagement, partnerships, spin-off companies
  • Reach: Media reach, online presence, surveys
  • Academic: Publications, fellowships, expert evidence, PhD students, philanthropic support, professional engagement, reviews
  • PhD students: Internal supervision, institutional support, industry partnerships, engagement
  • Patents and Awards: Patents granted, internal awards, external awards, fellowships

What weighting does the EI 2018 assessment give to the engagement narrative and the indicators?

Assessment panels make a holistic judgement about the performance of a Unit of Assessment (UoA). There are no weightings applied to the individual components of the engagement submission and panels may focus on aspects of the qualitative statements or indicators that are particularly relevant for different disciplines.

How is Full-time equivalent (FTE) calculated and used for EI indicators?

FTE will be calculated using all staff with employment status as employed and with a function of Research only, or Research and Teaching that were submitted by institutions in the relevant part of their ERA 2018 submissions. The FTE amount is used in the total Higher Education Research Data Collection (HERDC) income per FTE indicator.

As stated in the EI 2018 Framework, FTE staff (i.e., staff with employment status as employed) recorded with a function of other function in ERA 2018 will not be included in this calculation.

How were Impact case studies chosen?

Potential case studies were collated as Expressions of interest (EOIs) and faculty nominations. RedCap surveys, interviews and the ARC’s strict reference periods were used to filter potential case studies. The final case studies submitted to the ARC were selected after vigorous review by the EI steering committee.

How was readability measured?

We used the following two automatic readability checker websites which takes a sample of your writing and calculates the number of sentences, words, syllables and characters in your text. The program takes the output of these numbers and plugs them into six popular readability formulas. The results of these readability formulas are an assessment of the complexity of a text and can help determine the reading level and audience of the text, such as whether it can be understood by someone who left full-time education as a college graduate.

Can universities publish some or all of the information submitted in the Impact study on their website?

A university may use any material that the university submitted within their impact study, for the impact assessment in EI 2018, for the purposes of disseminating examples of research impact. In doing this: Universities must not use the EI templates or any other format developed by the ARC for the purposes of EI.

What is the University EI strategy going forward?

The EI strategy going forward is currently under consultation with the Engagement and Impact steering committee. For further information please contact:

Will there be an analysis of the national results?

Yes, there will be. The analysis of the national results is in progress. Please visit the ARC EI Home Page for updates.