News

Highlights of an analysis of European funder policies and their compliance with FAIR

31st October 2019News, Open Access, Open Data

This month, SPARC Europe contributed to a policy analysis that’s being produced as part of the FAIRSFAIR EC research project. Our focus: a review of funder policies to determine their compliance with FAIR principles, focusing on requirements for data / metadata sharing, long-term stewardship, accessibility, legal interoperability and the timeliness of sharing.

Our newly-released report, “Insights into European research funder open policies and practices,” which was the result of a recent survey of European funders, served as a jumping off point for SPARC Europe’s analysis. We reviewed a total of 17 European funders with research data policies, going into greater depth than we had done in the aforementioned report on select issues; our task was to determine if and/or how funders are addressing certain key recommendations from Turning FAIR into reality, the final report and action plan produced by the European Commission Expert Group on FAIR Data.

Here, we are sharing highlights from our analysis that will be included in the upcoming report to be issued by the FAIRSFAIR Project later this year. 

Synergies between FAIR and current research funder policies

  • Eight of the 17 funders included in the review have policies dedicated to research data alone while six provide a combined policy for research publications and research data.

  • 11 funders offer a definition of what is meant by “research data”, though notable differences exist in terms of clarity and comprehensiveness between funders. Six funders offer no definition. This illuminates a need for a singular, universally aligned and clear definition of what is meant by research data for the researcher.

  • 12 funder policies permit exception to data sharing. Nine of these policies require that a justification be made, often in a DMP (data management plan)—something that we hope will become standard practise over time.

  • At present, nine funders require DMPs; one recommends them and one more requests them, and one has an output management plan that extends the scope of the DMP to include software and other outputs. More alignment is clearly needed to streamline how to manage and document research data management practice.

  • Of those that require, request or recommend a DMP, five call for them to be submitted at the pre-award stage while six stipulate that they be submitted post-award with funders split on when it is the best time to assess data management practise.

  • A little more than half reference managing researchers’ data Intellectual Property (IP) rights. In these cases, scant guidance is provided on how to apply this policy. Only three specify a preferred license.

  • 11 funders justify costs associated with making research data available; five do not and one provides costs for an output management plan.

  • When it comes to guidance regarding how to comply, nine funders reference specific data repositories or scientific databases; and one funder asks grantees to investigate whether the repository where they plan to deposit their work is compatible with FAIR. Just under half reference specific research infrastructure. More could be done to promote authoritative resources amongst funders.

  • Ten funders provide some kind of guidance, training or support though this isn’t always referenced in the policy document. 

Gaps – between FAIR and current funder policies

  • Seven of 17 require data sharing and five suggest it. This reveals the potential opportunity to increase access to data if more funders were to mandate data sharing. It is also noteworthy that seven funders do not refer to the term “Open Access” to research data, even though this is core to gaining access to research data.

  • Only seven funder policies directly reference FAIR, which is somewhat surprising also considering clear policymaking in this area with the Turning FAIR into reality report for example.

  • Merely six funders require a data availability statement which indicates where data and under what conditions data is available; this statement is necessary in providing optimal access to research data.

  • And 14 funders make no reference to an expectation of data citation, which would promote data re-use.

  • The majority of funders make no reference to how long data should be made available while 10 make no reference to preserving the data at all. 
  • Further limiting access to research data is the absence of references to specific standards or protocols; only 3 funders mention them. As an example, 15 of the 17 funders reviewed do not call for the use of a research identifier, such as ORCID.

  • As far as legal issues are concerned, less than half of selected funders refer to data-related legislation whereas 9 do not. As far as data protection is concerned, this is not mentioned by the majority of policies (11).
  • Only five funders refer to data and research integrity; this is surprising as sound, ethical scientific practices rely on access to research data to guarantee reproducibility and transparency.

  • Just over half of the funders (9) reviewed monitor their policies, whereas 8 do not specify if they do or not. Three policies reference sanctions for noncompliance whereas 14 give no indication of their use. It is known that numerous funders are currently exploring this area in more depth.

  • No policies have PIDs (persistent identifiers) which limits access to them over time; most were accessible in HTML. Eight were in PDF format while one was in ODT.

We will be developing recommendations with our colleagues based on the results of this landscape analysis next year. The final report, which will encompass the above highlights, is expected to be published by FAIRSFAIR by the end of 2019.