New Knowledge Exchange Publication: “The Value of Research Data”
Posted: April 16th, 2013 | Author: Sven | Filed under: Report, Workshop | Tags: Knowledge Exchange, Report, Research Data | 1 Comment »Last week I’ve been in Berlin and took part in the workshop “Making Data Count: Research data availability and research assessment” hosted by Knowledge Exchange (KE), a 2005 established cooperation between five research funders.
The aim of the workshop was to bring experts and stakeholders from research institutions, universities, scholarly societies and funding agencies together in order to review, discuss and build on possibilities to implement the culture of sharing and to integrate publication of data into research assessment procedures.
The workshop was very informative. It was interesting to see which questions regarding research data in many European states are currently most discussed, what challenges are most pressing and where funders and other stakeholders see an urgent need for solutions.
The slides of many presentations are already available online. Also some dozen tweets reflect the numerous discussions.
Beside the presentations five breakout sessions offered the possibility for further discussion for all participants. The five sessions reflected some of the most pressing challenges in the field of research data publication:
a) Quality assurance for data publications
b) Enhanced publications: Linking data to other research information
c) New metrics and citation systems to measure the impact of data publication
d) Codes of conduct for sharing data
e) Research assessment procedures at universities and funding agencies
In the course of the workshop, KE also presented a new report: “The Value of Research Data – Metrics for datasets from a cultural and technical point of view.” The report presents a first landscape study of creating and promoting data metrics and the assessment of the use of datasets in scientific work as a tool to stimulate data sharing. He also recommends that data sharing and data publication should be more broadly adopted among scholars. To support this a reward system should be developed for scientists that includes data metrics – a very important approach in my eyes.
Other important steps to be taken are that the costs of data publication should be reduced and the existing negative perceptions of researchers regarding data publication should be addressed. Standards will need to be developed for preservation, publication, identification and citation of datasets – a job a research library could usefully fulfil.
The report reviews the current literature on data sharing and data metrics and presents outcomes of interviews with the main stakeholders on data sharing and data metrics. Existing repositories and tools in the field of data sharing have been analysed that have special relevance for the promotion and development of data metrics.
This report will be of interest not only for major stakeholders in science, but also for research funders and scientific institutions. It provides them with more knowledge about tools to promote and reward data sharing and data publication within their scientific communities. In this landscape study, all major stakeholders have been considered in order to summarize the main views, problems and challenges that need to be tackled in the development of metrics for datasets, and in the generalisation and promotion of data sharing activities.
pictures: edawax.de
text: parts from press release
FYI the press release of KE:
Research assessment should take data sharing into account
Workshop finds that data sharing should be rewarded and universities and funders have an important role to play in this process. Metrics are not yet sufficiently developed as a measure to be used in assessment.
Data sharing should be considered normal research practice, in fact not sharing should be considered malpractice. Research funders and universities should support and encourage data sharing. There are a number of important aspects to consider when making data count in research and evaluation procedures. Metrics are a necessary tool in monitoring the sharing of data sets. However, data metrics are at present not very well developed and there is not yet enough experience in what these metrics actually mean. It is important to implement the culture of sharing through codes of conducts in the scientific communities.
These are some of the key findings from the workshop ‘Making Data Count – research data availability and research assessment’ which took place 11 and 12 April 2013 in Berlin. The event brought together researchers, research funders, publishers, infrastructure providers, policy makers and technical experts to discuss whether data sharing could be incorporated in research assessment. At the workshop the study ‘The Value of Research Data – Metrics for datasets from a cultural and technical point of view’ was presented and discussed.
At present there is insufficient experience with alternative metrics and therefore it is hard to judge their value. More experiments are needed in this regard. Alternative metrics are not (at present) considered suitable as a measure of scientific quality. Peer review (both before or after publication) still has an important role to play. However, alternative metrics can be useful in showing the broader attention paid to research data. The citation of datasets should become standard behaviour among researchers.
Funders should incorporate the sharing of research data sets more strongly when judging project proposals and should offer funds to make sharing possible. Institutions should provide training, support and awareness raising not only among junior but also senior researchers. Learned societies and research communities can encourage data sharing by establishing codes of conduct. And finally the underlying infrastructures (e.g. the use of identifiers) must be in place and easy to use.
The report of the workshop is now available online and can be found at: http://www.knowledge-exchange.info/Default.aspx?ID=576
Here you will also find links to the video recordings of the presentations, the slides and photographs taken at the event.