Open data quality – the next shift in open data? – Open Knowledge International Blog

lterrat's bookmarks 2017-05-31

Summary:

"Improving quality by improving the way data is produced

Many data quality metrics are (rightfully so) user-focussed. However, it is critical that government as data producers better understand, monitor and improves the inherent quality of the data they produce. Measuring data quality can incentivise governments to design data for impact: by raising awareness of the quality issues that would make data files otherwise practically impossible to use.

At Open Knowledge International, we target data producers and the quality issues of data files mostly via the Frictionless Data project. Notable projects include the Data Quality Spec which defines some essential quality aspects for tabular data files. GoodTables provides structural and schema validation of government data, and the Data Quality Dashboard enables open data stakeholders to see data quality metrics for entire data collections “at a glance”, including the amount of errors in a data file. These tools help to develop a more systematic assessment of the technical processability and usability of data.

A call for joint work towards better data quality

We are aware that good data quality requires solutions jointly working together. Therefore, we would love to hear your feedback. What are your experiences with open data quality? Which quality issues hinder you from using open data? How do you define these data qualities? What could the GODI team improve?  Please let us know by joining the conversation about GODI on our forum."

Link:

https://blog.okfn.org/2017/05/31/open-data-quality-the-next-shift-in-open-data/

From feeds:

Open Access Tracking Project (OATP) » lterrat's bookmarks

Tags:

Date tagged:

05/31/2017, 14:59

Date published:

05/31/2017, 10:59