Currently, Wikidata, or any other Wikibase instance, is being populated from external data sources mostly manually, by creating ad-hoc data transformation scripts. Usually, these scripts are run once, and that is it. Given the heterogeneity of the source data and languages used to transform them, this means the scripts are hard or impossible to maintain and unable to run periodically in an automated fashion to keep Wikidata up-to-date.
In this session, we would like to demonstrate our work-in-progress in [https://meta.wikimedia.org/wiki/Grants:Project/MFFUK/Wikidata_%26_ETL our project] utilizing [https://etl.linkedpipes.com/ LinkedPipes ETL] - a tool for data transformation pipelines - to load data to Wikibases and Wikidata.