Big Spatio-Temporal Data
In the age of steadily advancing digitalization, we are dealing with ever increasing amounts of data. This trend inevitably also affects geoinformatics, which deals specifically with spatio-temporal data.
Big Spatio-Temporal Data is generated by sensors like satellite missions as well as by private users in crowdsourcing projects like OpenStreetMaps. They are therefore difficult to manage not only because of their size but also because of their semantic complexity. Dedicated systems are therefore needed that can cope with the size of the data, the spatio-temporal reference and the semantics.
A first step for organizing such data volumes is the use of efficient geo-database systems and cluster computing for parallel data processing. Furthermore, the integration of existing programming interfaces and the provision of new ones is essential to solve the challenges in an interdisciplinary and sustainable way.
The next step is to ensure the timeliness and data quality, which can be achieved e.g. via intrinsic evaluation criteria and machine learning.
All in all, completely new ways of automatic analysis of enormous spatio-temporal data volumes can be realized.