The 18-month project is funded by the German Federal Ministry for Economic Affairs and Energy (BMWi).
The project will demonstrate integration of the innovative Earth datacube paradigm – i.e., analysis-ready spatio-temporal raster data – into commercial Earth Observation (EO) services as well as public cloud infrastructures. To this end, the worldwide leading European Datacube technology (in database lingo: “Array Database”), rasdaman, will be installed on the German Copernicus hub, CODE-DE, as well as in the commercial hosted processing environment of cloudeo to exemplarily offer federated analytics services.
Complementing the batch-oriented Hadoop service already available on CODE-DE, rasdaman will offer important additional functionality, in particular an interactive paradigm of “any query, any time, on any size”, strictly based on open geo standards and federated with other data centres, in particular geo services offered on cloudeo. On this platform novel, specialized services can be established by third parties in a fast, flexible, and scalable manner.
For industry and research, this will mean enhanced access to value-adding services supporting the collaboration across disciplinary and geographical boundaries. The BigDataCube project is managed by Peter Baumann, Professor of Computer Science at Jacobs University who emphasises “open standards for datacube analytics are available today, and have been proven on multi-Petabyte services. Now it is time to convince data users how they can benefit from such services on a routinely basis”.
Start: 01 Jan 2018
Duration: 18 months
Find us: www.bigdatacube.org
Under the lead of Jacobs University, The BigDataCube project is developing flexible and scalable services for massive spatio-temporal Earth Observation (EO) data, offered as datacubes. This paradigm replaces the millions of EO files by a few massive multi-dimensional space/time objects, such as 3D image timeseries and 4D weather forecast cubes. This way, raster data get ready for spatio-temporal analysis in the large.
Goal of BigDataCube is to enhance access to value-adding services supporting collaboration across disciplinary and geographical boundaries for industry and research. The massively simplified Big Data handling benefits users of existing services as well as new businesses, e.g., in agro-informatics: they don't need to develop or deploy complex technology and manage all data, but can use data readily, thereby freeing resources for their core business. Hence, on the BigDataCube platform novel, specialized services can be established by third parties in a fast, flexible, and scalable manner.
Concretely, the project deploys the European Datacube engine, rasdaman, in two infrastructures:
Further, CODE-DE and cloudeo services will be federated, allowing users to combine datacubes from both services without the need for downloading them first.
BigDataCube employs the multi-award-winning pioneer Array Database system, rasdaman, which enables "any query, any time, on any size" on massive n-D datacubes.
Internationally recognized experts are teaming up for developing the next generation of services:
The Large-Scale Scientific Information Systems research group at Jacobs University has worldwide reputation in Array Databases and is actively shaping OGC, ISO, and INSPIRE datacube standards.
rasdaman GmbH is technology leader in high-performance Array Databases for serving datacubes and shaper of datacube standards, with Jacobs University.
cloudeo is a leading specialist in scalable geo-infrastructure, bringing together data, software and processing power with its GeoMarketplace as one-stop shop for GeoServices.
Photo: Heike Hoenig (rasdaman GmbH)
The BigDataCube project team
(from left to right):