It depends on what data we will be able to get. For example, there's GLiM - Global lithological map with 250m resolution containing surface rock maps, but when we contacted them they wrote their data are not available for non-academic use.
It seems that many academic projects adopted this strategy lately - fighting for grants they keep their data exclusive, yet they do not have commercial licenses because these data were obtained via public funding. Another case are the forest maps, a project where OT is cooperating (providing a special build for visualization, for free). Yet we can't get the distribution models from them ... all we can get is "publicity", which here means more exposure in academic circles, but that's for naught if we cannot get anything useful in exchange.
So it looks we'll have to collect and process the raw data, but obviously that will take much longer ...
... well, id always said science will be degraded by its economic aspect. But its for quite some time yet, that public domain science was out-financed into corporate-like legal behavior ... and not many, but more like most of them. ( I see that in chemistry field quite heavily, where even grant-projects may be, in some legal whirpool, handled almost like patented stuff by certain "specific private economic subjects". )
... im also sure, if our "system" crashes, among first thing to disappear, will be a ton of science papers from a lot of servers, and for good. Just for the way theyre globally handled.
What is the raw data here ? ... you mean going trough all the primary local information of vegetation dislocation and paste it piece by piece into your own global data-pack ? For the whole planet ?