Descartes Labs launches its new platform for analyzing geospatial data
Descartes Labs, a well–funded startup based in New Mexico, provides businesses with geospatial data and the tools to analyze it in order to make business decisions. Today, the company announced the launch of its Descartes Labs Platform, which promises to bring its data together with all of the tools data scientists — including those with no background in analyzing this kind of information — would need to work with these images to analyze them and build machine learning models based on the data in them.
Descartes Labs CEO Phil Fraher, who took this position only a few months ago, told me that the company’s current business often includes a lot of consulting work to get its customers started. These customers span the range from energy and mining companies to government agencies, financial services and agriculture businesses, but many don’t have the in-house expertise to immediately make use of the data that Descartes Labs provides.
“For the most part, we still have to evangelize how to use geospatial data to solve business problems. And so a lot of our customers rely on us to do consulting,” Fraher said. “But what’s really interesting is that even with some of our existing customers, we’re now seeing more early adopters, more business and analysis teams and data scientists being hired, that do focus on geospatial data. So what’s really exciting with this launch is we’re now going to put our platform tool in the hands of those particular individuals that now can do their own work.”
In many ways, this new platform gives these customers access to the tools and data that Descartes Labs’ own team uses and allows them to collaborate with the company to solve their problems and use the new modeling tools to build solutions for their individual businesses.
“Previously, a data science team at a company that’s interested in this kind of analysis would also have to know how to wrangle very large-scale or petabyte-scale Earth observation data sets,” Fraher said. “These are very unique and specific skillsets and because of that kind of barrier to entry, the adoption of some of this technology and data sources has been slow.”
To enable more businesses to get started with working with this data (and become Descartes Labs customers), the company is betting on the standard tools in the industry, with hosted Jupyter notebooks, Python support and a set of APIs. It also includes tools to transform and clean the incoming data from Descartes’ third-party partners in order to make it usable for data scientists.
“It’s not just like some simple ETL-like data processing pipeline,” Descartes Labs’ head of Engineering Sam Skillman noted. “It’s something where we have to combine very in-depth data science, remote sensing and large-scale compute capabilities to bring all of that data in in a way that normalizes it and gets it ready for analysis.”
All of this analysis is handled in the cloud, of course.
The new platform is now available to businesses that want to give it a try.
Leave a Reply
Want to join the discussion?Feel free to contribute!