A Chasm No Longer for Geospatial? It’s an Abyss that’s Bigger than Ever.
In 2008, I wrote an article called “GIS Next – A Chasm Crossed” as the editor for Directions Magazine. At that time, I suggested that the diversity of applications and players in the market had expanded significantly, and geographic information systems would become embedded in other enterprise technology software rather than existing as a niche software solution. More applications, such as BI and enterprise databases, had already begun to support geographic data primitives to perform standard geospatial queries and visualizations. I also believed that the acronym “GIS” would disappear and indeed today the term “geospatial” or “location intelligence” more aptly describes this technology sector and is more widely adopted in the vernacular.
Today, almost 15 years later, the chasm has not only been crossed; it seems to no longer exist. Geospatial technology has, indeed, become embedded in so many applications that maps and data are table stakes in enterprise software, from Salesforce to Snowflake. The fact that there is an expectation for a level of geoprocessing capabilities within everything from CRM and BI to cloud solutions alone, suggests that whatever market education was required 15 years ago is at least “less” today than it was at that time. And in the mobile app space, maps are considered a valuable, sometimes indispensable, visualization tool. Bottom line—geospatial is not special anymore.
Here, however, I want to challenge my own theory. Because of the increasing volume of location-based data, the capabilities and complexity of geospatial technology have had to advance as well. Both mobility data and satellite imagery are feeding a hunger for more hyperlocal analysis of everything from consumer expenditures to targeted mobile advertising to advanced AI for feature recognition with remotely sensed, Earth observation data. As a result, the chasm that exists between what the professional geospatial technology community is producing, and what the prospective users know about their availability, is huge.
Let me explain further. Today’s business intelligence users of geospatial data and technology are just now becoming comfortable with the basics of using isochron and isodistance analysis of retail trade areas or utilizing origin-destination models for transportation analytics, as just two examples. Fifteen years ago, trade areas based on ZIP codes or census block groups were good enough. That’s not the case today. Business users understand the value of location data and are using the additional data to the best of their abilities. And, if there was any silver lining to the pandemic, it was Johns Hopkins University’s use of a dashboard to demonstrate the geographic extent of COVID-19. It was visually informative. It provided “answers” and not just data. Drilling down on specific regions where spiking infections occurred led to policy action.
In addition, we are seeing larger corporations, such as in the insurance industry, deploying cloud-native geoprocessing to perform address verification data quality checks and geocoding on policyholders for which a highly accurate address fabric is required. Consequently, accurate addresses support complex risk models, because the amount of data that can be incorporated into models has grown and improved. But as we at Korem have observed in working with our insurance, banking, or retail clients, the usage of these data and software is not ubiquitous and others are seeing it now too.
IDC, the market research firm, is observing similar trends and has reported that 54% of enterprises are challenged by a lack of geospatial intelligence. The Boston Consulting Group reports that only 15% of companies surveyed are qualified as ”location intelligence” leaders. In a survey conducted by Splunk, 55% of data collected by organizations goes unused because they don’t know they have it or don’t know how to use it.
The software tools that are now incorporating more geospatial analysis—let’s pick on Tableau—are much better today to support the types of queries that users are expecting to give them answers. Workflows with tools such as those provided by Alteryx, for example, have an enterprise designer function that helps users create very sophisticated workflows to answer complex questions through data integration. These are fantastic tools, but they also require dedicated, experienced users.
Listen to this podcast episode on BI solutions »
And so, here’s the gap: The BI tools are better; data integration tools are very comprehensive; users require both more experience with the tools and an understanding of the types of data necessary to answer complex questions. The result is that more tools and more data put more pressure on providers to continue to add more functionality. Therefore, the gap between the complexity of technology and those that need to know what’s possible continues to widen, if for no other reason than the demand is there for more data analytics.
In short, the gap between technology and expertise continues to widen as the technology, the appetite for new data, and solutions advance as well.
You may want to argue that this is the case for all disruptive technologies, especially today. Here, I’ll make the argument that “spatial is special.” Why? Because the Earth changes every day; people are constantly in motion; vehicles are more computers than they are rubber and steel, and are becoming even more connected to road and satellite sensors; every financial transaction is tagged by location.
To give you an idea of how much location-based data is being generated, it is estimated that connected cars will generate 300 Terabytes every year and rely on multi-gigabyte high-definition maps, according to Strategy Analytics. In addition, in an article published by Data Center Frontier commercial satellite imagery companies are collecting upwards of 100 terabytes of data per day or 36 petabytes per year! [By comparison, Earthweb reports that YouTube stores 76 petabytes of video per year.] Other geospatial companies such as Foursquare report that it has collected 100+ Million points of interest globally, each with attribution. Ecopia reports that it has collected 173 million building footprints and over 240 million addresses. WhatsApp messages, tweets, Facebook posts, Strava workouts, and data from pollution sensors and smart meters all broadcast location data. The numbers add up.
Let’s take an example closer to home. We had a customer that wanted to understand more about consumer behavior and their activity near physical locations. They determined that they needed building footprint data. But it became tedious to download records from every county and city government where data existed. They were simply unaware of the existence of nationwide data for building footprints, as mentioned above.
“I did not have the time to go to 3,000 county websites and pull down 3,000 county websites worth of GIS data and then, spin that into something,” said Ben Edelman, the Principal Consultant at Dun & Bradstreet. Our client didn’t know, what he didn’t know.
Today, it’s nearly impossible to know every data product vendor and every provider of software capable of processing geospatial data. Recently, Safegraph offered an attempt to capture the geospatial data ecosystem only, in which they identified approximately 150 unique vendors. In addition, there are more non-traditional GIS vendors offering geo-enabled software. The ecosystem for geospatial technology is now huge and growing. The chasm between the need for highly accurate, up-to-date data is huge, and users are staring into an abyss.