High-Fidelity Data and New Tools for GEOINT

06 July 2015 by Coleman McCormick

Two weeks back I was in Washington, DC at the GEOINT Symposium, the major industry event for the geospatial intelligence market, put on by the excellent USGIF. Each year there’s a growing number of companies with a primarily commercial focus, showcasing their tech for government users, which shows that the focus of the government community is more and more shifting toward off-the-shelf and consumer tech each year.

This year’s theme was “opening the aperture”, a term which can be interpreted in a couple of ways. USGIF CEO Keith Masback’s comments on the theme are that the use of geospatial intelligence is expanding outside of the national and homeland security spaces. It now sits “alongside business intelligence, location-based technologies, remote sensing, and analytics” in the business world, as well.

Geotagged photos in Helsinki, Finland

Geotagged photos from Flickr + Picasa, Helsinki, Finland

I would add to that another way to interpret the theme: that “opening the aperture” means consumers of GEOINT are expanding their areas of interest to accept, process, and understand new sources of information from new avenues including microsatellites, commercial spatial data sources, and the Internet. Widening the net on data and tools used to solve GEOINT problems gives users new methodologies to apply and new datasets to crunch looking for answers.

The big picture stuff like near real-time satellite feeds of whole countries, Twitter streams of billions of data points, and sensor-based feeds that produce petabytes of data per day — these things are all excellent resources, and require the innovative people showcasing their tech at GEOINT to build the processing and analytical tools to extract meaning from the firehoses of content.

The industry has to be careful, though, about putting too much confidence in easy-to-get data and black magic analytical processing engines. Sometimes I sense some trivialization of the problems the industry faces with regard to using data (particularly spatial data) that leads to bad assumptions about what technology can be applied to what problem. Just listen to the buzzwords in the community about “big data”, “Hadoop clusters”, and “dataviz” and it starts to sound like those things are magic wands you point at your problem for an answer. I’m a believer that all of these technologies have immense value when applied to the right problem, and when they’re well understood as to how they should be employed.

I’m particularly interested in how Fulcrum can play a role in the process. We can now take advantage of new streams like Internet-based data sources (think tweets and Foursquare checkins) as a means of geographic focus. Fulcrum’s value is in the high-fidelity, ground-observed information. And because it requires an active rather than a passive means of data capture, getting the most out of the toolset means focusing your data collection activities on the areas of highest importance. Mining the seams of social media can give indicators of where that focus should be turned. Which neighborhood in the city is the most mentioned on Twitter? What types of venues and events are people visiting in the neighborhood?

Because Fulcrum also can be used as a means of work dispatch to a user in the field, analysts with questions derived from the buckets of new data they now have access to also have a means to ping their resources in the field—whether it’s a soldier on the ground in the case of government, or a regional sales rep in a commercial business looking at competitors’ strategies and tactics in distribution. Fast access to the right resources in the field at the right time can save immense resources and ramp up the ROI for everyone.

Bryan recently posted another piece that’s relevant here, highlighting a few of the thousands of Fulcrum users and how they’re using Fulcrum to drive value to their organizations while actually saving money and getting better answers. One of the most relevant things on my mind as I browsed the latest capabilities being showcased at GEOINT was how powerful new resources for back-office data analysis and visualization can help steer field resources in the right direction.

Photo: Eric Fischer

About the author

Coleman is a geographer and our Executive VP, working every day with our customers to bring better data management capability to their operations.

comments powered by Disqus

Sign Up Today & Start Collecting Data In Just Minutes!

Start Your FREE Trial