Collaboration and Commercialization in the GEOINT Business

By Rafael de Ameller, Robert J. Farnsworth, David Gauthier, Lisa Spuria, and Steven Truitt

This article was originally published in USGIF’s State & Future of GEOINT Report 2017. Download the full report here.

The geospatial industry is at an inflection point similar to the microelectronics sector in the ’70s and ’80s. What was once a capital-intensive and single-purpose industry dominated by government influence and a few large entities is becoming publicly available and multipurpose with myriad smaller contributors. This trend has wide-ranging effects, many of which are discussed in this State of GEOINT Report, although none is more important for the future of the geospatial industry than how the definition of products and services will drive collaboration and competition.

Collaboration emerges when there is sufficient diversity that partnership is economically efficient. Collaboration often coexists with competition, as an environment with many connections between organizations creates opportunities for invention and competition. The connected growth of collaboration and competition incentivizes incumbencies and customers to focus on core values, and is both the motive and governing factor for innovation and the range of potential value the geospatial community can collectively deliver.

This article covers two main trends in collaboration and commercialization, and draws conclusions between them. The first trend is the shift from a government-centric to a commercial-centric environment. The second trend is about the fundamental nature of geospatial products and services — how commoditization, specialization, and niches evolve from the structure of the industry. We close with a discussion of what the intersection of these topics teaches us about areas in which the geospatial community would do well to cooperate versus areas in which ruthless competition is for the better.

Passing the Baton: From Government to Commercial

Many commonplace technological advances began with federal investments and requirements. These investments did not have immediately obvious commercial applications or any plan for transitioning them to civilian life, but they proved worthwhile on a national scale. The military especially has a strong historic influence on technology, with many high-tech programs initially shrouded in secrecy or grown out of necessity to solve social challenges. However, as observed repeatedly, numerous technological advances have found their way into commercial products, advanced society, and sparked entirely new industries.

Many of the phenomenal technological changes of the last few decades stemmed from major research and development investments by federal government agencies and departments. Indeed, in her book, The Entrepreneurial State, Mariana Mazzucato wrote, “Truly radical innovation needs patient, long-term, committed finance. This type of finance is hard to find in the short-termist private sector. So it’s no surprise that modern capitalism has seen the increased role of the state in providing patient capital and directly investing in innovation development.”[1]

Programs such as NASA’s mission to the moon and the Defense Advanced Research Project Agency’s (DARPA) development of ARPANET resulted in profound changes to global life — today we are launching commercial imaging satellites, planning for civilian travel to Mars, and the world is connected globally via the internet. Other investments, which perhaps do not receive the same level of notoriety, have had many of the same results:[2]

● U.S. Air Force/RAND/DARPA investment in artificial intelligence has driven advances in machine learning.

● DoD/NIST/DARPA focus on GPS for military uses now enables location analytics for smartphones, automobiles, fitness trackers, etc.

● National Laboratories investment in supercomputing for nuclear programs drove major advances in computing power and applications.

● U.S. Air Force investment in red LED lights has been a catalyst for the replacement of fluorescent and incandescent lights.

● Seismic imaging developed for nuclear testing by the National Labs/DOE has had major implications for oil and gas exploration.

GEOINT, too, is following this trend to move technology from the government sector to the private sector as cost efficiency and commercial applications are identified. Specifically, emerging trends are: to move away from a government-only analytic workforce and products to commercially developed analytic products; to focus on the development of automation tools to meet the oncoming persistent imagery environment; and to create analysis-as-a-service through which consumers will primarily purchase information derived from imagery and other sources and not purchase the raw data sources.

In-depth analysis from GEOINT sources and specific products such as maps has traditionally been produced and consumed by governments and large organizations. These specialized products were developed to assist decision-makers related to policy, military operations, civil engineering, land use, disaster response, and other problem sets. In recent years, with the arrival of commercial imagery, the commercial market has developed a range of analytic products, some industries faster than others. A specialized tradecraft, in-depth analysis of imagery has been applied to a number of new problem areas and resulted in the training of analysts in its use.

These shifts in the GEOINT enterprise, as a result of long-term investment by the federal government, represent the current revolution underway and epitomize the full life cycle of new technologies making the shift to commercially enabled capabilities.

Government’s Shift to Consumer

As industrial capabilities and academic knowledge of geospatial technology and analytics continues to grow, it is inevitable that innovation and advancements in commercial technology will outpace government-run research and development. This trend has appeared in many technology sectors over time, including the modern-day transistorized computer and cloud solutions. Whereas computer hardware manufacturing is capital intensive and took years to extend its reach beyond government-funded development, the deployment of technology services seemingly happened overnight.

As with technology, the role of government will change as new GEOINT services become available and commercial markets drive innovation. As automation achieves trusted status and commercial markets witness the opportunities offered by frequent coverage, the government sector will reach developmental maturity. GEOINT Analysis-as-a-Service (GaaS), available to all consumers across industries and government, will come to the forefront. As this occurs, government agencies will no longer need a large analytic workforce to perform traditional imagery analysis and can either downsize or shift those analysts to more difficult or complex tasks. Commercial analytics companies will be able to provide automated and detailed analysis, and industries looking for new insights will be able to subscribe to these sophisticated GEOINT analytic services.

At a minimum, the U.S. government will need to understand the relative value of many new service providers and ensure it does not purchase the same core services many times over. Like today’s GPS, a universal geospatial service may launch thousands of useful lightweight applications for the public. Yet this raises concerns for public safety and consumer protection. Should the government also publish its knowledge of the market and recommended services to ensure private citizens get a fair service for the cost? Growing pressure for regulation will develop as the industry gains momentum and questions such as these arise. Topics will possibly include the governance of international standards, a means of accrediting service providers, and the monitoring of market forces.

The industry must recognize the strategic significance of geospatial information between nation-states — therefore, the government’s need to maintain a healthy commercial base — and balance this pressure with the economic pressures to outsource. For example, the medical community routinely outsources radiological expertise to doctors who are experts in exploiting medical imaging sensor data.[3] It remains to be seen whether the United States will do the same with services for outsourcing analysis of crop health, urban parking optimization, or perhaps even military intelligence.

The Commercial Innovation Explosion

Insights companies gain by adopting geospatial intelligence capabilities have led to improvements in performance and productivity as well as a reduction in costs. GEOINT is required more and more by industries and professionals to remain competitive. Implementation of a geospatial capability has to date been defined by four major categories: collection, storage, processing, and visualization.

Commercial companies small and large, governments, and nonprofit organizations around the world are taking advantage of geospatial tools. Without investing in infrastructure or staff, software solutions increasingly allow companies with limited budgets to leverage geospatial data. Empowering individuals to make better decisions while interacting with mapping apps and services has brought geospatial technologies to all market verticals and promoted geospatial innovation. This growth in the available market has spurred fierce competition among providers of similar services and strong collaboration among those that provide different services, all in order to meet the rising demand for an integrated whole.

From geo-tagged, crowdsourced data collected from smartphones to organizations installing networks of sensors and geo-tracking their entire workforce, hardware and software firms are continuously innovating to bring solutions to an expanding market that requires collection solutions. These continuously growing streams of data require storage to properly accumulate all of the information. Cloud technology providers allow organizations to benefit from massive economies of scale on shared infrastructure, enabling data accessibility and increasing efficiency while reducing the cost of scalable storage.

In addition to continuously improving open-source and commercial desktop software for processing, geospatial software firms partnering with cloud computing providers now offer Software-as-a-Service (SaaS) web-based GIS solutions. This ability to offer specific services is a direct effect of collaboration. The relatively low barriers to entry for both consumers and producers have led to strong competition and a growth of options. As a result, many industries lifting the burden of dealing with in-house desktop or server GIS software have finally brought geospatial solutions into their enterprise business intelligence. Visualization is how insights from geospatial data become operational and useful for decision-making. Open standards, APIs, data access, and software bring GIS tools previously only available to specialists to the entire workforce, and visualization service providers can now form viable businesses around niche needs due to lower costs.

The Nature of the Beast: Geospatial Products and Services

The products and services available in the GEOINT environment are the other side of the equation when we want to understand how everybody collaborates and competes. Just as there are trends in the government and commercial sectors, there are trends within each product and service category. This section describes critical characteristics and commonalities of available products and services as broken down into the rough categories of remote sensing, extracted information, and detailed analysis. Competition within and collaboration across sectors and these product and service categories will be the defining influence of GEOINT for the foreseeable future.

Remote Sensing

Remote sensing, especially from high altitudes or space, is a very technically difficult problem limited in complexity by the physical characteristics of the world. This means that over time incremental refinements make common what was once extremely expensive. This is playing out today with the widespread implementation of overhead imagery satellites — a capability almost exclusively reserved for large governments not more than a decade ago. Over time, diminishing costs will mean these capabilities continue to become more impressive and common.

As these capabilities grow, the requirement to automate and streamline processes becomes ever more important. Initial space-based imagers provided reels of film unique to that platform and for the specific analysts that viewed them, often taking weeks or months to get the first insight. Today, one can look on a variety of web-based maps and see orthomosaics or 3D renderings seamlessly stitched from multiple sources — all done autonomously and in near real time. This occurred as a result of standardization to a few commonly accepted formats of imagery so that, within reason, any analyst can use products from any sensor.

This standardization, the ability to depend on repeatable resources, has led to the growth of downstream businesses and wide adoption of the basic mapping and navigating capabilities geospatial information has enabled. As these trends persist, direct competition will continue to drive optimizations and turn once specialized capabilities into commodities — so long as the community’s collective agreement to collaborate using common formats remains. It is imperative for both open competition among collectors and the continuation of collaboration among consumers that the collected source data (imagery or otherwise) continues to fit standard formats and converge to well-accepted representations.

Extracted Geospatial Information

The discovery of new insights about the world and patterns of activity using advanced geospatial data analytics requires the benefits of scale and interoperability across multiple data sets. Achieving the required level of scale demands widespread extraction of geospatial information across all sources and phenomenologies. One can think of this as akin to “atom smashing” in particle physics, which is used to reveal the existence of hidden particles and sub-particles that make up the structure of the universe. Similarly, geospatial experts can use humans and machine learning algorithms to “smash” little bits of vector geospatial information out of every image, video, data cube, or piece of free text. This might reveal a new bridge, a change detection result, a vehicle tracklet, or the room number in a building at a certain postal address. Achieving the necessary interoperability demands all sources adhere to appropriate standards and also that the resulting information be geospatially tagged to allow for automated discovery as metadata can describe its own structure to the entity seeking to use it.

The greatest challenge for utility in such a data rich environment will be lineage and pedigree of the extracted content. How is a pattern recognition algorithm to know the accuracy and precision of the source that generated a change detection result? What is the probability of a false positive? With regulated standardization, it could be expected that a piece of vector data self-report both the spatiotemporal accuracy and precision of the original sensor content from which it was extracted, as well as the receiver operating characteristic (ROC) curves, accuracy, and precision carried for the algorithm(s) used to create it. This context is needed to understand how to correctly interpret one sub-particle of information in a universe of data. However, without standards, data could cause harm to GEOINT analysis through misunderstanding or be unusable and therefore of no value. This presents a challenge for the GEOINT Community to solve: We need to acknowledge and account for the richness of information both in and behind our insights. However, there is not yet a well-accepted way to capture and convey this. As a result, a large market of geospatial products and services are being made available with undetermined value due to a lack of standards.

There are several ways this environment can evolve: the emergence of a regulator; self-organization through independent standards organizations such as the Open Geospatial Consortium; walled gardens controlled by one or few large companies; or a continuation of the current uncertainty. The most direct path is for the government or another watchdog organization to provide market knowledge and consumer advice — they could endeavor to inspect, investigate, and accredit third-party products and services and even publish their value determinations. Regardless of the path and implementation, the use of standards for developing rigor when publishing lineage information for data, products, and services could have a widespread positive effect. Vendors in this environment would compete by provisioning higher quality services of known value with independent proof-of-value to differentiate their superior offerings from the noise of unvalued and lower-quality content. This would be a welcome result of competition and collaboration. Alternatively, a market could develop whereby purveyors of extracted content and analytics develop exclusive business relationships with specific remote sensing vendors, allowing them to compete based upon exclusivity versus product quality. In this case, a customer may suffer the right to choose services from many vendors with equally unverifiable lineage and unknown value to the services they provide. This is the most likely (and undesirable) effect of the walled garden scenario.

In-Depth Analysis

The pinnacle of value created by geospatial content is in-depth analysis that can lead to strategic business or policy decisions. This type of analysis requires professionals who understand the factual content, the underlying measurements, the limitations of both, and the context within which their customer lives. By combining all of these, along with a background on the subject, a skilled geospatial analyst can make remarkable conclusions about a wide variety of topics.

One attribute of detailed analysis is that analysis created for one purpose or organization will rarely resemble that of another. For example, a military analyst may identify the patterns of an adversary and use that to predict future behaviors, while a very similar process may lead a business intelligence analyst to identify popular times for competing stores. The end products will look and mean different things to inform different needs, even as the processes used may be similar.

Additionally, analysis often incorporates proprietary information to the organization that sponsors it. The very questions being asked of GEOINT can be quite sensitive. This may be innocuous for some, but it serves as a strong disincentive to share in-depth analysis for others. Even for those who are willing or eager to share their in-depth analysis, the value it provides can be opaque to others not in similar positions. Yet this is not always true; for example, deep analysis of the distribution of food in a disaster area can be of common concern, while a detailed understanding of the food preferences as a result — less so. This variance, and the time involved in making use of or even understanding deep analysis places large barriers to entry on mass production.

Conclusion

Actionable insight is found when the products and services available can be combined in a useful way to answer (or pose) questions. There are several major conclusions that can be drawn.

First, there is a large opportunity in the near term to make information extracted from already available data broadly accessible. The one commoditized product in the GEOINT world is that of remotely sensed data. With nearly all types of sensors, there is a reasonably interchangeable common product expected as an output. This common basis can be used to build the next level of products and services, and in the process to explore and standardize the range of extracted information. Furthermore, the shift from a government-dominated remote sensing constellation to a commercially owned and operated one will only accelerate this opportunity. The more commonality there is at the information level, the faster the demand growth will become as additional businesses depend on these streams of information.

Second, analysis and the generation of actionable knowledge will continue to be specialized and custom-built until there is more growth and agreement on standard practices at lower levels. Even if the extraction of factual information rapidly becomes a commodity, the application of those facts to a particular business or mission will remain unique for the foreseeable future as a result of the many ways it can be applied. What will change, and presents an opportunity for analysis organizations, is the potential for automation and detail that can be applied to an analysis problem. This ability to customize answers to the problem at hand will remain a key differentiator for custom product generation in the next few years, while the increasing availability and reliability of data and information will reduce the costs of doing business.

A strong influence on the expected specialization of analysis is the increase in data sources, particularly the seeming inevitability of persistent imagery coverage. This growth will give rise to new applications of imagery and provide new insights to many industries — resulting in even more capability that will require the development of standard approaches. Currently, a heavy focus of the persistent imagery industry remains on new collectors — getting them built and launched. Lagging behind are efforts to address the analysis of the data collected by these sensors. As the collection platforms mature, the industry will take a more aggressive approach to image analytics by developing more sophisticated automation tools to maximize the full potential of these new sensors. Automation will be one of the biggest changes to the GEOINT enterprise in decades. The realization of scalable, automated feature extraction and automated change detection will transform imagery analysis by taking the tedium out of mundane tasks currently performed with a high number of man-hours. This shift to a greater reliance on automation will free analysts to focus on more complex analytic problems, growing this portion of the industry.

Finally, most organizations that consume geospatial information will move away from the ability to produce content as a necessary business function and instead move toward the consumption and application of independently provided services. This is a natural consequence of specialization that began with niche knowledge and grew into large industries, and has been proven in many industries all over the world. In the initial microelectronics analogy, the move away from integrated design-fab firms toward licensed and outsourced design, manufacturing, assembly, and marketing was instrumental in dropping the costs of electronic products and opening huge markets. Therefore, most organizations that rely on geospatial understanding will shift to a consumption model with a dedicated industry to provide the final products and constituent components they need to make better decisions.

[1] Mariana Mazzucato, .The Entrepreneurial State. New York: Anthem Press, 2013.

[2] Examples from research included in: Patricia Panchak, Major Technology Advances that Began with Federal Research Funding and Support [IndustryWeek , February 6, 2014], slideshow.

[3] James Brice, “Globalization Comes to Radiology.” Diagnostic Imaging (2003). http://web.mit.edu/outsourcing/class1/DI-radiology-1.htm.

To learn more about USGIF, visit the Foundation’s website and follow us onFacebook, Twitter, or LinkedIn.

--

--

United States Geospatial Intelligence Foundation
The State and Future of GEOINT 2017 report

USGIF is a 501c3 nonprofit educational foundation dedicated to promoting the geospatial intelligence tradecraft.