Why your mapping and data platform should cater to university research and teaching.

First, I should note that this is my first Medium post. I intend for future posts to be more “teaching” than “preaching,” but these are the thoughts that prompted me sit down and write 3000 words, so here you are, reading it.

The following thoughts, requests, opinions and rants are based upon conversations I find myself having (over and over, again) with education reps (or whoever has the HigherEd users glommed onto their job description) from major geospatial technology vendors. Some of these thoughts are about your tools, from a VERY large technology organization manager’s perspective, and some are specific to the Higher Education world of supporting spatial technologies for research and teaching at an university.

Why should you care about serving university research and teaching?

“Everything is related to everything else, but near things are more related than distant things.”
Waldo Tobler’s First Law of Geography

There are currently more than 20,000 faculty, staff and students at Stanford University. If Waldo is right (and he is), every one of those people has a map or “spatial answer” that they, their students, patrons or stakeholders could benefit from. That’s 20,000 potential users, keen on answering questions that have never been asked with tools that have never been available before at scales that have never been possible.

The National Center for education Statistics estimates that: “In fall 2017, some 20.4 million students are expected to attend American colleges and universities.” That’s 20.4 million people, all looking for a competitive edge with their classmates, colleagues and future co-applicants for jobs inside and outside of research and teaching. 20.4 million potential users, with millions more every year.

Catering to Higher Education is a marketing strategy, and it should be one of your primary marketing strategies. All you need to do to access this vast user base of future entrepreneurs, developers and executives is help the “Geo Gatekeepers” in academia do their jobs.

What’s so hard about supporting geospatial technologies in a university?

If you’ve never worked as an academic technologist at a research university, then you have no idea of the scale at which you have to work to support a patron base of it’s diversity and size with something like geospatial technologies. The field itself is so broad that it’s impossible for someone called, for instance, “Geospatial Instruction and Support Coordinator” to know everything and serve everyone at the same level. Often this “coordinator” will be underpaid and overworked academic staff, without the benefits of teaching assistants, funding, space and little more than enthusiasm at their disposal. Often, they are librarians, with the task of selecting, cataloging and providing reference services on vast collections of geospatial data in addition to the “GIS Support services” they provide by default because they know the most about it.

Everything they do has to scale.

The idea that one person, or even a small team of people, can effectively support the use of the entire and ever-evolving universe of spatial data technologies with a patron base numbering in the tens of thousands is ludicrous, but that is exactly what is happening at colleges and universities across the world.

We need you to make better user administration tools.

Universities are collections of schools, departments, programs, centers, labs, courses, libraries, skunkworks and myriad other units, all focused upon their own little corner of the research and teaching world. Not surprisingly, this makes managing access to expensive software and services a nightmare. Researchers working with sensitive data, or simply with the desire to protect scholarship from plagiarism before publication, means that those of us who are the gatekeepers for things like geospatial analysis and web mapping platforms need the ability to compartmentalize users and user groups.

One of the things you can do to provide value for ALL of your customers (inside or outside of academia) is to make better administrative tools and interfaces for the people who manage, administer and support these enterprise level software and data services.

That is, if you are going to give us a “bucket” of resources, we need the ability to create “sub-buckets.” We even need the ability to take those “sub-buckets” and give them to other lab managers, departmental technologists, etc…, and allow them to delegate those “sub-buckets,” themselves (and, maybe even create their own “sub-sub-buckets,” whoa, calm down now).

On the other hand, sharing the resources users put into your platforms is also important. Academic users want to be able to share datasets, maps, visualizations and anything else they upload or produce, with colleagues inside and outside of their “silos.” Some platforms have implemented ‘groups’ in their platform to facilitate this type of controlled access at the project level, but most are still punting, at this point. It doesn’t have to be byzantine, it just has to work.

You can’t take it with you.

Universities are, by their very nature, made up of extremely transient populations. Students are rarely here for more than four or five years, staff turnover is high and the “tenure or out” machine ejects faculty looking for career stability. This creates an complicated situation where exiting faculty, staff or students need to be able to “extract, package and/or migrate” the digital scholarship they have created, so that they can take it with them when they leave. This process needs to be as quick and easy for non-programmers, as possible. Often, these works have been built by a team of developers no longer at the disposal of the researcher, who doesn’t have time to learn SQL or your API while they are taking finals, moving, looking for another job, etc…

Pay-to-play services and products are a nightmare for us.

Geocoding, routing, distance matrices, satellite imagery, etc… Ideally, you should just be giving these resources away to students (and, perhaps, even faculty researchers, but definitely students!). If you are going to use a pay-to-play model where services consume “credits” or where the cost of services can be impacted by the actions of individual users, YOU HAVE GOT TO GIVE ADMINISTRATORS EXTREMELY GRANULAR CONTROL OF THOSE SERVICES. Those services have to be tied to a system that allows for capping or providing quotas for the use of services, at the user level.

Dedicate more HR to your higher education users.

Want a chance to teach at a certain Silicon Valley University? Want to meet people as stoked to make the world a better place, at scale, as you are? Want to meet your next Director of Machine Learning or Chief Innovation Officer? I’ve got your opportunity to do that, and so much more.

We love visitors. We love being introduced to new tools that make our research and teaching better. We can’t keep up with everything. If you want your platforms to be supported and promoted for use in novel, impactful applications it means hitting the ground and visiting universities to support your tools by taking on some of the teaching load, yourself.

Here’s where a certain Inland Empire company has been really smart for the last thousand years (in GIS time): They dedicate massive amounts of resources to serving the higher education vertical and (spoiler alert) they are now turning their attention to K-12 education, too. Ask any grad of a U.S. university GIS program what tools they are familiar with and you already know what they will say. That is by design, and it’s a product of really smart long-term thinking about how to make your tools the default.

If you don’t have at least one full-time employee dedicated to getting your tools into the hands of university researchers and students, and then supporting those users, you’re missing one of your greatest marketing opportunities. But you’ve got to go even further than that. You need a team!

That’s because it’s not enough just to sell and support to the Higher Ed community remotely. You've got to embed. Supporting the research and teaching community means showing up at their GISDay events and hackathons. It means supporting them through sponsorship of their students and staff at your company events and conferences.

It means showing up at their conferences, like Geo4Lib, Code4Lib, DLF Forum, ALA and other meetings where the people who support these tools in universities meet and trade knowledge. It means staying up-to-date on the tools that universities are creating and using to manage and make their content useful (like IIIF, Fedora, Spotlight, Blacklight and Geoblacklight) and seeking opportunities to integrate with those tools.

Stop making tutorials and start writing curriculum

Tutorials, workshops and course curriculum are force multipliers. Again, the individuals and small teams who find themselves charged with support geospatial technologies on university campuses simply can’t learn and support everything. In my center, we’ve got a great deal of expertise, including cartography, remote sensing, Open Source tools, geologic mapping, field data collection, spatial metadata and more, yet we still barely scratch the surface in terms of being able to support the use of all of the geospatial technologies and tools available, and of interest, to our users.

The platforms that are truly transformational, the ones that are allowing researchers to ask (and answer) questions at scales never before possible and the ones that are fulfilling Jim Grey’s ideal of “sending our questions to the cloud” have one thing in common…

They are really only accessible through code. And this is how spatial data and services companies can get their foot in the door. Everyone wants to learn to code!

The next time you sit down to write a tutorial about exploiting some obscure feature of your SQL API,… instead, go get your favorite Introduction to Javascript, tear the the Table of Contents out and use that as the outline for your Introduction to JavaScript with the [insert your startup name here] API, or write an Introduction to Spatial Statistics in RStudio, with maps using Leaflet and

I guarantee this content will be useful and used! Because maps are f@#$%g cool, and that is going appeal to more than data scientists. It’s also going to give the people who are your gateway into academia some handrails for conveying the possibilities of your tools.

There’s a great group called The Carpentries (https://datacarpentry.org/ , https://software-carpentry.org/ ) that is creating fantastic curriculum, using sound pedagogy, for data science. They are focused upon free and open source technologies and basic data carpentry/management with tools like CLI, Git, R, OpenRefine, etc… They are tailoring much of their materials to specific applications (Epidemiology, Environmental Science, Social Sciences, etc…). Take a look at what they are doing and see how it can inform the content you are producing, internally.

We need to persist the data AND the software

There is this thing we do at universities, called science. It’s based upon something called reproducibility: The idea that if I get a particular result, that result should be reproduce-able for others, granted that they have the ability to recreate the research. There’s plenty of debate about whether it’s actually possible in some fields (and I’ve no interest in that debate, here), but it’s certainly the gold standard that we are striving for. The catch is that we need the raw materials of the research to remain with the research, in order for reproducibility to be possible.

This means that our use case is largely incompatible with licensing that doesn’t allow us to retain software and data (like geocodes, satellite imagery, etc…) and to make that software and data accessible to those who might wish to test our results. Government, NGO, commercial and other ‘just in time’ application areas are usually concerned with ‘right now’ and don’t typically have this need, so it’s often a struggle to explain this to vendors, but we’ve been increasingly successful from here at Stanford and have seen these accommodations increasing in our licensing agreements.

In related news, as scholarship becomes increasingly ‘digital’ the university professionals tasked with the preservation of the raw materials and products of research (librarians!) are challenged by the daunting task of preserving content that can be rendered useless through browser updates, API deprecations and other changes to the infrastructure that drive interactivity. The problem of how to preserve these ephemeral products with infinite states is non-trivial, but we’re thinking hard about it. Making deprecated software available for licensing non-profit use in perpetuity can help by providing us with the ability to ‘freeze’ scholarship using virtualization.

Play well with the rest of the neighborhood (and, with your own house, too!)

Researchers in university settings are asking questions that have never been asked before, at scales they’ve never been asked before. That requires novel approaches to solving research questions and I can tell you from experience, those approaches are rarely possible WITHIN a single, ready-made platform, no matter how hard that platform tries to be everything to everyone. The really innovative approaches, that create new insights and solutions, come from novel combinations of technologies.

I recently worked on a project to locate highly mobile pastoralists in Ethiopia for a public health survey. The problem was that these populations move constantly, so that our solution had to have a maximum time window of two weeks from satellite imagery capture to GPS points in a Satellite Phone GPS unit in Southern Ethiopia. The success of the project depended upon connecting workflows between OpenStreetMap’s Humanitarian Task Manager, DigitalGlobe’s fast capture-to-customer pipeline, Esri’s ArcGIS Online platform for creating user-friendly reconnaissance apps, gigantic 35 HD monitor arrays for manual recon at scale and RStudio for the randomized survey design… and it worked!

But it could have been a much easier system to put together if everyone in the pipeline was providing OGC service endpoints for their data. Some do, though it’s sometimes the case that they implement their unique “interpretations” of standards, making things tricky, but most don’t. In fact, many spatial data platforms, even internally, simply can’t speak to one another easily.

Make things that use the standards that make things talk to other things.

Yes, we do expect you to give us all of this for free (or at least really cheap!)

This is just simple quid pro quo. We’re providing you with access to millions of potential users who are totally focused upon learning new skills, tools, concepts and putting them to use in novel ways. That’s your ideal user base. Millions of users, eager to learn the bleeding edge, so that they have an advantage over their classmates and eventual co-applicants for jobs in your company, and others. Many of those users, particularly at places like Stanford, will be the Founders, CEOs and CTOs of the ‘next big company’ and if you’ve made your tools easy for us to administer, teach, support and afford, then your tools will be the ones they think of when their company need location intel.

It doesn’t have to all be free, of course. We understand that the modern web runs on cloud services and that those services aren’t free. But, they are cheap. And that means that you can pass on those very low operating costs to higher education users, in the form of very low-cost “token” license fees for access to data and services. The aforementioned Inland Empire company has found an excellent balance in this way, with what seems to be a sliding scale license fee for their Higher Education Site license. But even for institutions licensing their resources at the higher end of the scale, the return on that investment is hundreds of times what is paid when you consider the “all-you-can-eat” software licensing, liberal credit allotments on web services, free curriculum, support materials, fantastic support personnel, gratis conference registrations and more. It’s their secret weapon, and the their HigherEd team is laser focused on expanding those ROIs, all the time, essentially acting as ombudsman for their users.

You’re already doing a great job

Over my 20 years in the application of spatial data and services in research and teaching, I’ve seen the possibilities explode.

Just five years ago, the idea of doing a complete settlement survey to bring health services to nomadic pastoralists would have been nearly impossible. Now, it’s not only possible, but damn-near trivial.

Laying logistics-focused addressing systems on crowdsourced house and street data in rural Haiti? All the pieces are there, now.

Using satellite imagery and machine learning to stop fishing industry slavery? Been done, now.

We’re in amazing times for people who want to do impactful work and apply technology to making life on this planet better and more hospitable. The tools you are creating are a huge part of that. And we, here in the hallowed halls of academia, want to make sure that we are getting those tools, the best tools, into the hands of the brightest and most earnest people in the world: Our students.

There’s a huge ROI in it for you, if you help us.

You’re welcome.