Battleship vs Coordinated Fleet

The next generation of Information Systems within Higher Education

Around the world, CIOs are dealing with the difficult task of building, procuring, and governing platforms for campus infrastructures with specific needs around security, identity management, compliance, governance, shrinking budgets, increasing operational demands, and more. Historically this has resulted in giant centralized IT systems that resemble a battleship — effective, but difficult to turn.

The merits of centralization seem obvious: fewer vendor relationships, greater control, and ideally more consistency of data formats — altogether resulting in lower cost. But that doesn’t happen. Instead, these infrastructures have started to crack due to rising costs, lack of usability, lack of flexibility, and lack of data portability (which also impedes analytics). Being unable to turn the battleship also makes it impossible for these large institutions to quickly respond to new conditions and unavoidable trends (like mobile).

With the growing number of cloud services and SaaS products, the CIO’s job has become even more complex because of inertia toward decentralization. This proliferation of smaller tools seems like it could result in more plumbing to manage, losing the birds eye view, and losing a grip on security. So, these two forces of centralization and decentralization stand in apparent opposition.

And yet, the traditional process of purchasing new systems for several million dollars and then hoping nothing goes wrong during a multi-phase (often multi-year!) implementation continues … but clearly crazy. So, schools have begun to disaggregate their large ERP systems and embrace the rise of smaller single-purpose tools that together form a more flexible IT infrastructures. This is part of a secular trend in consumer software whose APIs make sharing data with other systems a first class feature.

“Now, with web services and standardized APIs, it’s far easier to use lighter-weight solutions, or best-in-breed applications, allowing organizations to purchase just what they need and integrate it into a larger technology ecosystem rather than trying to purchase a massive application that covers every eventuality. This allows companies to rapidly innovate to solve unique business challenges.”

— Forrester

The MIT Experiment

John Charles, the new VP of Information Services & Technology at MIT, recently announced a technology vision summarizing these areas of challenge and opportunity:

We are now experimenting with a new, agile operating model that is allowing us to accelerate the pace of IT-enabled innovation at MIT. This practice of agile, iterative experimentation — think big, start small, fail fast, iterate rapidly — is key to enabling innovation.

He goes on to explain in more detail how they intend to achieve their goals:

  • embrace API-centric architectures that will enable IS&T and other IT service providers to quickly respond to requests to extend the functionality of key Institute systems to meet differentiated needs;
  • incorporate the use of emerging data visualization and predictive analytics tools that will better enable us to keep pace with rapidly evolving needs for real-time access to structured and unstructured Institute data; and
  • break down existing barriers to integration and improve support for cloud-based services.

His vision statement understands that today’s priorities and toolsets will not be the same in a few years. So, MIT is shifting its focus away from a perfect battleship to integration architectures that can welcome and connect a variety of new systems easily. Data visualization provides the truth necessary for reformation and removing barriers to integration enables continuous cycles of adaptation and institutional learning.

This is very different from the legacy strategy of first putting everything into a massive database and then making it quarterback various external operations & systems. Those databases were designed to lock down data, not coordinate complex functions.

APIs are Mandatory

I believe the most critical ingredient enabling the successful implementation of this mindset is choosing tools that have APIs for moving clean data in/out. This may sound obvious to people who only use consumer tools, but it’s still a big problem in higher education. Most of the legacy vendors who sell data or information systems still rely on nightly loads via FTP, often sending files in ancient formats with dirty data complicating operational workflows. Those brittle and often manual connections between systems are not integrations. They simply introduce more work, more complexity, more risk, and false expectations.

That has to end.

We’re proud to be part of a new generation of enterprise vendors who build API first so that data can be easily ingested by any of our tools and sent out to the larger ecosystem as needed. This keeps everything synced in real-time, which is preferred by every stakeholder. Here’s a recent case study involving our SlideRoom ATS demonstrating what’s possible when a focused product is connected into a larger established ecosystem.

Technology is supposed to make everything easier and more affordable, but so many vendors sell stuff that does the opposite. They reinforce complexity rather than help reform it. Schools shouldn’t consider any system without these core qualities of great usability (for self-service), data interoperability (for coordination), and lowered costs (just because).

The goal is to attain a modular IT infrastructure that enables leaders to choose their favorite tools while connecting with larger systems. Get the best of both worlds and destroy that battleship.