My next chapter and hopefully yours: working on MASSIVELY multi & interdisciplinary problems (MMIPS)

dj patil
7 min readFeb 22, 2023

--

I’ve been incredibly fortunate to work on a wide array of problems across industry, academia, and government and one of the common threads across them is that they all involve multi/interdisciplinary problems. What is a multidisciplinary or interdisciplinary problem? They are problems that require multiple areas of expertise or more than one branch of knowledge to find a solution.

Data science itself is the convergence of data, compute, storage, and talent. And many of the best data scientists I know come from varied disciplines like mathematics, physics, political science, psychology, economics, and computer science (and I still assert that a great liberal arts training is one of the best things a data scientist can have).

The challenges facing us as a country and as a species are massive and require a fundamentally multi/interdisciplinary approach. What kinds of problems am I talking about? Drug discovery, public health, and medical treatments now rely on a combination of traditional biology with data, compute and AI. Sustainable food practices are becoming a combination of biology, ecology, data, hardware, machine learning and more. Similarly, smarter transportation (e.g., fuel & batteries) and creation of new materials that are stronger, lighter, and environmentally friendly will require teams with a broad array of skills and expertise.

These problems show that we’re entering the era of Massively Multi/Interdisciplinary Problems (MMIPs).

The landscape has been shifting this way for some time. For example there is a tectonic shift across both the U.S. and China research and development enterprises to MMIPs with the recognition that solutions to these problems will set the stage for the next 100 years (including the next generation pharmaceuticals and materials which will leverage large data sets, AI, and lab work. Students have recognized this too, it’s why so many are doing double majors or data science degrees since it enables them to focus in multiple fields.

Consider the mission of the U.S. Chief Data Scientist, which we debated extensively at the creation of the role, — to responsibly unleash the power of data to benefit all Americans — focused on MMIPs. Including the launch of the Precision Medicine Initiative (including the NIH’s All of us campaign) to enable a world where we have truly affordable, equitable, and accessible tailored medical treatments. As well as our work on community policing including using data in novel ways to provide transparency and prevent the endless cycles of incarceration.

Our work on the COVID has required a team with a broad array of skill sets from multiple disciplines and the best ideas always came from the classic adage with the right team, 1+1=3 (in this case 1+1=57). And my work with the late Secretary of Defense Ash Carter and his Technology and Public Purpose Project would range from quantum computing, to AI, to synthetic biology, within 15 minutes of conversation with him.

I’ve also been incredibly fortunate to be an early stage investor/builder/advisor in companies, technologies, and people who work on MMIPS. Including:

  • Figma pioneered how design and collaboration can be powered by powerful in browser approaches.
  • Devoted Health is building a health care plan we’d want for our own loved ones and leverages data and technology to deliver the right care at the right time.
  • Confluent takes Kafka which we open sourced at LinkedIn and allows all organizations to benefit from streaming and log data.
  • Ola in India has transformed transportation in India using a broad array of solutions.
  • Monte Carlo & Anomalo are taking care of some of the biggest challenges facing data scientists with observability of data and making sure it is correct.
  • Chronosphere leverages new ways of using data at scale to help remediate cloud issue faster than anyone else.
  • Sumologic pioneered understanding what’s happening in your system with the first cloud offering.
  • Peoplehood is bringing people together in new ways to address our chronic issues of loneliness and isolation.
  • Crisis Text Line a non-profit is the largest all volunteer network of crisis counselors and uses data and technology to ensure people get help fast (and is part of the the 988 network).
  • Rebellion Defense is using the latest technology developments to deliver solutions for national security challenges.
  • RelateIQ (acquired by Salesforce) used data science and design to build the most powerful CRM of the day.
  • LinkedIn — well, you all know the story of data science here ;-)

Which leads me to how I’m going to be spending my time in this next chapter and the platforms where I’ll be doing it from. I’m excited to announce that I’ve joined the team at GreatPoint Ventures (GPV) as a General Partner. And I’m going to double down on my approach to building companies, founders, and their teams.

Why GPV? GPV has a track record of working on MMIPs that require deep understanding of the confluence of at least two major areas. Their motto is “make no little plans” and aligns exactly with how I think of building. They approach projects as a team with a multi/interdisciplinary approach and their companies reflect that including: Vim, Beyond Meat, Activ Surgical, Recogni, Extend, Skyhawk, Excision, Abett, and many more. At their core, they are operators and focus on empowering the founder to solve problems. Many say this, but as I’ve found GPV really walks the talk.

What will I be focusing on? There are four many decadal MMIP trends I’m excited to be working on:

  1. Computation, data, & AI are opening new ways to approach healthcare and life sciences
  2. Modernizing the U.S. national security stack requires new approaches that will transform the sector
  3. Stupid, boring problems in the back office have just begun being solved using smart software
  4. Data tooling and infrastructure require new approaches to power the next wave of AI, Data Science, and MMIPs

Why computation, data, and AI + healthcare and life science? First, healthcare is now 20% of U.S. GDP and growing. In the U.S., 14 million people receive some form of long-term care & will double by 2050, according to CMS. We’re starting to see major breakthroughs using AI including Alpha Fold as well as leveraging large data sets for off label drug solutions. And personalized medicine is a major focus for academia and healthcare organizations. On top of this, every year I meet more and more technologists who want to work on these problems and healthcare veterans who are looking for new ways to attack long standing problems.

Why modernize the U.S. national security stack? Modernization of intelligence and defense solutions has been well documented. This includes the need for the Department of Defense to move to the cloud, executing on the Joint All–Domain Command & Control (JADC2). Much of the time people think of weapons systems, but it is actually the “stupid, boring, problems”. Moving food, fuel, and batteries more efficiently. Tracking of spend (it is our taxpayer money after all). Additionally, the conflict in Ukraine has shown, open source technologies/intel, drones, and other commodity solutions provide an asymmetric advantage. Add to that the U.S. is increasingly pivoting to addressing the threats from China which will require fresh new thinking.

Why focus on stupid boring problems? When I’ve been asked to help large organizations, all too often there are what I call stupid, boring problems (SBPs) that are responsible for a lack of agility and speed. SBPs tend to be error prone, involve lots of process and oversight, and job’s with low satisfaction. It’s for this reason they tend to be offshored or have staffing issues (e.g., the great resignation). These problems are especially acute in healthcare, banking, and other “paperwork rich” environments. It’s one of the reasons Gartner continues to expect robotic process automation (RPA) to continue its double digit growth. And we’re just seeing the beginning of the potential impact as applications of large language models, machine vision, and data science are applied to these formerly unsexy areas.

Why data tooling and infrastructure? Most data tooling is still hard to use and setting up basic infrastructure remains complex and requires specialized expertise. Data Scientists, a decade later, still are not as productive due to lack of collaboration and access to data. Add to this, we need better tools for protecting privacy and ensuring responsible use of data. And of course, let’s not forget what we found a decade ago and it is still true today that 80% of a data scientist’s time goes into cleaning data. Finally, I believe that we are going to require new approaches to power the next wave of AI and data science.

So, let’s get to work. There’s lots to do. And if you’re working on MMIPS, let’s talk!

--

--