Science takes dose of capitalism, feels better
Startup Portrait #3: Science Exchange
Startup Portraits is an ongoing series of visual stories about the founders of Bay Area startups, their visions, and what they’re learning. In early October I met with the Science Exchange team at their office in Palo Alto, and then two days later met with OncoSynergy, a company that makes heavy use of the Science Exchange platform. For the interview below, I spoke with Elizabeth Iorns and Dan Knox, the married co-founders of Science Exchange. I’ve edited the conversation for length and clarity. Learn more about Science Exchange: Company homepage, AngelList profile.
What does Science Exchange make?
Dan: Science Exchange is an online marketplace for science experiments. Essentially we are a services marketplace that makes it easy for scientists to get research done by collaborating with other scientists using market based mechanisms.
Elizabeth: Our new tagline is “Order experiments from the world’s best labs.”
What’s the most delightful experience a scientist can have on the platform?
Elizabeth: I think for scientists it’s about seeing the possibilities of what they can do with their research. We definitely have amazing use cases where people have used facilities they would never have had access to—at Harvard, at Duke, at amazing universities. They've been able to make really profound breakthroughs.
Building a product like Science Exchange requires some foresight. Can you tell me if you're anticipating a certain future? And is it potentially one that you see and others do not?
Elizabeth: We definitely do. We think the way scientific research is being conducted is dramatically changing and we think we're at the center of that change. We've already had considerable foresight in seeing how labs are moving away from individuals doing all their experiments themselves in a single lab, and moving towards these teams that are comprised of many people working across multiple institutions.
Already the data is starting to support this. You see the rise of multi-author publications, where in 1980, no publications had more than 10 authors. Now 60% of publications are co-authored by investigators from multiple institutions. So as that happens, those traditional ways of collaborating through social mechanisms start to break down. You really have to get what we call ‘market driven collaboration’ in place, which uses very clear incentives, through payment, and makes expertise available in a very transparent way through a marketplace.
Can you clarify the social mechanisms that aren’t working or don't scale?
Elizabeth: Collaboration is currently based on your personal network as a researcher. You find somebody who can do an experiment for you and you collaborate with them by essentially asking them to do a favor for you that you will repay in the future. That’s not very scalable and it really breaks down when you need 10 different people to work together on one project. It’s also not a very good incentive for getting the best expert to help with your project.
Would you point to any cultural shifts that are influencing your ability to succeed?
Elizabeth: Science right now is undergoing a massive cultural shift. The research system is really, really broken. To think that you can make these little tweaks, and they're going to change all the problems, that’s unrealistic. There’s been the realization that the culture really does not promote good, quality research. It doesn't promote happy people who have satisfactory jobs and lives. It’s a mess—there's extreme exploitation of young people; there are no jobs. There’s tremendous pressure to publish low quality research. There’s no incentive to do things cost effectively. But that’s changing because there are basically no jobs and there’s no money.
But there’s momentum for change. I never would have dreamed that we could start a company two years ago, with now 10 people, and then have the NIH talking about implementing independent validation. I think that, this can be at least partially attributed to us starting the Reproducibility Initiative.
Independent validation? Can you tell me about that?
Elizabeth: So we started looking at the quality of research, which is a huge issue. Basically seventy to eighty percent of biomedical research that’s published cannot be independently replicated. So if someone tries to take that result, which pharma does—pharma uses all of the literature as the basis from which they develop drugs—and they say, OK, this new breast cancer treatment is really effective so we'll try to see if it works in our lab and make a drug around it. Seventy to eighty percent of the time they can’t get the same result that was published.
Wow, I didn’t know that.
Elizabeth: It’s really, really shockingly bad.
Dan: It really came to a head early last year. A couple of scientists at Amgen and Bayer came out and announced some of the stuff that had been going on for years, but no one had really talked about. And the Wall Street Journal and New York Times wrote about it, and a congressmen said “This is crazy,” and called Francis Collins, the head of the NIH, before a committee to say, “Well, why are we giving you $30B a year if $23B of that is just going to wasted research?”
Elizabeth has always been passionate about this particular area. She started really thinking deeply about what you can do to change this. They've got systems in place to catch fraud, but we're not really talking about fraud. We're talking about poor quality research and the wrong sort of incentives. And she came up with a really savvy way to address that through the Reproducibility Initiative, which is an effort to get results validated by independent third parties—who we have thousands of on Science Exchange. Then researchers get a stamp of approval that says, basically, “This research has been double checked. We're confident that it’s of high quality.”
And this is because of Science Exchange?
Elizabeth: It’s only possible because of Science Exchange. This is the interesting thing: Without Science Exchange you could never do it. People started to talk about this problem, and it was all just, “What are we going to do? There’s no solution.” And then we said: “We have a solution!” We have almost every expertise you can possibly think of represented on Science Exchange. So we can actually get labs to do the experiments required to independently validate a researcher’s results.
Dan: And they’re not biased by what the results are. They don’t care about finding the same thing you did—they care about doing good science. They get incentivized through getting paid, rather than, “I’m trying to get a paper out of this, so I better get this result, because that makes it more interesting.”
Elizabeth: There are a whole slew of perverse incentives in academic research. Like, you pretty much need a positive result, not a negative result, to get published. You need publications in order to graduate and get jobs and grants. And then the whole system is based on a peer network. You have to get peer reviewed to get things published, to get tenure, so your funding, your publications, and your career progression all depend on your peer network. That’s a big problem if you’re going to show that a peer’s research doesn’t work. You’re not going to publish that because there’s little upside and a lot of downside, so people just hide it.
Dan: We launched the Reproducibility Initiative as a partnership with PLoS, which is an Open Access journal, Mendeley, Figshare, and we launched it last August. The response has been phenomenal. Elizabeth has been in a meeting with Francis Collins and Harold Varmus of the NIH. They now know about Science Exchange. They may not agree exactly with our approach, but they talk about a 2 year old company with 10 people.
Elizabeth: At the moment we’re replicating 50 of the most impactful cancer biology papers that were published from 2010-2012, and we anticipate that we can do that for the fraction of the cost, and in one year. We can basically confirm this enormous amount of literature. That it’s true and can be robustly built upon. To do that shows the power of the network.
So one component is enabling new research, but there’s this other component of confirming the existing literature.
Elizabeth: Exactly. And that hasn’t really existed before. You say “science” and everyone thinks about exploration. But it’s also confirmation. There’s an element, certainly in the public’s perception, that results are actually validated and we can be confident that they’re correct. And people think about peer review as that process, which it’s not. Peer review is just a process of assessing whether something is interesting enough based on figures that summarize data that isn’t even present. You don’t replicate the experiment in your own lab. It is just a way to say that this is impactful and makes sense as presented. That’s peer review. We’re saying we need more than that. We need to not only enable novel research to happen more efficiently, but also we can put in place an additional confirmation step to make sure that the research is really high quality.
Dan: The mission of the company is to transform scientific collaboration, but the purpose of us wanting to do that is to really improve the efficiency and quality of scientific research. We don’t want to take what’s already done and just make that cheaper if the same sort of concerns exist about the quality. By trying to remove all these really perverse incentives, the quality of what is done can be improved.
Let’s switch gears for a moment. Tell me about a behavior that you’ve tried to encourage with users that they’ve been resistant to.
Elizabeth: One of the behaviors we really want to encourage is on our supply side, our facilities. We want them to use Science Exchange as their storefront, so they bring on their existing users as well as obtain new users through Science Exchange. The reason we want to do that is, well, it obviously helps to build the platform, but it also gives them a lot of reporting tools to assess the impact their facilities are making, which they have to report on to universities and grant offices. So if they use us, this is all in one place. That’s been a behavior we strongly encourage, but I think many of the facilities feel like they can’t ask their users to do things. They feel that they don’t have enough clout to tell users, “This is the new way of ordering from me.”
Dan: We look at a marketplace like Etsy, which has done this really compellingly for local craftspeople. What can we do to encourage that same behavior and same loyalty with our suppliers?
Elizabeth: They have to trust us. Inherently I think it’s always an issue with startups working with universities. Universities are the antithesis of a startup. They’re big monoliths that have been around for hundreds of years. They’re extremely risk-adverse. They’re extremely reluctant change or to do anything different. And startups are the exact opposite. So as a startup you’re coming there and you’re like: “We should do this. We can do this now. Why don’t we just do it?” And they’re like, “Woah! I can’t do it now! I have to think about it! And I have to form a committee! And I have to get permission!” It’s a very different way of working.
Do you have any examples of strategies that have helped you get around that?
Elizabeth: In the beginning we were way too aggressive. We were like: “Why won’t you do this? There’s no reason to say no.” We were trying to convince them. Now I’ve realized that convincing somebody to change their mind is almost impossible. So what you should do is focus on the people who get it, and then the other people will join. If we hit a roadblock now, we forget it. We move on. I think only just now have we realized who the key people are who see the value of Science Exchange.
Do these people have certain traits?
Elizabeth: Initially we went after the actual people who do the experiments in the labs. They obviously see the benefit of Science Exchange. They want to be a part of the platform. But what we found was that they signed up, but then they’d actually be told by their administrators to take down their listing. So although they had the enthusiasm, they didn’t have the level of administrative power to control their own destiny. Then we’d go really high up, up to the VP of research. We’d be like: “Can we list all of your facilities?” And they’d be like: “What? I’m busy. Go away.” Then we just recently found that finance officers, they have to chase up all the payments of the users of their different facilities. So when they found out that we centralize this and have guaranteed payment, and they only have to chase us, they’re like: “This is amazing! All my facilities should be listed!”
So we finally found that one person who it’s a huge win for. It makes their job way easier. Now we find the person who does the books for the labs, and they’re the ones who get it. It’s amazing how universities, without us, can have hundreds of thousands of dollars in unpaid fees. Or maybe money goes from one university to another—but it never makes it back to the lab that did the work. It’s just kind of floating around.
From the surface, your tagline doesn’t touch on any of this. But on the backend, you’re saying that what’s actually getting you traction is this relatively boring problem. It’s just handling the books.
Elizabeth: On the front-facing side, we do get a lot of traction through the exciting offer of gaining access to new facilities. But I do think you have to solve a problem for somebody who has the power to actually implement it.
Dan: I don’t want to give you the wrong impression that we’re not exciting!
Elizabeth: We’re totally exciting! Oh my god, we’re so exciting!
Dan: That’s just a certain aspect of one part of what we do.
Where do you draw the line between what I’ll call “useful anticipation” of the future you describe, and dangerous speculation?
Elizabeth: We use a lot of data. We form these hypotheses, which are really not just based on ideas, but on domain expertise. We’ve looked at a lot of numbers: How is research changing? It’s hard to argue when you see things like the rise in multiple-author, multi-institution papers. And the rise of core facilities and contract research organizations. Those things didn’t exist until the late 90's, when suddenly universities saw that it makes no sense to have everybody doing everything themselves, and paying for this very expensive equipment that’s duplicated across multiple labs.
Dan: Some things will take longer than others. Our goal is to identify the researchers, or the pocket of research, or the types of institutions where the pain we address is most acute. We’re not going to convert every scientist to this way of doing work in the next five years. We just need to convert enough of them.
What’s your biggest need right now? what would you ask readers to do?
Dan: our biggest need is for really talented and passionate people who believe in what we’re trying to do, and feel like they can make an impact as a member of the Science Exchange team. We always have a number of open positions, and we’re always looking for people to reach out to us, even if we don’t have a role advertised.