Published in


Considdr Milestones

I think the easiest way to recount the ideas we worked on is to place them in the context of the company timeline. I’ll outline the key company milestones here and for each, I’ll link to an article that dives deeper into the key ideas we were pursuing or how our technology worked.

March 2014: Initial Ideation

Considdr was born in a strange moment of existential anxiety. I had just started spring break of my sophomore year in college after finishing a particularly grueling set of midterm exams. I collapsed in the reclining chair in my living room and reflected on the sheer volume of information I thought I had internalized the past few months. Then I panicked. It seemed like I had already forgotten almost everything. If I can’t remember most of what I just learned, what about everything I’ve ever learned?

An uncomfortable line of thinking quickly consumed me. If I can’t remember most of the information I’ve ever learned, what are my beliefs based on? Why do I feel so strongly about any position I hold if I can’t remember the supporting arguments or evidence that went into holding it — let alone those that might undermine it? How can I really believe anything at all?

One possible answer: Belief Trees — a new way to store, visualize, and integrate information. Belief Trees also enable a new kind of search and suggestion approach that aims to make it easier to form well-balanced beliefs: Logical Aggregation. In 2018, the USPTO issued Considdr a patent for Belief Trees and Logical Aggregation, which I began work on in 2014, but was filed officially April 26, 2016. For more in depth walkthroughs of Belief Trees and Logical Aggregation, see the article links below the video.

This video reflects the general theoretical basis for Considdr or at least the state of Considdr (previously called The Market of Ideas) in 2016 before it had the benefit of so much input and improvement from my teammates. At the time, I made some big claims re: Google, which obviously did not pan out. In retrospect, I hope these claims are viewed more as naïveté than arrogance.

Belief Trees: Storing Reasoning and Crowdsourcing Truth

Logical Aggregation: Leveraging the Implicit Structure in Documents for a New Kind of Search Engine

April 2014 — May 2016: Research in Polarization

The flimsy foundation of belief peaked my interest in another phenomenon that seemed to be getting worse: political polarization. Are we so divided because our minds aren’t very good at forming accurate beliefs? Why is it so hard to find common ground? As an Economist article in May 2014 put it, given how much information we now have at our disposal, why it is more likely to be weaponized than to be harnessed as fuel for “a new Socratic age, in which the political classes jointly search for truth.”

To me, this problem of political polarization seemed at its core to be one of inaccurate belief formation — a natural result of our inability to process and retain information in an evenhanded way. Beyond that, it also seemed like a problem the tended to underpin most other problems. If we can’t form accurate beliefs as a society, how can we solve anything else?

This realization gave me a sense of purpose and motivation that I had never felt before. It’s why I dedicated the rest of my college education and free time researching polarization and why I spent more than half of my twenties trying to build something that might address it.

Fact vs. Faction: Polarization in the Information Age

June 2016 — April 2017: Company and Team Formation

It turns out I wasn’t alone in seeing Belief Trees, Logical Aggregation, and Considdr more generally as a potential tool for overcoming the limits of the human mind to form more accurate beliefs. As I shared what I had been working on over the previous two years with family and friends, I received a ton of encouragement and support. I can’t possibly thank all the people who at this stage were critical to getting Considdr off the ground. My Bowdoin College, Maine, and family networks were particularly impactful.

With summer interns, our team swelled to nine in this time, including my co-founders Hailey and Marcus. We also added our first two advisors, Bill and Jerry, who provided key early and continued mentorship.

June 2017 — August 2017: MVP 1

Over the first few years of research and ideation, I had begun writing code for the earliest version of Considdr. A Rails web application that implemented the barebones functionality of Belief Trees and Logical Aggregation. The goal of the first summer with teammates was to turn that web app into a functional minimum-viable product for college classrooms.

The earliest version of a Belief Tree. For more on the theory behind Belief Trees, see Belief Trees: Storing Reasoning and Crowdsourcing Truth. For more on how the initial MVP evolved, see Considdr: A Social Reasoning Platform (article in progress).

My teammates made monumental improvements to the original barebones app. It went from very confusing and almost unusable, built for no one in particular to a functional classroom tool in a production environment. A huge shoutout to Marcus, Hailey, Son, Alex, James, Maddie, Grace, and Joe. Considdr would have been dead-on-arrival without their insight and hard work. They successfully stood up a functional “social reasoning platform.”

Considdr: A Social Reasoning Platform (article in progress)

September 2017 — May 2018: Beta Testing 1

Starting in September, we began piloting our initial platform in a handful of classes at Bowdoin College. Thanks in particular to Professors Franz and Stone, who were the first to volunteer to test Considdr out in their courses. Students used the platform to build well-evidenced beliefs on questions like “Should the U.S. raise the minimum wage?” (intro economics course) and “Is American Democracy in trouble?” (public opinion course).

Throughout the semester we tracked usage of the platform, interviewed students, and continually made improvements based on their feedback. We were happy with the early response to the product. Of particular importance to us, a third of students surveyed said that Considdr actually helped them change their mind.

Some early positive feedback on the Considdr MVP

The positive feedback was encouraging so we doubled down on the classroom strategy. In the second semester nearly a dozen classes across multiple disciplines were using Considdr in their curriculum. Most expressed an interest in using it again for the following year as well. This stage of the company and product is summarized nicely in a feature that Bowdoin ran in the spring of 2018: “Bowdoin Grads Doing Their Part to End Polarization.”

May 2018: Pivot

Despite what felt like a positive launch of Considdr in the classroom, we quickly realized that survival depended on making a significant pivot. We were running out of capital; our initial monetization strategy of licensing our software to colleges seemed increasingly problematic; and the sales cycle looked too long (along with the long lull of the summer) for us to demonstrate enough growth/willingness-to-pay data that could help us raise more funding.

Students expressed most value in the Belief Tree building component of Considdr in which “notes” would populate for them to “consider” as different arguments and evidence in their Belief Tree. However, one issue was that our platform relied on crowdsourcing to generate notes: students would summarize key arguments and evidence from their readings and everyone on Considdr could leverage those notes in building their Belief Trees (provided students opted to make their notes public).

Note-taking wasn’t a trivial ask of our users and, while some liked it, most did it primarily because their professors required them too. In the entire year of beta-testing, students had generated only 3,000 unique notes that could be used in Belief Tree building. This reliance on crowdsourcing meant that on-boarding was time-consuming and relied on users to understand and apply good summarization practices. Could we automate this process?

Yes. In fact, we developed what we think is a totally new approach to summarization using pretty standard machine learning and natural language processing techniques. We called this approach “summarization by adjacent document.” It enabled us to generate ~3 million “notes” (hereafter referred to as “insights”) in the same time it had taken our initial users to generate 3,000 — and they tended to be more reliable summarizes of key points. See “Summarization by Adjacent Document: a New Way to Extract Insights” linked to below for more on how this works.

Instead of “a social reasoning platform” we decided to pivot toward “a search engine that helps people build evidence-based beliefs.” We eliminated the social features of the site and paired down the search and Belief Tree construction components of Considdr into one streamlined page.

Summarization by Adjacent Document: a New Way to Extract Insights

June 2018-December 2018: MVP 2

Rather than just links to articles, search results on Considdr were summaries of important arguments and evidence from full-length articles. Users could search through the ~3 million insights we had indexed (and continued to index in real-time) — and then without leaving the page they could “consider” any relevant insights in one of their Belief Trees, which we began calling “collections” for clarity.

An example search. On the left, you can see insights relating to the competition between Netflix and Disney+. On the right you can see a collection that tracks the user’s current belief about threats to Netflix.

On the technical side of the product, our stack grew considerably in this time — from a standalone app to a robust data pipeline. For more on how Considdr worked from a product and technical perspective see the article linked below.

Considdr: Search Less to Consider More. (article in progress)

January 2019-December 2019: Beta Testing 2

Beginning in May, before we started writing any code we did much more extensive market testing and analysis. We built simple mockups of one-page application configurations and called anyone who was willing to talk to us and provide feedback — journalists, investment bankers, marketing agencies, strategy consultants, lawyers, medical researchers, and academics to name a few.

After months of conversations and testing, we decided to focus in on market researchers — specifically in the marketing agency and strategy consulting space. The two verticals often had similar use cases that appeared to lend themselves to building evidence-based beliefs with Considdr. They often needed well-supported recommendations on trends in market and consumer behavior and they were required to develop expertise quickly across many different subject domains.

In this time, we saw early traction; landed our first paying customers; and began a trial with one of the big five consulting firms. Considdr was also selected as a 2019 MassChallenge Finalist — a highly competitive accelerator program aimed at identifying startups with widespread potential to impact the world. At the end of the year, Considdr was also named a “Built-in-Boston” top 50 startup to watch in 2020.

Clustering Insight: Finding Signal in the Noise (article in progress)

January 2020 — July 2020: Final Days

In 2020, we continued to try to land customers in consulting and marketing agencies. Our runway was tight and we needed to demonstrate more growth through either usage or paying customers in order to have a shot at raising another round of funding. Unfortunately, Considdr struggled to break through in a big way. We had a difficult time proving concrete ROI and as a result couldn’t find the market traction we had initially thought would be there. Once COVID hit, new leads and sales calls understandably began to dry up. Our trial with the big five consulting firm — which went relatively well based on their early feedback — was put on pause.

We attempted to pivot one last time by looking for partnerships or organizations that might value and integrate our technology and data into their product. Two of the largest academic search engines in the world expressed interest and we built a couple proof-of-concept applications, where we applied our technology to academic research papers. Both organizations expressed interest in moving forward, but it became clear that any kind of deal that could provide us an infusion of capital would take months if not longer to complete. By the end of the summer, we had completely run out of capital; exhausted our personal savings; and quite frankly, were mentally, physically, and emotionally spent. It was time to call it quits.

I’m immeasurably proud of everything my team accomplished over the years and can’t express how grateful I am to have had this opportunity. It was an incredible journey.



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store