Some Thoughts On Maximizing Curation Throughput

This year I am on a journey to dramatically increase my reading, writing, and overall learning. And thus far, it’s been going relatively well. I am discovering the sheer amount of amazing content there is to learn from.

In the same day I can listen to a lecture on Portfolio Theory from an MIT professor, watch Khan Academy videos on partial differential equations, listen to a top CEO talk about the overall trends in B2B SaaS companies, tap into the stream of consciousness of some of my favorite modern day philosophers on Twitter and still have time to do my Sunday chores.

It has never been easier to learn.

However, this abundance of high-value (and free!) content has also created a pretty challenging problem; the sheer scale of the opportunity cost to everything I consume. The FOMO of potentially reading, watching or listening to something more interesting/informative/insightful is real.

Twitter has been helpful because I get to interact with individuals that are essentially curating the world around me on an almost second by second basis. It has essentially become my recommendation engine for what ideas/books/topics I should prioritize. I also maintain a regular list of individuals whose writing and thoughts I follow and look forward to on a regular basis. I am hoping to publish that list pretty soon once I refine it some more.

Curators have become incredibly valuable for me in my overall knowledge aggregation value chain. Because of the way information is organized (and aggregated) by the internet, curation is indeed where the overall value in the consumption economy has shifted towards. But not all curators are created equal and it’s been important for me to focus on the individuals and brands with the best curation throughput. Curation Throughput is the amount of content an individual consumes AND synthesizes AND reproduces* in a given time frame.

*By reproduces, I mean their ability to turn the synthesized content into a shareable output (blog, video, podcast etc.) that takes a fraction of time to consume compared to the original work.

So currently, I am discovering individuals and organizations that have the best curation throughput. I could start reading the published papers directly from universities and industry but because my Curation Throughput is not that high, I would not gain as much leverage as following someone like Tyler Cowen.

There is a slight tradeoff here though. What is curation but a human filter? Information filtering through the brain of one individual who decides the value of each piece of information and synthesizes it for others to consume. Those individuals in turn filter that content through their own brain. So on and so forth. There is the risk of the initial core message/learning gets diluted. However, I do not necessarily see it that way. Depending on the curator, they may actually be adding value to that piece of information and giving it much needed context. That’s the highest leverage value point about the current content renaissance on the internet. It allows me to get really close to the brains (for lack of a better term) of some of the most amazing and experienced thinkers.

Another big realization in this whole journey of maximizing learning has been that it is not enough just to consume. Consumption is great but producing is where > 90% of the value comes from, for me personally. I find that you have to participate in the process of curation to fully derive value out of what others have curated for you. You have to join the conversation and be a part of the living, breathing dialogue. Writing is my way of increasing my Curation Throughput. The more I write, the more I find myself reading and learning and vice versa. When you read you end up creating a multitude of end-points. Vectors of thoughts, ideas, and questions that need to connect. Writing allows you to close off some of those end-points. But in the process, you end up creating way more endpoints than you closed. So you go back to reading. It’s a pretty wonderful feedback loop.

I wish I had started earlier but I am glad I am doing it now :)