<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:cc="http://cyber.law.harvard.edu/rss/creativeCommonsRssModule.html">
    <channel>
        <title><![CDATA[Stories by Aleksander Molak on Medium]]></title>
        <description><![CDATA[Stories by Aleksander Molak on Medium]]></description>
        <link>https://medium.com/@aleksander-molak?source=rss-f390f1bdd353------2</link>
        
        <generator>Medium</generator>
        <lastBuildDate>Fri, 15 May 2026 08:42:12 GMT</lastBuildDate>
        <atom:link href="https://medium.com/@aleksander-molak/feed" rel="self" type="application/rss+xml"/>
        <webMaster><![CDATA[yourfriends@medium.com]]></webMaster>
        <atom:link href="http://medium.superfeedr.com" rel="hub"/>
        <item>
            <title><![CDATA[Causal AI at KDD 2024 — Why Companies That Won’t Jump on the Causal Train Now Will Have a Harder…]]></title>
            <description><![CDATA[<div class="medium-feed-item"><p class="medium-feed-image"><a href="https://medium.com/data-science/causal-ai-at-kdd-2024-why-companies-that-wont-jump-on-the-causal-train-now-will-have-a-harder-bdd0671543cf?source=rss-f390f1bdd353------2"><img src="https://cdn-images-1.medium.com/max/2600/1*BdwEdrlke0hP3FeQUS63wA.jpeg" width="4000"></a></p><p class="medium-feed-snippet">Building Causal Expertise is a Process, Not an Event</p><p class="medium-feed-link"><a href="https://medium.com/data-science/causal-ai-at-kdd-2024-why-companies-that-wont-jump-on-the-causal-train-now-will-have-a-harder-bdd0671543cf?source=rss-f390f1bdd353------2">Continue reading on TDS Archive »</a></p></div>]]></description>
            <link>https://medium.com/data-science/causal-ai-at-kdd-2024-why-companies-that-wont-jump-on-the-causal-train-now-will-have-a-harder-bdd0671543cf?source=rss-f390f1bdd353------2</link>
            <guid isPermaLink="false">https://medium.com/p/bdd0671543cf</guid>
            <category><![CDATA[causal-inference]]></category>
            <category><![CDATA[machine-learning]]></category>
            <category><![CDATA[ai]]></category>
            <category><![CDATA[causality]]></category>
            <category><![CDATA[deep-dives]]></category>
            <dc:creator><![CDATA[Aleksander Molak]]></dc:creator>
            <pubDate>Mon, 30 Sep 2024 14:58:02 GMT</pubDate>
            <atom:updated>2024-10-01T16:32:48.274Z</atom:updated>
        </item>
        <item>
            <title><![CDATA[“I’m quite capable of great enjoyment, and I’ve had a great life.”]]></title>
            <description><![CDATA[<div class="medium-feed-item"><p class="medium-feed-image"><a href="https://aleksander-molak.medium.com/im-quite-capable-of-great-enjoyment-and-i-ve-had-a-great-life-55e0164c11b8?source=rss-f390f1bdd353------2"><img src="https://cdn-images-1.medium.com/max/1280/1*26JAdzIuAva25TYexVGuMA.jpeg" width="1280"></a></p><p class="medium-feed-snippet">Two paragraphs on Daniel Kahneman</p><p class="medium-feed-link"><a href="https://aleksander-molak.medium.com/im-quite-capable-of-great-enjoyment-and-i-ve-had-a-great-life-55e0164c11b8?source=rss-f390f1bdd353------2">Continue reading on Medium »</a></p></div>]]></description>
            <link>https://aleksander-molak.medium.com/im-quite-capable-of-great-enjoyment-and-i-ve-had-a-great-life-55e0164c11b8?source=rss-f390f1bdd353------2</link>
            <guid isPermaLink="false">https://medium.com/p/55e0164c11b8</guid>
            <category><![CDATA[daniel-kahneman]]></category>
            <category><![CDATA[psychology]]></category>
            <category><![CDATA[causality]]></category>
            <category><![CDATA[aleksander-molak]]></category>
            <category><![CDATA[personal]]></category>
            <dc:creator><![CDATA[Aleksander Molak]]></dc:creator>
            <pubDate>Fri, 29 Mar 2024 09:26:22 GMT</pubDate>
            <atom:updated>2024-03-29T09:26:22.122Z</atom:updated>
        </item>
        <item>
            <title><![CDATA[Three Mind-Expanding Books for Causal and Non-Causal Data Scientists to Read in 2024]]></title>
            <link>https://aleksander-molak.medium.com/three-mind-expanding-books-for-causal-and-non-causal-data-scientists-ef6b7466b5bb?source=rss-f390f1bdd353------2</link>
            <guid isPermaLink="false">https://medium.com/p/ef6b7466b5bb</guid>
            <category><![CDATA[data-science]]></category>
            <category><![CDATA[book-recommendations]]></category>
            <category><![CDATA[modeling]]></category>
            <category><![CDATA[causality]]></category>
            <category><![CDATA[statistics]]></category>
            <dc:creator><![CDATA[Aleksander Molak]]></dc:creator>
            <pubDate>Sat, 16 Dec 2023 08:02:02 GMT</pubDate>
            <atom:updated>2023-12-16T18:45:48.591Z</atom:updated>
            <content:encoded><![CDATA[<p>Hint: Knowing something about the data generating process will almost always put your ahead of the pack.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*YtHeiX5KRyGQHfGbmsyjMQ.png" /><figcaption><strong>Fig 0. </strong>Three mind-expanding books for causal and non-causal data scientists. Image by yours truly.</figcaption></figure><p>The Data Science Revolution was largely driven by the idea the we can look at the data, find patterns and leverage them to our benefit.</p><p>These ideas were fueled by the rising hopes that growing computational resources and unprecedented data availability will allow us to automate our decisions, scientific discovery and business analyses.</p><p>This turned out to work. At least to an extent.</p><p>A famous investor and mathematician, <a href="https://en.wikipedia.org/wiki/Jim_Simons_(mathematician)">Jim Simons</a> was able to successfully exploit predictive techniques in his investment strategies. Neural networks powered (and still do) some of the (partially) autonomous vehicles.</p><p>But there’s also a second, less visible part to this story.</p><p>Jim Simons also <strong>lost</strong> a lot of money using the predictive paradigm, and autonomous vehicles often fail when facing even slightly unusual conditions.</p><p>The three books we discuss in this blog post have one thing in common.</p><p>They all show how understanding the data generating process (rather than just looking at the patterns in the data) can help us make better decisions and enrich our understanding of the world.</p><p>Each does it differently.</p><h3>1. The Book of Why: The New Science of Cause and Effect</h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*i4wRlXvUFX5o2OLrw70JFA.jpeg" /><figcaption><strong>Fig 1.</strong> <a href="https://amzn.to/47W1WQL">“The Book of Why” by Judea Pearl &amp; Dana Mackenzie</a>. Image by yours truly.</figcaption></figure><p>This book, written by Judea Pearl and Dana Mackenzie is an absolute classic when it comes to causality and causal inference. It has been eye-opening to an entire generation of data scientists, researchers and practitioners.</p><p>Pearl shows the “why” behind “why” — why it is important to ask <em>why</em> <em>questions </em>and why it’s critical that we understand which methods to use in order to answer them (hint: understanding the structure of the data generating process is crucial).</p><p>Most of my <a href="https://causalbanditspodcast.com/">podcast</a>’s guests either started their journey into causality with this book or read it later in their career.</p><p>Highly recommended to anyone interested in improving their data skills.</p><p>🟡 <a href="https://amzn.to/47W1WQL">“The Book of Why” by Judea Pearl &amp; Dana Mackenzie</a> (print, Kindle, audiobook)</p><h3>2. “Antifragile: Things That Gain from Disorder”</h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*AZDB_HldvTCKZwiRT3xg-Q.jpeg" /><figcaption><strong>Fig 2.</strong> <a href="https://amzn.to/3GJyVvT">“Antifragile” by Nassim Nicholas Taleb</a> (print, Kindle, audiobook). Image by yours truly.</figcaption></figure><p>From cherry-picking to linear models and lack of understanding of fat-tailed distributions, Nassim Taleb is a fierce critic of common practices in science and industry.</p><p>One of the theses in the book is that trying to control randomness in complex systems — although might seem beneficial in the short term — can badly backfire in the longer run.</p><p>In the book, Taleb shares his belief that talking about causes and effects might not be meaningful in case of complex non-linear and potentially cyclic systems.</p><p>I am not convinced by his pessimistic position here. We know that meaningful interventions in dynamical systems are possible, but we need to know what we’re doing (Naftali Weinberger explains it <a href="https://youtu.be/UQ8j-DEkB98?si=6DME-Fhw88Cxz_Mj&amp;t=2436">here</a>).</p><p>What Taleb calls “naive interventions” can lead to dramatic, unwanted consequences.</p><p>Another important concept in the book are <a href="https://en.wikipedia.org/wiki/Fat-tailed_distribution">fat</a> and <a href="https://en.wikipedia.org/wiki/Long_tail">long</a>-tailed distributions that can easily derail any traditional learning algorithm trained on finite-sized samples.</p><p>Fat and long tails are critically important in most complex areas from <a href="https://www.fintechna.com/articles/the-long-tail-of-financial-markets/">finance</a> to <a href="https://youtu.be/rHM0mBXubig?si=yt3jKtqUvqsN8UyA&amp;t=2148">autonomous driving</a>.</p><p>Great and mind-expanding read.</p><p>🟡 <a href="https://amzn.to/3GJyVvT">“Antifragile” by Nassim Nicholas Taleb</a> (print, Kindle, audiobook)</p><h3>3. <strong>“Chaos: Making a New Science”</strong></h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*vDVpu2UvVQl-3WhgLASdZg.jpeg" /><figcaption><strong>Fig 3.</strong> <a href="https://amzn.to/4aj3YvR">“Chaos” by James Gleick (print, Kindle, audiobook)</a>. Image by yours truly.</figcaption></figure><p>In the age of machine learning, we got used to the thought that if something is unpredictable, it clearly lacks structure — and maybe collecting measurements of more variables could make it more predictable.</p><p>The truth is that chaotic systems can produce chaotic behavior very systematically, based on a set of (possibly) very simple rules.</p><p>James Gleick’s book is an excellent introduction to <a href="https://en.wikipedia.org/wiki/Chaos_theory">chaos theory</a>, dynamical systems and <a href="https://en.wikipedia.org/wiki/Complex_system">complexity</a>. His engaging style and love for a good story (from Oppenheimer to Lorenz and much more) make it a great read!</p><p>If you were ever interested in fractals, emergence, complexity or chaos — this is a must-read.</p><p>🟡 <a href="https://amzn.to/4aj3YvR">“Chaos” by James Gleick (print, Kindle, audiobook)</a></p><p>Each of these books brings a unique perspective to the table, and each presents an idea that questions a broadly accepted status quo.</p><p>Taken together, these books are a great inspiration to question the assumptions that our contemporary data culture takes for granted.</p><p>Sometimes just fitting models to data is not enough to answer the questions that are the most important to us.</p><p>Understanding this is power.</p><p>I hope these mind-expanding books will inspire you as much as they inspired me (to say the least, I wouldn’t have written <a href="https://amzn.to/3uUG4Xq">my book</a> if I hadn’t read “The Book of Why”)</p><p>Finally, I‘d love to learn from you — what are your favorite mind-expanding books?</p><p>PS: The best way to let me know if you liked this story is by clapping. You can clap more than once 👏🏼👏🏼👏🏼</p><p><a href="https://aleksander-molak.medium.com/">Aleksander Molak - Medium</a></p><blockquote>This article contains affiliate links — if you decide to purchase a book using one of them a small part of the revenue will help me create more free content for you.</blockquote><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=ef6b7466b5bb" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Jane the Discoverer: Enhancing Causal Discovery with Large Language Models (Causal Python)]]></title>
            <description><![CDATA[<div class="medium-feed-item"><p class="medium-feed-image"><a href="https://medium.com/data-science/jane-the-discoverer-enhancing-causal-discovery-with-large-language-models-causal-python-564a63425c93?source=rss-f390f1bdd353------2"><img src="https://cdn-images-1.medium.com/max/1900/1*swa9bZ-HAwWZo4NFar8MYg.png" width="1900"></a></p><p class="medium-feed-snippet">A practical guideline to LLM-enhanced causal discovery that minimizes the risks of hallucinations (with Python code)</p><p class="medium-feed-link"><a href="https://medium.com/data-science/jane-the-discoverer-enhancing-causal-discovery-with-large-language-models-causal-python-564a63425c93?source=rss-f390f1bdd353------2">Continue reading on TDS Archive »</a></p></div>]]></description>
            <link>https://medium.com/data-science/jane-the-discoverer-enhancing-causal-discovery-with-large-language-models-causal-python-564a63425c93?source=rss-f390f1bdd353------2</link>
            <guid isPermaLink="false">https://medium.com/p/564a63425c93</guid>
            <category><![CDATA[editors-pick]]></category>
            <category><![CDATA[large-language-models]]></category>
            <category><![CDATA[causality]]></category>
            <category><![CDATA[machine-learning]]></category>
            <category><![CDATA[causal-discovery]]></category>
            <dc:creator><![CDATA[Aleksander Molak]]></dc:creator>
            <pubDate>Sun, 22 Oct 2023 15:33:44 GMT</pubDate>
            <atom:updated>2024-07-12T15:39:17.079Z</atom:updated>
        </item>
        <item>
            <title><![CDATA[Causal Python: Five Novel Causal Ideas At NeurIPS 2023]]></title>
            <description><![CDATA[<div class="medium-feed-item"><p class="medium-feed-image"><a href="https://medium.com/data-science/causal-python-five-novel-causal-ideas-at-neurips-2023-13bb68c5ed56?source=rss-f390f1bdd353------2"><img src="https://cdn-images-1.medium.com/max/2600/1*v6qUfE1n6f4QuxLHZ8lliA.jpeg" width="4592"></a></p><p class="medium-feed-snippet">New exciting ideas that marry causality with generative modeling, conformal prediction and topology.</p><p class="medium-feed-link"><a href="https://medium.com/data-science/causal-python-five-novel-causal-ideas-at-neurips-2023-13bb68c5ed56?source=rss-f390f1bdd353------2">Continue reading on TDS Archive »</a></p></div>]]></description>
            <link>https://medium.com/data-science/causal-python-five-novel-causal-ideas-at-neurips-2023-13bb68c5ed56?source=rss-f390f1bdd353------2</link>
            <guid isPermaLink="false">https://medium.com/p/13bb68c5ed56</guid>
            <category><![CDATA[python]]></category>
            <category><![CDATA[machine-learning]]></category>
            <category><![CDATA[causality]]></category>
            <category><![CDATA[causal-discovery]]></category>
            <category><![CDATA[causal-inference]]></category>
            <dc:creator><![CDATA[Aleksander Molak]]></dc:creator>
            <pubDate>Sun, 24 Sep 2023 14:13:08 GMT</pubDate>
            <atom:updated>2023-09-24T16:35:30.684Z</atom:updated>
        </item>
        <item>
            <title><![CDATA[Causal Python — Elon Musk’s Tweet, Our Googling Habits & Bayesian Synthetic Control.]]></title>
            <description><![CDATA[<div class="medium-feed-item"><p class="medium-feed-image"><a href="https://medium.com/data-science/causal-python-elon-musks-tweet-our-googling-habits-bayesian-synthetic-control-187114fc4aa8?source=rss-f390f1bdd353------2"><img src="https://cdn-images-1.medium.com/max/1280/1*fU5OsSQt8i7yuCqeJmcY0w.jpeg" width="1280"></a></p><p class="medium-feed-snippet">Applying Synthetic Control with a Bayesian twist to quantify the impact of a tweet (using CausalPy)</p><p class="medium-feed-link"><a href="https://medium.com/data-science/causal-python-elon-musks-tweet-our-googling-habits-bayesian-synthetic-control-187114fc4aa8?source=rss-f390f1bdd353------2">Continue reading on TDS Archive »</a></p></div>]]></description>
            <link>https://medium.com/data-science/causal-python-elon-musks-tweet-our-googling-habits-bayesian-synthetic-control-187114fc4aa8?source=rss-f390f1bdd353------2</link>
            <guid isPermaLink="false">https://medium.com/p/187114fc4aa8</guid>
            <category><![CDATA[bayesian-statistics]]></category>
            <category><![CDATA[machine-learning]]></category>
            <category><![CDATA[python]]></category>
            <category><![CDATA[causality]]></category>
            <category><![CDATA[causal-inference]]></category>
            <dc:creator><![CDATA[Aleksander Molak]]></dc:creator>
            <pubDate>Sun, 08 Jan 2023 17:18:10 GMT</pubDate>
            <atom:updated>2023-06-08T08:03:56.523Z</atom:updated>
        </item>
        <item>
            <title><![CDATA[Causal Python — Level Up Your Causal Discovery Skills in Python (2024)]]></title>
            <description><![CDATA[<div class="medium-feed-item"><p class="medium-feed-image"><a href="https://medium.com/data-science/beyond-the-basics-level-up-your-causal-discovery-skills-in-python-now-2023-cabe0b938715?source=rss-f390f1bdd353------2"><img src="https://cdn-images-1.medium.com/max/1280/1*-t66eXjm-XhVw0lcLUx13A.jpeg" width="1280"></a></p><p class="medium-feed-snippet">&#x2026;and unlock the best Causal Discovery package in Python!</p><p class="medium-feed-link"><a href="https://medium.com/data-science/beyond-the-basics-level-up-your-causal-discovery-skills-in-python-now-2023-cabe0b938715?source=rss-f390f1bdd353------2">Continue reading on TDS Archive »</a></p></div>]]></description>
            <link>https://medium.com/data-science/beyond-the-basics-level-up-your-causal-discovery-skills-in-python-now-2023-cabe0b938715?source=rss-f390f1bdd353------2</link>
            <guid isPermaLink="false">https://medium.com/p/cabe0b938715</guid>
            <category><![CDATA[causality]]></category>
            <category><![CDATA[causal-discovery]]></category>
            <category><![CDATA[python]]></category>
            <category><![CDATA[deep-dives]]></category>
            <category><![CDATA[causal-inference]]></category>
            <dc:creator><![CDATA[Aleksander Molak]]></dc:creator>
            <pubDate>Sun, 11 Dec 2022 20:46:20 GMT</pubDate>
            <atom:updated>2024-03-24T06:32:25.143Z</atom:updated>
        </item>
        <item>
            <title><![CDATA[Yes! Six Causality Books That Will Get You From Zero to Advanced (2024)]]></title>
            <link>https://aleksander-molak.medium.com/yes-six-causality-books-that-will-get-you-from-zero-to-advanced-2023-f4d08718a2dd?source=rss-f390f1bdd353------2</link>
            <guid isPermaLink="false">https://medium.com/p/f4d08718a2dd</guid>
            <category><![CDATA[causality]]></category>
            <category><![CDATA[book-recommendations]]></category>
            <category><![CDATA[causal-inference]]></category>
            <category><![CDATA[python]]></category>
            <category><![CDATA[machine-learning]]></category>
            <dc:creator><![CDATA[Aleksander Molak]]></dc:creator>
            <pubDate>Mon, 17 Oct 2022 18:52:55 GMT</pubDate>
            <atom:updated>2026-02-09T07:19:49.862Z</atom:updated>
            <content:encoded><![CDATA[<h4>…and you can get 3 of them completely for <strong>free</strong> if you want! 🤗</h4><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*GnoCzWQ8BTuADJx4aByIoQ.jpeg" /><figcaption>The six causal books. Image by yours truly.</figcaption></figure><blockquote>Update 2025: You can now check my free course on causality in the context of experimentation and modern machine learning here: <a href="https://causalsecrets.com/">https://causalsecrets.com/</a></blockquote><h4>Introduction</h4><p>Recent years brought a sharp increase in interest in causal methods in the research community and in the industry. One of the challenges that people entering the field face is a <strong>lack of standardized resources </strong>and <strong>terminology</strong>. Causality research has been scattered and divided into sub-fields for decades. One of the consequences of this fact is that many newcomers feel <strong>overwhelmed</strong> and <strong>confused </strong>when they enter the field.</p><p>I was in the same spot when I started.</p><p>Today, I have my own <a href="https://amzn.to/48Dc8xO">book on causality in Python</a>, where I summarized my journey and translated the most important causal concepts into Python code, but let’s not get ahead of ourselves.</p><p>In this post I want to share with you <strong>six causal books</strong> that allowed me to <strong>structure </strong>and <strong>speed-up</strong> my <strong>causal journey</strong>. I hope they will help you achieve the same!</p><p>And yes, you got it right, you can get <strong>3 of these books</strong> for <strong>free</strong>, <strong>100% legally </strong>if you choose so! 😯</p><p>For every book I’ll provide you with <strong>5</strong> <strong>bullet points</strong> highlighting the most important <strong>topics </strong>covered in the book. I’ll also provide<strong> you with links </strong>to get a <strong>copy </strong>and/or a <strong>free copy</strong> of a book if it’s available.</p><p>Let’s start!</p><h4>1. Starting Strong: “<a href="https://amzn.to/3z339Gy">The Book of Why</a>”</h4><figure><img alt="Figure 1. “The Book of Why” by Judea Pearl and Dana Mackenzie. Image by yours truly." src="https://cdn-images-1.medium.com/max/1024/1*6mZ5SEA1YAEKKHSytVzUog.jpeg" /><figcaption><strong>Figure 1.</strong> “The Book of Why” by Judea Pearl and Dana Mackenzie. Image by yours truly.</figcaption></figure><p><a href="https://amzn.to/3z339Gy">“The Book of Why”</a> by the godfather of modern causality <a href="https://twitter.com/yudapearl"><strong>Judea Pearl</strong></a> and his co-author, former mathematician <strong>Dana Mackenzie</strong> is a starting point for many and not by accident.</p><p>You can think of it as a comprehensive introduction to the field. It’s a mixture of theory, history, storytelling, math and practical exercises. If that sounds like a lot, don’t worry, it’s really well-structured and fun to read! The authors cover the history of causality, the basic theory behind Pearl’s <strong><em>do</em>-calculus</strong> and share inspiring examples of the applications of causal inference in the real life and science. We also get many useful comparisons between <strong><em>do</em>-calculus</strong> and <strong>potential outcomes</strong> frameworks. They will not only let you learn the basics of the latter, but also to grasp the elementary vocabulary that will help you orientate yourself in the broader causal world. Last, but not least the narrative is build around the concept of <strong><em>The Ladder of Causation </em></strong>— a powerful framework that helps the reader to clearly distinguish between associative, interventional and counterfactual modes of analysis.</p><p>Another great thing about this book is that it’s <strong>available </strong>as an <a href="https://amzn.to/3yNcsdP"><strong>audiobook</strong></a>!</p><p><strong>What you’ll learn:</strong></p><ul><li><strong>History of causality</strong></li><li><strong>The Ladder of Causation</strong></li><li><strong>Basics of <em>do</em>-calculus</strong></li><li><strong>Selected concepts of the potential outcomes framework</strong></li><li><strong>Useful applications of causal inference</strong></li></ul><p><strong>Get a copy:</strong></p><ul><li><a href="https://amzn.to/3z339Gy"><strong>Paper </strong>(Amazon)</a></li><li><a href="https://amzn.to/3z339Gy"><strong>Kindle </strong>(Amazon)</a></li><li><a href="https://amzn.to/3SCqVQZ"><strong>Audiobook</strong> (Audible)</a></li></ul><blockquote>Amazon links in this article are affiliate links. For every purchase made using these links I’ll receive a small amount of the transaction fee that will support my writing. At the same time it does not change the price for you. Thank you!</blockquote><h4>2. Your Turn: “<a href="https://amzn.to/3TfYYzn">Causal Inference in Statistics — A Primer</a>”</h4><figure><img alt="Figure 2. “Causal Inference in Statistics: A Primer” by Pearl, Glymour and Jewell. Image by yours truly." src="https://cdn-images-1.medium.com/max/1024/1*C5oUd78D255FBGALCufHcA.jpeg" /><figcaption><strong>Figure 2.</strong> “Causal Inference in Statistics: A Primer” by Pearl, Glymour and Jewell. Image by yours truly.</figcaption></figure><p>When I finished reading “The Book of Why” I wanted more! At the same time, I was not sure which direction to take. I asked my network on Twitter for their recommendations. Interestingly, the first reply I received was from <a href="https://twitter.com/yudapearl">Judea Pearl</a> who recommended <a href="https://amzn.to/3TfYYzn">“Causal Inference in Statistics: A Primer”</a> to me. Whose recommendation could be better?</p><p>It’s a great book with an amazing approach to teaching. It’s relatively short — just a little over 120 pages, yet very content-rich, including exercises.</p><p>The book is divided into 4 parts:</p><ul><li>A review of basic statistics and probability,</li><li>Introduction to graphical models,</li><li>A discussion on interventions,</li><li>A discussion on couterfactuals.</li></ul><p>The book will give you really <strong>solid foundations</strong>, especially if you follow with the exercises! I want to add that the theory and practice are mostly limited to discrete and linear cases, yet I see this as an advantage. It allows you to <strong>focus </strong>on what’s <strong>important </strong>from the <strong>causal point of view</strong> rather than being distracted by complex math or fancy estimation methods.</p><p>The book also covers more advanced topics like <strong>mediation</strong>, <strong>direct</strong> and <strong>indirect effects</strong>, probability of sufficiency and necessity and teaches you how to <strong>compute counterfactuals</strong> by <strong>hand </strong>(how cool is that?).</p><p>I read it on my <a href="https://amzn.to/3DiKaub">Kindle</a> but a paper version is <a href="https://amzn.to/3TfYYzn">also available</a>.</p><p><strong>What you’ll learn:</strong></p><ul><li><strong>Graphical models</strong></li><li><strong>Interventions as graph surgery</strong></li><li><strong>Back-door and front-door criteria and inverse probability weighting</strong></li><li><strong>Counterfactuals</strong></li><li><strong>Mediation, probability of necessity and probability of sufficiency</strong></li></ul><p><strong>Get a copy:</strong></p><ul><li><a href="https://amzn.to/3TfYYzn"><strong>Paper </strong>(Amazon)</a></li><li><a href="https://amzn.to/3Sj08sx"><strong>Kindle</strong> (Amazon)</a></li></ul><h4>3. Get more perspective: “<a href="https://amzn.to/43GO22F">Elements of Causal Inference</a>”</h4><figure><img alt="Figure 3. “Elements of Causal Inference” by Peters, Janzig and Schölkopf. Image by yours truly." src="https://cdn-images-1.medium.com/max/1024/1*b2gQBR2esmY0N1jtsn56Aw.jpeg" /><figcaption><strong><em>Figure 3.</em></strong><em> “Elements of Causal Inference” by Peters, Janzig and Schölkopf. Image by yours truly.</em></figcaption></figure><p>After finishing “<a href="https://amzn.to/3TfYYzn">Causal Inference in Statistics: A Primer</a>”, I was hungry for more! In particular, I wanted to learn more about causal discovery.</p><p><a href="https://amzn.to/3slIcCU">“Elements of Causal Inference</a>” by Peters and colleagues is the first book on our list that goes beyond traditional causal inference and extends to <strong>causal structure learning</strong> (aka <strong>causal discovery</strong>). This might sound strange, because the term <strong><em>causal inference</em></strong> is written is glaring large yellow letters on the cover. The reason for this is that the authors use the term in a <strong>broader meaning</strong> that also <strong>includes causal discovery</strong> (do you remember what did we say about standardized terminology in the intro? — that’s just a tip of the iceberg!)</p><p>The book covers differences between <strong>purely statistical</strong> and <strong>causal models</strong>, assumptions for causal inference, <strong>bi-variate</strong> and <strong>multivariate models</strong>, <strong>semi-supervised</strong> learning, <strong>reinforcement </strong>learning, <strong>domain adaptation</strong> and <strong>time series</strong> models, all seen through causal lens.</p><p>Math goes beyond discrete and linear cases and you can meet integrals and derivatives here and there. The authors share some examples and insights from the world of <strong>physics </strong>— a nice addition to popular examples from the fields of social sciences and epidemiology.</p><p>Is the book a<strong> complete handbook</strong> for causal <strong>inference </strong>and <strong>discovery</strong>? As the authors state in the introduction — <strong>no</strong>, rather their “<em>personal taste influenced the choice of topics</em>” (Peters et al., 2017, p. xii) and in my opinion it makes this book <strong>really unique</strong>!</p><p>Before we conclude, let me share two more thoughts with you. If you look for a book that is full of real-world use cases and solutions to practical problems —you won’t find it here. If on the other hand you aim at deepening your understanding of the <strong>mechanics of causal inference</strong> and <strong>discovery</strong>, in particular in relation to <strong>machine learning</strong>, this might be a really good shot! 🤘🏼</p><p>If you’re not sure, don’t worry! You can <strong>get this book for free</strong> 😯 in a <strong>PDF</strong> format and check if it’s a good fit for you (link below).</p><p><strong>What you’ll learn:</strong></p><ul><li><strong>Theory behind </strong>(some of the) <strong>causal discovery methods</strong></li><li><strong>Causal inference &amp; discovery for bi-variate models</strong></li><li><strong>Causal inference &amp; discovery for multivariate models</strong></li><li><strong>Causality vs episodic reinforcement learning</strong></li><li><strong>Causality and time series</strong></li></ul><p><strong>Get a copy:</strong></p><ul><li><a href="https://amzn.to/3TfYYzn"><strong>Paper </strong>(Amazon)</a></li><li><a href="https://amzn.to/3Sh1hAU"><strong>Kindle</strong> (Amazon)</a></li><li><a href="https://library.oapen.org/bitstream/id/056a11be-ce3a-44b9-8987-a6c68fce8d9b/11283.pdf"><strong>Free PDF </strong>(OAPEN)</a></li></ul><p><a href="https://towardsdatascience.com/causal-kung-fu-in-python-3-basic-techniques-to-jump-start-your-causal-inference-journey-tonight-ae09181704f7">Causal Python: 3 Simple Techniques to Jump-Start Your Causal Inference Journey Today</a></p><h4>4. Deep Dive: “<a href="https://amzn.to/3gvg4Lf">Causality — Models, Reasoning and Inference</a>”</h4><figure><img alt="Figure 4. “Causality: Models, Reasoning and Inference” (2nd Ed.) by Judea Pearl. Image by yours truly." src="https://cdn-images-1.medium.com/max/1024/1*MZijORAdlb1NKlekwMGhTw.jpeg" /><figcaption><strong><em>Figure 4.</em></strong><em> “Causality: Models, Reasoning and Inference” (2nd Ed.) by Judea Pearl. Image by yours truly.</em></figcaption></figure><p>Pearl’s “<a href="https://amzn.to/3gvg4Lf">Causality: Models, Reasoning and Inference</a>” brings over 400 pages of causal deep dive. The book covers graphical models, <strong><em>d</em>-separation</strong>, Bayesian causal models, structural causal models, structural equation models (SEM), <strong>model identification, assumptions behind causal inference</strong>, complete rules of <strong><em>do</em>-calculus</strong>, in-depth discussions on <strong>interventions </strong>and <strong>counterfactuals</strong>, <strong>probability of causation</strong> and more.</p><p>In addition, the book discusses a bit of <strong>causal discovery</strong>. Chapter 2 covers two <strong>structure learning algorithms</strong> proposed by Pearl and Verma: <strong>IC</strong> and <strong>IC*</strong>. In many places in the book, you can find comparisons between graph-based approach to causality and Rubin’s <strong>potential outcomes</strong> framework, which allows you to deepen your understanding of (inter)relations between the two.</p><p>The last part contains<strong> over 60 pages</strong> of reflections and <strong>discussions </strong>with readers.</p><p>All of this adds up to a very <strong>comprehensive resource</strong> on causality that you can use as your <strong>go-to reference</strong> on the topic ⚡⚡⚡</p><p><strong>What you’ll learn:</strong></p><ul><li><strong>Assumptions behind causal inference</strong></li><li><strong><em>Do</em>-calculus </strong>(in-depth)</li><li><strong>Casual discovery </strong>(limited scope)</li><li><strong>Probability of causation</strong></li><li><strong>Interventions and couterfactuals </strong>(in-depth)</li></ul><p><strong>Get a copy:</strong></p><ul><li><a href="https://amzn.to/3gvg4Lf"><strong>Paper </strong>(Amazon)</a></li><li><a href="https://amzn.to/3VMz1ZS"><strong>Kindle</strong> (Amazon)</a></li></ul><h4>5. The World of Econometrics: “<a href="https://amzn.to/3MOINqp">Causal Inference — The Mixtape</a>”</h4><figure><img alt="Figure 5. “Causal Inference: The Mixtape” by Scott Cunningham. Image by yours truly." src="https://cdn-images-1.medium.com/max/1024/1*j89S6VOXwSV4BmgPDghiRg.jpeg" /><figcaption><strong><em>Figure 5.</em></strong><em> “Causal Inference: The Mixtape” by Scott Cunningham. Image by yours truly.</em></figcaption></figure><p>Do you feel like something more practical?</p><p>Scott Cunningham’s “<a href="https://amzn.to/3MOINqp">Causal Inference — The Mixtape</a>” is the first book on the list with the main focus on <strong>real-world applications</strong> of causal inference methods. The book provides us with a ton of <strong>great practical examples</strong> of causal inference applications from the fields of economics, social policy, epidemiology and more.</p><p>The narrative is enriched with frequent references to hip-hop culture and quotes from hip-hop artists (my favorite example is a quote from <a href="https://www.google.com/url?sa=t&amp;rct=j&amp;q=&amp;esrc=s&amp;source=web&amp;cd=&amp;cad=rja&amp;uact=8&amp;ved=2ahUKEwi0xdmA0Of6AhWX7aQKHS4DAOYQFnoECBAQAQ&amp;url=https%3A%2F%2Fen.wikipedia.org%2Fwiki%2FChance_the_Rapper&amp;usg=AOvVaw0D9Q0CtO5_LNgGwmJyTZ3k"><strong>Chance the Rapper</strong></a> used to explain how to find good instruments when using <a href="https://www.google.com/url?sa=t&amp;rct=j&amp;q=&amp;esrc=s&amp;source=web&amp;cd=&amp;cad=rja&amp;uact=8&amp;ved=2ahUKEwiO7MGJ0Of6AhVS3KQKHcWlCiQQFnoECBMQAQ&amp;url=https%3A%2F%2Fen.wikipedia.org%2Fwiki%2FInstrumental_variables_estimation&amp;usg=AOvVaw2e3CRLEFwdfqCFCzR5klne"><strong>Instrumental Variables</strong> technique</a> ♥️). Each section of the book is accompanied by <strong>Stata </strong>and <strong>R </strong>code snippets. <strong>Python </strong>code is available in the <a href="https://github.com/scunning1975/mixtape">book’s repository</a> and in the online version of the book (link below).</p><p>The main focus of the book is on the methods popular in contemporary econometrics: <strong>regression discontinuity</strong>, <strong>instrumental variables</strong>, <strong>difference-in-differences</strong> and s<strong>ynthetic control estimator</strong>. The book contains just enough math to give you a <strong>solid understanding</strong> of the discussed methods. Not too much, not too little.</p><p>The book covers both — randomized and natural —<strong> experiments </strong>and provides us with a comprehensive overview of <strong>potential outcomes </strong>framework.</p><p>🤫Psst! Scott Cunningham is also an author of a <strong>popular </strong><a href="https://causalinf.substack.com/p/mixtape-the-podcast"><strong>podcast</strong></a> on causality. You can find it <a href="https://causalinf.substack.com/p/mixtape-the-podcast"><strong>here</strong></a>.</p><p><strong>What you’ll learn:</strong></p><ul><li><strong>Natural experiments</strong></li><li><strong>Potential outcomes</strong></li><li><strong>Regression discontinuity</strong></li><li><strong>Instrumental variables</strong></li><li><strong>Difference-in-differences &amp; synthetic control estimator</strong></li></ul><p><strong>Get a copy:</strong></p><ul><li><a href="https://amzn.to/3MOINqp"><strong>Paper </strong>(Amazon)</a></li><li><a href="https://amzn.to/3TEwNdc"><strong>Kindle</strong> (Amazon)</a></li><li><a href="https://mixtape.scunning.com/"><strong>Free Online Book</strong></a></li></ul><h4>6. A Unifying Framework? <a href="https://www.hsph.harvard.edu/miguel-hernan/causal-inference-book/">“What If?”</a></h4><figure><img alt="Figure 6. “Causal Inference: What If?” by Hernán and Robins. Image by yours truly." src="https://cdn-images-1.medium.com/max/1024/1*nKEeqC1x8RiW3t2Vpi1I9w.jpeg" /><figcaption><strong><em>Figure 6.</em></strong><em> “Causal Inference: What If?” by Hernán and Robins. Image by yours truly.</em></figcaption></figure><p>Last, but not least, the sixth book I want to recommend to you comes from Harvard’s <strong>Miguel Hernán</strong> and <strong>James Robins</strong>. Both authors are seasoned researchers and well-known figures in the world of epidemiology.</p><p>The book is divided in three parts:</p><ul><li>Causal inference without models</li><li>Causal inference with models</li><li>Causal inference from complex longitudinal data</li></ul><p>Out of the six, this is probably the most balanced book in terms of how much space is dedicated to <strong>graph-based</strong> vs <strong>potential outcomes</strong> frameworks.</p><p>The authors provide us with great insights on<strong> interactions</strong> in the context of <strong>interventions</strong>,<strong> selection bias </strong>and more.</p><p>A part of the uniqueness of the book lies in the discussions of <strong>structural nested models</strong>, <strong>causal survival analysis</strong> and <strong>causal effects of time-varying treatments</strong>. It’s a great read if you want to broaden your horizons!</p><p>The book comes with code in <strong>SAS</strong>, <strong>Stata</strong>, <strong>R</strong>, <strong>Python </strong>♥️ and <strong>Julia</strong>. Links to all repositories are available on <a href="https://miguelhernan.org/whatifbook"><strong>book’s website</strong></a>.</p><p>Currently the print version is not available.</p><p><strong>What you’ll learn:</strong></p><ul><li><strong>Interactions</strong></li><li><strong>Selection bias</strong></li><li><strong>Structural nested models</strong></li><li><strong>Causal survival analysis</strong></li><li><strong>Causal effects of time-varying treatments</strong></li></ul><p><strong>Get a copy:</strong></p><ul><li><a href="https://miguelhernan.org/whatifbook"><strong>Free PDF</strong></a></li></ul><h3>Wrapping It Up!</h3><p>In this post we discussed <strong>six causal books</strong> that will get you from beginner to advanced in causality. Each book offers something unique that you cannot find in others. Three of the books mentioned in this article get be read <strong>for free</strong> — either online or as a PDF.</p><p>If you want to <strong>jump-start</strong> your <strong>causal journey</strong> in Python <strong>today</strong>, check <a href="https://medium.com/towards-data-science/causal-kung-fu-in-python-3-basic-techniques-to-jump-start-your-causal-inference-journey-tonight-ae09181704f7"><strong>this post</strong></a>.</p><p><strong>Good luck</strong> with your <strong>causal journey 💪 </strong>and let me know your thoughts in the comments and/or reach out on <a href="https://www.linkedin.com/in/aleksandermolak/"><strong>LinkedIn</strong></a>!</p><blockquote><strong>Interested in causality and Causal Machine Learning in Python?</strong></blockquote><blockquote>Subscribe to the email list to get <strong>exclusive free content on causality</strong> and updates on my <strong>upcoming </strong>book on <strong>Causality in Python</strong>: <a href="https://causalpython.io"><strong>https://causalpython.io</strong></a></blockquote><p>❤️ <em>Interested in getting more content like this? Join using this link:</em></p><p><a href="https://aleksander-molak.medium.com/membership">Join Medium with my referral link - Aleksander Molak</a></p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=f4d08718a2dd" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Causal Python — 3 Simple Techniques to Jump-Start Your Causal Inference Journey Today]]></title>
            <description><![CDATA[<div class="medium-feed-item"><p class="medium-feed-image"><a href="https://medium.com/data-science/causal-kung-fu-in-python-3-basic-techniques-to-jump-start-your-causal-inference-journey-tonight-ae09181704f7?source=rss-f390f1bdd353------2"><img src="https://cdn-images-1.medium.com/max/2003/1*76onYxRL_E3-VTqyPoWJ0g.jpeg" width="2003"></a></p><p class="medium-feed-snippet">Learn 3 techniques for causal effect identification and implement them in Python without losing months, weeks or days for research</p><p class="medium-feed-link"><a href="https://medium.com/data-science/causal-kung-fu-in-python-3-basic-techniques-to-jump-start-your-causal-inference-journey-tonight-ae09181704f7?source=rss-f390f1bdd353------2">Continue reading on TDS Archive »</a></p></div>]]></description>
            <link>https://medium.com/data-science/causal-kung-fu-in-python-3-basic-techniques-to-jump-start-your-causal-inference-journey-tonight-ae09181704f7?source=rss-f390f1bdd353------2</link>
            <guid isPermaLink="false">https://medium.com/p/ae09181704f7</guid>
            <category><![CDATA[programming]]></category>
            <category><![CDATA[causal-inference]]></category>
            <category><![CDATA[dowhy]]></category>
            <category><![CDATA[python]]></category>
            <category><![CDATA[causality]]></category>
            <dc:creator><![CDATA[Aleksander Molak]]></dc:creator>
            <pubDate>Tue, 27 Sep 2022 19:02:13 GMT</pubDate>
            <atom:updated>2023-06-08T08:25:41.637Z</atom:updated>
        </item>
        <item>
            <title><![CDATA[Understanding Contrastive Representation Learning through Alignment and Uniformity on the…]]></title>
            <description><![CDATA[<div class="medium-feed-item"><p class="medium-feed-image"><a href="https://aleksander-molak.medium.com/understanding-contrastive-representation-learning-through-alignment-and-uniformity-on-the-6e0763187c00?source=rss-f390f1bdd353------2"><img src="https://cdn-images-1.medium.com/max/1448/0*bvv6QU1aG6hSc2Mm" width="1448"></a></p><p class="medium-feed-snippet">A Polish translation of an article, that we wrote with Mike Erlihson, PhD as a part of #DeepNightLearners series.</p><p class="medium-feed-link"><a href="https://aleksander-molak.medium.com/understanding-contrastive-representation-learning-through-alignment-and-uniformity-on-the-6e0763187c00?source=rss-f390f1bdd353------2">Continue reading on Medium »</a></p></div>]]></description>
            <link>https://aleksander-molak.medium.com/understanding-contrastive-representation-learning-through-alignment-and-uniformity-on-the-6e0763187c00?source=rss-f390f1bdd353------2</link>
            <guid isPermaLink="false">https://medium.com/p/6e0763187c00</guid>
            <category><![CDATA[python]]></category>
            <category><![CDATA[representation]]></category>
            <category><![CDATA[contrastive-learning]]></category>
            <category><![CDATA[deep-learning]]></category>
            <category><![CDATA[unsupervised-learning]]></category>
            <dc:creator><![CDATA[Aleksander Molak]]></dc:creator>
            <pubDate>Mon, 02 May 2022 18:59:40 GMT</pubDate>
            <atom:updated>2022-06-28T05:00:35.595Z</atom:updated>
        </item>
    </channel>
</rss>