One small complaint: you mention Moore’s Law as one of the reasons ML is having a rennaisance; while it’s true that there’s an exponential growth trend involved, the trend in question is not actually Moore’s Law (which ended sometime between 2005 and 2007 depending on who you ask). Moore’s Law has to do with the number of transistors that can be fit on a single die — a number whose hard limit we’re bumping up against (because of the size of atoms); mechanisms to get around this limit don’t involve transistors as such and so the moniker doesn’t apply to them.
In popular science writing, Moore’s law has become a shorthand for all forms of exponential growth that affect performance-per-dollar or performance-per-inch of computer hardware. CS people sometimes use it this way, but (when it matters) make distinctions — and there are a number of other “laws” that are essentially similar in nature but apply to different metrics: Kryder’s law for storage density, Koomey’s Law for performance per watt. And then, there’s Englebart’s Law (which all these other accelerating performance laws are arguably a special case of): that the performance of human beings increases exponentially in all sorts of contexts and domains.