Thanks for this question- you’re not the first one to ask this so I’ll probably write a longer post, but TL;DR:

I believe that statistics as a field evolved as it did due to lack of computational resources. When you don’t have fast computers, you have to rely on shorthand calculations to make inferences, so you compute summary values, develop heuristics for how those values should behave and use them to make comparisons between populations. If one was developing it today I think it would look very different. Fast computers allow you to work directly with the underlying probability models. The methods, like MCMC, that allow you to do so can be shown to be formally equivalent to traditional approaches. In an important sense these methods are more fundamental, since they allow you to make assumptions around how the data was generated explicit and generate the most appropriate summary statistics for solving your problem.

Is that helpful? Happy to go into more detail if not.