The Problem: Dedekind’s UNreal UNkind cut.

I see problems with both a strictly mathematicians’ approach AND a strictly particle physicists’ approach. Yes, Michael Atiyah’s “Fine Structure Constant” paper is vague (not detailed enough in places) in many respects both mathematically and physically. However, everybody is vague in their expositions, because that’s the nature of modeling ideas. With one side saying: “I don’t understand the mathematics” and the other side saying “I don’t understand the physics” — we have a problem here. It is a good discussion to have, but watch out for saying “I don’t understand it, therefore it is not applicable”.

Sean Carroll says: “Renormalization teaches us …”. To me, this raises a red flag. Renormalization is a mathematical trick used by particle physicists— to say that “things are too complicated” (therefore we will chose a narrow band (a gauge) ) and treat it as 2D Euclidean “thin enough” approximation and ASSUME that the “real number” constants (mixing angles and/or scale factors) gives you precise process information in higher measure spaces (e.g., 3D, Orthogonal, Simplicial, Lie Groups) reminds me of Nima Arkani-Hamed’s very informative lectures but EQUALLY VAGUE (and hand waving via tensors (assuming “real numbers” and Hemitian products as a foundation)) rant on the “problems with” coming up with a theory of quantum gravity.

This is good that the mathematicians and the physicists enjoy dismissing each other, so that comparative science and relational complexity has a wide open lattice to play with. It’s a Dedekind’s unreal unkind cut.