This is a really interesting thought.
As a system increases in complexity it will inherently face increased risk. I think it’s easily agreed that if a complex system is not created thoughtfully (or with eons of trial and error), it will have a higher risk of failure. Given this, my supposition is that a system can be infinitely complex, but only with the right level of thoughtfulness in its design.
So I think the goal should not be to avoid making systems too complex, it should be to create the systems in a thoughtful manner. If the level of thought or quality of design isn’t at least equal to it’s complexity it shouldn’t be attempted.
However, to my knowledge, there is no agreed upon way to measure the level of thought applied to the development of a system. There are some conceivable ways to measure complexity.