Does God Always Want to Heal Us of Our Sicknesses?
What does the Bible say about that?
The Book of Genesis confirmed that everything the Lord created was good and beautiful. God always wants us to live that life of optimal happiness.
So His initial plan for us when He destined us to be his son and daughters through Christ was to live in divine health.