Anyone can write program with dependent types for primitive functions involving Peano arithmetic. Now try to prove anything useful: with non-determenistic floats, 64- and 32-bit integers. Suddenly every addition is a bomb waiting to expolde around INT_MAX. Even a+b>a (where a and b are positive) can not be relied upon.
For some reason the problem that proof of real world applications will be astronomical are NEVER mentioned in articles like this. I wonder why.