This sounds like every data science project ever. It starts with a neural network that will have an accuracy of 90% in six months. And after two years, it’s 70% accurate 50% of the time. Then, you just drop it and write a custom algorithm to do the task while data science complains about data being impure.
In all seriousness, it would be interesting to know the domain context and how requirement or stack was altered for the rewrite.