.NET Core in the focus of C# Renaissance
A flash out of my feelings after reading the analysis by Ian Cooper on the future of C# and .NET.
I spent my morning reading through the article. I could not scan it quickly, I had to stop after nearly each paragraph, compare it with my own feelings, sometimes reading it again. It is harder to digest hard facts if you have emotional ties to the topic.
But why do I care? Nowadays I am more involved in methodology and architecture related topics. Behavior driven development is the same in all platform. I could probably dive in to any other platform relatively quickly if it would be necessary. Is this still the fear of change?
I hope it is not. I hope it is more than the years I have spent dealing with .NET and people working in .NET makes me responsible for not leaving the sinking ship. I don’t think (and never thought) that C# would be the best programming language ever, neither that .NET would be the best platform. But this is what I’ve got to make better, like the rose of the Little Prince. This is what Ian is talking about when he says “We have the ability to create a .NET Renaissance”.
.NET Core is in the center of focus
As you can read out from Ian’s article, the .NET Renaissance for making .NET more prominent on server-side stands on two pillars:
- The community that should be better welcoming
- .NET Core that provides a suitable platform for modern server-side development (*nix, CLI, fast, lower entry barrier, etc.)
Actually, the best would be to have the combination of both: a more welcoming community working on for a better .NET Core.
.NET Core is open-source, sure. And the process has improved a lot in comparison with the old classic .NET platform development, where we’d just “got” things from Microsoft without having much community consent. But .NET Core, as the platform of community is still not welcoming enough and I see three key reasons for this:
- Bad communication of goals and status
- Focusing too much on weird kind of “backwards compatibility” (not being brave enough)
- Not taking the advantage (enough) of “outsourcing” work to external OSS libraries
I would like to share a few thoughts on these three points.
Bad communication of goals and status
It would be interesting to make a poll among active .NET developers with regard to the status of the .NET Core, asking questions like “Has .NET Core been released already?”, “Has ASP.NET Core been released already?” or “Does ASP.NET Core work exclusively on .NET Core?”, “Is .NET Core the next version of .NET?”
I am pretty sure that the majority of the folks would answer these wrong. And this is because there is no clear communication in these topics. I would have been necessary to publish a simple and clear statement in the news that “.NET Core is DONE. Use it!”. Instead of that, we’ve got releases of .NET Core with the “tooling” around in preview state. Whatever this means.
It is hard to see the big picture.
But no wonder, because the dependency chain across .NET Core, ASP.NET Core, .NET Core Standard and the tooling for these is so complex that it is impossible to communicate a simple message about them.
Focusing too much on weird kind of “backwards compatibility”
About a year ago I started to work on an article. As I wanted to better discover the .NET Core space to be able to define the strategy for the .NET Core version of SpecFlow, I thought I would share my findings. I worked through several weeks doing all the research and writing the article that grew out to 8000 words. It became so long, because I had to explain all the diverse goals and dependencies I mentioned. Finally, I had the kind of feeling that in the end I see the point. I see the strategy. I see the reasoning behind the rewrite and all the fundamental changes they did. But then suddenly a new update was out. And then another one and another one.
I gave up updating my article (before even publishing it). Not (only) because it would have taken a huge amount of time, but because almost all the changes in the concept were a step back. Reversing decisions in the favor of better backwards compatibility. Keeping the legacy that we wanted to move away from.
And this is a weird kind of backwards compatibility, no one ever expected that any classic .NET code would run “as-is” in .NET Core. It looked more like that they introduced these compatibility changes to give a better support to the existing knowledge and this way the code bases could be easier ported.
Do you still remember Ian’s article? He concluded that the decline we see for .NET is more related to the new projects and not to the existing ones. I believe that the classic .NET framework is “good enough” to carry on the existing projects. Trying to fulfill the hypothetical needs of these instead of focusing on the new projects was a mistake in my opinion.
This is particularly true for ASP.NET Core, which is supposed to be the primary framework on the top of .NET Core for server-side projects. I don’t know how much effort has been wasted to make it work both on .NET Core and the classic .NET framework, but I am sure that this duality made it extremely hard to see and communicate its goals clearly. For the sake of what? For the sake of being able to run it on a .NET platform that is certainly not efficient enough for long-term server-side. Ehh.
No inclusion of external OSS libraries
Having everything included in the .NET Core framework is nice as it provides a higher consistency. I think for being able to hit the market earlier, test different approaches and provide more options for the projects, a successful platform should strongly rely on external (open-source) frameworks, libraries and tools.
.NET Core has taken a huge step in this in comparison with the classic .NET, but not an enough big one. Yes, it uses the Newtonsoft’s Json.NET library and probably there are a couple of more similar stories, but the mentality is still the same. I was happy to see how xUnit becomes the .NET Core standard for unit testing. And every Microsoft employee I met told me that they used xUnit, but somehow I was missing the announcement that “.NET Core (Microsoft) will not provide a default unit test runners, because xUnit and NUnit are just good enough.” And they now work on the MsTest .NET Core port that will put the “market” in an imbalance again: the enterprises have to choose between the just perfect xUnit/NUnit or the “official” MsTest. And we all know what they will choose…
You might say that this is not a big problem, because xUnit will always have enough users. This is certainly (I hope) true. But think about other potential open-source library authors who see these patterns. They will ask themselves: “Should I invest a lot of time, effort and energy into something that might be suddenly replaced by a Microsoft one?” And especially if you consider that this “suddenly” is when it becomes successful and relevant, your answer will more likely be “No” than it should.
But also the constant changes and the fact that it is so hard to reach the “DONE” stamp make it harder for OSS projects to support .NET Core.
I work as an independent trainer/coach for BDD, SpecFlow and test automation. When I started my business 1.5 years ago, my goal was to get financed by the trainings/coaching so that I can work more on the open-source part of SpecFlow. The .NET Core support would be one important area to improve. But if I’ve got time to work on SpecFlow, I have to face the question: “Should I spend hours of discovering what has been changed since I’ve last time dealt with .NET Core? Should I install all the new updates (with breaking changes), even a new Visual Studio release candidate to be able to get started (knowing that the other contributors might not have that installed anyway)? Or shall I just pick something else I can improve right now.” What do you think about my answer?
So heads up, it’s high time to think positively. I try to find some constructive next steps to make this Renaissance happen. It is not easy, for sure. We all need to help.
Somehow, we have to make .NET Core more welcoming. Because with the lack of these, it will take much more time to get it widely accepted.
With the time we might miss the window of opportunity for .NET to enter the modern server-side market. So whatever we do, we have to do it quickly.
Whatever is it, I am in.
Gaspar Nagy is the creator of SpecFlow working as a trainer & coach. Check out his public SpecFlow courses or request an in-house private course for your team. He is a bdd addict and as such he’s editing a monthly newsletter about interesting articles, videos and news related to BDD, SpecFlow and Cucumber.