Macs are Better for Coding, sort of.

It seems like every company is jumping on the Macbook Pro bandwagon these days. It is easy to see why. Macs provide a POSIX compliant environment without the myriad of distro and desktop options. Linux is great, but who wants to spend the time troubleshooting a DNS issue with an unsupported VPN client when the Mac verison just works. Of course, Windows is almost always supported, but it’s a pain dealing with virtualization and poor language implementations when Macs are similar enough to production. Wait, is that a benefit or not?

I started a job recently that also had ridden the Mac wave, that is until I got hired and they had already switched back to Windows. It wasn’t terribly surprising since the company had been a .NET shop. I was the lowly DevOps new hire stuck with the furthest thing from server automation, Windows 10. Sure it has Ubuntu, but when you’re targeting CentOS you might as well be coding on a Mac. That’s what I realized the problem was. I was happy to hear Windows was finally getting a POSIX environment, but there is no replacement for a good production-like environment. I finally put my finger on why my coworkers would have trouble in production. We had a culture of coding locally across different systems. Ansible was the biggest offender, where we would spin up a Vagrant instance that included everything except a control VM for Ansible. We just ran it locally on our systems across different OSes and Ansible versions and ran into bugs as a result.

Test production in production-like systems.

It sounds obvious, and people do live by it, but everyone gets complacent and nobody wants to spin up an entire VM to test a Bash script. It’s human nature to find the path of least resistance. I mean, it’s Bash. It’s probably the most portable tool outside of the Windows ecosystem. Why shouldn’t it work? That is until you take into account that the Mac your coworker used does not have gnu-tools, and grep arguments are drastically different, or production is missing git arguments because it is a version too old. A good test environment version pins to production, runs the same distribution, and has an equivalent configuration. If you use LDAP for auth in production, spin up an AD server in your test environment with junk users and passwords.

That’s not to say we can’t test locally, but we should always have a production-like environment nearby as a sanity check. Even something as trivial as unit tests could be affected by low level system checks. It’s surprising how even getting a list of files can cause cascading bugs and false negatives due to patch versions of libraries. When I was given a Windows box, I thought I was given a bunk environment to work in. On the contrary, it forced to create a valid test environment and evangelize this to the rest of the team.