Plates Need Spinning, Most Broken & Biggest Bang For Buck?
Yesterday was more-or-less about systems-checks on a major site’s https switch-over and starting my first real next-step thinking, now that the post-one-year-here integrity pangs start hitting me. So, it’s often the same things: it’s what you have to do for the today versus what you know you should be doing for the tomorrow. At some point during the daily grind, it becomes obvious to you what you should really be doing, to solve a bigger problem whose solution is of more valuable to more people that pay your paycheck than the things you’re doing today. All the better, you can continue to deliver everything you do today, if only they buy into some little investment in you to take it to the next step, because usually all it takes is time.
And so, gauging the ambition-level of my next step now is important. Not only the ambition-level, but also selection of my tools and hardware/software/service “stack” is also important. I am fending off obsolescence in the most bad-ass, genuinely good for me and good for my employers as I can possibly imagine will address the integrity issues, and help me to actually do the right things for the right reasons, which is what being a Peter Drucker-style modern information and knowledge worker is all about. And here’s the first big trick, which is almost in my eyes, bigger than Python.
Be comfortable in almost any modern Linux “server”. By this, I mean get used to those directory locations that are in common on most any modern Unix or Linux. Generally know what and where they are, and why. Know a few commands like “cd” and “ls” so you can get around and look at things. Master a text editor that will always be there. This usually means vim, which will also let you basically get by with vi, which will almost always certainly be there if vim isn’t. Don’t require a graphical desktop and a mouse or touchscreen to be productive. Get by with a serial connection with an old-school term connection, if it gets you to a Unix/Linux terminal session, where you’ll be immediately comfortable.
It underlies everything. Information Tech underlies everything. Unix/Linux underlies almost all modern popular, mainstream tech from your Apple Watch to China’s fastest super-computer, the $5 Raspberry Pi computer, and everything shy of the next computer evolution, which will be more Neural Network and talking to an every-OS-is-unique sorta buddy than it is individual Unix/Linux machines.
But until then, know the Unix/Linux machines. They’ll probably be used in everything that’s not ridiculously custom or cost-reduced into a specialized micro-controller. And even then, specialized micro-controllers are going to become more and more like a tiny PC of today, so that anyone doing new work for them can already automatically take advantage of them without a year of learning some custom system. Think of what we have today becoming smaller and cheaper, and merely processor nodes in some larger network. Datacenters in a shoebox, with several layers of off-site redundancy not only being possible, but cheap, secure and wise… well, maybe just cheap.
Those who know how to wire this stuff up and control it… well, robot armies and that whole thing. Miracle materials will be discovered and brought into our everyday lives… every day. Energy will gradually become free, and getting off the grid and forging your own way in this world, perhaps with a truly mobile home, is all in-our-lifetime possibilities.
I teach this to my daughter. I draw very clear lines between probably never in her lifetime (portals) and absolutely within her lifetime (human cloning, convincingly smart machines, etc.). I love it. Not only do I get to continue to forge my own tools and navigate my own way through the new SEO frontier, which I consider fun and full of several embedded worthwhile missions (evangelizing Python, vim, etc.), I get to talk to both you and my daughter about these things in differing and always fun and useful to myself (introspection) reasons, which only helps me to navigate my next step better.
Plates to spin? Check my automated reports. Check the post-https-switchover JIRA items for high priority follow-up emails. Check that site’s performance in Google Analytics to ensure no fires are brewing. Make sure the re-submission as a news source takes place. Anything else?
What’s most broken? Although I collect the GA/GSC data every day, I don’t post-process for insights. I also have a particular high priority report that requires this same GSC work. Not having that round of work that is almost the whole backbone of my “next” system is most broken. And this can be / is the exactly same problem as my reports not necessarily successfully running every day in it’s Debian-style /etc/cron.[daily|hourly] directories with home-spun bash-based logging and no notifications when it’s broke. Fixing this problem while keeping my systems-independence, getting to be my own devops person perchance to rack up a few more skills and SEO-insights. Being a developer is good in SEO.
Where’s the biggest bang for the buck? Those GSC daily processing of insights, for sure! Get rid of the old Tableau dashboard with something you envision as the next generation HitTail type of thing, deploy-able in-house and able to be baked-in piecemeal to the already impressive home-grown systems that exist here.
Pulling data is good in SEO.
And my next step, post-processing data to extract silver-platter insights is also good.
Being your own devops person in this whole process, capable of working independently (though using various services, of course) is most excellent.
And visualizing it all in an easy fashion, under Bokeh under Python under Jupyter Notebook is good. Being able to migrate the stuff that should become automated into… into… well, not cron if it might become complex jobs, that’s for sure. Know how to do that, but then… well, that’s part of today’s job, isn’t it? The stuff I REALLY want to look at today: Celery or Schedule?
No real reason to do Celery, I think, in order to “stay compatible” with our group’s systems. But getting rid of the brittle-ness of cron scheduling thing without having to back-end it with a database would be nice too.