Defying Limits, or, What was once meant for poignancy.

I’ve always loathed Twitter. So, that moment a little while ago was a pretty good day for me.

Now, to be fair, Twitter’s been used for a lot of good. Many people’ve been helped due to Twitter, including the quick dissemination of information for things like Amber Alerts, certain news reports, ad hoc safety checks - e.g. the relatively recent Paris attack - etc. This is to say nothing of the numerous tweets that have managed to move me intellectually, philosophically, have piqued both curiosity and an urge to do further research and so on. Unfortunately, such content makes up maybe 1% — a generous percentage, I should add — of all the content therein.

See, what was once a platform that could be seen as championing poignancy and conciseness soon devolved into radioactive, carcinogenic pits of racism (and every other type of -ism), containing exobytes worth of useless opinions, hashtag journalism, hashtag activism, and a slew of other things typical of my generation (as well as tellingly typical of older and younger ones; we’re all guilty of some kind of stupidity), to mention nothing of the dissemination of vast amounts of misinformation because of the seeming death of journalistic integrity (including, but not limited to a lack of fact-checking and due diligence). Antivaxxers may have existed before the dawn of the tweet, but fuck me three ways from Tuesday if they haven’t been able to use the platform to proselytize their utter bilge. Flat Earthers are inexplicably regaining some sort of traction. Flat Earthers. An incomparable amount of tweets are as useful and intellectually stimulating as the plethora of news article comments that exist.

And we know what a cesspool those are.

I took a course a few years back, an additional qualification to my base teaching degree, titled Librarianship, Part 1, and part of the course was figuring out how to use social media platforms to educate and stimulate learning and the acquisition of information. Both accounts are pretty inactive now, one made for the course, and another for my own personal use. Sifting through tweets of various news outlets and educational organizations, foundations, etc. ended up having me realize that a) news aggregates like Feedly were better for the former, and b) of the organizations I subscribed to that existed for educational purposes, I found myself using less than 25% of what I’d have to parse through. Which, fine. b’s not really all that much of a problem. My realization was rooted in a simple point: I’m not exactly sure how something like Twitter can effectively be used - in a simple way, mind you - for education. After all, we like simplicity. I want to take a moment to defend Twitter as a platform based on a number of accounts I’ve seen, one specifically: It documented World War 2 in real time, looking at events big and small, and would post them as the days went on. This project took a lengthy amount of time (I’d like to say it’s still going, but it may very well have been 6+ years since I’ve had a chance to see it. Bonus points and years if the account included the Second Sino-Japanese War, but I doubt it.) and served as an excellent way to see how one of the most notorious, well known wars in the Western world would look if seen over the course of 6 or so years. It’d hard to qualify myself having grown up in a time of war, when you juxtapose the Gulf War, the Balkan Wars, Iraq, Afghanistan, and a slew of other wars both bloody and filled with war crimes against something we’ve spent some 60 years glorifying and immortalizing - seriously, by now, everyone’s kill at least one Nazi in their lives, real or digital. I’d always found that account brilliant as it filled in so many of the baby gaps that’ve formed from between 1939–1945.

Besides, any chance I can have to talk about Free France versus Vichy France, and the importance of the French Revolution, I’ll take that time.

I only take a moment to defend because there has been some good. That’s why I’m not very vocal about my Twittercisms. There’s been good. But also, less than good.

Twitter crippled language skills, lead to the rise of emoji — I have my qualms with emoji, though they amount to little more than personal preference, and frankly, I’m not down with trying to divine what symbols mean in context and depth (I prefer the precision of language, but the irony of that statement doesn’t escape me), as though I’m reading an alethiometer— became another platform for widespread abuse in a number of forms and, well, marketing.

That last one’s a huge bias, I’ll freely admit. I don’t particularly like marketing. I don’t believe people need most of the bullshit they’re fooled into thinking they do. I don’t believe in the concept of selling an idea, or being sold on what way I should feel. Not everything needs to be a commodity.

I’d go on about calling the pound sign, or number sign, a hashtag. I’d also go on about how generations don’t know that it’s a symbol that represents a number. But I won’t.

I didn’t like word counts while writing essays in university, and subsequently never liked character limits on spurts of thought. While many might be proud they can easily spew 140 characters without having to painstakingly count each and everyone one of ‘em, I can’t, nor will I ever care to. The things we say don’t need limits. They don’t need to be digestible. And for working in a field where one of the current mantras is stimulating critical thinking and expression, such limits can be damaging to how one states something.

Thus, cue the celebratory flugelhorns.

Maybe I’m being unfair. Maybe I’ve simply never liked the fact that masturbatory egotism’s become mainstream. That so many people who use Twitter never learned the lesson that while their experience and existence are valid, they’re not valuable. That everyone may be unique, but they’re sure as hell not special.

Maybe I’m being too critical. Then again, a little criticality goes a long way.

All I know is that things are about to get a bit more interesting.