Rethinking the Current SEO Keyword or Keyphrase Framework
In recent discussions of Search Engine Optimization (SEO) frameworks, a controversial issue has been whether keyphrase or keyword optimization structure is consistent with natural human language. On the one hand, some who are more optimistic about the tool argue that it brings clarity to the computer-to-human interactions. From this perspective, since such tools optimize reach and visibility, resulting in better customer and user engagements, it would be contradictory to assume humans do not relate to the information as they would in natural ways of communicating. Therefore, they insist it engages users.
On the other hand, however, others argue that SEO keyword framework promotes redundancy by introducing unnecessary disruption to the natural flow of human language. Some among them see a trend in which online contents begin to look and feel the same. In the words of Onskul from SEMrush (August 18, 2021), one of the main proponents of this view, “content across the web has started to look and sound the same” (para. 1). According to this view, though it is a challenge, it does not undercut the overwhelming merits of the current SEO keyword structure. Moreover, some tools help reduce this tendency, they argue.
In sum, the issue is whether the current keyword SEO framework is counter-productive to human communication on the web or if the redundancy of the current structure undermines natural human language — changing human language as we know it. Another way to frame the issues is if the natural language processors (NLP) of computer-mediated SEO contents respect the natural human language. How robust are the query structures to tackle the complex natural human language?
My view is that the current SEO keyword or keyphrase framework needs a better human-centered framework to equip digital content scholars, marketers, and communicators for more robust SEO practices. In addition, it must require improved queries structures that respect the semantics and semiotic complexities of language.
Though I concede that such would involve a radical change in the design of the NLP and the data structure of search engines and may lead to momentary high costs for advertisement, marketing, and optimization tools, I still maintain that in the long run, businesses and society would be better for it. Thus, for example, we would preserve the richness of natural human language, without which a crucial aspect of culture is shaped in ways determined solely by computers.
Although some might object that, after all, computer-mediated interactions must respect the logic of the hardware and software tools for efficient delivery of information. Google anticipates this challenge too, hence its 2018 BERT project — which tries to improve the current software NLP — leveraging the efficiency of AI richness.
Others, who seem to see society along the lines of quantifiable information or data, might strongly oppose an alternate framework because it would disrupt the current data structure. I would reply that those disruptive technologies have consistently improved content and communication online, including the current practice, which is less than two decades. We cannot settle in the technologies of the past while there are possibilities for improvement.
The issue is important because unless we properly developed a keyword and keyphrase structure that incorporates various contexts that resonate with natural human language, we would end up with a dystopia in the digital ecosystem. Such will pose a greater risk of disruption of common sense human language rules.