CREATING CONTENT IS IRRELEVANT FOR THE MOST PART for SEO

Robert James Gabriel
Robert James Gabriels' Blog
7 min readMar 9, 2017
Note I did this talk in NYC and in Austin Texas if you have a copy that would be great.

Creating content is irrelevant for the most part when it comes to SEO. When people think of SEO, they believe that it involves targeting keywords and text on a website along with setting your site up to crawled by search engines and make it number one on the web.

That’s sort of true, websites have evolved and got more complex and so have search engines algorithms for picking how they rank your website.

I am going to say this a talented or senior front-end engineer knows that there are many browsers. They have different bugs. They implement the same features in a variety of ways. You do not control the upgrade cycle of the of search engine algorithms or how the browser handles code. An excellent example of this is caniuse.com. It shows the wide range of issues a frontend developer makes and has. When developing the backend, you are sure it is going to work 99% of the time in the future wherein frontend development that can change at the drop of a hat.

At the rate of use for mobile phones on the web is reaching the millions now, that means many different types of inputs, screen sizes, display settings and customization options a frontend developer has to watch out. (e.g.mouse, keyboard, touch, large screen, mobile, tablet, retina display, screenreader, assistive technology, enlarging text, changing the color settings, switching off CSS).

There are much a few things that in my option that show the difference between a junior to senior level front-end engineer. That is SEO, accessibility, and usability.

Search Engines and SEO

SEO is straight out based on whispers and hints. There is no sure way to get your website first on Google anymore. There used to be. Search engines have changed for the better over time and got more complex. Google, for example, uses over 200 different factors when it comes to ranking.

As you see Google’s mission for search is to give you answers before you need it. They do all this within 1/8th of a second.

You want the answer, not trillions of webpages. Algorithms are computer programs that look for clues to give you back exactly what you want.

Some of these experiments come in all shares and forms. From projects to more complex areas of development There are many components to the search process and the results page, and search engines are constantly updating there technologies and systems to deliver better results. Many of these changes involve exciting innovations, such as the Knowledge Graph. This list of projects provides a glimpse into the many different aspects of search.

Answers

Displays immediate answers and information for things such as the weather, sports scores and quick facts.

Autocomplete

Predicts what you might be searching for. This includes understanding terms with more than one meaning.

Books

Finds results out of millions of books, including previews and text, from libraries and publishers worldwide.

Freshness

Shows the latest news and information. This includes gathering timely results when you’re searching specific dates.

Google Instant

Displays immediate results as you type.

Images

Shows you image-based results with thumbnails so you can decide which page to visit from just a glance.

Indexing

Uses systems for collecting and storing documents on the web.

Knowledge Graph

Provides results based on a database of real world people, places, things, and the connections between them.

Mobile

Includes improvements designed specifically for mobile devices, such as tablets and smartphones.

News

Includes results from online newspapers and blogs from around the world.

Query Understanding

Gets to the deeper meaning of the words you type.

Refinements

Provides features like “Advanced Search,” related searches, and other search tools, all of which help you fine-tune your search.

SafeSearch

Reduces the amount of adult web pages, images, and videos in your results.

Search Methods

Creates new ways to search, including “search by image” and “voice search.”

Site & Page Quality

Uses a set of signals to determine how trustworthy, reputable, or authoritative a source is. (One of these signals is PageRank, one of Google’s first algorithms, which looks at links between pages to determine their relevance.)

Snippets

Shows small previews of information, such as a page’s title and short descriptive text, about each search result.

Spelling

Identifies and corrects possible spelling errors and provides alternatives.

Synonyms

Recognizes words with similar meanings.

Translation and Internationalization

Tailors results based on your language and country.

Universal Search

Blends relevant content, such as images, news, maps, videos, and your personal content, into a single unified search results page.

User Context

Provides more relevant results based on geographic region, Web History, and other factors.

Videos

Shows video-based results with thumbnails so you can quickly decide which video to watch.

Our algorithms are constantly changing. These changes begin as ideas in the minds of our engineers. They take these ideas and run experiments, analyze the results, tweak them, and run them again and again.

All these are starting to show the many aspects it comes to developing a website. Cause in my mind the line between a website and web app is now gone. Some think I am mad for thinking this. One of the major points when engines consider when indexing is the quality of the site. That includes the code. All of it.

You need to ensure that your content can be accessed without JavaScript. So the bot can scrape your content. Accessibility and usability because the level of acceptable performance is subjective from the engine to engine. I like to think of it as if the bot is dyslexic who’s trying to read a messy page. It is best to keep it short and straight to point with fewer errors.

CODE

Hey, it is just a website I can use procedural javascript code, and it will not matter. WRONG. It will. So much more work is being required client-slide that it has gone past the days of getting away with procedural javascript code. The line between web apps and website is gone. Websites should be treated as software.

Side note. I think where to some sites are too relient on jQuery as a whole.

From having social logins, API calls, analytics, tracking software, javascript effects. The list goes on you can very easily have bloated and slow javascript.

Architecting your solution is a necessity. Be it OO, modular JavaScript often using an MVC pattern is gaining increasing traction. Simply cause testing is more easily done with tools like QUnit. Correctly setting up your javascript will allow you to have a bleeding fast site and less risk of having a knock on bubble effect. As stated easily a better quality site which means better ranking factors. JavaScript has a steep learning curve to get right. Just look here.

Another factor that leads on from the javascript design is CSS. CSS no matter how you build it is painful. Be it with the Stylus, LESS or SASS. It gets messy once your site gets big enough. Its very static. You hold a lot in your head and develop strategies for regression testing effectively, so you do not go around in circles to see if you have broken something. Another important thing here is the feeling of immediate loading, by minifying CSS, HTML and critical CSS to give a good experience.

Before the browser can render content it must process all the style and layout information for the current page. As a result, the browser will block rendering until external stylesheets are downloaded and processed, which may require multiple roundtrips and delay the time to first render

Final word

Behind your simple page of results is a complex system, carefully crafted and tested, to support more than one-hundred billion searches each month.

This is even more true for when it comes to the frontend engineers creating the website/app. Front-end development is fragmenting.

I believe 80% of SEO improvements can be fixed by having a rock-solid front end engineer. As just a few years ago it was a template developer with some CSS. Now it is javascript, and even some backend languages as the frontend and backend merge, i.e., Node.js.

From my experience being good at SEO is an indicator of how skilled you are as whole at being a frontend engineer. It shows problem-solving, intimate knowledge of core web technologies rather than products (Example being Javascript rather than “jQuery”), solid knowledge of web performance. Best practices and scalability, usability, accessibility, SEO practices and how algorithms work from a low to high level.

So creating content is kinda irrelevant but important for SEO. Just having a skilled frontend engineer will sort 80% of SEO problems and that is fact. As a will crafted site should not run into any problems when being indexed.

Hat tips

--

--