SEO from UX perspective — Part 2: relevant content

Marlena Reinke
Jung von Matt TECH
Published in
4 min readApr 13, 2023

In the course of a website relaunch, SEO should be an important component from the very beginning. The most important basics should already be set in the concept. In the first part, I described how we can use tools to create a new page structure without risking a loss of ranking. This is the second of three parts: SEO from UX perspective — Part 2: relevant content.

Using keywords correctly, avoiding spam texts and thin content

After we have an idea of the new structure and which content goes on which pages, we can start to set up the individual pages in terms of content. It is important to sort by topics and user intent, not by keywords. Keywords are search terms that users enter into the search engine.

The goal of our website is to provide our users relevant content. Google tries to understand what the user’s intention is with their search query, in order to then present the best results of our website. The better our content meets the user’s search intention, the more likely we are to achieve a top position on Google.

So keywords are the base of SEO, but they should be part of texts with content and added value, not part of spam texts or “thin content”. Such texts often offer no added value, consist of obvious statements and do not make the user smarter. They are often written just to mention relevant keywords as often as possible. Google recognizes this type of texts since the Panda Update 2011, which leads to the fact that pages with such texts lose visibility and ranking position.

To use keywords wisely, we use tools that provide us with related keywords. Examples are the Google Keyword Planner, HyperSuggest and Semrush. In this beginner guide, HubSpot shows how exactly to work with keywords, names other (free) tools and explains how to use them.

Preventing duplicated content

Another thing we should avoid is duplicated content. This of course applies to content within a page, so as not to be identified as spam by Google. But most importantly, it applies across pages.

Duplicate content causes search engines like Google to have difficulty indexing the correct page. Due to similar or identical content on different pages, affected content is found worse or even filtered out. To avoid ranking problems caused by duplicated content, each indexed page needs sufficient “unique content”. This content was created exclusively for one page and can only be found on this page.

Sometimes it happens on websites that duplicate or very similar content is created and therefore it can be reached on several URLs. This is for example difficult to avoid in online stores, e.g. to display a product in different colors. Google registers (crawls) both contents, but decides only for the registration (indexing) of one of the two. We cannot control which URL Google chooses. By indexing only this one page, the link power to the other URLs is lost, which can lower our ranking.

To avoid this and to mark duplicate content, there is the so-called “canonical tag”. This is stored in the HTML document of the duplicate page by our developers and refers to the standard URL, the “canonical URL“. With this we define which page Google should index.

To detect duplicate content, we can also use tools like ScreamingFrog here.

Illustration: Marlena Reinke, animation: Anja Lindner

Competitor comparison

In order to make our content more relevant than the content of our competitors, we can compare ourselves to them. For this purpose, we work out topics and include them in our content, which are not or only slightly discussed by the competition, but which are searched for by the users. On the base of the keyword analysis we can see where content for keywords is missing. For this, we should still consider the goal of the page and the needs of the target group.

In addition, we can stand out by having well-structured information in our content. Furthermore, emotions can be conveyed and used. Images and videos can help the user to better understand content, lighten up complex topics and generate further interest. This makes it more likely that users will continue to click through our site. And Google also recognizes this and gives us a possibly better ranking position as a result. Because a good user experience also counts for Google.

Conclusion

This process is not easy to achieve, as it requires time and experience. It is best to share this work with experienced copywriters. Especially in an agency, you are always dealing with new topics. We can’t be experts in every field, so we should constantly consult with our clients to have the texts checked for correctness. Together with copywriters, we can check the new texts for the points mentioned, in order to create a base. In the third and last part, I explain the importance of a good user experience.

Here you can read the german version: https://medium.com/jung-von-matt-tech/seo-aus-ux-sicht-teil-2-relevanter-content-8b1acaddb250

Jung von Matt/TECH is a data-driven and technology focused boutique within the Jung von Matt group. Jung von Matt is — in terms of awards for both creativity and efficiency — the most successful advertising agency group in German-speaking countries. We help our clients and partners to be digital pioneers by using technology plus data to develop digital platforms, products and services that create momentum.

--

--