close
August 4, 2021

How to fix keyword stuffing

Relevant keywords, used in a sensible and natural way, are vital to generate the right kind of traffic to your site.

However, many sites go overboard, using them in a clearly unnatural way. These sites are likely to suffer in Google rankings as Google's algorithm has clamped down on keyword stuffing.

They may also find that users will bounce off the page quickly if the content isn't up to scratch. This is another bad indicator to the search engines.

Keyword stuffing explained

Keyword stuffing, also known as spamdexing is a primitive attempt to increase your site's ranking. It is done by packing a page full of keyword and phrases. This is done unnaturally, and with little regard to how useful the content might actually be for the user. Or, how it may affect the reading experience for the user.

Once upon a time, this was a very popular tactic, and it did work! However, crawlers have improved rapidly and can now detect unnatural keyword stuffing.

Nowadays, it is likely your site will be severely penalized, deindexed, or simply drop out of the rankings if they detect signs of abuse.

Not only this, but keyword stuffing offers a negative experience for the user, and is unlikely to:

  1. Keep the user on the page for long, increasing your bounce rate.
  2. Result in any conversion or click on any CTA.

User experience is incredibly important to search engines like Google. Without a good experience, users will begin to look elsewhere, this is not what the search engines want!

Examples of keyword stuffing

  • Repeating keywords or key phrases too much, in a manner that is not natural.
  • Mentioning geographic locations (towns, states etc) too often while trying to rank locally.
  • Stuffing keywords ate the end of normal content.
  • Making the foreground and background color the same so that the text is invisible to the human eye.
  • Positioning an image in front of a load of keywords to hide them.
  • Choosing a font size zero, so that the text is invisible.
  • Inserting repetitive words in HTML comments and tags, such as image ALT tags.
  • Using keywords that do not match the page's topic.

How to fix keyword stuffing

Firstly, you may need to establish if you have examples of keyword stuffing on your site.

For this our SEO audit > Over optimization tool is ideal.

This will indicate any instances on your website of content being overly optimized (e.g. where keyword stuffing has occurred).

The only solution then is to revisit the page or article, strip out any offending content and start again (making sure it obeys all the rules of good SEO content). There is no shortcut or magic fix.

Observing good keyword density

We tend to express keyword density as a percentage of the page’s total word count.

Most experts recommend that you keep your keywords to below 3% of the pages total content. So, in an article of 1000 words, use your keyword 30 times maximum.

Anything over this and you risk a penalty.

We’d always recommend adjusting your KD based on what the top performers are using on their pages. This is a more subjective and likely accurate measurement of what Google wants your pages to include.

Forget density, write good content that flows!

Although we have just discussed keyword density, this is more of a marker of what not to go over. The main aim in your head should be to write natural, good quality content.

Once this is done you can then tweak to ensure you have enough iterations of your selected keywords.

Assign just one primary keyword or phrase to every page

Every page should have a primary focus, this primary keyword or phrase should represent this. Of course, you can also have secondary keywords to help back this up.

The most important location for your primary keyword or phrase

The following locations are the most important for telling the search engines what your content is about:

  • The page's URL.
  • Page title and in all <title> tags.
  • Meta description and other meta tags.
  • Page's headline: <h1>.
  • Subheadings: <h2> and <h3>.
  • Image description and/ or caption and it's filename.
  • Image ALT text.

Note: you do not have to always include your primary keyword in all of these.

It is most important in the URL, page title and main headings such as <h1>.

Just make sure it seems natural wherever it is placed.

Select up to four secondary keywords

Secondary keywords should enhance the content, and have semantic relevance to the primary keyword. Yet again, you must ensure your secondary keywords are not overused.

The 3% or less rule should still be followed.

If you are curious on how to find the best keywords for your content check out our content optimizer. It analyzes your site and your competitors, and then gives you an indication on what keywords you should be going after.

Consider variants and long tail keywords 

Many keyword terms that include 'buy' or 'best' tend to be massively oversaturated. Meaning high competition and low reward. Google's algorithm is highly developed and will be able to recognize synonyms of such words.

For example, you should use:

“get”

“obtain”

“acquire”

Instead of “buy”

These keywords will be less popular and therefore less saturated.

Long tailed keywords are also very powerful, especially on Amazon listings. Basically, you wrap your primary keyword inside of a short phrase. This short phrase may have relatively low volume of searches but can have strong purchasing intent.

For example:

A primary keyword "rich roast coffee" can be made into a longtail keyword phrase such as:

"Buy the best rich roast coffee in Chicago"

or

"What makes dark rich roast coffee so flavorsome?"

This way you add value to your primary keyword and incorporate good secondary keywords at the same time. In fact, nowadays, nearly half of all searches made by users, use more than one word.

Why LSI (latent semantic indexing) can make a big difference

As we know, Google wants to give the user the best experience. That means it's vital that the send them to good websites that correspond to the users search term. This is where LSI comes into play.

LSI means using words and terms in your copy that you would expect to see when reading good information about your search term.

For example:

A piece of content with a primary keyword of "architecture" might also contain related words such as:

"building"

"architect"

"constructed"

"design"

"steel"

"modern"

"glass"

"timber"

and so on.

LSI can improve both SEO and the user experience, and is also likely to occur when writing in depth on a topic anyway.

Summary: best use of keywords

Consistent keyword research and good writing are the pillars you should be striving for.

Using primary, secondary keywords, and LSI within the right density is key to creating content that Google and the user will love.

You can use our specialized content optimizer for recommendations on how and what keywords to implement to improve your site's success.

Good content and successful keyword optimization are key to giving you that edge on your competition. 

FREE TRIAL

Start your free trial now

Capabilities

close

Full SEO Audit

  • Probable Affiliates Check;
  • Text Quality Optimization Test;
  • Check for Plagiarism (unoriginal texts);
  • Snippets in Google;
  • Number of internal links to landing pages;
  • Number of duplicate links on the page;
  • Links to other sites;
  • Over-spamming in the text;
  • Over-spamming in META tags and H1 (+3 factors);
  • Excessive use of bold type;
  • Multiple use of the same word in the sentence;
  • Multiple use of bigrams in the text and META tags;
  • Multiple use of trigrams in the text and META tags;
  • Excessive use of headers;
  • Skinny pages (with small text);
  • Pages without outgoing internal links;
  • Check landing page relevance;
  • Pages closed from indexing;
  • TITLE = DESCRIPTION;
  • TITLE = H1;
  • DESCRIPTION = H1;
  • H1 = H2, H3, H4;
  • TITLE duplicates;
  • DESCRIPTION duplicates;
  • Not filled TITLE, DESCRIPTION (+2);
  • Number of indexed pages in Google (+2);
  • Pages closed from indexing in Robots, noindex, nofollow, rel = canonical (+4);
  • Landing pages in the sitemap.xml;
  • Non-indexed landing pages;
  • Landing pages URLs history;
  • Adult content;
  • Swear words and profanity.
close

Tools

  • Export your reports to XLS;
  • Import your key phrases, cluster analysis and landing pages url’s from CSV format;
  • Printed version of the site audit in DOCX;
  • Guest access to audit;
  • Generate sitemap.xml with duplicate pages and pages closed from indexing;
  • Labrika highlights texts that are used for snippets.
close

Technical audit

  • Errors 403, 404;
  • Errors 500, 503, 504;
  • Not Responding pages;
  • Critical HTML errors
  • W3C HTML Validator;
  • Multiple redirects;
  • Lost images;
  • Lost JS;
  • Lost CSS;
  • Lost files;
  • Multiple TITLE tags;
  • Multiple DESCRIPTION tags;
  • Multiple KEYWORDS tags;
  • Multiple H1 tags;
  • Pages with rel = "canonical";
  • Common Duplicate Content Issues: www. vs non-www. and http vs https versions of URLs;
  • Correct 404 status code header;
  • Duplicate pages;
  • Mobile HTML optimization;
  • HTML size optimization;
  • Page speed time;
  • Large pages;
  • 3 types of Sitemap.xml errors (+3);
  • 26 types of Robots.txt errors (+26);
  • Tag Length: TITLE, DESCRIPTION, H1 (+3);
  • SSL Certificate Checker (+7);
  • Check if the Domain or IP is Blacklisted;
  • Pages with program's error messages;
  • Check a website response from User-agent;
  • Test the availability of your website from locations worldwide;
  • Test the website for Cloaking;
  • Test if some search engine is blocked by the website;
  • Check a website response from mobile.
close

Recommendations for text optimization

  • Keyword clustering;
  • Check landing page relevance;
  • Find correct landing page;
  • Find the optimal level of the page;
  • Recommendations for text optimization;
  • Optimal text length;
  • Keyword in the main text (+2);
  • Keyword in TITLE (+2);
  • Keyword in DESCRIPTION (+2);
  • Keyword in H1 (+2);
  • Latent semantics (LSI) on the page;
  • Number of relevant pages on the site;
  • TF-IDF calculation for text in BODY, TITLE, DESCRIPTION, H1 (+4);
  • Estimate the level of the page optimization.
close

Keyword characteristics

  • Number of main pages in TOP10;
  • A list of relevant landing pages;
  • Recommended keyword depth;
  • Latent semantics (LSI).
close

User metrics

  • Google Analytics;
  • Drawing charts;
  • % of Bounce Rates;
  • View depth of the site;
  • Average session time;
  • Number of visitors;
  • Mobile devices: traffic, bounce rates, visit time (+4);
  • Visits from sources with traffic and bounce rates (+2);
  • Information on the pages with traffic and level of bounce rates (+2);
  • Visits from cities with traffic and bounce rates (+2);
  • Visits from search engines with traffic and bounce rates (+2);
  • Key phrases with bounce rates and traffic from search results (+2);
  • List of all search requests that people used to find your site for one-year period;
  • Pages without traffic.
close

Analysis of competitors' websites

  • List of competitors;
  • Snippets of competitors;
  • Labrika generates recommendations for texts based on the analysis of competitors' websites and Machine Learning algorithm;
close

Check your search rankings

  • Site positions in Google around the world (+1);
  • All Locations & Languages;
  • Country, city or region levels;
  • More than 100k locations;
  • High-precision analysis;
  • Check search positions in several regions at the same time;
  • Monitor the dynamics of search rankings (+1);
  • Available position archive;
  • Download position lists in XLS format for selected period;
  • Desktop and mobile rankings;
  • Top 50, 100 or 200 results.
close

Domain information

  • Domain Age;
  • Domain payment plan expiration date;
  • Website hosting;
  • Hosting region;
  • IP of the site;
  • The number of sites with the same IP;
  • NS Records;
  • Favicon.

Address

3rd Floor
86-90 Paul Street
London EC2A 4NE