November 8, 2021

Keyword common density interval

What does this message mean in the content optimizer: "It was not possible to calculate the recommendations because not all keywords have a common density interval. It is necessary to remove some of the keywords of this page and update the recommendations"?

Essentially, all keywords have a certain density on landing pages.

There are several sites in the TOP10 of the SERPs that we must discard from our recommendations as they may be there due to other factors, for example; Wikipedia or YouTube.

For example, say we are writing an article for a blog and within the TOP10 for the keywords, we are using, one of the pages is from YouTube with only 50 words. It makes no sense to include this page in our recommendations. If we were to replicate this type of page, with only 50 words, we would never see the page in the TOP10 because the video is ranked by other factors, and is added to the TOP10 of the SERPs to give variety to the content offered.

And there's a second problem. Imagine we are optimizing our page for the extreme value of the keyword density. If, after one day, the site that had this extreme keyword density value drops out of the TOP10, the range of word density for sites in the TOP10 is reduced. Leading to the system yet again asking you to redo your page.

Therefore, it is much easier to remove the extreme values from our recommendations.

Here's, a simplified graph that shows how many sites in the TOP10 contain what density of this word.

When we need to place several key phrases on one page, we must find the overall confidence interval for each word within these key phrases.

For example, if we want to place the phrases "mobile phones" and "mobile traffic" on one page, then we need to find the overall confidence interval for the word "mobile", which is included in both key phrases. `

If the text for these words on the pages in the TOP10 are similar, then the overall confidence interval will be found:

If the text for these words on the pages in the TOP10 are very different, then there will be no overall confidence interval:

In this case, it is undesirable to use the average value - since this density will not be enough for one key-phrase and will be too high for another key-phrase. Meaning there's a risk of being lowered in the SERP for over-optimization.

Remember that we are calculating what density we need to use for a word because this word is part of several key phrases. And if the density doesn't fit, then one of the phrases on the landing page will have less chance of getting into the TOP10 of the SERP.

There are several reasons that can lead to this situation:

  1. The content on the landing pages for different key phrases in this group where the error occurred is different. And you should separate these key phrases across different landing pages.
  2. You manually excluded competitors that were very similar for key-phrases on this page and because of this, the link that indicated that these key phrases can be placed on the same landing page has disappeared. You can try to include those competitors again that were previously excluded.

Finally, an explanation on why you should never use averages. Below shows a graph of the distribution of keyword density on sites in the TOP10 that are often found:

To provide variety, the search engine often adds sites with different types of content in order to best satisfy the user. For example, an overview site may be added to the search results alongside online stores.

This is often less noticeable as a user but is very different when analyzing the numbers to get an average. Let's say there are 9 blog sites in the search results and 1 site with a huge price list that has a high keyword density. This will shift the averages away from the majority group, skewing the results.

It is also important to note that a page doesn't always need to be made exactly in accordance with what we see in the TOP10.

Firstly, there are categories of sites that lag behind others in terms of the quality of features and content offered. In cases like this, it is more useful to improve the site rather than follow what others have done, in order to get ahead of the competition.

Secondly, if there is a lot of competition, doing something different may be the only thing to set your site apart. However, at the same time, you must clearly understand how the site should be produced, to give you a chance of reaching that TOP10 in Google.


Start your free trial now



Full SEO Audit

  • Probable Affiliates Check;
  • Text Quality Optimization Test;
  • Check for Plagiarism (unoriginal texts);
  • Snippets in Google;
  • Number of internal links to landing pages;
  • Number of duplicate links on the page;
  • Links to other sites;
  • Over-spamming in the text;
  • Over-spamming in META tags and H1 (+3 factors);
  • Excessive use of bold type;
  • Multiple use of the same word in the sentence;
  • Multiple use of bigrams in the text and META tags;
  • Multiple use of trigrams in the text and META tags;
  • Excessive use of headers;
  • Skinny pages (with small text);
  • Pages without outgoing internal links;
  • Check landing page relevance;
  • Pages closed from indexing;
  • TITLE = H1;
  • H1 = H2, H3, H4;
  • TITLE duplicates;
  • DESCRIPTION duplicates;
  • Not filled TITLE, DESCRIPTION (+2);
  • Number of indexed pages in Google (+2);
  • Pages closed from indexing in Robots, noindex, nofollow, rel = canonical (+4);
  • Landing pages in the sitemap.xml;
  • Non-indexed landing pages;
  • Landing pages URLs history;
  • Adult content;
  • Swear words and profanity.


  • Export your reports to XLS;
  • Import your key phrases, cluster analysis and landing pages url’s from CSV format;
  • Printed version of the site audit in DOCX;
  • Guest access to audit;
  • Generate sitemap.xml with duplicate pages and pages closed from indexing;
  • Labrika highlights texts that are used for snippets.

Technical audit

  • Errors 403, 404;
  • Errors 500, 503, 504;
  • Not Responding pages;
  • Critical HTML errors
  • W3C HTML Validator;
  • Multiple redirects;
  • Lost images;
  • Lost JS;
  • Lost CSS;
  • Lost files;
  • Multiple TITLE tags;
  • Multiple DESCRIPTION tags;
  • Multiple KEYWORDS tags;
  • Multiple H1 tags;
  • Pages with rel = "canonical";
  • Common Duplicate Content Issues: www. vs non-www. and http vs https versions of URLs;
  • Correct 404 status code header;
  • Duplicate pages;
  • Mobile HTML optimization;
  • HTML size optimization;
  • Page speed time;
  • Large pages;
  • 3 types of Sitemap.xml errors (+3);
  • 26 types of Robots.txt errors (+26);
  • Tag Length: TITLE, DESCRIPTION, H1 (+3);
  • SSL Certificate Checker (+7);
  • Check if the Domain or IP is Blacklisted;
  • Pages with program's error messages;
  • Check a website response from User-agent;
  • Test the availability of your website from locations worldwide;
  • Test the website for Cloaking;
  • Test if some search engine is blocked by the website;
  • Check a website response from mobile.

Recommendations for text optimization

  • Keyword clustering;
  • Check landing page relevance;
  • Find correct landing page;
  • Find the optimal level of the page;
  • Recommendations for text optimization;
  • Optimal text length;
  • Keyword in the main text (+2);
  • Keyword in TITLE (+2);
  • Keyword in DESCRIPTION (+2);
  • Keyword in H1 (+2);
  • Latent semantics (LSI) on the page;
  • Number of relevant pages on the site;
  • TF-IDF calculation for text in BODY, TITLE, DESCRIPTION, H1 (+4);
  • Estimate the level of the page optimization.

Keyword characteristics

  • Number of main pages in TOP10;
  • A list of relevant landing pages;
  • Recommended keyword depth;
  • Latent semantics (LSI).

User metrics

  • Google Analytics;
  • Drawing charts;
  • % of Bounce Rates;
  • View depth of the site;
  • Average session time;
  • Number of visitors;
  • Mobile devices: traffic, bounce rates, visit time (+4);
  • Visits from sources with traffic and bounce rates (+2);
  • Information on the pages with traffic and level of bounce rates (+2);
  • Visits from cities with traffic and bounce rates (+2);
  • Visits from search engines with traffic and bounce rates (+2);
  • Key phrases with bounce rates and traffic from search results (+2);
  • List of all search requests that people used to find your site for one-year period;
  • Pages without traffic.

Analysis of competitors' websites

  • List of competitors;
  • Snippets of competitors;
  • Labrika generates recommendations for texts based on the analysis of competitors' websites and Machine Learning algorithm;

Check your search rankings

  • Site positions in Google around the world (+1);
  • All Locations & Languages;
  • Country, city or region levels;
  • More than 100k locations;
  • High-precision analysis;
  • Check search positions in several regions at the same time;
  • Monitor the dynamics of search rankings (+1);
  • Available position archive;
  • Download position lists in XLS format for selected period;
  • Desktop and mobile rankings;
  • Top 50, 100 or 200 results.

Domain information

  • Domain Age;
  • Domain payment plan expiration date;
  • Website hosting;
  • Hosting region;
  • IP of the site;
  • The number of sites with the same IP;
  • NS Records;
  • Favicon.


1833 S OCEAN DR APT 1403