Website Availability Test
Website accessibility analysis is one of the most important activities. Loss of normal access to a website rarely occurs but in some cases it can be a symptom of serious problems that may result in the website being dropped completely from search results. This type of analysis also helps find the signs of some types of hacking and viruses on the website.
The report contains several sections:
Variants of domain spelling
In some cases, readdressing is performed on some URLs. If there are variations in the website name, it is recommended that readdressing should be carried out to only one of them.
The website should ideally support the HTTPS protocol and there should be readdressing to only the single correct address for all the possible variations of the name. If the website is accessible by different URL variants without readdressing (for example it can be accessed both as
http://www.example.com and as
https://www.example.com) a lot of problems may result. Some pages of the website in the search system index will have www, and some won’t. In one search system, the website might be indexed with
http:// and in another with
https://. To avoid such chaos, it is necessary to correctly set domain gluing in the .htaccess file.
Accessibility testing with User-Agent
User agent is an application which uses a certain network protocol. The term is usually used for applications which provide access to the website, such as browsers and search bots.
If the website blocks access deliberately or shows an error for one of the User-agents, you should find the reason, whether such behavior is normal, and why it happens. Otherwise, you may lose large numbers of users and intending purchasers and be completely unaware of it.
When the response result for a search system differs from the one shown for normal browsers, it may reveal cloaking on the website. If the difference is significant, it is likely that the website may be downgraded in search results because of cloaking, even if it was done unintentionally or for a good purpose. It is necessary to analyze the results.
Suspicion of cloaking
Cloaking is an attempt to manipulate search engines by showing different information to the user than to search bots, on the same page. All search engines penalize cloaking by downgrading the website in the search results, to the extent of total exclusion.
The cause may be incorrect software settings or website hacking. For example, Labrika found malicious code on a website which generated links to another resource in such a manner that only search engines could see them. So the perpetrators got links to their website and were able to hide this fact for a long time. And the hacked website was steadily being downgraded in the search results because of this cloaking. If you suspect cloaking, you should access the saved result of this request and compare it with normal state of the page. If a malicious code is inserted, you should contact specialists for website treatment. If it is a feature of your software, you should find reasons for such behavior and correct it.
Different response for mobile browsers
A different response using mobile browsers may be observed if there is a separate mobile version, or if the website was hacked and contains a virus or readdressing to a malicious website. The second scenario happens rarely but causes serious damage to the ranking and reputation of the website. So if you know that there is no separate mobile version of the website, you should analyze the result. Access the content of the server response and check for miners, malicious scripts or content. Hackers often use this way of hacking – it enables them to increase the time before the problem is detected, as there are usually no antiviruses on mobile devices, and website owners don’t check mobile versions every day. Labrika finds traces of hacking on tens of websites every month.
Testing accessibility from different countries
The report shows analysis results of accessibility from other countries.
Sometimes traffic blocking from other countries is enabled to avoid threats during a DDOS attack. However, if it is permanently enabled it may cause some loss of traffic because users may not be able work with the website during their travels or via proxies/VPNs of these countries.