7 Reasons Why Your Website Rankings are Not Stable

Many site owners and webmasters would like to see their site in the TOP forever after they get there. But instead, the opposite often happens and for some reason, rankings of the site are not stable.

Sometimes the drop is sharp, and sometimes the drawdown is slow but sure. In this article, you will find out about the 7 most common reasons, as well as what to do if the site’s position has dropped.

1. The impact of hosting

There can be two types of problems with hosting:

1.Low uptime

  1. Slow website loading

Low uptime is when your site is intermittently unavailable due to server-side issues. If at such moments a search robot comes to the site, seeing its inoperability, Google may begin to exclude the pages of your resource from its index.

If the uptime is very low, all pages may be dropped from the index. To prevent this from happening, it is very important to host your site on reliable hosting, whose uptime is higher than 99.99% of the time.

Slow loading can be either the fault of the hosting or a problem on your side. Search engines want to provide maximum comfort to their users, so it is difficult to get into the TOP and even more so to stay there if your site takes a long time to load.

What to do:

Find out your hosting uptime. You can find this information on the hosting site itself, indifferent ratings and selections, and you can also measure it yourself using the Uptime Robot service.

Also, check your website loading speed using services such as GT Metrix and PageSpeed Insights.

2. Dropping pages from the index

Loss of pages from the index can lead not only to the loss of organic traffic but also to a drop in positions on the rest of the pages.

The most important thing here is to make a list of all the pages on the site and check if they are in the index of search engines. Since this is a rather laborious process, at first you can just see how many pages of the site are in the index of Google.

What to do:

If there are significantly fewer pages in the search engine index than it should be, you should do a page check.

If some pages have dropped out of the index, and you have eliminated the cause, you should re-index them as soon as possible.

3. Aggressive ads or viruses and link spam

Sites with aggressive ads began to rank much worse than they used to be. If you have ads on your site, the first thing you should look at is the space it takes up.

If it takes up most of the first screen, then it is quite possible that this is the main reason for the fall in the position of your site. Search engines want the user to start getting a solution to their problem on the first screen, and not to see an advertisement.

The source of your problems may also be the presence of paid archives, clickers, etc. The presence of viruses is also easily recognised by search engines. If you have doubts about the security of your site – now is the time to check it for viruses and vulnerabilities.

Link manipulation penalties are the most common. Probably because webmasters still believe that for successful website promotion it is enough to buy links. You should definitely check your links and their anchors periodically for spam and quality.

What to do:

You can perform diagnostics using special services and get rid of the influence of bad links through the Disavow Tool.

4. Deteriorating content quality

Over time, the content on your site loses its relevance and completeness. It is believed that an average article has a lifespan of 3-4 years, after which it begins to lose relevance, and with it its position. Although, much depends on the subject matter and quality of the article itself.

What to do:

If you have posts on your site that have a chronology (2016, 2017, 2018), then you should at least change the titles to the current date. Better yet, update the content itself, if, of course, you want to return the previous positions in the search.

5. Global changes on the site

Often, major website updates lead to a noticeable drop in traffic from search engines. If you changed:

  • Sitemap settings
  • Rules in robots.txt file
  • Design
  • Design theme
  • URL structure
  • Text content

then this is exactly what can cause a decrease in positions. Changes to the sitemap or robots.txt file can lead to dropping out or poor indexing of important pages that drive traffic.

Web design agency experts say that erratic change in the design can have bad consequences. Big changes in the content of the site can lead to a deterioration in text and behavioural factors, which of course will negatively affect its positions. Changing the URLs of existing pages is one of the most dangerous and main reasons for a drawdown. Even if everything is done correctly and a page-by-page 301 redirect from old addresses to new ones is set up, it takes time for the search engine to return the previous positions. Sometimes you have to wait for weeks or months, and sometimes the positions do not return to their previous level at all.

What to do:

List all changes made to the site along with dates. Compare this list and the date when the site’s positions began to decline. With a high degree of probability, you will find a match.

Depending on the changes made, you should choose the most optimal solution. Sometimes you need to return everything as it was, and sometimes you just need to wait.

6. Hitting search engine filters

The imposition of sanctions by search engines is the first thing that comes to mind when a site’s rankings drop. And in fact, this is one of the most common problems. Sanctions may be imposed due to:

  • Text spam (low-quality text, frequent use of keys)
  • Link spam (buying low-quality links, selling links)
  • Behavioural spam (PF promotion)
  • Advertising spam
  • Fraudulent spam (mobile redirect, identity theft, phishing).

What to do:

If you suspect that the site contains any of the above, it’s time to audit the site.

7. Natural displacement by competitors

The process of crowding out your site by competitors is quite natural and logical. Analyse your competitors and, in general, the search results for which your site’s positions have dropped.

It is quite possible that you will find an increased competition or the capture of the TOP by large sites. For example, on many commercial topics, aggregators and services of the search engines themselves appeared in the TOP, which ousted the old-timers.

What to do:

If you see that the competition in the TOP has increased, perhaps you should more carefully and thoughtfully approach the promotion of your resource.

A step-by-step list of actions

If at this point you are still wondering what to do if the site is losing position, we will give you the following step-by-step list of actions:

Determine the scale of the drawdown:

  • The whole site went down?
  • A specific page sank?
  • Dropped a specific request?
  • Localise the problem:

What queries did the positions drop for?

  • Which search engine?
  • How many positions?
  • Drawdown date?

After answering these questions, make a list of possible problems. Start narrowing the circle until you find the true cause. Feel free to contact search engine support.

Why do different browsers have different positions on the site?

fact is that the positions of your site are not static, and change depending on who exactly and where enters the search query.

This is called SERP personalisation. Its goal is to better understand the user’s need and solve exactly his problem.

For example, a person enters the query “Napoleon” into a search. The personalisation of search results takes into account the history of the user’s browser requests. For example, if a person previously searched for everything related to the French leader, then the search results will contain more sites about Napoleon Bonaparte.

If a person had previously searched for everything related to cooking, then in the search results most sites can make up a recipe for Napoleon cake.

Thus, the position of the site directly depends on who enters the request, from which browser (its history is needed) and also from where (country, city).

As for the browser, the following data can be used to generate personalised SERPs:

  • History of recent requests
  • Most visited sites
  • Social Connections (Google Plus)

An interesting conclusion that can be drawn from this: track the positions of your site through special services, and not by hand.


The same site at different times (month, week, day and even hour) may have different ratings, i.e. position on the queries of search engines, even if no changes were made to the site during this period. Nevertheless, make sure to monitor your site, control its content. Your goal is not only to get to the top but to stay there.

News Reporter