It was a usual day.
You opened your Google Analytics account to check how your blog has been performing lately.
But! The things aren’t usual.
There are fewer sessions, low page views, and organic traffic has been consistently low. You started cursing Google for showing lesser love on your site and lowering your SERP ranks.
Out of nowhere, your blog starts receiving lesser hits, page views, and low income. Well, that’s what typically happens when you face Google Rankings Drop.
After watching multiple threads by bloggers stressing over their rankings drop, I decided to write a whole article about it explaining everything one needs to know to recover.
In this guide, you’ll learn:
- What is SEO Ranking Drop
- How to detect Goole Rankings Drop
- 10 Reasons you’re losing Google rankings
- How to recover lost Google rankings
Let’s get started.
What is SEO Ranking Drop?
Simply put, Google ranking drop is a situation where your pages continuously struggle to rank and eventually get rolled to second or third pages.
It means there’s something wrong with your site and it needs to be fixed ASAP.
If you are someone new to SEO, here’s a guide to DIY SEO you should be reading.
How to Detect An SEO Ranking Drop?
There are several hints which lead to detecting drops in ranking.
- A decrease in organic traffic, i.e., traffic from search engines
- Low revenue and traffic stats
- Your pages are getting kicked out of the first page on SERP
- You’re facing issues with indexing new articles
- Your ranking tracker shows a sudden drop
You may observe one or more of these signs which basically signals that something is broken on your site which isn’t attractive to Google.
But what’s broken? Read on to find out.
10 Reasons Behind Google Rankings Drop and How to Recover
1. You were hit by Google penalty
It’s heartbreaking when you receive an email that your site doesn’t comply with one or more guidelines, and as a result, your pages get rolled down.
You can check if your site has been hit by browsing Search Console > Select your property > Search traffic > Manual actions.
There you can see if any active action has been taken on your site by Google.
To fix this issue…
The only possible way to save your site from Google penalty is to request Google for reconsideration. It is available on the same manual action page.
For a successful reconsideration, you need to provide some proof that you’ve worked on the said issue and it has been fixed. You also need to explain why that happened, and if it was your mistake.
Being honest is the key here. After you submit the request, someone from Google will visit your site and verify that your site is in compliance with Google guidelines.
2. Your website was hacked
Let’s just hope this isn’t the cause of your Google rankings drop. Getting hacked not only affect your rankings, but all the data and information is on the stake as well.
If your site is flagged as hacked by Google, you’ll see a warning stating “this site may be hacked” below your domain name on SERP – like this.
There are numerous reasons why your website may get hacked, but the commoners being nulled themes or plugins, easy to guess login credentials, and use of malicious code.
Once hacked the hackers can insert links to low-quality sites or malicious scripts which leads to rankings drop.
Fortunately, Google does understand this issue given that every other day someone gets hacked.
To smooth out the process, Google has created a troubleshooter, to help you determine if your site is hacked and how to fix it.
Once you have dealt with the problem and “fixed” it, you can request Google for a review.
3. Bunch of on-page errors
Google’s primary focus is to improve the user experience. And a site full of on-page errors can’t provide it.
A few such errors may not affect the rankings, but when you have multiple of them, consistently, it becomes a serious issue.
Various on-page SEO elements you need to focus are:
Content Quality: Content is the King, and you can’t afford to ignore it. Quality of your content should be top-notch if you want to retain the rankings. Focus on grammar, readability, and sentence structure while creating content for your site.
Also, regularly update the older articles with the latest information.
Page Headings: Each page should have only 1 H1 heading used for the title alone. If your site has more than one, it may cause a problem. Also, you should maintain a proper headings structure for better understanding.
Keyword Optimization: Gone are the days when you could just stuff the article with one keyword and get ranked for it. Google’s Artificial Intelligence aka RankBrain is much smarter now and keywords stuffing, and over-optimization may lead to a penalty.
Ads: Adding too many ads on a single page can affect user experience and irritates them to leave the site right away.
404 Errors: It occurs when the bots couldn’t find the pages on your site. You may have deleted the page or changed URL which gives a 404 error screen when someone lands on that page.
Increase in 404 pages gives a wrong impression and often leads to Google rankings drop.
Bounce Rate: In the most straightforward word, it’s the rate at which users hit back button after visiting your page from search results.
A high bounce rate indicates that your page isn’t suitable for that query and people don’t like it. And eventually, it gets rolled to inner pages.
Ideally, a page visit of 2 minutes is considered as good, while 5 minutes is excellent. So, try to make your visitors stick for at least 2 minutes. Learn more about Bounce Rate.
Loading Time: Again user experience is the given the importance here. A page which takes more than 3 seconds to load can turn away the readers leaving a high bounce rate. Switching to a better host like Cloudways can improve site speed by 458%.
More on-page factors are there but adequately managing these should do the job.
4. Your website is down
If you’re on a cheap web hosting which often struggles to stay online, you can face this issue.
Also, you may be a busy person who works on the site as a weekend project. Here, if your site goes down at the start of the week and stays down for days – it’d give a wrong impression and bots may think your website is dead.
As a result, your Google rankings will drop until you fix the issue and bots reindex the pages.
In case of hack or server load, the same situation may arise. You can check if the site is down to only you or everyone using this site.
To fix the issue…
The most optimal way will be using a better web host like SiteGround that offers a guaranteed 99% uptime. It’ll save you from half the hassle, and the support team will help you fix any technical issues asap.
Additionally, use downtime notifier to get notified when your site goes down and keep track of it. This way you can fix the issue or talk to your web hosting company earliest.
If that’s not enough, use a CDN service like CloudFlare. So, even if the site goes down, your visitors can visit a cached copy of the site.
The quality of links pointing to your site is an important SEO factor. Although Google may not accept it directly, right quality links can boost your SERP rankings.
In simplest terms, backlinks work like a voting system. When you get a backlink (i.e., a vote) from a reputed source, your value increases. Google starts giving importance to your site.
On the other hand, backlinks from low-quality sources indicate bad practices and cause trust issues.
A sudden increase in low-quality unnatural links decreases your authority which in turn affect rankings.
How to fix it?
To fix it, you first need to keep track of your links. Tools like SEMrush and Ahrefs can be used to keep track of links and detect bad-quality links.
Once you’ve identified the links you don’t want to have, there are 2 ways to get rid of them –
- Get it removed from the site: You can email the website owner and request him/her to remove the link.
- Ask Google to ignore it: If the above option didn’t help you, use Google’s disavow tool to disavow the low-quality links. I recently used it to remove many directory backlinks which were hurting the Google rankings.
Additionally, you should also take care of niche relevancy of the link. Backlinks from high-authority sites which doesn’t relate to your niche aren’t going to help you much.
Gaining new links isn’t always the issue. The backlinks you’ve created over time should also need to stick.
According to a study by Brian Dean, he identified that pages ranking on top positions have more [niche-relevant and authority] referring domain names.
For strange reasons, you may lose backlinks from a reputable site which affects the overall site authority.
Since Google was giving importance due to that link, you’d soon a see a dip in ranking if not fixed after removal of the link.
SEMRush again can be used to keep track of lost backlinks. To avoid such situation, you should first analyze what was the reason behind link removal and then ask the webmaster to add it again. Or you can build another link from the similar quality site.
7. Changes in Website Structure
Changing the website theme, codes, permalink, etc. is a common practice among bloggers. But it can lead to some unknown problems causing rankings drop.
While rebranding your site, you have the highest chances of facing these issues related to website structure.
You or your team may delete some pages, change URL, update internal links, or make changes to navigation while rebranding. One or multiple of such reasons combined will lead to rankings lost.
But the recovery from such issue is pretty straightforward. All you’ve to do is use the backup data (before rebranding) and replace it with the new one.
It is a good idea to use a staging environment for rebranding and thoroughly test your site before making your changes live. Cloud hosts like CloudWays offer staging environment and free backup plans.
Also, use 301 redirects to notify Google about changes in URL of pages.
8. Issues with Robots.txt file
Robots.txt file is basically a text file which is meant for search engine bots and crawlers. It contains instructions or rules that define how to crawl pages on the website.
The two rules used for user-agents (crawlers) are “allow” and “disallow.”
Although it is always recommended to stay away from editing the robots.txt file, you or some plugin may have mistakenly updated it and changed the way Google bots crawl your site.
With a disallow permission, bots won’t be able to access the page, and eventually, your site will lose the rankings.
You can see your robots.txt file, by browsing: www.yoursite.com/robots.txt
See if any of the search engine bots are getting disallowed, and if it’s the case, remove that specific rule.
Note: If you aren’t sure about what you’re doing, consult someone experienced rather than doing more harm. At the least, contact me for further help.
Tip: If you just installed WordPress and facing issues with indexing, it is possible that you’ve discouraged search engines from accessing your site.
Log in to your WordPress dashboard and go to Settings > Reading > Search Engine Visibility. Make sure the box is unticked and save the changes.
9. Your competitors are better than You
In this scenario, you won’t observe a considerable drop in ranking but by a small margin like 1 or 2 position drop.
You’ll also see your competitor(s) taking up the top position, and somehow Google is favoring their articles over yours.
To recover your previous rankings, here you first need to understand why competitor’s page outranked you using the following approach:
- Content Quality: Analyse their content and topics covered and compare with yours. If possible, compare an earlier version of their content using archive.org and see what did they do to outrank your page.
- Backlinks: Your competitors may have got links from some reputed sites which makes Google rank their article over yours (assuming a similar level of content quality)
- Interlinking: Did the competition changed internal linking structure for better crawling?
- User experience: Do they have better Dwell time, bounce rate, and click-through rate? Analyze why.
Tools like SEMrush can help compare your site with competitors and steal their working strategies. Additionally, use these tools to spy on your competitors.
10. Google Updated Their Algorithm
Unlike early days (2015), Google algorithm changes were rolled faster and announced which SEO community used to analyze and understand the reasons behind losers and winners.
But, with the introduction of machine learning, these changes have become more sophisticated and rolled slowly which creates confusion and less understanding of what the update was all about.
If the rankings drop doesn’t indicate any of the above situations, the Google algorithm update should be the cause.
There are 9 major Google algorithm updates, each targeted at different factors of SEO.
- Panda: Duplicate, plagiarized content, and thin content
- Penguin: Low-quality links
- Fred: Too many ads
- Pirate: Pirated and copyright content
- Hummingbird: Keyword stuffing and low-quality content
- Pigeon: Poor on-page and off-page SEO
- RankBrain: Content Relevancy
- Mobile: Mobile version of the site
- Possum: User’s location-based
Websites to keep track of Google Algorithm Updates:
- Google Inside Search
- Matt Cutts
- Moz – Google Algorithm Change History
- SearchEngineJournal – History of Google Algorithm
- SEMRush Sensor
Seeing a sudden Google rankings drop can be a painful experience when you’ve worked hard to achieve the rankings and didn’t want to leave the spot.
Irrespective of the reason, you can *almost* always recover your rankings with the right techniques and proper plan. I hope the above article will help you do the same, and firmly face the Google rankings sudden drop.
Do leave a comment if you have questions or suggestions for me. Also, make sure to share this article with your friends and followers.