Why Did My Site's
Google Ranking Drop?
Hardly an hour passes before some poor frustrated website owner posts a message in
Google's Webmaster Help Forum
asking why their website dropped in the rankings of Google when they are still ranking high
in the other search engines. Often it's a plea for help from a non-technical webmaster whose site has
been near the top of the results for his best keyword phrase for quite some time when suddenly,
and without his making any changes to his site, it is no longer to be found and he cannot see
any reason why. So I've collected a list of the most common reasons here why a website's search engine
rankings fall with suggested methods of recovery.
Google Panda and Penguin Updates
Reasons Why Your Site Dropped in Google Ranking
There are many reasons why a site will drop in the rankings, so I've compiled
a list of the most common reasons here. Start here, but once you've taken care of any of the
ranking issues I describe, be sure to check Google's
Guidelines For Webmasters to be sure you're in compliance. Pay particular attention to the Quality
Guidelines, because that section contains the rules that will incur a penalty if you violate
them. The other search engines all have similar guidelines that you need to follow.
- Change in the Search Engine Ranking Algorithm: Search engines are constantly
changing the methods they use to rank websites in order to improve the general
quality of their search results and to weed out the sites that are resorting to
various tricks (ie. so-called "black hat" optimization techniques) to
artificially boost their rankings. For the past few years, Google has been making significant
changes to their algorithm on a frequent basis. While Bing does tend to make less frequent and
more subtle changes than Google, they don't stand still for long either. The point is that search engines
do change their methods and you can't rely on your rankings to remain unchanged forever. A drop in
ranking can happen gradually, or it can be sudden when their methods change in a way that particularly
affects your website. When your site's rankings fall and you believe that you're abiding by the rules, your
only recourse is to reexamine your site in light of the Guidelines and work to make your site better and stronger.
- Loss of PageRank/Link Popularity: One or more links to your site that had been
providing a significant amount of PageRank to your site have been removed/deleted,
moved to a new unranked page, or the PageRank of the originating site has dropped (for
similar reasons). Websites with low to modest PageRank often obtain the bulk of their link
strength from a small number of strong links. So the loss of even one of them can have a
major impact on their rankings. In recent years, Google has begun to aggressively
discount or completely negate the PageRank value of links from low quality pages/websites,
and links they consider to be unnatural, such as text link ads, gratuitous press releases,
forum signature links, blog comments, and listings in low quality directories. The loss of
the value of such artificial links can cause a significant ranking drop, and excessive posting
of such links can lead to penalties. Artificial links are links that the webmaster generated
himself, as opposed to "editorial links," which are links that are posted by other
webmasters for the benefit of their users. See the Google Panda and
Penguin Updates information for details.
- Malware or Hacking: If Google detects malware on your pages, it will warn
users who click on your pages in the search results before sending them to your site.
This is not really a drop in ranking, but the effect is just as bad since very few
users will proceed past such a warning. See my article on
Removing Malware for help
on getting the Malware Warning removed and for removing the malware itself. Google has some Good
advice for repairing a hacked site, too. You can use Google's Safe Browsing Diagnostic
Tool to check your site using the form below:
- Penalty: The search engines are getting very aggressive about violations of
their guidelines and are quick to punish some transgressions. Some of the most common
causes of penalties include:
- Hidden Text: Hidden text is an old trick that can remain undetected by the
search engines for a long time, but is usually discovered at some point. Don't
hide text by making it the same color as the page background. Google sniffs that
out very easily these days and can get you a significant penalty. You can
use CSS methods for drop-down menus or to keep content invisible until a user requests it (by
mouseovers or clicks). As long as there is a legitimate reason for using CSS in this
manner and there is a clear way for users to see the content, you won't be penalized. Just don't
try to stuff keywords on a page in a way that only the search engines will ever see it. You'll
get burned eventually.
- Text Link Ads: While all of the major search engines prohibit buying and selling links, Google has
been the most aggressive in penalizing link sellers. If you're selling links or
advertising that can pass PageRank, your site may have it's PageRank score reduced or it may have it's ability to
pass on PageRank removed. In either case, that can mean your website's internal pages will
no longer rank as well as they had before. If you're a link buyer, it can mean that you've
wasted your money on links that are no longer helping your rankings. If you have paid
links or advertising on your site, make sure that all of the links have the attribute
rel="nofollow" in the <a>nchor tag in the HTML code or that the
links point to a redirection script on your site that is blocked in your robots.txt file.
- Thin Content/Low Quality Pages: Google's Guidelines prohibit sites that exist mainly to serve
advertising. You can have AdSense ads or affiliate links if your site contains a high proportion of original
high-quality content. It's only websites that have a large amount of content copied from elsewhere, and websites
with little or no information-rich, original content for the user that will be penalized. Ecommerce websites
should avoid copying product descriptions from the manufacturer or other e-commerce sites
like Amazon, and create their own unique descriptions for each product they offer for sale. In January of 2012,
Google announced that they would also reduce the rankings of pages where there is a large amount of advertising
"above the fold" that makes the user have to scroll down the page in order to see the
content. This includes AdSense ads. Overall, Google now reduces the rankings of sites with a large number of pages that have a high
ratio of advertising to content. It's all intended to improve the user's experience in the search
results. See my note on the Panda Update for more details.
- Multiple Domain Names: Having multiple domain names
pointing to the same site is a common mistake that new webmasters make, thinking
that it will lead to better rankings when just the opposite is true. While this practice
doesn't trigger an actual penalty, the search engines' methods for filtering out duplicate
content can damage your rankings. See "Duplicate Content" below.
- Domain Farms: This is the practice of creating many websites and heavily cross-linking
them all to each other. It's not uncommon for people or companies to own and operate many
websites, but if they are heavily linked to each other ("cross-linked") Google may consider it to be a
"link scheme" you created solely to increase your rankings. If you own multiple domains,
it's alright to create a handful of links between them. You can also post individual cross-links within
the content of one of your sites pointing to another site you own when it's appropriate to the
content. But don't cross-link all of them on every page. All things in moderation!
- Linking to penalized, or so-called "bad neighborhood" sites: This is another
mistake that a novice might make without knowing that he may be doing something
in violation of the search engines guidelines. Make sure that any website you link
to is one that you would be happy to have your users visit and that has a majority
of its pages included in Google's index. In general, a couple of links to a bad neighborhood won't
hurt you unless you also have other quality issues, but you should always be careful in
posting links on your site. If you need to link to a site you think might be of questionable quality, the
best practice is to add the rel="nofollow" attribute to the <a>nchor tag.
- Cinderella Story or "Honeymoon" Effect: If your site is less than 6 months old, you may have been
getting an artificial boost in your rankings from Google to help your site be found by users.
But that extra ranking power for new sites doesn't last forever. You'll seem to be flying
high one day and not to be seen at all the next. Again, this usually comes down to low
link popularity since it takes time and effort to build quality links to a
new website. A continuous link building program is your best insurance against
falling rankings. See my Building Links article for
some good advice.
- Excessive Link Exchanges: Yes, I know that I recommend link exchanges for new
websites and I also know that Google discourages the practice. But although their guidelines are
ambiguous, their actions are easier to interpret. Limit your link exchanges to just a handfull related, good quality
sites. A couple of dozen is all you need to get your site on the right track, not hundreds
or thousands. Don't do massive exchanges through automated programs that create link exchange
directories on your site. Don't make posts in online forums or comment in blogs just to get
links. Such artificial links (ie. links you create yourself) don't improve your rankings and are likely to reduce
Google's level of trust for your site. Trust is a term you will run into more and more as you
study Search Engine Optimization. Both Google and Bing use that term in their suggestions for
webmasters. A high proportion of artificial links can really hurt your site's rankings now. See my note
on the Penguin Update for more details on bad links.
- Canonicalization Problems: My personal favorite because, without changing a
thing on your site, you can fall victim to this problem in Google. All it takes
is someone linking to your site with the wrong version of your URL and you can
start to have some problems. This is mostly a problem for newer websites that
haven't firmly established themselves in Google. It means that Google has indexed
pages from your site with more than one version of your URLs. It can be two versions
of your domain name (ie. 'www.example.com' and simply 'example.com'), or it can
be the correct domain name, but using both HTTP and HTTPS. You can also have problems
if you set a rel="canonical" tag to the wrong URL. I can personally attest to
how easy it is to make that mistake. Fortunately, you can recover easily from this issue.
See Website Canonicalization Repair article for details.
- Broken links: If one of your internal pages is critical to your site's
success - either for its ability to draw traffic by its high search engine ranking or
because it's a critical navigation page - making a typographical error in a link can
mean a major section of your site is suddenly disconnected from the rest of your site
and therefore vulnerable to being dropped by the search engines. Normally, a critical
page will have several links, but novices will sometimes fall victim to this error.
The cure is to make a habit of regularly using a link checker like the free utility
Xenu Link Sleuth,
or the W3C's Link Checker.
- Server Problems: If Google has difficulty accessing your site, if it's slow
to respond or responds with an error code for a sustained period, it can lead to
problems. Search engines are very tolerant of short periods when a site may be unavailable
for maintenance or other issues, but if the problems persist over many days it can
impact your rankings. If you know your site will be down for maintenance, you should
set your server to respond with code 503, which informs the search engines that you're
aware of the situation and they should try again later. Other server problems include
chains of redirects that take too many steps to reach the final page. Two redirection steps should
be the most any HTTP request to your site should return.
- robots.txt Issues: Your robots.txt file may be blocking the search engines from
crawling some or all of your pages. Review your robots.txt file to make sure it looks good. The
"Fetch As Googlebot" tool in Google's Webmaster Tools console will let you test your robots.txt file to
make sure it only blocks the URLs that you want it to. The Webmaster Tools console also
shows a list of URLs Google found were blocked by your robots.txt file in the Crawl Errors report.
- Duplicate Content: You should do your best to prevent the same content from being
available through more than one URL on your website or anyone else's. When Google finds duplicate pages,
it tries to select the "best" or "canonical" version and devalues all of the
copies. But as a webmaster, you don't get to choose which copy they select as best. This problem crops
up quite often when people have blogs or shopping carts on their sites, or have separate sites for
different countires and/or languages. If your website uses a popular
blog or a shopping cart script, be sure to look for the latest version of your blog software and any plug-ins
that might be available to help. Another source of duplicate content is other sites copying your
website. This is particularly annoying since you obviously had nothing to do with creating the
problem. It's a good idea to use services like CopyScape
to check for other sites copying your pages. They have both free and paid services available. If you
find someone copying your site, contact them and demand they remove it. If they don't comply, contact
their hosting service and tell them what happened. It can also be helpful to file a
DMCA Removal Request with Google.
Google offers webmasters substantial help through their Webmaster
Tools. You can get a detailed analysis of your site's status in Google there. But for websites that
have incurred a manual penalty, there is a form called the "Reconsideration Request" form that lets you
tell Google that you have repaired any violations you found and ask that any penalties be removed. While
Google's automatic systems will usually remove a penalty over time if a website has been brought into compliance, filing
a Reconsideration Request can speed the process along by several weeks or months. Note that the Reconsideration
Request will only help if you have had a manual penalty applied to your website, which are penalties for
violating Google's Quality Guidelines. Google considers other issues that have the same ranking effect as a penalty
to be a part of their algorithm and they rarely take any direct action in those cases.
Google Panda Algorithm
On February 24, 2011 Google released a major update to its ranking methods, popularly
referred to as "Farmer" or "Panda" since the purpose was to target so-called "content farms"
which are large sites filled with low-quality content. The update also targets sites that have an excessive
ratio of advertising to content and sites that have a high level of duplicate or 'scraped' content. This is one of the
rare circumstances in which a Google ranking factor works on a site-wide level, as opposed to an individual web page. If your site's rankings dropped
significantly recently, you should read the article on Search Engine Land by Googler Vanessa Fox,
Traffic Dropped With Farmer/Panda Update that suggests ways to analyze why your rankings dropped and how
to restore them.
Generally, you want to keep your pages rich with fresh, useful, original content. Websites that
syndicate, republish, or repackage content from other sites are likely to have their rankings drop significantly.
The same holds for sites with a large number of pages that consist mostly of advertising and other template
content. If you have low-quality pages that simply can't be improved, but are still important to your
site for users, you should either add a robots <meta> tag set to "noindex" to them, or block them
in your robots.txt file. Since this is a sitewide ranking factor, you won't see any increase in your rankings for quite
some time after you make any attempts to improve the quality of your site with regard to this update in
Google's ranking algorithm because they only update this information periodically.
April 26, 2012 Update: Google announced two separate changes to their ranking
methods this past week. On April 19, they updated their Panda algorithm, and on April 24, they added a
separate algorithm change targeting website SPAM. These are separate changes that look at different
aspects of a website. The Panda update is a refinement of their site-wide assessment of the quality
of your pages. If your rankings changed significantly on the 19th or 20th, then the new version of
Panda is the likely reason.
Google Penguin Update
Google has also made a second major change in its algorithms. This change has been officially named "Penguin" and
you can get a sense of how it works in the Inside
Search article on Penguin by Google's "distinguished engineer" Matt Cutts. In that
article, he examines some eggregious examples of webspam with overt and deliberate attempts to scam
the search engines. It's not the imfamous "over-optimization penalty" that he had discussed
at an industry conference earlier in 2012, apparently, but might well be a step along that path. Issues
like unnatural keyword usage and irrelevant and artificial links were highlighted in the article, but it's not clear
yet just what Penguin specifically targets. But the message is clear. Google wants websites that are designed to
deliver quality content for users to rank best, and websites that try to beat their algorithms should not
One aspect of Penguin that is well-established is its targeting of artificial links, such
as comment SPAM, article SPAM, paid links, forum signature links, and other webmaster-generated links. If you're
trying to diagnose a drop in your rankings, check the "Links To Your Site" tool in the "Search
Traffic" menu in the Google Webmster Tools console. One thing to look for is large numbers of links coming from single
domains. These are often so-called "scraper" sites that just copy content or automatically generate
content that include links. And since they're low-quality sites, Google is targeting these links in their ranking systems.
Look for any links you created that might appear to be SPAM. Once you identify these potentially harmful links,
contact the webmasters of those sites and ask them to remove the links. If/When that fails, you can use Google's
Link Disavowal Tool to request that Google
ignore them. Removing such bad links has helped some websites to escape from Penguin issues. In the end, you want
the vast majority of your links to be natural (ie. "editorial") links from high-quality websites.
This page was last updated on July 25, 2014