Is there any reason to get a massive decrease on indexed pages?
-
Hi,
I'm helping on SEO for a big e-commerce in LatAm and one thing we've experienced during the last months is that our search traffic had reduced and the indexed pages had decreased in a terrible way.
The site had over 2 Million indexed pages (which was way too much, since we believe that around 10k would be more than enough to hold the over 6K SKUs) but now this number has decreased to less than 3K in less than 2 months.
I've also noticed that most of the results in which the site is still appearing are .pdf or .doc files but not actual content on the website.
I've checked the following:
- Robots (there is no block, you can see that on the image as well)
- Webmaster Tools Penalties
- Duplicated content
I don't know where else to look for. Can anyone help?
Thanks in advance!
-
It could be, if they made the switch improperly and Google isn't transferring link equity or can't find new pages. I like checking on services like Archive.org to get backing for my ideas, but I think that you should probably reach out directly to your client and ask about their activities in April.
Hope this helps!
Kristina
-
Gazzerman,
Thank you so much for your reply. I see you're a great detective
I'm aware of many of those issues, and I'm fighting with the IT team to get some of those changes done ASAP.
I'll be checking everything you've mentioned to be sure that this things can get fixed.
I'll let you know what happen!
Thank you,
Mat
-
OK I think I found out some of your issues!
There are a few major ones. I hope you don't mind but I did some detective work and found out what your site was by the code screen grabs you posted on another reply below.
OK so the first MAJOR ISSUE is that you have TWO BODY tags in your HTML! You need to check your code on all pages / templates and get it fixed ASAP. Archive.com shows that this has been a problem on your site since you did a redesign in OCT 2013. Google is looking at signals like this a lot more now and is not being favourable to them
My concern when you do a site: query in google is that your homepage does not show, it should be the first result. I see that you have 15,100 pages indexed.
When I search for your website "sitename.com.ar" I see in Google there is no title, description even the quick links are broken, just showing just one and saying untitled.
NEXT, All your pages have the same description and keyword tags! Whats going on there? I you cant make them unique then get rid of them, they are doing more damage and Google hates that stuff, I have had pages not show in SERPS because they have duplicate description tags. This happen to five of my pages recently that I somehow missed. But this is on every product Page of yours if not every page on the site.
Also I would remove the keywords meta tag, its pointless and will do you more harm than good.
The paths to your javascript and css files have two slashes (//) should only be one.
I would start with those changes and do a fetch in Google WMT and give it a few days, the homepage should reindex in a very short time frame maybe minutes.
After that we can take another look at the site. These are very basic issues that a large brand site like this should never be doing wrong, as you can see the damage it has caused. Lucky for you that you have only just started there and can be the knight in shining armour to sort it all out
-
Hi Kristina,
Thank you so much for your complete answer. It's great to have such a comprehensive feedback!
Here I'm copying a screenshot for the Organic traffic since Nov. 2013. You can see that there was a huge decline in visits between march and april 2014.
I'm checking but it seems that all the pages lost traffic and not only some of them. The only page that seems to be getting traffic is the home.
I was checking and it seems that the site has moved from https to http, I'm not sure about it but when I'm checking it on Archive.org I get some differences on the source file I see that on the previous version some links were pointing to https and now they are not. (I'm attaching a screenshot as well).
I'll try the Sitemap thing, but do you think that there might be a penalty or maybe is just that they've moved from https to http?
Thank you so much!
Matías
-
Hi Gazzerman1,
Thank you so much for your feedback. I'll be checking that. Unfortunately I can't publicly disclose the URL.
Best regards,
Matías
-
Hi there,
You've got a lot of things going on here. I'm worried that Google thinks your site is exceptionally spammy or has malware, since it's showing your .pdf or .doc files but not as many of you HTML pages. But I can't make any definitive statements because I'd need a lot more information. If I were you, I would:
-
Dig into Google Analytics to get a better idea of what exactly has lost organic traffic, and when
-
Look at the trend:
-
Is it a slow decline or a sharp drop? Sharp drop typically means a penalty or that the company accidentally deindexed their pages or something equally horrifying. A slow decline could be an algorithm change (at least the ones I've seen take their time to really take effect) or a new competitor.
-
When did it start? That way you can pinpoint what changes happened around then.
-
Is it a steady decline or has it been jumping up and down? If it's a steady decline, it's probably one thing you're looking for; if traffic is erratic, there could be a bunch of problems.
-
Find out which pages are losing traffic.
-
_Are all pages on the site losing traffic? _If they are, it's probably an algorithm change or a change that was made by webmasters sitewide. If it's specific pages, it could be that there's now better competition for those key terms.
-
Which pages are losing the most traffic? When did their traffic start to drop? Is it different from the rest of the site? You may need to investigate at a deeper level here. Individual pages could have new competition, you may have accidentally changed site structure and lessened internal links to it, and/or it may have lost some external links.
-
Get an accurate list of every unique page on your site to clear up duplicate indexation and find out if anything _isn't _in Google's index.
-
Add a canonical to all pages on your site pointing to the version of the URL you want in Google's index. It's a good way to make sure that parameters and other accidental variations don't lead to duplicate content.
-
Divide that list into site sections and create separate XML sitemaps by section. That way you have a better idea of what is indexed and what isn't in Google Webmaster Tools. I got this idea from this guy (who is also my boss) and I swear by it.
-
Based on traffic in Google Analytics, pick out pages that used to get a lot of traffic and now get none. Search for them in Google with site:[url] search to see if Google's got them indexed.
After that, if you're still stumped, definitely come back here with a little more info - we'll all be happy to help!
Kristina
-
-
I'm not sure if this will solve the problem, but thank you so much for your reply.
I've checked with both tools and they're great. But I can't find a solution yet.
Best regards,
Matías
-
Panda has run recently, does the the drop coincide with that date? Normally there is a shake up in the week leading up to it.
I suspect that someone may have made changes before you go there and they have not given you the full picture.
Was there any issue with the server during that time and the pages speeds were slow or not loading. WMT should give you some indication of the decline, even how dramatic the drops were and what kind of pages were dropped.
FYI you are better to remove the meta descriptions from the pages that have duplication than keep them there, but the best course of action is of course to re-write them.
If you care to share the URL we can all take a look and see if we can spot anything.
-
Try checking it out with bing/yahoo and yandex. See what type of pages they were then go to archive.org to check them.
If it's indeed thin pages or pages caused by parameters/searches/sorting then that is usually an automatic algorithmic penalty, usually, panda.
You might also have some good pages with good links, so be sure to check for 404s. 301 good pages as needed. It's a really tricky situation.
You also have to see if there's trouble with internal links and inbound links (just in case there's a mix with penguin. I've seen sites with manual and algorithmic penalties for both wtf)
Try out tools like
http://www.barracuda-digital.co.uk/panguin-tool/
http://fruition.net/google-penalty-checker-tool/
to see when your drops really started. You should be able to identify your algo penalty at least and go from there.
-
Hi Dennis,
Thank you for your help. I'm not sure what those pages were since I've started managing the account only a few days ago. I suspect that most were pages generated by different parameters, but I've adjusted that on Webmaster's tools.
Also the site had a "Expiration" meta tag dated in 1997, which we've removed some days ago, since I thought that might be the problem, but the indexed pages continued to decrease even after removing that meta.
I've checked the http/https and the site (at least the indexable content) is all http, so there shouldn't be an issue there.
One of the issues I've asked to fix on the site is that there are no meta descriptions on most pages and in the ones in which there is, is a generic one which is exactly the same in every single page. I'm aware that this is a problem, but I don't think this can explain the main issue.
Do you know how can I find out if there was any algorithmic penalty? I've checked on GWT and I can only check for manual penalties.
Thank you so much!
Matías
-
Hi
That's a lot of indexed pages. What were the bulk of those pages? Categories? Products? Random tags or thin pages?
Did you switch to https recently? Are all versions included in webmaster tools? With a preferred version? How about their sitemaps?
It's possible that it's panda. It's also possible that it's a technical side like a switch to https, improper redirects, etc.
I just find it weird to see a big site like that lose that much indexed content (assuming they were good pages) without incurring an algorithmic penalty or a technical problem.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should Hreflang x-default be on every page of every country for an International company?
UPDATED 4/29/2019 4:33 PM I had made to many copy and pastes. Product pages are corrected Upon researching the hreflang x-default tag, I am getting some muddy results for implementation on an international company site older results say just homepage or the country selector but…. My Question/Direction going forward for the International Site I am working on: I believe I can to put x-default all the pages of every country and point it to the default language page for areas that are not covered with our current sites. Is this correct? From my internet reading, the x-default on every page is not truly necessary for Google but it will be valid implemented. My current site setup example:
International SEO | | gravymatt-se
https://www.bluewidgets.com Redirects to https://www.bluewidgets.com/us/en (functions as US/Global) Example Countries w/ code Site:- 4 countries/directories US/Global, France, Spain Would the code sample below be correct? https://www.bluewidgets.com/us/en/ (functions as US/Global) US/Global Country Homepage - https://www.bluewidgets.com/us/en/ US/Global Country Product Page(s) This would be for all products - https://www.bluewidgets.com/us/en/whizzer-5001/ http://www.bluewidgets.com/us/en (functions for France) France Country Homepage - https://www.bluewidgets.com/fr/fr/ France Country Product Page(s) This would be for all products- https://www.bluewidgets.com/es/es/whizzer-5001 http://www.bluewidgets.com/us/en (functions as Spain) Spain Country Homepage - https://www.bluewidgets.com/es/es/ Spain Country Product Page(s) This would be for all products - https://www.bluewidgets.com/es/es/whizzer-5001 Thanks for the spot check Gravy0 -
Why are some regions/countries not indexing correctly?
Hi All, Recently I've added different regions (website.com/se/ etc) to Google search console and pointed them to their relevant countries, but only half are working when I search from a regions IP with a VPN and use the correct Google search ( Google.se etc ). Will this correct over time? or is something else causing them not to be indexed up correctly? Thanks in advance <colgroup><col width="81"><col width="104"></colgroup>
International SEO | | WattbikeSEO
| Country | Appear in SERP 17/12/2018 |
| AU | TRUE |
| CZ | TRUE |
| DK | TRUE |
| HK | TRUE |
| IE | TRUE |
| IT | TRUE |
| KR | TRUE |
| NL | TRUE |
| NZ | TRUE |
| SE | TRUE |
| SG | TRUE |
| US | TRUE |
| ZA | TRUE |
| AE | FALSE |
| AT | FALSE |
| CH | FALSE |
| CN | N/A |
| DE | FALSE |
| EE | FALSE |
| ES | FALSE |
| FI | FALSE |
| FR | FALSE |
| GB | FALSE |
| GR | FALSE |
| JP | FALSE |
| NO | FALSE |
| PL | FALSE |
| RU | FALSE |
| SI | FALSE |
| TR | FALSE |0 -
Web Site Migration - Time to Google indexing
Soon we will do a website migration .com.br to .com/pt-br. Wi will do this migration when we have with lower traffic. Trying to follow Google Guidelines, applying the 301 redirect, sitemap etc... I would like to know, how long time the Google generally will use to transfering the relevance of .com.br to .com/pt-br/ using redirect 301?
International SEO | | mobic0 -
Are my translated pages damaging my ranking?
Hi there, I have a site in English but with duplicates in different languages. The first problem is that these translated versions of my site receive no ranking on google stars (while the english does) - why is this? The second problem is that SEOmoz counts the errors on my site and then duplicates this error count for all the translated versions of my site - meaning I have a huge amount of errors (too many on-page links). Add this to the fact that I use affilite ID´s to track different types of traffic to my site - so all page urls in english and other languages, with an affiliate id on the end of the url, count as an error. This means I have a huge amount of on page errors indicated by SEOmoz, plus no ranking for my translated pages - I think this is really harming my overall ranking and site trust. What are your opinions on this?
International SEO | | sparkit0 -
Getting ranked in French on Google UK ?
Hellooooo the Moz community ! (#superexcited, # firstpost) Here's my problem. I'm working for a client specialised in Corporate Relocation to London for French families. (I'm reworking the entire site from the ground up, so I can manoeuvre pretty easily) The thing is, these families will either be : Searching on Google FR but mostly in English (French as well) Searching on Google UK but mostly in French ! (and of course, English as well) To be honest, I'm really not sure what strategy I should go with. Should I just target each local market in its native language and google will pick up the right language if people are searching in the "opposite" language ? I'd love some tips to help get me started. Sadly, I don't have a lot of data yet. (Client didn't even have tracking up on their site before I came in). So far here's what I got (on very small number of visitors): Location: 50+% from UK / 20+% from France.
International SEO | | detailedvision
Language : 60+% En / 35+% Fr Thank you. Tristan0 -
Getting A Sub Domain To Out-Rank The Main Domain
Hi, We have a prospective client who currently have a sub domain setup for each language, they all have the same content as the main domain. The problem is that the main domain is written in English (but not UK English), and they want the UK sub domain to outrank it (it's the other way round at the moment). Effectively, there are duplicate content issues here and as a result it looks like Google have chosen to keep the main domain (as it has more authority) and lower the UK sub results in its rankings. Is there a feature in webmaster tools where you can target subdomains to a location (I know you can do this with a main domain). Additionally, any other tips for the above would be greatly appreciated. Thanks in advance,
International SEO | | jasarrow0 -
Lightbox on Home Page for Geo-Targeting
Hi -- I have a client with various international versions of their site. By adding a lightbox to their U.S. home page enabling the user to select their preferred translation (and cookie them)....does this have any negative SEO implications? It seems like a better alternative than the splash page they were using, but just want to be sure. Thanks!
International SEO | | MedThinkCommunications0 -
How can I rank couple of pages to a specific geography ?
Hi guys, I have a pretty good success in many of my keyword on google US. We are a multi-country company and would like to get better ranking on all these countries. I know it's a long run and we need to by patient to get the rank desired. We are getting the slowly, bu surely. In the next couple of months, we will be attending a conference where we will have a booth and we would like to conduct a campaign to invite customers to join us. My question is : Is there an efficient way to have just couple of pages on our web site that could potentially rank fast on a specific geography ? Europe is my target audience ( France an UK ). If you have any advice, I would appreciate. Best regards,
International SEO | | processia1