Is there any reason to get a massive decrease on indexed pages?
-
Hi,
I'm helping on SEO for a big e-commerce in LatAm and one thing we've experienced during the last months is that our search traffic had reduced and the indexed pages had decreased in a terrible way.
The site had over 2 Million indexed pages (which was way too much, since we believe that around 10k would be more than enough to hold the over 6K SKUs) but now this number has decreased to less than 3K in less than 2 months.
I've also noticed that most of the results in which the site is still appearing are .pdf or .doc files but not actual content on the website.
I've checked the following:
- Robots (there is no block, you can see that on the image as well)
- Webmaster Tools Penalties
- Duplicated content
I don't know where else to look for. Can anyone help?
Thanks in advance!
-
It could be, if they made the switch improperly and Google isn't transferring link equity or can't find new pages. I like checking on services like Archive.org to get backing for my ideas, but I think that you should probably reach out directly to your client and ask about their activities in April.
Hope this helps!
Kristina
-
Gazzerman,
Thank you so much for your reply. I see you're a great detective
I'm aware of many of those issues, and I'm fighting with the IT team to get some of those changes done ASAP.
I'll be checking everything you've mentioned to be sure that this things can get fixed.
I'll let you know what happen!
Thank you,
Mat
-
OK I think I found out some of your issues!
There are a few major ones. I hope you don't mind but I did some detective work and found out what your site was by the code screen grabs you posted on another reply below.
OK so the first MAJOR ISSUE is that you have TWO BODY tags in your HTML! You need to check your code on all pages / templates and get it fixed ASAP. Archive.com shows that this has been a problem on your site since you did a redesign in OCT 2013. Google is looking at signals like this a lot more now and is not being favourable to them
My concern when you do a site: query in google is that your homepage does not show, it should be the first result. I see that you have 15,100 pages indexed.
When I search for your website "sitename.com.ar" I see in Google there is no title, description even the quick links are broken, just showing just one and saying untitled.
NEXT, All your pages have the same description and keyword tags! Whats going on there? I you cant make them unique then get rid of them, they are doing more damage and Google hates that stuff, I have had pages not show in SERPS because they have duplicate description tags. This happen to five of my pages recently that I somehow missed. But this is on every product Page of yours if not every page on the site.
Also I would remove the keywords meta tag, its pointless and will do you more harm than good.
The paths to your javascript and css files have two slashes (//) should only be one.
I would start with those changes and do a fetch in Google WMT and give it a few days, the homepage should reindex in a very short time frame maybe minutes.
After that we can take another look at the site. These are very basic issues that a large brand site like this should never be doing wrong, as you can see the damage it has caused. Lucky for you that you have only just started there and can be the knight in shining armour to sort it all out
-
Hi Kristina,
Thank you so much for your complete answer. It's great to have such a comprehensive feedback!
Here I'm copying a screenshot for the Organic traffic since Nov. 2013. You can see that there was a huge decline in visits between march and april 2014.
I'm checking but it seems that all the pages lost traffic and not only some of them. The only page that seems to be getting traffic is the home.
I was checking and it seems that the site has moved from https to http, I'm not sure about it but when I'm checking it on Archive.org I get some differences on the source file I see that on the previous version some links were pointing to https and now they are not. (I'm attaching a screenshot as well).
I'll try the Sitemap thing, but do you think that there might be a penalty or maybe is just that they've moved from https to http?
Thank you so much!
Matías
-
Hi Gazzerman1,
Thank you so much for your feedback. I'll be checking that. Unfortunately I can't publicly disclose the URL.
Best regards,
Matías
-
Hi there,
You've got a lot of things going on here. I'm worried that Google thinks your site is exceptionally spammy or has malware, since it's showing your .pdf or .doc files but not as many of you HTML pages. But I can't make any definitive statements because I'd need a lot more information. If I were you, I would:
-
Dig into Google Analytics to get a better idea of what exactly has lost organic traffic, and when
-
Look at the trend:
-
Is it a slow decline or a sharp drop? Sharp drop typically means a penalty or that the company accidentally deindexed their pages or something equally horrifying. A slow decline could be an algorithm change (at least the ones I've seen take their time to really take effect) or a new competitor.
-
When did it start? That way you can pinpoint what changes happened around then.
-
Is it a steady decline or has it been jumping up and down? If it's a steady decline, it's probably one thing you're looking for; if traffic is erratic, there could be a bunch of problems.
-
Find out which pages are losing traffic.
-
_Are all pages on the site losing traffic? _If they are, it's probably an algorithm change or a change that was made by webmasters sitewide. If it's specific pages, it could be that there's now better competition for those key terms.
-
Which pages are losing the most traffic? When did their traffic start to drop? Is it different from the rest of the site? You may need to investigate at a deeper level here. Individual pages could have new competition, you may have accidentally changed site structure and lessened internal links to it, and/or it may have lost some external links.
-
Get an accurate list of every unique page on your site to clear up duplicate indexation and find out if anything _isn't _in Google's index.
-
Add a canonical to all pages on your site pointing to the version of the URL you want in Google's index. It's a good way to make sure that parameters and other accidental variations don't lead to duplicate content.
-
Divide that list into site sections and create separate XML sitemaps by section. That way you have a better idea of what is indexed and what isn't in Google Webmaster Tools. I got this idea from this guy (who is also my boss) and I swear by it.
-
Based on traffic in Google Analytics, pick out pages that used to get a lot of traffic and now get none. Search for them in Google with site:[url] search to see if Google's got them indexed.
After that, if you're still stumped, definitely come back here with a little more info - we'll all be happy to help!
Kristina
-
-
I'm not sure if this will solve the problem, but thank you so much for your reply.
I've checked with both tools and they're great. But I can't find a solution yet.
Best regards,
Matías
-
Panda has run recently, does the the drop coincide with that date? Normally there is a shake up in the week leading up to it.
I suspect that someone may have made changes before you go there and they have not given you the full picture.
Was there any issue with the server during that time and the pages speeds were slow or not loading. WMT should give you some indication of the decline, even how dramatic the drops were and what kind of pages were dropped.
FYI you are better to remove the meta descriptions from the pages that have duplication than keep them there, but the best course of action is of course to re-write them.
If you care to share the URL we can all take a look and see if we can spot anything.
-
Try checking it out with bing/yahoo and yandex. See what type of pages they were then go to archive.org to check them.
If it's indeed thin pages or pages caused by parameters/searches/sorting then that is usually an automatic algorithmic penalty, usually, panda.
You might also have some good pages with good links, so be sure to check for 404s. 301 good pages as needed. It's a really tricky situation.
You also have to see if there's trouble with internal links and inbound links (just in case there's a mix with penguin. I've seen sites with manual and algorithmic penalties for both wtf)
Try out tools like
http://www.barracuda-digital.co.uk/panguin-tool/
http://fruition.net/google-penalty-checker-tool/
to see when your drops really started. You should be able to identify your algo penalty at least and go from there.
-
Hi Dennis,
Thank you for your help. I'm not sure what those pages were since I've started managing the account only a few days ago. I suspect that most were pages generated by different parameters, but I've adjusted that on Webmaster's tools.
Also the site had a "Expiration" meta tag dated in 1997, which we've removed some days ago, since I thought that might be the problem, but the indexed pages continued to decrease even after removing that meta.
I've checked the http/https and the site (at least the indexable content) is all http, so there shouldn't be an issue there.
One of the issues I've asked to fix on the site is that there are no meta descriptions on most pages and in the ones in which there is, is a generic one which is exactly the same in every single page. I'm aware that this is a problem, but I don't think this can explain the main issue.
Do you know how can I find out if there was any algorithmic penalty? I've checked on GWT and I can only check for manual penalties.
Thank you so much!
Matías
-
Hi
That's a lot of indexed pages. What were the bulk of those pages? Categories? Products? Random tags or thin pages?
Did you switch to https recently? Are all versions included in webmaster tools? With a preferred version? How about their sitemaps?
It's possible that it's panda. It's also possible that it's a technical side like a switch to https, improper redirects, etc.
I just find it weird to see a big site like that lose that much indexed content (assuming they were good pages) without incurring an algorithmic penalty or a technical problem.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will hreflang with a language and region allow Google to show the page to all users of that language regardless of region?
I'm launching translations on a website with the first translation being Brazilian Portuguese. If I use the following hreflang: If a user is outside of Brazil and has their browser language set to just Portuguese (Not Portuguese (Brazil)) will Google still serve them the Portuguese version of my pages in search results?
International SEO | | Brando160 -
I have more than 4000 pages but still have a low trafic. Would love to know more to be better ranked ?
My website is a magazine about travel and fashion. But even if i have a lot of pages, I am still low in ranking. Why ? Thanks for any advice !
International SEO | | ccjourn0 -
Specific page URL in a multi-language environment
I've read a lot of great posts on this forum about how to go about deciding the best URL structure for each language that your site will support, so thank you to everyone that has provided input on that. I now have a question that I haven't really found answers/opinions on. When providing a page translation, should my content URL reflect that of the country I'm targeting or always remain the same across all sites? Below is an example using the "About Us" page. www.example.com/about-us/
International SEO | | Matchbox
www.example.com/es-mx/about-us/ -- OR -- www.example.com/about-us
www.example.com/es-mx/sobre-nosotros Thank you in advance for your help. Cheers!0 -
How do I get a UK website to rank in Dubai?
We are trying to get a UK-based children's furniture website to rank in Dubai. We have had a couple of orders from wealthy expats in Dubai and it seems to be the correct target market. Does anyone have any specific knowledge of this area? We are promoting the same website as for the UK market. Also does anyone know any user behaviour stats on expatriates using search engines? Do they carry on using the version of Google they are used to, or do most change to the local version of Google? Thanks in advance
International SEO | | Wagada0 -
Are my translated pages damaging my ranking?
Hi there, I have a site in English but with duplicates in different languages. The first problem is that these translated versions of my site receive no ranking on google stars (while the english does) - why is this? The second problem is that SEOmoz counts the errors on my site and then duplicates this error count for all the translated versions of my site - meaning I have a huge amount of errors (too many on-page links). Add this to the fact that I use affilite ID´s to track different types of traffic to my site - so all page urls in english and other languages, with an affiliate id on the end of the url, count as an error. This means I have a huge amount of on page errors indicated by SEOmoz, plus no ranking for my translated pages - I think this is really harming my overall ranking and site trust. What are your opinions on this?
International SEO | | sparkit0 -
Getting ranked in French on Google UK ?
Hellooooo the Moz community ! (#superexcited, # firstpost) Here's my problem. I'm working for a client specialised in Corporate Relocation to London for French families. (I'm reworking the entire site from the ground up, so I can manoeuvre pretty easily) The thing is, these families will either be : Searching on Google FR but mostly in English (French as well) Searching on Google UK but mostly in French ! (and of course, English as well) To be honest, I'm really not sure what strategy I should go with. Should I just target each local market in its native language and google will pick up the right language if people are searching in the "opposite" language ? I'd love some tips to help get me started. Sadly, I don't have a lot of data yet. (Client didn't even have tracking up on their site before I came in). So far here's what I got (on very small number of visitors): Location: 50+% from UK / 20+% from France.
International SEO | | detailedvision
Language : 60+% En / 35+% Fr Thank you. Tristan0 -
Lightbox on Home Page for Geo-Targeting
Hi -- I have a client with various international versions of their site. By adding a lightbox to their U.S. home page enabling the user to select their preferred translation (and cookie them)....does this have any negative SEO implications? It seems like a better alternative than the splash page they were using, but just want to be sure. Thanks!
International SEO | | MedThinkCommunications0 -
Geo Targeting for Similar Sites to Specific Countries in Google's Index
I was hoping Webmaster Tools geo targeting would prevent this - I'm seeing in select google searches several pages indexed from our Australian website. Both sites have unique TLDs: barraguard.com barraguard.com.au I've attached a screenshot as an example. The sites are both hosted here in the U.S. at our data center. Are there any other methods for preventing Google and other search engines from indexing the barraguard.com.au pages in searches that take place in the U.S.? dSzoh.jpg
International SEO | | longbeachjamie0