Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Is there any reason to get a massive decrease on indexed pages?
-
Hi,
I'm helping on SEO for a big e-commerce in LatAm and one thing we've experienced during the last months is that our search traffic had reduced and the indexed pages had decreased in a terrible way.
The site had over 2 Million indexed pages (which was way too much, since we believe that around 10k would be more than enough to hold the over 6K SKUs) but now this number has decreased to less than 3K in less than 2 months.
I've also noticed that most of the results in which the site is still appearing are .pdf or .doc files but not actual content on the website.
I've checked the following:
- Robots (there is no block, you can see that on the image as well)
- Webmaster Tools Penalties
- Duplicated content
I don't know where else to look for. Can anyone help?
Thanks in advance!
-
It could be, if they made the switch improperly and Google isn't transferring link equity or can't find new pages. I like checking on services like Archive.org to get backing for my ideas, but I think that you should probably reach out directly to your client and ask about their activities in April.
Hope this helps!
Kristina
-
Gazzerman,
Thank you so much for your reply. I see you're a great detective
I'm aware of many of those issues, and I'm fighting with the IT team to get some of those changes done ASAP.
I'll be checking everything you've mentioned to be sure that this things can get fixed.
I'll let you know what happen!
Thank you,
Mat
-
OK I think I found out some of your issues!
There are a few major ones. I hope you don't mind but I did some detective work and found out what your site was by the code screen grabs you posted on another reply below.
OK so the first MAJOR ISSUE is that you have TWO BODY tags in your HTML! You need to check your code on all pages / templates and get it fixed ASAP. Archive.com shows that this has been a problem on your site since you did a redesign in OCT 2013. Google is looking at signals like this a lot more now and is not being favourable to them
My concern when you do a site: query in google is that your homepage does not show, it should be the first result. I see that you have 15,100 pages indexed.
When I search for your website "sitename.com.ar" I see in Google there is no title, description even the quick links are broken, just showing just one and saying untitled.
NEXT, All your pages have the same description and keyword tags! Whats going on there? I you cant make them unique then get rid of them, they are doing more damage and Google hates that stuff, I have had pages not show in SERPS because they have duplicate description tags. This happen to five of my pages recently that I somehow missed. But this is on every product Page of yours if not every page on the site.
Also I would remove the keywords meta tag, its pointless and will do you more harm than good.
The paths to your javascript and css files have two slashes (//) should only be one.
I would start with those changes and do a fetch in Google WMT and give it a few days, the homepage should reindex in a very short time frame maybe minutes.
After that we can take another look at the site. These are very basic issues that a large brand site like this should never be doing wrong, as you can see the damage it has caused. Lucky for you that you have only just started there and can be the knight in shining armour to sort it all out
-
Hi Kristina,
Thank you so much for your complete answer. It's great to have such a comprehensive feedback!
Here I'm copying a screenshot for the Organic traffic since Nov. 2013. You can see that there was a huge decline in visits between march and april 2014.
I'm checking but it seems that all the pages lost traffic and not only some of them. The only page that seems to be getting traffic is the home.
I was checking and it seems that the site has moved from https to http, I'm not sure about it but when I'm checking it on Archive.org I get some differences on the source file I see that on the previous version some links were pointing to https and now they are not. (I'm attaching a screenshot as well).
I'll try the Sitemap thing, but do you think that there might be a penalty or maybe is just that they've moved from https to http?
Thank you so much!
Matías
-
Hi Gazzerman1,
Thank you so much for your feedback. I'll be checking that. Unfortunately I can't publicly disclose the URL.
Best regards,
Matías
-
Hi there,
You've got a lot of things going on here. I'm worried that Google thinks your site is exceptionally spammy or has malware, since it's showing your .pdf or .doc files but not as many of you HTML pages. But I can't make any definitive statements because I'd need a lot more information. If I were you, I would:
-
Dig into Google Analytics to get a better idea of what exactly has lost organic traffic, and when
-
Look at the trend:
-
Is it a slow decline or a sharp drop? Sharp drop typically means a penalty or that the company accidentally deindexed their pages or something equally horrifying. A slow decline could be an algorithm change (at least the ones I've seen take their time to really take effect) or a new competitor.
-
When did it start? That way you can pinpoint what changes happened around then.
-
Is it a steady decline or has it been jumping up and down? If it's a steady decline, it's probably one thing you're looking for; if traffic is erratic, there could be a bunch of problems.
-
Find out which pages are losing traffic.
-
_Are all pages on the site losing traffic? _If they are, it's probably an algorithm change or a change that was made by webmasters sitewide. If it's specific pages, it could be that there's now better competition for those key terms.
-
Which pages are losing the most traffic? When did their traffic start to drop? Is it different from the rest of the site? You may need to investigate at a deeper level here. Individual pages could have new competition, you may have accidentally changed site structure and lessened internal links to it, and/or it may have lost some external links.
-
Get an accurate list of every unique page on your site to clear up duplicate indexation and find out if anything _isn't _in Google's index.
-
Add a canonical to all pages on your site pointing to the version of the URL you want in Google's index. It's a good way to make sure that parameters and other accidental variations don't lead to duplicate content.
-
Divide that list into site sections and create separate XML sitemaps by section. That way you have a better idea of what is indexed and what isn't in Google Webmaster Tools. I got this idea from this guy (who is also my boss) and I swear by it.
-
Based on traffic in Google Analytics, pick out pages that used to get a lot of traffic and now get none. Search for them in Google with site:[url] search to see if Google's got them indexed.
After that, if you're still stumped, definitely come back here with a little more info - we'll all be happy to help!
Kristina
-
-
I'm not sure if this will solve the problem, but thank you so much for your reply.
I've checked with both tools and they're great. But I can't find a solution yet.
Best regards,
Matías
-
Panda has run recently, does the the drop coincide with that date? Normally there is a shake up in the week leading up to it.
I suspect that someone may have made changes before you go there and they have not given you the full picture.
Was there any issue with the server during that time and the pages speeds were slow or not loading. WMT should give you some indication of the decline, even how dramatic the drops were and what kind of pages were dropped.
FYI you are better to remove the meta descriptions from the pages that have duplication than keep them there, but the best course of action is of course to re-write them.
If you care to share the URL we can all take a look and see if we can spot anything.
-
Try checking it out with bing/yahoo and yandex. See what type of pages they were then go to archive.org to check them.
If it's indeed thin pages or pages caused by parameters/searches/sorting then that is usually an automatic algorithmic penalty, usually, panda.
You might also have some good pages with good links, so be sure to check for 404s. 301 good pages as needed. It's a really tricky situation.
You also have to see if there's trouble with internal links and inbound links (just in case there's a mix with penguin. I've seen sites with manual and algorithmic penalties for both wtf)
Try out tools like
http://www.barracuda-digital.co.uk/panguin-tool/
http://fruition.net/google-penalty-checker-tool/
to see when your drops really started. You should be able to identify your algo penalty at least and go from there.
-
Hi Dennis,
Thank you for your help. I'm not sure what those pages were since I've started managing the account only a few days ago. I suspect that most were pages generated by different parameters, but I've adjusted that on Webmaster's tools.
Also the site had a "Expiration" meta tag dated in 1997, which we've removed some days ago, since I thought that might be the problem, but the indexed pages continued to decrease even after removing that meta.
I've checked the http/https and the site (at least the indexable content) is all http, so there shouldn't be an issue there.
One of the issues I've asked to fix on the site is that there are no meta descriptions on most pages and in the ones in which there is, is a generic one which is exactly the same in every single page. I'm aware that this is a problem, but I don't think this can explain the main issue.
Do you know how can I find out if there was any algorithmic penalty? I've checked on GWT and I can only check for manual penalties.
Thank you so much!
Matías
-
Hi
That's a lot of indexed pages. What were the bulk of those pages? Categories? Products? Random tags or thin pages?
Did you switch to https recently? Are all versions included in webmaster tools? With a preferred version? How about their sitemaps?
It's possible that it's panda. It's also possible that it's a technical side like a switch to https, improper redirects, etc.
I just find it weird to see a big site like that lose that much indexed content (assuming they were good pages) without incurring an algorithmic penalty or a technical problem.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Issues with Baidu indexing
I have a few issues with one of my sites being indexed in Baidu and not too sure of how to resolve them; 1. Two subdomains were redirected to the root domain, but both (www. and another) subdomains are still indexed after ~4 months. 2. A development subdomain is indexed, despite no longer working (it was taken down a few months back). 3. There's conflicting information on what the best approach is to get HTTPS pages indexed in Baidu and we can't find a good solution. 4. There are hundreds of variations of the home page (and a few other pages) on the main site, where Baidu has indexed lots of parameters. There doesn't appear to be anywhere in their webmaster tools to stop that happening, unlike with Google. I'm not the one who deals directly with this site, but I believe that Baidu's equivalent of Webmaster Tools has been used where possible to correctly index the site. Has anyone else had similar issues and, if so, were you able to resolve them? Thanks
International SEO | | jobhuntinghq0 -
Hreflang tag on every page?
Hello Moz Community, I'm working with a client who has translated their top 50 landing pages into Spanish. It's a large website and we don't have the resources to properly translate all pages at once, so we started with the top 50. We've already translated the content, title tags, URLs, etc. and the content will live in it's own /es-us/ directory. The client's website is set up in a way that all content follows a URL structure such as: https://www.example.com/en-us/. For Page A, it will live in English at: https://www.example.com/en-us/page-a For Page A, it will live in Spanish at https://www.example.com/es-us/page-a ("page-a" may vary since that part of the URL is translated) From my research in the Moz forums and Webmaster Support Console, I've written the following hreflang tags: /> For Page B, it will follow the same structure as Page A, and I wrote the corresponding hreflang tags the same way. My question is, do both of these tags need to be on both the Spanish and English version of the page? Or, would I put the "en-us" hreflang tag on the Spanish page and the "es-us" hreflang tag on the English page? I'm thinking that both hreflang tags should be on both the Spanish and English pages, but would love some clarification/confirmation from someone that has implemented this successfully before.
International SEO | | DigitalThirdCoast0 -
Country subfolders showing as sitelinks in Google, country targeting for home page no longer working
Hi There, Just wondering if you can help. Our site has 3 region versions (General .com, /ie/ for Ireland and /gb/ for UK), each submitted to Google Webmaster Tools as seperate sites with hreflang tags in the head section of all pages. Google was showing the correct results for a few weeks, but I resubmitted the home pages with slight text changes last week and something strange happened, though it may have been coincidental timing. When we search for the brand name in google.ie or google.co.uk, the .com now shows as the main site, where the sitelinks still show the correct country versions. However, the country subdirectories are now appearing as sitelinks, which is likely causing the problem. I have demoted these on GWT, but unsure as to whether that will work and it seems to take a while for sitelink demotion to work. Has anyone had anything similar happen? I thought perhaps it was a markup issue breaking the head section so that Google can no longer see the hreflangs pointing to each other as alternates. I checked the source code in w3 validator and it doesn't show any errors. Anyway, any help would be much appreciated - and thanks to anyone who gets back, it's a tricky type of issue to troubleshoot. Thanks, Ro
International SEO | | romh0 -
URL Structure - Homepage, Country and State Pages
Hello, I am creating a website (or websites if best format) that will have state-specific boating license courses for every state in the US, Canada and Australia. I would like the content to be available on the website in English, French and Spanish. I want to be the global leader in providing boat test courses. For the (1) homepage, (2) country pages, and (3) state pages, what is best SEO format I should use for:
International SEO | | Monologix
(a) URL structure
(b) "href lang" code
(c) rel canonical code
(d) will meta content with non-English pages need to also be in the non-English language of that page? Also, what server company do you recommend I host my website with? I am a non-programmer and learning SEO, so any and all help will be greatly appreciated! Thank you very much in advance!!!0 -
Redirect the main site to keyword-rich subfolder / specific page for SEO
Hi, I have two questions. Question 1: is it worthwhile to redirect the main site to keyword-rich subfolder / specific page for SEO? For example, my company's webpage is www.example.com. Would it make sense to redirect (301) the main site to address www.example.com/service-one-in-certain-city ? I am asking this as I have learned that it is important for SEO to have keywords in the URL, and I was thinking that we could do this and include the most important keywords to the subfolder / specific URL. What are the pros and cons of this? Should I create folders or pages just the sake of keywords? Question 2: Most companies have their main URL shown as www.example.com when you access their domain. However, some multi-language sites show e.g. www.example.com/en or www.example.com/en/main when you type the domain to your web browser to access the site. I understand that this is a common practice to use subdomains or folders to separate different language versions. My question is regarding subfolders. Is it better to have only the subfolder shown (www.example.com/en) or should I also include the specific page's URL after the subfolder with keywords (www.example.com/en/main or www.example.com/en/service-one-in-certain-city)? I don't really understand why some companies show only the subfolder of a specific language page and some the page's URL after the subfolder. Thanks in advance, Sam
International SEO | | Awaraman1 -
Are my translated pages damaging my ranking?
Hi there, I have a site in English but with duplicates in different languages. The first problem is that these translated versions of my site receive no ranking on google stars (while the english does) - why is this? The second problem is that SEOmoz counts the errors on my site and then duplicates this error count for all the translated versions of my site - meaning I have a huge amount of errors (too many on-page links). Add this to the fact that I use affilite ID´s to track different types of traffic to my site - so all page urls in english and other languages, with an affiliate id on the end of the url, count as an error. This means I have a huge amount of on page errors indicated by SEOmoz, plus no ranking for my translated pages - I think this is really harming my overall ranking and site trust. What are your opinions on this?
International SEO | | sparkit0 -
Should product-pages with different currencies have different URLs?
Here is a question that should be of interest for small online merchants selling internationally in multiple currencies. When, based on geolocation, a product-page is served with different currencies, should a product-page have a different URL for each currency? Thanks.
International SEO | | AdrienOLeary0 -
Non US site pages indexed in US Google search
Hi, We are having a global site wide issue with non US site pages being indexed by Google and served up in US search results. Conversley, we have US en pages showing in the Japan Google search results. We currently us IP detect to direct users to the correct regional site but it isn't effective if the users are entering through an incorrect regional page. At the top of each or our pages we have a drop down menu to allow users to manually select their preferred region. Is it possible that Google Bot is crawling these links and indexing these other regional pages as US and not detecting it due to our URL structure? Below are examples of two of our URLs for reference - one from Canada, the other from the US /ca/en/prod4130078/2500058/catalog50008/ /us/en/prod4130078/2500058/catalog20038/ If that is, in fact, what is happening, would setting the links within the drop down to 'no follow' address the problem? Thank you. Angie
International SEO | | Corel0