Index or No Index (Panda Issue)
-
Hi,
I believe our website has been penalized by the panda update. We have over 9000 pages and we are currently indexing around 4,000 of those pages. I believe that more than half of the pages indexes have either thin content. Should we stop indexing those pages until we have quality page content? That will leave us with very few pages being indexed by Google (Roughly 1,000 of our 9,000 pages have quality content). I am worried that we would hurt our organic traffic more by not indexing the pages than by indexing the pages for google to read. Any help would be greatly appreciated.
Thanks,
Jim Rodriguez
-
Firstly, please don't assume that you've been hit by Panda. Find out. Indexation count is generally not a good basis for assuming a penalty.
- Was there a traffic drop around the date of a known Panda update? Check this list. https://moz.com/google-algorithm-change . If the date of traffic drop lines up, you might have a problem. Otherwise it could easily be something else.
- How many links does your site have? Google indexes and crawls based on your authority. It's one area where it doesn't really matter where the links go: just having more links seems to increase the amount your site is crawled. Obviously the links should be non-spammy.
- Do you have a site map? Are you linking to all of these pages? It could be an architecture issue unrelated to penalty.
If it is a Panda issue: generally I think people take the wrong approach to Panda. It's NOT a matter of page count. I run sites with hundreds of thousands of URLs indexed, useful pages with relatively few links and no problems. It's a matter of usefulness. So you can decrease your Panda risk by cutting out useless pages - or you can increase the usefulness of those pages.
When consulting I had good luck helping people recover from penalties, and with Panda I'd go through a whole process of figuring out what the user wanted (surveys, interviews, user testing, click maps, etc.), looking at what the competition was doing through that lens, and then re-ordering pages, adjusting layout, adding content, and improving functionality toward that end.
Hope that helps.
-
Every case is different, what might work for someone else may not work for you. This depends on the content you are saying is thin - unless it has caused a penalty, I would leave it indexed and focus on writing more quality content.
-
I think it is a critical issue - you have thin content on your most of the pages; If google bot can access your thin content pages, you may not recover from panda until you add quality content on your all the pages and that pages indexed by google (it may take a very long time)
If you have added noindex (just you told Google that do not index pages), still Google can access your pages so, google can still read your thin content and you can not recover any how.
so as per my advice you need to either remove all thin content from your pages and add quality content as fast as you can and tell google to indexed your new content (using fetch option in Google search console) (recommended) or add nofollow and noindex both to the thin content pages (not recommended) because you may lose huge number of traffic and (may you can't recover from panda - i am not sure for this statement).
-
Hi Jim,
From my own experience with Panda-impacted sites, I've seen good results from applying meta robots "noindex" to URLs with thin content. The trick is finding the right pages to noindex. Be diligent in your analytics up front!
We had a large site (~800K URLs), with a large amount of content we suspected would look "thin" to Panda (~30%). We applied the noindex to pages that didn't meet our threshold value for content, and watched the traffic slowly drop as Google re-crawled the pages and honored the noindex.
It turned out that our analytics on the front end hadn't recognized just how much long-tail traffic the noindexed URLs were getting. We lost too much traffic. After about 3 weeks, we essentially reset the noindex threshold to get some of those pages back earning some traffic, which had a meaningful impact on our monetization.
So my recommendation is to do rigorous web analytics up front, decide how much traffic you can afford to lose (you will lose some) and begin the process of setting your thresholds for noindex. It takes a few tries.
Especially if you value the earning potential of your site over the long term, I would be much more inclined to noindex deeply up front. As long as your business can survive on the traffic generated by those 1000 pages, noindex the rest, and begin a long-term plan for improving content on the other 8000 pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Virtual URL Google not indexing?
Dear all, We have two URLs: The main URL which is crawled both by GSC and where Moz assigns our keywords is: https://andipaeditions.com/banksy/ The second one is called a virtual url by our developpers: https://andipaeditions.com/banksy/signedandunsignedprintsforsale/ This is currently not indexed by Google. We have been linking to the second URL and I am unable to see if this is passing juice/anything on to the main one /banksy/ Is it a canonical? The /banksy/ is the one that is being picked up in serps/by Moz and worry that the two similar URLs are splitting the signal. Should I redirect from the second to the first? Thank you
On-Page Optimization | | TAT1000 -
My website is not indexing the image.
Our website's images are not indexed. Will anyone help me? How will all images in my website be indexed? This is my website address: https://www.expertclipping.com/
On-Page Optimization | | jacky_risham0 -
Links in my website are indexed as separate pages
Hi All, We have pages which has names of people which we have created a Alink when clicked it goes to the related persons name. Now Google has indexed it as a seperate page example www.xyz.com/person/name www.xyz.com/person/name?alinks Now the same page is seen in google 2 times. How can i handle this please
On-Page Optimization | | jomin740 -
Would Panda target this?
Hi guys, We suffered a massive rankings drop in September 2012, same date as Panda 20, so we've been trying to fix the issues since with no little to no success. I think these Q&A's work best if I ask a specific question, instead of just screaming for help, so hopefully we're looking in the right place at least. One area I've been looking into, is of course, content (being a Panda penalty). However, I'm not sure what about our content is causing a problem. We provide a phone unlocking service and have over 6000 handsets that we can unlock. We only allow search engines to index 5 of them, due to these being those with unique product descriptions (there are over 100 more but we want to start getting our rankings back a bit at a time). We also let them index our manufacturer pages, news and support pages, 160 approx in total. On our handset and manufacturer pages we have much of the same content, with a few words difference to alter the price or the name of the manufacturer/phone. We also change our delivery times for some, as it can vary and have a "Why use us" section which is the same for each handset page. In my mind there is no point changing these areas to be unique to each page as they clearly describe our service and what we offer. Changing each one for each page, especially if we wanted to start adding our other remaining 5995 handset would be ridiculously. It would also clearly be manipulative is we're just rewriting the same thing in a slightly different way to benefit a search engine and not the users. Does anyone know if this type of content would be seen as duplicate content and would result in a penalty? And is there anything we can do about it? Thanks,
On-Page Optimization | | purpleindigo
Darren.0 -
What is the fastest way to re-index an important page?
Hello Moz Community Members, Besides submitting the URL in Google Webmaster, what are other ways to make sure google indexes/crawls a page which was noindexed?
On-Page Optimization | | SEMEnthusiast0 -
Can internal duplicate content cause issues?
Hello all mozzers - has anyone used nitrosell? we use them only because their inventory connects to ours epos point but because they have thousands of 301s on our domain we are getting duplicate content because different sizes of products (we sell womenswear) are creating seperate URLS so we are duplicating both content and URLS - im curious as to whether anyone has experienced simillar problems that have affected their SERPS? Best wishes, Chris
On-Page Optimization | | DaWillow1 -
On my site, www.myagingfolks.com, only a small number of my pages appear to be indexed by google or yahoo. Is that due to not having an XML sitemap, keywords, or some other problem?
On my site, www.myagingfolks.com, only a small number of my pages appear to be indexed by google or yahoo. I have thousands of pages! Is that due to not having an XML sitemap, keywords, or some other problem?
On-Page Optimization | | Jordanrg0 -
Has anyone with a high number of home page tiles been effected by the panda/farmer update?
One of the sites I'm working on seems to have dropped a few spots in rankings. It has numerous home page tiles, but they are not really advertising; just links to different sections on the site. Does anyone think that might be a factor in the rankings drop?
On-Page Optimization | | J.Marie0