Hit With Panda, How Should I block pages?
-
Hello!
I believe Ive been hit with Panda, I have a large Ecommerce site with literally thousands of pages, but working on adding custom content daily. Should I block pages that have duplicated copy, that dynamically insert a product/artist/team name? Will this help with my huge ranking drop? If so after this has been done should I send a request reconsideration to google? Or will it just happen automatically? I believe this is a algo penalty and not manual, as I have not received any messages in my Webmaster.
Any help would be greatly appreciated!!
Thank You!
-
Hello Ben!
Yes still indexed, Ive never really ranked very well but I had alot of Long tail searches that came up, but most of it was on the DUP content pages. With thousands of pages how can I Noindex duplicate content?
Thank you
-
Are you still indexed? No need to submit a reconsideration if you are.
Noindex you duplicate content.
Develop a new links to your new quality content. Guest blog, do some non-profit work in the community, ect.
If you're still indexed, you should regain some rankings quickly. However after a slap you're going to have to show Google you've changed your ways over the course of a couple of months to get back to where you were.
-
Im almost positive that its Panda, its a LARGE ecommerce site with thousands of pages, traffic fell off in 1-2 days to almost nothing now. I have a fwe hundred pages with custom content everything is is pretty much the same with artists names dynamically inserted. Everything is whitehat thats the only thing I see which could be causing the problem. Im not sure exactly what to do know? Do I have to contact google on this? Or just fix it and wait?
Thank You
-
Hi,
If you are positive that Panda update is the reasoning for your loss of ranking then you can use a canonical tag to let the search engine know which page to credit. However, your drop in rankings can be from multiple scenarios.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blocking websites.
We have noticed an abundance of requests for some free colour cards on our website, however these are coming from a russian freebie website. how can i block this website, so the link from that forum doesn't work?
Content Development | | TeamacPaints0 -
Simular product pages
I have 27000 products on my website, showed one by one on a separated webpage. Google index them almost all (+- 25000). But the SEOmoz report shows them as duplicated content. Indeed, most of the page is identical, only changing description and price of the product which is indeed not more than 2% of the total content of the page. On the bottom of the product page are shown the alternatives for this product, mainly other colors. So, within the same family of products that can have 50 products, the site creates 50 webpages showing the product and it's family. That's why nearly everything on the page is identical within this family of products. My guess is, as Google indexed them all, I should not worry about duplicated content. Is my guess correct? Thanks for a soon answer. Rik
Content Development | | noordhout0 -
Correction Duplicate Page Title Problems for a Blog
EDITED: To just focus on the issue at hand. I am trying to figure out the SEO rules instead of just working on the content. Please bear with me. I am adept technically. I just do not know the rules of the SEO process or even some of the termology. So I’m trying to attack problems one at time. Today’s problem – **Duplicate Page Titles ** We evidently have thousands of Duplicate Page Titles. We are using Joomla 2.5 & Easyblog. Our sitemap is automated from XML Sitemap Easyblog takes the title of the sites and uses it for a name of the summary pages. We post 5 blog items per page and all the names are the same. http://www.OursiteName.com/?start=5 Page Title = Site Name http://www.OursiteName.com/?start=10 Page Title = Site Name A similar thing happens on the sorting by Author or Category etc etc. Basically non-duplicate pages are looking like duplicates. What is the best practice / approach? Using the Robot.txt or XML Sitemap to tell Google not to crawl these pages? Writing a script or edit the Easyblog code to edit the 2000 duplicate Page Titles? Other thoughts?
Content Development | | Romana0 -
How can I resolve a duplicate page issue?
I have (an attached) report that shows duplicate content for a blog page and I'm not sure how to resolve the issue. The blog/website is hosted on wordpress.org, maybe it's something to do with having to add categories or tags - can anyone help please? SDtXT.png
Content Development | | lindsayjhopkins0 -
How many words should be placed on a home page, category pages, and product pages?
To optimize content for a website, how many words should be provided for a home page, category page and a product page?
Content Development | | gallreddy0 -
Block Low Quality Pages?
What are your thoughts on blocking (in robots.txt) and/or noindexing low-quality pages to defend against Panda, assuming you can't remove, redirect, or add quality content to it? Also, assume there are no external links pointing to these low-quality pages, no social shares, and zero incoming organic traffic. Has anyone had experience with this as a solution to Panda?
Content Development | | poolguy0 -
I have a page where you can download a PDF of the material - should I exclude the PDF from the search engines?
In my niche, there is a controversial research article that is very popular. I am writing a rebuttal to this article and giving another point of view. My article has the potential to be really good link bait for my site. The original article is often printed out to be shown to professionals in my niche. My hope is that people will do the same with mine. So, I plan to have a PDF version of my article available on my page. The article that is visible on my site (i.e. non PDF) will be a graphic rich article that is easy for the reader to go through. I plan to have the PDF have all of the same text, but it won't have as many graphics - it will look more like a scientific research article. So, should I exclude the pdf from search engines so that it isn't duplicate content? Or does that even matter seeing as it is a duplicate of my own content? I want people to link to the main article, not the pdf. Any tips would be greatly appreciated!
Content Development | | MarieHaynes1 -
How to fight the Panda/Farmer update?
I have been suffering majorly from the Google algo change, as I lost more than 50% of my traffic of my largest site. Since then I have been focusing on rewriting pages, and adding new ones. New pages are all of high quality up to 2,000 words each, and the improved pages used to be thin content pages, rewritten to about 1,000 words. All content is 100% unique. I have noticed Google still has the old pages cached, dated to more than a month back, despite new pages (linking to some of the old) being indexed. Anyway, I am pretty much desperate by now, and coul really use some advice how to fight this. FYI, I got some budget available and a writer stand-by. Thanks, Giorgio
Content Development | | VisualSense0