10,000 New Pages of New Content - Should I Block in Robots.txt?
-
I'm almost ready to launch a redesign of a client's website. The new site has over 10,000 new product pages, which contain unique product descriptions, but do feature some similar text to other products throughout the site.
An example of the page similarities would be the following two products:
-
Brown leather 2 seat sofa
-
Brown leather 4 seat corner sofa
Obviously, the products are different, but the pages feature very similar terms and phrases.
I'm worried that the Panda update will mean that these pages are sand-boxed and/or penalised.
Would you block the new pages? Add them gradually? What would you recommend in this situation?
-
-
Consider reversing your thinking from "what will be my loss to panda" into "what can I do to make this site kick ass".
Reach for opportunity, extend yourself.
If this was my site I would get a writer on those product descriptions to make them unquestionably unique, beef them up, add salesmanship and optimize them for search. This will give you substantive unique content, that converts better, pulls more long tail traffic and moves out of competition with other sites that do the minimal.
Sure, it will cost money but in the long run it could bring back a huge return.
My only caution on this is that if you make this investment in writing you need to do that on a site that has can pull reasonable traffic. If you do this on a site that has no links it will not do you much good. It is part of a marketing plan not a single item on a "to do" list.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best practice for disallowing URLS with Robots.txt
Hi Everybody, We are currently trying to tidy up the crawling errors which are appearing when we crawl the site. On first viewing, we were very worried to say the least:17000+. But after looking closer at the report, we found the majority of these errors were being caused by bad URLs featuring: Currency - For example: "directory/currency/switch/currency/GBP/uenc/aHR0cDovL2NlbnR1cnlzYWZldHkuY29tL3dvcmt3ZWFyP3ByaWNlPTUwLSZzdGFuZGFyZHM9NzEx/" Color - For example: ?color=91 Price - For example: "?price=650-700" Order - For example: ?dir=desc&order=most_popular Page - For example: "?p=1&standards=704" Login - For example: "customer/account/login/referer/aHR0cDovL2NlbnR1cnlzYWZldHkuY29tL2NhdGFsb2cvcHJvZHVjdC92aWV3L2lkLzQ1ODczLyNyZXZpZXctZm9ybQ,,/" My question now is as a novice of working with Robots.txt, what would be the best practice for disallowing URLs featuring these from being crawled? Any advice would be appreciated!
Intermediate & Advanced SEO | | centurysafety0 -
Ecommerce SEO: Shared content on product pages
Hi Guys, I am wondering what the best practices are for avoiding duplicate content on product pages that have shared content. For example, say I have a 3 different product pages for each of the following: Verizon IPhone 5 16GB, AT&T IPhone 5 16GB, AT&T IPhone 5 32GB. Obviously each product is for the most part the same (all are IPhone 5). The only differences lie in the carrier of the phone and the storage capacity. I want to write product descriptions for each page to target a variety of different keywords, but I don't want to get penalized for duplicate content. Does anybody have any experience in what the SEO best practices are for product pages that have shared content like this? Thank you!
Intermediate & Advanced SEO | | Cody_West0 -
Robots.txt Help
I need help to create robots.txt file. Please let me know what to add in the file. any real example or working example.?
Intermediate & Advanced SEO | | Michael.Leonard0 -
Help with Robots.txt On a Shared Root
Hi, I posted a similar question last week asking about subdomains but a couple of complications have arisen. Two different websites I am looking after share the same root domain which means that they will have to share the same robots.txt. Does anybody have suggestions to separate the two on the same file without complications? It's a tricky one. Thank you in advance.
Intermediate & Advanced SEO | | Whittie0 -
How should I go about repairing 400,000 404 error pages?
My thinking is to make a list of most linked to and most trafficked error pages, and just redirect those, but I don't know how to get all that data because i can't even download all the error pages from Webmaster Tools, and even then, how would i get backlink data except by checking each link manually? Are there any detailed step-by-step instructions on this that I missed in my Googling? Thanks for reading!!
Intermediate & Advanced SEO | | DA20130 -
Should all pages on a site be included in either your sitemap or robots.txt?
I don't have any specific scenario here but just curious as I come across sites fairly often that have, for example, 20,000 pages but only 1,000 in their sitemap. If they only think 1,000 of their URL's are ones that they want included in their sitemap and indexed, should the others be excluded using robots.txt or a page level exclusion? Is there a point to having pages that are included in neither and leaving it up to Google to decide?
Intermediate & Advanced SEO | | RossFruin1 -
Why are these results being showed as blocked by robots.txt?
If you perform this search, you'll see all m. results are blocked by robots.txt: http://goo.gl/PRrlI, but when I reviewed the robots.txt file: http://goo.gl/Hly28, I didn't see anything specifying to block crawlers from these pages. Any ideas why these are showing as blocked?
Intermediate & Advanced SEO | | nicole.healthline0 -
What is the best way to optimize/setup a teaser "coming soon" page for a new product launch?
Within the context of a physical product launch what are some ideas around creating a /coming-soon page that "teases" the launch. Ideally I'd like to optimize a page around the product, but the client wants to try build consumer anticipation without giving too many details away. Any thoughts?
Intermediate & Advanced SEO | | GSI0