REMOVE
-
REMOVE
-
The way you have to look at it is...
Best-case scenario: write completely original content for every one of your pages, and receive the highest ranking from search engines.
- OR -
Use existing content provided by manufacturers and not receive full potential in the SERP's. That's just the way it is.
I know exactly what you're saying though, don't get me wrong... writing unique content for 1,000's of pages can be a pain, especially if you're adding new ones on a regular basis. I just gave you the Pro/Con of your situation.
If you could ever find the time to write unique write-ups for every product, and get that out of the way, and then you're at a point where you're only adding a new product every week or so (even daily isn't that bad in terms of maintaining a website, really), then you'd be laughing and you'll see a massive difference in the SERP's as your content would be 100% unique and people would start scraping your site for theirs.
-
You are partially correct. Poor content is bad, too. You would need to spend a lot of time on making a system that generated substantial unique content, but it could be time well spent.
-
For someone starting out, this is really annoying.
Wow! You got great advice. Fantastic advice.
I think that you should reread it several times and hope that your competitor is not reading this thread.
Ressler gave you some of the best advice that you will get.
-
Other options:
Use reviews on your product pages (I'd suggest utilizing Schema markup - http://www.schema-creator.org)
Hiring out to college kids looking for a few bucks
Hiring freelancers
-
The best solution would be to work within your Content Management System to provide the best possible Title and H1 Tag for your customers, and then use an auto-generator to produce content. I work with companies that have less than 1000 products, so I don't have a lot of experience with the auto-generators, but it will give you a slight benefit.
If I were dealing with that many DVD's I would make landing pages for each Genre, some for major actors/actresses and major directors, and go after those niches while I hammered out the issues with my content management system.
-
Yes, Google will knock you for it.
Start with your most popular product, and work your way down. Also, make sure you write the content for the consumer, and not for your company.
Another suggestion I would have is personalizing your website. Brand yourself as an expert within the content, and mark the products that you suggest. If you have a favorite hammer, make sure people can quickly identify it.
For references, check out Gun Dog Supply http://www.gundogsupply.com/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Beta Site Removal best practices
Hi everyone.
Intermediate & Advanced SEO | | bgvsiteadmin
We are doing a CMS migration and site redesign with some structural changes. Our temporarily Beta site (one of the staging environments and the only one that is not behind firewall) started appearing in search. Site got indexed before we added robots.txt due to dev error (at that time all pages were index,follow due to nature of beta site, it is a final stage that mirrors live site) As an remedy, we implemented robots.txt for beta version as : User-Agent: *
Disallow: / Removed beta form search for 90 days. Also, changed all pages to no index/no follow . Those blockers will be changed once code for beta get pushed into production. However, We already have all links redirected (301) from old site to new one. this will go in effect once migration starts (we will go live with completely redesigned site that is now in beta, in few days). After that, beta will be deleted completely and become 404 or 410. So the question is, should we delete beta site and simple make 404/410 without any redirects (site as is existed for only few days ). What is best thing to do, we don't want to hurt our SEO equity. Please let me know if you need more clarification. Thank you!0 -
Does removing large portion of content hurt overall website organic visibility?
Hi everyone, I am wondering if there are any negative SEO effects of removing mass amounts of content specifically in the situation I am about to describe. We have a website that is being converted into Wordpress, however, one particular section that contains a large portion of content (31 pages) have not been transferred over yet. We are very eager to launch the new Wordpress website for lead generation purposes and will gradually re-implement the content over time. From Google Analytics, these pages have not generated a significant amount of organic entrances (~7 ) in the last year. Furthermore, these pages do not contain any backlinks. I would like to know whether or not this would have an overal negative SEO impact on the website even if we 301/create a page for coming soon/310/404 these pages? My gut feeling is no, but I would like to make sure I am not missing anything. Thanks Moz community!
Intermediate & Advanced SEO | | Snaptech_Marketing0 -
When removing a product page from an ecommerce site?
What is the best practice for removing a product page from an Ecommerce site? If a 301 is not available and the page is already crawled by the search engine A. block it out in the robot.txt B. let it 404
Intermediate & Advanced SEO | | Bryan_Loconto0 -
Impact of simplifying website and removing 80% of site's content
We're thinking of simplifying our website which has grown to a very large size by removing all the content which hardly ever gets visited. The plan is to remove this content / make changes over time in small chunks so that we can monitor the impact on SEO. My gut feeling is that this is okay if we make sure to redirect old pages and make sure that the pages we remove aren't getting any traffic. From my research online it seems that more content is not necessarily a good thing if that content is ineffective and that simplifying a site can improve conversions and usability. Could I get people's thoughts on this please? Are there are risks that we should look out for or any alternatives to this approach? At the moment I'm struggling to combine the needs of SEO with making the website more effective.
Intermediate & Advanced SEO | | RG_SEO0 -
Moving Part of a Website to a Subdomain to Remove Panda Penalty?
I have lots of news on my website and unlike other types of content, news posts quickly become obsolete and get a high bounce rate. I have reasons to think that the news on my website might be partly responsible for a Panda penalty so I'm not sure. There are over 400 news posts on the blog from the last 4 years so that's still a lot of content. I was thinking of isolating the news articles on a subdomain (news.mywebsite.com) If the news play a part in the Panda penalty, would that remove it from the main domain?
Intermediate & Advanced SEO | | sbrault740 -
Removing Content 301 vs 410 question
Hello, I was hoping to get the SEOmoz community’s advice on how to remove content most effectively from a large website. I just read a very thought-provoking thread in which Dr. Pete and Kerry22 answered a question about how to cut content in order to recover from Panda. (http://www.seomoz.org/q/panda-recovery-what-is-the-best-way-to-shrink-your-index-and-make-google-aware). Kerry22 mentioned a process in which 410s would be totally visible to googlebot so that it would easily recognize the removal of content. The conversation implied that it is not just important to remove the content, but also to give google the ability to recrawl that content to indeed confirm the content was removed (as opposed to just recrawling the site and not finding the content anywhere). This really made lots of sense to me and also struck a personal chord… Our website was hit by a later Panda refresh back in March 2012, and ever since then we have been aggressive about cutting content and doing what we can to improve user experience. When we cut pages, though, we used a different approach, doing all of the below steps:
Intermediate & Advanced SEO | | Eric_R
1. We cut the pages
2. We set up permanent 301 redirects for all of them immediately.
3. And at the same time, we would always remove from our site all links pointing to these pages (to make sure users didn’t stumble upon the removed pages. When we cut the content pages, we would either delete them or unpublish them, causing them to 404 or 401, but this is probably a moot point since we gave them 301 redirects every time anyway. We thought we could signal to Google that we removed the content while avoiding generating lots of errors that way… I see that this is basically the exact opposite of Dr. Pete's advice and opposite what Kerry22 used in order to get a recovery, and meanwhile here we are still trying to help our site recover. We've been feeling that our site should no longer be under the shadow of Panda. So here is what I'm wondering, and I'd be very appreciative of advice or answers for the following questions: 1. Is it possible that Google still thinks we have this content on our site, and we continue to suffer from Panda because of this?
Could there be a residual taint caused by the way we removed it, or is it all water under the bridge at this point because Google would have figured out we removed it (albeit not in a preferred way)? 2. If there’s a possibility our former cutting process has caused lasting issues and affected how Google sees us, what can we do now (if anything) to correct the damage we did? Thank you in advance for your help,
Eric1 -
Robots.txt & url removal vs. noindex, follow?
When de-indexing pages from google, what are the pros & cons of each of the below two options: robots.txt & requesting url removal from google webmasters Use the noindex, follow meta tag on all doctor profile pages Keep the URLs in the Sitemap file so that Google will recrawl them and find the noindex meta tag make sure that they're not disallowed by the robots.txt file
Intermediate & Advanced SEO | | nicole.healthline0