REMOVE
-
REMOVE
-
The way you have to look at it is...
Best-case scenario: write completely original content for every one of your pages, and receive the highest ranking from search engines.
- OR -
Use existing content provided by manufacturers and not receive full potential in the SERP's. That's just the way it is.
I know exactly what you're saying though, don't get me wrong... writing unique content for 1,000's of pages can be a pain, especially if you're adding new ones on a regular basis. I just gave you the Pro/Con of your situation.
If you could ever find the time to write unique write-ups for every product, and get that out of the way, and then you're at a point where you're only adding a new product every week or so (even daily isn't that bad in terms of maintaining a website, really), then you'd be laughing and you'll see a massive difference in the SERP's as your content would be 100% unique and people would start scraping your site for theirs.
-
You are partially correct. Poor content is bad, too. You would need to spend a lot of time on making a system that generated substantial unique content, but it could be time well spent.
-
For someone starting out, this is really annoying.
Wow! You got great advice. Fantastic advice.
I think that you should reread it several times and hope that your competitor is not reading this thread.
Ressler gave you some of the best advice that you will get.
-
Other options:
Use reviews on your product pages (I'd suggest utilizing Schema markup - http://www.schema-creator.org)
Hiring out to college kids looking for a few bucks
Hiring freelancers
-
The best solution would be to work within your Content Management System to provide the best possible Title and H1 Tag for your customers, and then use an auto-generator to produce content. I work with companies that have less than 1000 products, so I don't have a lot of experience with the auto-generators, but it will give you a slight benefit.
If I were dealing with that many DVD's I would make landing pages for each Genre, some for major actors/actresses and major directors, and go after those niches while I hammered out the issues with my content management system.
-
Yes, Google will knock you for it.
Start with your most popular product, and work your way down. Also, make sure you write the content for the consumer, and not for your company.
Another suggestion I would have is personalizing your website. Brand yourself as an expert within the content, and mark the products that you suggest. If you have a favorite hammer, make sure people can quickly identify it.
For references, check out Gun Dog Supply http://www.gundogsupply.com/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Improve Site Performance by Removing/Replacing Widgets & Plugins?
We completed a Wordpress redesign of our website in December. The project took about 8 months. Important URLs on the new site are performing slowly according to Google Page Speed Insights. For instance, a key product page gets a score of 18 on mobile and 61 on desktop. Home page scores 37 on mobile and 80 on desktop. My new SEO believes the website is hindered by an excessive number of plugins and widgets. That reducing the number of these may increase performance. Also, my developers were unable to get WT3 Total Cache to work with our InMotion server and have used about 3 plugins for cache. We purchased a real estate theme (wpcasa) and heavily customized it. Any suggestions for improveing performance? If we recoded the website from scratch without a pre existing theme (using the existing design) would that speed up performance? Is there anything we can do remove complexity and improve URL download speeds? We are in a very competitive niche and we need decent performance in order to rank. Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan1 -
Should remove 404 page
Hello, I upload a new website with new web addresses and my current addresses don't work anymore. I don't want to do redirects. Should I just remove the old address from google index using their tool or let google do it on its own. Thank you,
Intermediate & Advanced SEO | | seoanalytics1 -
Old Sub domain removal and deletion of content
There are two questions here. I have waited for over 2-3 weeks now and they are still not resolved till now. An old sub-domain is still indexed on Google (blog.nirogam.com) of which all pages have been redirected or 404'd to main domain. There is no webmasters, no authority of this old sub-domain. Hosting of the same might be there. (this has been deleted and does not exist - we own main domain only) How do I de-index and remove them for good? _(Around ~1,000 pages)_I am trying this public tool - any better approaches?Even after removing pages and submission on the tool, 600 pages are still indexed after 2-3 weeks! We deleted a lot of thin content/duplicate pages from the domain (nirogam.com) in Wordpress - All these pages are still in Google's index. They are in Trash folder now. This is causing an increase in 404s in the webmasters etcI have served a 410 header (using wordpress plugin) on all these pages as these should not be directed to anything. However, Google does not always fully understand 410 properly and it still shows up in webmasters as read in this detailed post.All these pages are still indexed.How do I de-index these pages? Any other approach to stop the 404s and remove these pages for good?Any feedback/approach will be highly appreciated.
Intermediate & Advanced SEO | | pks3330 -
Should I remove all vendor links (link farm concerns)?
I have a web site that has been around for a long time. The industry we serve includes many, many small vendors and - back in the day - we decided to allow those vendors to submit their details, including a link to their own web site, for inclusion on our pages. These vendor listings were presented in location (state) pages as well as more granular pages within our industry (we called them "topics). I don't think it's important any more but 100% of the vendors listed were submitted by the vendors themselves, rather than us "hunting down" links for inclusion or automating this in any way. Some of the vendors (I'd guess maybe 10-15%) link back to us but many of these sites are mom-and-pop sites and would have extremely low authority. Today the list of vendors is in the thousands (US only). But the database is old and not maintained in any meaningful way. We have many broken links and I believe, rightly or wrongly, we are considered a link farm by the search engines. The pages on which these vendors are listed use dynamic URLs of the form: \vendors<state>-<topic>. The combination of states and topics means we have hundreds of these pages and they thus form a significant percentage of our pages. And they are garbage 🙂 So, not good.</topic></state> We understand that this model is broken. Our plan is to simply remove these pages (with the list of vendors) from our site. That's a simple fix but I want to be sure we're not doing anything wring here, from an SEO perspective. Is this as simple as that - just removing these page? How much effort should I put into redirecting (301) these removed URLs? For example, I could spend effort making sure that \vendors\California- <topic>(and for all states) goes to a general "topic" page (which still has relevance, but won't have any vendors listed)</topic> I know there is no distinct answer to this, but what expectation should I have about the impact of removing these pages? Would the removal of a large percentage of garbage pages (leaving much better content) be expected to be a major factor in SEO? Anyway, before I go down this path I thought I'd check here in case I miss something. Thoughts?
Intermediate & Advanced SEO | | MarkWill0 -
Removing pages from index
My client is running 4 websites on ModX CMS and using the same database for all the sites. Roger has discovered that one of the sites has 2050 302 redirects pointing to the clients other sites. The Sitemap for the site in question includes 860 pages. Google Webmaster Tools has indexed 540 pages. Roger has discovered 5200 pages and a Site: query of Google reveals 7200 pages. Diving into the SERP results many of the pages indexed are pointing to the other 3 sites. I believe there is a configuration problem with the site because the other sites when crawled do not have a huge volume of redirects. My concern is how can we remove from Google's index the 2050 pages that are redirecting to the other sites via a 302 redirect?
Intermediate & Advanced SEO | | tinbum0 -
Disavow first (and link removal outreach second) as tactic?
I need to remove/disavow hundreds of domains due to an algorithmic penalty. Has anyone disavowed first and done the outreach thing second as a tactic? The reason why I was considering this was as follows: Most of the websites are from spammy websites and unlikely to have monitored accounts/available contact details. My business is incredibly seasonal, only being easily profitable for half of the year. The season starts from next month so the window of opportunity to get it done is small. If there's a Penguin update before I get it done, then it could be very bad news. Any thoughts would be much appreciated. (Incidentally, if you are interested in, I also posted here about it: http://moz.com/community/q/honest-thoughts-needed-about-link-building-removal)
Intermediate & Advanced SEO | | Coraltoes770 -
I have removed over 2000+ pages but Google still says i have 3000+ pages indexed
Good Afternoon, I run a office equipment website called top4office.co.uk. My predecessor decided that he would make an exact copy of the content on our existing site top4office.com and place it on the top4office.co.uk domain which included over 2k of thin pages. Since coming in i have hired a copywriter who has rewritten all the important content and I have removed over 2k pages of thin pages. I have set up 301's and blocked the thin pages using robots.txt and then used Google's removal tool to remove the pages from the index which was successfully done. But, although they were removed and can now longer be found in Google, when i use site:top4office.co.uk i still have over 3k of indexed pages (Originally i had 3700). Does anyone have any ideas why this is happening and more importantly how i can fix it? Our ranking on this site is woeful in comparison to what it was in 2011. I have a deadline and was wondering how quickly, in your opinion, do you think all these changes will impact my SERPs rankings? Look forward to your responses!
Intermediate & Advanced SEO | | apogeecorp0 -
Robots.txt & url removal vs. noindex, follow?
When de-indexing pages from google, what are the pros & cons of each of the below two options: robots.txt & requesting url removal from google webmasters Use the noindex, follow meta tag on all doctor profile pages Keep the URLs in the Sitemap file so that Google will recrawl them and find the noindex meta tag make sure that they're not disallowed by the robots.txt file
Intermediate & Advanced SEO | | nicole.healthline0