I'm Pulling Hairs! - Duplicate Content Issue on 3 Sites
-
Hi,
I'm an SEO intern trying to solve a duplicate content issue on three wine retailer sites. I have read up on the Moz Blog Posts and other helpful articles that were flooded with information on how to fix duplicate content. However, I have tried using canonical tags for duplicates and redirects for expiring pages on these sites and it hasn't fixed the duplicate content problem.
My Moz report indicated that we have 1000s of duplicates content pages. I understand that it's a common problem among other e-commerce sites and the way we create landing pages and apply dynamic search results pages kind of conflicts with our SEO progress. Sometimes we'll create landing pages with the same URLs as an older landing page that expired. Unfortunately, I can't go around this problem since this is how customer marketing and recruitment manage their offers and landing pages. Would it be best to nofollow these expired pages or redirect them?
Also I tried to use self-referencing canonical tags and canonical tags that point to the higher authority on search results pages and even though it worked for some pages on the site, it didn't work for a lot of the other search result pages. Is there something that we can do to these search result pages that will let google understand that these search results pages on our site are original pages?
There are a lot of factors that I can't change and I'm kind of concerned that the three sites won't rank as well and also drive traffic that won't convert on the site. I understand that Google won't penalize your sites with duplicate content unless it's spammy. So If I can't fix these errors -- since the company I work conducts business where we won't ever run out of duplicate content -- Is it worth going on to other priorities in SEO like Keyword research, On/Off page optimization? Or should we really concentrate on fixing these technical issues before doing anything else?
I'm curious to know what you think.
Thanks!
-
I'm an SEO intern trying to solve a duplicate content issue on three wine retailer sites.
-
Hey there,
Regarding the tech issues, if Google has any difficulties with crawling your site (duplicates included), it can't reach the content and links you have there. Therefore, it's crucial to solve the tech issues first to help Google crawl your site as smoothly as possible so it can see your content.
Also, anytime the page expires, redirect it with 301 to a similar one.
Feel free to shoot other questions. Cheers, Martin
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google's Better Ads Chrome update: will it affect email pop-ups?
Chrome will be blocking ads on websites that are not compliant with the Better Ads standards as of Feb 15. Could not find any clues whether a pop-up (not an ad) asking for your email is included in this? We currently have a pop-up that appears on exit intent only (which is not penalized). Any ideas?
Algorithm Updates | | ati10 -
Website's server IP address is redirected to blog by mistake; does Google responds?
Hi all, Our website's server IP address is set to be redirected to our blog by mistake and it stayed same for months. Is there any way Google recognises it and how it responds if so? Thanks
Algorithm Updates | | vtmoz1 -
Our partners are using our website content for their websites. Do such websites hurt us due to duplicate content?
Hi all, Many of our partners across the globe are using the same content from our website and hosting on their websites including header tags, text, etc. So I wonder will these websites are hurting our website due to this duplicate content. Do we need to ask our partners to stop using our content? Any suggestions? What if some unofficial partners deny to remove the content? best way to handle? Thanks
Algorithm Updates | | vtmoz0 -
Is Having Content 'Above The Fold' Still Relevant for Website Design and SEO
Hey there, So I have a client who recently 're-skinned' their website and now there is little to no content above the fold. Likewise, I've noticed that since the transition to this new front-end design there has been a drop in rankings for a number of keywords related to one of the topics we are targeting. Is there any correlation here? Is having content 'above the fold' still a relevant factor in determining a websites' searchability? I appreciate you reading and look forward to hearing from all of you. Have a great day!
Algorithm Updates | | maxcarnage0 -
Google's Local Search Results for Broad Keywords
I have a question regarding Google's local search results for broad keywords. Since Google is changing their algo to reflect local results for broad words, would it be beneficial now to start going after those words as well? For example: we have a client ranking for 'miami security alarm', but I would like to know if it would be beneficial to start optimizing for 'security alarm' as well. Also, since Google's keyword research tool reflects searches on a national level, how would I be able to find out how many searches a broad keyword is receiving on a local level? Thank you in advanced!
Algorithm Updates | | POPCreative0 -
Google indexing my website's Search Results pages. Should I block this?
After running the SEOmoz crawl test, i have a spreadsheet of 11,000 urls of which 6381 urls are search results pages from our website that have been indexed. I know I've read that /search should be blocked from the engines, but can't seem to find that information at this point. Does anyone have facts behind why they should be blocked? Or not blocked?
Algorithm Updates | | Jenny10 -
How to Link a Network of Sites w/o Penguin Penalties (header links)
I work for a network of sites that offer up country exclusive content. The content for the US will be different than Canada, Australia, Uk, etc.… but with the same subjects. Now to make navigation easy we have included in the header of every page a drop down that has links to the other countries, like what most of you do with facebook/twitter buttons. Now every page on every site has the same link, with the same anchor text. Example: Penguins in Canada Penguins in Australia Penguins in the USA Because every page of every site has the same links (it's in the header) the "links containing this anchor text" ratio is through the roof in Open Site Explorer. Do you think this would be a reason for penguin penalization? If you think this would hurt you, what would you suggest? no follow links? Remove the links entirely and create a single page of links? other suggestions?
Algorithm Updates | | BeTheBoss0 -
SinglePlatform's Restaurant Menu Across Web Properties vs "SEO-Optimized"
Surprised I wasn't able to find an existing answer given that SinglePlatform apparently serves 500,000 SMBs with menus that appear on over 150 publisher websites. Given Panda's razor-sharp intolerance for duplicate content, am I safe to assume that any claim of SinglePlatform's menu on a local restaurant being beneficial to your SEO is now spurious? If so, what's best way to handle this as a potential SEO liability while still having one of their nicely formatted restaurant menus on your site? For reference: http://www.openforum.com/articles/using-singleplatform-to-build-a-digital-presence Update May 7, 2012 Connected directly with the folks at SinglePlatform, and the answer here is a lot simpler than my over-thinking of it. The menu usually sits within an iFrame or widget so that's that. But the ability to truthfully show an up-to-date menu for any given establishment is a legit way to address the healthy amount of local search intent that seems to be directed at exactly that. Overall a pretty slick platform, looking forward to seeing how they grow into the SMB, local & mobile in the coming months, I think the space is ripe to benefit from products/services that take advantage of these sorts of economies of scale.
Algorithm Updates | | mgalica0