I'm Pulling Hairs! - Duplicate Content Issue on 3 Sites
-
Hi,
I'm an SEO intern trying to solve a duplicate content issue on three wine retailer sites. I have read up on the Moz Blog Posts and other helpful articles that were flooded with information on how to fix duplicate content. However, I have tried using canonical tags for duplicates and redirects for expiring pages on these sites and it hasn't fixed the duplicate content problem.
My Moz report indicated that we have 1000s of duplicates content pages. I understand that it's a common problem among other e-commerce sites and the way we create landing pages and apply dynamic search results pages kind of conflicts with our SEO progress. Sometimes we'll create landing pages with the same URLs as an older landing page that expired. Unfortunately, I can't go around this problem since this is how customer marketing and recruitment manage their offers and landing pages. Would it be best to nofollow these expired pages or redirect them?
Also I tried to use self-referencing canonical tags and canonical tags that point to the higher authority on search results pages and even though it worked for some pages on the site, it didn't work for a lot of the other search result pages. Is there something that we can do to these search result pages that will let google understand that these search results pages on our site are original pages?
There are a lot of factors that I can't change and I'm kind of concerned that the three sites won't rank as well and also drive traffic that won't convert on the site. I understand that Google won't penalize your sites with duplicate content unless it's spammy. So If I can't fix these errors -- since the company I work conducts business where we won't ever run out of duplicate content -- Is it worth going on to other priorities in SEO like Keyword research, On/Off page optimization? Or should we really concentrate on fixing these technical issues before doing anything else?
I'm curious to know what you think.
Thanks!
-
I'm an SEO intern trying to solve a duplicate content issue on three wine retailer sites.
-
Hey there,
Regarding the tech issues, if Google has any difficulties with crawling your site (duplicates included), it can't reach the content and links you have there. Therefore, it's crucial to solve the tech issues first to help Google crawl your site as smoothly as possible so it can see your content.
Also, anytime the page expires, redirect it with 301 to a similar one.
Feel free to shoot other questions. Cheers, Martin
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Images search traffic and image thumbnail issues
Hi MOZ community! Need a little help with a strange issue we are seeing of late on our project CareerAddict.com. We have seen a sudden and significant drop in image visibility in Search Console from the 27th August onwards. I understand that Google has been updating their filters and other bits in image search, so maybe this could have impacted us? I also noticed that the images which are mapped to our articles are not the full featured article 700px wide images which we provide to Google in the Structured Data. They are instead taking the OG share 450px wide images now on many occasions. You can see this by searching for "careeraddict.com" in images. Any insight or suggestions welcome on both of these. Interested to understand if any other webmasters are experiencing other or similar problems with image visibility in Google also. Thanks!
Algorithm Updates | | dqmedia0 -
Would there be any benefit to creating multiple pages of the same content to target different titles?
Obviously, the duplicated pages would be canonical, but would there be a way of anchoring a page land by search term entry? For example: If you have a site that sells cars you could use this method but have a page that has (brand) cars for sale, finance options, best car for a family, how far will the (brand) car go for on a full tank and so on? Then making all the information blocks h2's but using the same H2s for the duplicated page titles. Then it gets complicated, If someone searches "best car for a family" and the page title for the duplicated page is clicked how would you anchor this user to the section of the page with this information? Could there be a benefit to doing this or would it just not work?
Algorithm Updates | | Evosite10 -
New Website Old Domain - Still Poor Rankings after 1 Year - Tagging & Content the culprit?
I've run a live wedding band in Boston for almost 30 years, that used to rank very well in organic search. I was hit by the Panda Updates August of 2014, and rankings literally vanished. I hired an SEO company to rectify the situation and create a new WordPress website -which launched January 15, 2015. Kept my old domain: www.shineband.com Rankings remained pretty much non-existent. I was then told that 10% of my links were bad. After lots of grunt work, I sent in a disavow request in early June via Google Wemaster Tools. It's now mid October, rankings have remained pretty much non-existent. Without much experience, I got Moz Pro to help take control of my own SEO and help identify some problems (over 60 pages of medium priority issues: title tag character length and meta description). Also some helpful reports by www.siteliner.com and www.feinternational.com both mentioned a Duplicate Content issue. I had old blog posts from a different domain (now 301 redirecting to the main site) migrated to my new website's internal blog, http://www.shineband.com/best-boston-wedding-band-blog/ as suggested by the SEO company I hired. It appears that by doing that -the the older blog posts show as pages in the back end of WordPress with the poor meta and tile issues AS WELL AS probably creating a primary reason for duplicate content issues (with links back to the site). Could this most likely be viewed as spamming or (unofficial) SEO penalty? As SEO companies far and wide daily try to persuade me to hire them to fix my ranking -can't say I trust much. My plan: put most of the old blog posts into the Trash, via WordPress -rather than try and optimize each page (over 60) adjusting tagging, titles and duplicate content. Nobody really reads a quick post from 2009... I believe this could be beneficial and that those pages are more hurtful than helpful. Is that a bad idea, not knowing if those pages carry much juice? Realize my domain authority not great. No grand expectations, but is this a good move? What would be my next step afterwards, some kind of resubmitting of the site, then? This has been painful, business has fallen, can't through more dough at this. THANK YOU!
Algorithm Updates | | Shineband1 -
Duplicate pages in language versions, noindex in sitemap and canonical URLs in sitemap?
Hi SEO experts! We are currently in the midst of reducing our amount of duplicate titles in order to optimize our SEO efforts. A lot of the "duplicate titles" come from having several language versions of our site. Therefore, I am wondering: 1. If we start using "" to make Google (and others) aware of alternative language versions of a given site/URL, how big a problem will "duplicate titles" then be across our domains/site versions? 2. Is it a problem that we in our sitemap include (many) URL's to pages that are marked with noindex? 3. Are there any problems with having a sitemap that includes pages that includes canonical URL's to other pages? Thanks in advance!
Algorithm Updates | | TradingFloor.com0 -
Is it better to build a large site that covers many verticals or many sites dedicated to each vertical
Just wondering from an seo perspective is it better to build a large site that covers many verticals or build out many sites one for each vertical?
Algorithm Updates | | tlhseo0 -
Regarding site url structure
OK so there are already some answers to questions similar to this but mine might be a little more specific. OK website is www.bestlifeint.com Most of our product pages are as such: http://www.bestlifeint.com/products-soy.html for instance. However I was trying to help the SEO for certain pages (namely two) with the URL's and had some success with another page our Soy Meal Replacement I changed the site URL of this page from www.bestlifeint.com/products-meal to www.bestlifeint.com/Soy-Amazing-Meal-Replacement-with-Omega-3s.html (notice I dropped the /product part of url and made it more seo friendly. The old page for this page was something like www.bestlifeint.com/products-meal The issue is that recently this new page and another page I have changed http://www.bestlifeint.com/Whey-Milk-Alternative.html I have dropped the "/product" on the URL even though they are both products. The new Meal Replacement page used to be ranked like 6th on google at the begining of the month and now is like 48th or something. The new "whey milk" page (http://www.bestlifeint.com/Whey-Milk-Alternative.html) is ranked like 45th or something for "Whey Milk" when the old page...."products/wheyrice.html" was ranked around 18th or so at the begining of the month. Have I hurt these two pages by not following www.bestlifeint.com/product.... site structure? And focusing more on the URL SEO? I have both NEW pages receiving all link juice inside web site so they are the new pages (can not go to old page) and recently seeing that google has pretty much dropped the old pages in search rankings I have deleted these two pages. Do i just need to just wait and see? According to my research we should rank much higher for "Whey Milk" we should be on the first page according to googles own statements of searchers finding good relevant material. Any advice moving forward? Thanks, Brian
Algorithm Updates | | SammisBest0 -
Accidently blocked our site for an evening?
Yesterday at about 5pm I switched our site to a new server and accidentally blocked our site from google for the evening. our domain is posnation.com and we are ranked in the top 3 in almost all pos related keywords. When i got in this morning i realized the mistake and went to google web tools and noticed the site was blocked so i went to fetch as google bot and corrected that. Now the message says: Check to see that your robots.txt is working as expected. (Any changes you make to the robots.txt content below will not be saved.)
Algorithm Updates | | POSNation
robots.txt file Downloaded Status
http://www.posnation.com/robots.txt 1 hours ago 200 (Success) When you go to google and type "pos systems" we are still #2 so i assume all is still ok. My question is will this potentially hurt our rankings and should i be worried and is there anything else I can do.0 -
What is the critical size to reach for a content farm to be under google spot?
We're looking for building a content farm, as an igniter for another site, so there will be some duplicate content. Is it a good or a bad strategy in terms of SEO.
Algorithm Updates | | sarenausa0