Unavoidable duplicate page
-
Hi,
I have an issue where I need to duplicate content on a new site that I am launching. Visitors to the site need to think that product x is part of two different services. e.g.
Re-writing content for product x for each service section is not an option but possibly I could get over that only one product-x page is indexed by search engines. What's the best way to do this? Any advice would be appreciated.
Thanks,
Stuart
-
Thanks for your help with this. In the end we decided to go down the nofollow route after consulting the developer with your suggestions.
Stu
-
It's okay, we all just want to give you the best answer we can. It can be tough without specifics to give you actionable advice, but I'll keep trying!
My gut feeling here (and please someone correct me if I'm wrong) is that if there is any appreciable difference in content (as you indicate above), then the dupe content thing may not be a problem. This sort of problem only exists for pages with multiple urls (like in more than one subdomain) or old mirrors that were never taken down and such. If you offer similar services with alot of the same content but slightly different service sections, you shouldn't be flagged for dupes.
Why not skip any tactics for now, get the site launched, remove the older content, and use Moz to run a crawl and see if any further content is flagged? Sounds to me like this is a borderline dupe content issue and may not really be a problem for the engines. I don't think I can really say until the pages are live. Let us know when they are!
-
The content in the 'details' subsections are identical between service 1 and 2, but the details sections aren't identical within their own service sections. Anyone confused yet?! I'm really sorry I can't share any links of the site, it's currently still in development.
Stu
-
Tentative yes. If the subsections are different "pages" it should work the same.
However, since you aren't specific with the URLs, I can't visit the pages to see how exactly appropriate this is. I've optimized thousands of paginated and product pages with this problem, and rel-canonicaling (yes I made up that verb) them worked best for us. If you'd like a more accurate answer, feel free to share example URLs for me to poke around in.
-
That DOES sound cool...
So will that also work for any sub-sections within the product-x section too? I probably should have mentioned but product-x has a sub set of pages which arn't unique e.g.
domain.com/service1/product-x/product-x-details-1
domain.com/service1/product-x/product-x-details-2
domain.com/service2/product-x/product-x-details-1
domain.com/service2/product-x/product-x-details-2
Stu
-
So if you have to FORCE the engines to prioritize one of multiple duplicates, you have a few options. Assuming you want both pages to exist, you can provide a nofollow or noindex rule to one of the duplicates. This is a blunt approach which works well but not really the coolest.
The coolest option is to give one a canonical tag. Telling the engines that one of the multiple duplicates is the "canonical" one is what Google even recommends (and it has all sorts of neat downstream SEO benefits anyway.
So add a "rel canonical" tag to the "proper" page. Matt lays it out here:https://support.google.com/webmasters/answer/139394?hl=en
Let us know how it goes!
-
Hi Peter,
Thanks for your response. I probably should have said but I have vastly simplified the description of this problem. Product-x is actually made of lots of components and sub pages within itself. Which is the reason I can't re-write product-x for each service section.
So there will be lots of duplicate pages within each product-x section. The service 1 and 2 sections are unique however.
Stuart
-
Hi Stuart
Is product-x just one component in each of these services along with other components?
If so, could you not solve this by having content on each of the service pages about those services and then a small amount of info about product-x on each of those pages with a read more type link to product-x on the actual product-x page.
So you would have:
domain.com/service1
domain.com/service2both of which point to a page showing product-x at domain.com/product-x
If that is not possible then there isn't a way around having duplicate pages as I can see it.
Peter
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Old pages not mobile friendly - new pages in process but don't want to upset current traffic.
Working with a new client. They have what I would describe as two virtual websites. Same domain but different coding, navigation and structure. Old virtual website pages fail mobile friendly, they were not designed to be responsive ( there really is no way to fix them) but they are ranking and getting traffic. New virtual website pages pass mobile friendly but are not SEO optimized yet and are not ranking and not getting organic traffic. My understanding is NOT mobile friendly is a "site" designation and although the offending pages are listed it is not a "page" designation. Is this correct? If my understanding is true what would be the best way to hold onto the rankings and traffic generated by old virtual website pages and resolve the "NOT mobile friendly" problem until the new virtual website pages have surpassed the old pages in ranking and traffic? A proposal was made to redirect any mobile traffic on the old virtual website pages to mobile friendly pages. What will happen to SEO if this is done? The pages would pass mobile friendly because they would go to mobile friendly pages, I assume, but what about link equity? Would they see a drop in traffic ? Any thoughts? Thanks, Toni
Technical SEO | | Toni70 -
Blog archive pages are meta noindexed but still flagged as duplicate
Hi all. I know there several threads related to noindexing blog archives and category pages, so if this has already been answered, please direct me to that post. My blog archive pages have preview text from the posts. Each time I post a blog, the last post on any given archive page shifts to the first spot on the next archive page. Moz seems to report these as new duplicate content issues each week. I have my archive pages set to meta noindex, so can I feel good about continuing to ignore these duplicate content issues, or is there something else I should be doing to prevent penalties? TIA!
Technical SEO | | mkupfer1 -
Indexed pages
Just started a site audit and trying to determine the number of pages on a client site and whether there are more pages being indexed than actually exist. I've used four tools and got four very different answers... Google Search Console: 237 indexed pages Google search using site command: 468 results MOZ site crawl: 1013 unique URLs Screaming Frog: 183 page titles, 187 URIs (note this is a free licence, but should cut off at 500) Can anyone shed any light on why they differ so much? And where lies the truth?
Technical SEO | | muzzmoz1 -
On-Page Problem
Hello Mozzers, A friend has a business website and the on-page stuff is done really bad. He wants to rank for: conference room furnishing, video conference, digital signage. (Don't worry about the keywords, it's just made up for an example.) For these three services he has a page: hiswebsite.com/av AV stands for audio and video and is the h1. If you click on one of the service, the url doesn't change. Like if you click on video conference, just the text changes, the url stays /av. All his targeted pages got an F Grade, I am not surprised, the services titles are in . Wouldn't it be a lot better to make an own page for every service with a targeted keyword, like hiswebsite.com/video-conference All this stuff is on /av, how will a 301 resirect work to all the service pages, does this make sense? Any help is appreciated! Thanks in advance!
Technical SEO | | grobro1 -
Pages with Duplicate Page Content Crawl Diagnostics
I have Pages with Duplicate Page Content in my Crawl Diagnostics Tell Me How Can I solve it Or Suggest Me Some Helpful Tools. Thanks
Technical SEO | | nomyhot0 -
Can you 301 redirect a page to an already existing/old page ?
If you delete a page (say a sub department/category page on an ecommerce store) should you 301 redirect its url to the nearest equivalent page still on the site or just delete and forget about it ? Generally should you try and 301 redirect any old pages your deleting if you can find suitable page with similar content to redirect to. Wont G consider it weird if you say a page has moved permenantly to such and such an address if that page/address existed before ? I presume its fine since say in the scenario of consolidating departments on your store you want to redirect the department page your going to delete to the existing pages/department you are consolidating old departments products into ?
Technical SEO | | Dan-Lawrence0 -
How to Solve Duplicate Page Content Issue?
I have created one campaign over SEOmoz tools for my website. I have found 89 duplicate content issue from report. Please, look in to Duplicate Page Content Issue. I am quite confuse to resolve this issue. Can any one suggest me best solution to resolve it?
Technical SEO | | CommercePundit0 -
Cache my page
So I need to get this page cached: http://www.flowerpetal.com/index.jsp?info=13 It's been 4-5 months since uploaded. Now it's linked to from the homepage of a PR5 site. I've tweeted that link 10 times, facebooked, stumbled, linked to it from other articles and still nothing. And I submitted the url to google twice. Any thoughts? Thanks Tyler
Technical SEO | | tylerfraser0