URLs: Removing duplicate pages using anchor?
-
I've been working on removing duplicate content on our website. There are tons of pages created based on size but the content is the same.
The solution was to create a page with 90% static content and 10% dynamic, that changed depending on the "size" Users can select the size from a dropdown box.
So instead of 10 URLs, I now have one URL.
- Users can access a specific size by adding an anchor to the end of the URL (?f=suze1, ?f=size2)
For e.g:
Old URLs.
- www.example.com/product-alpha-size1
- www.example.com/product-alpha-size2
- www.example.com/product-alpha-size3
- www.example.com/product-alpha-size4
- www.example.com/product-alpha-size5
New URLs
- www.example.com/product-alpha-size1
- www.example.com/product-alpha-size1?f=size2
- www.example.com/product-alpha-size1?f=size3
- www.example.com/product-alpha-size1?f=size4
- www.example.com/product-alpha-size1?f=size5
Do search engines read the anchor or drop them? Will the rank juice be transfered to just www.example.com/product-alpha-size1?
-
Thanks Everett,
- Rel="canonical" is in place, so that's covered
- The urls with the parameter are only accessible if you want to directly access a particular size. If you are on the default page and switch sizes from the dropdown, no URL change is presented.
- I have left webmaster to decide what should be crawled or not. The parameter has been mentioned though.
-
Cyto,
The Google Webmaster Tools parameter handling, in my opinion, is often best left up to Google. In other words, I rarely change it. Instead, I try to fix the issue itself. In your case, here is what I would advise:
Instead of using a parameter in the URL, use cookies or hidden divs to change the content on the page to the different size. Have a look at most major online retailers. You can select a size or color from the drop down and it never changes the URL.
If this is not possible, I recommend the following:
Ensure the rel = "canonical" tag on all of those pages references the canonical version (e.g. /product-alpha-size1) which will consolidate the link-related metrics like PageRank into the one page.
-
Please say YES
-
Thank you Celilcan2,
- I'll set it up as 'yes' and it 'narrows' the page
- What is the perk of doing this though? Will Google not count anything after the parameter as something or value, it would focus on just the single URL?
-
Go to google webmaster tools
- On the Dashboard, under Crawl, click URL Parameters.
- Next to the parameter you want, click Edit. (If the parameter isn’t listed, click Add parameter. Note that this tool is case sensitive, so be sure to type your parameter exactly as it appears in your URL.)
- If the parameter doesn't affect the content displayed to the user, select **No ... **in the Does this parameter change... list, and then click Save. If the parameter does affect the display of content, click Yes: Changes, reorders, or narrows page content, and then select how you want Google to crawl URLs with this parameter.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Web accessibility - High Contrast web pages, duplicate content and SEO
Hi all, I'm working with a client who has various URL variations to display their content in High Contrast and Low Contrast. It feels like quite an old way of doing things. The URLs look like this: domain.com/bespoke-curtain-making/ - Default URL
Intermediate & Advanced SEO | | Bee159
domain.com/bespoke-curtain-making/?style=hc - High Contrast page
domain.com/bespoke-curtain-making/?style=lc - Low Contrast page My questions are: Surely this content is duplicate content according to a search engine Should the different versions have a meta noindex directive in the header? Is there a better way of serving these pages? Thanks.0 -
Use Canonical or Robots.txt for Map View URL without Backlink Potential
I have a Page X with lots of unique content. This page has a "Map view" option, which displays some of the info from Page X, but a lot is ommitted. Questions: Should I add canonical even though Map View URL does not display a lot of info from Page X or adding to robots.txt or noindex, follow? I don't see any back links coming to Map View URL Should Map View page have unique H1, title tag, meta des?
Intermediate & Advanced SEO | | khi50 -
Removing old versions of a page.
Recently one of my good friends updated his iweb based screen printing site to wordpress per my recommendation. This update has helped dramatically boost his rankings to #3 for most local keywords. This new site is now V5 of his site, but all older iweb versions are still on the ftp. There are a total of 209 pages on the ftp, as versions of about 30 actual pages. The pages have changed significantly with each update, leaving very little duplicate content, but the old ones are still on the google index. Would it hurt the rankings to clean up these older versions & 301 redirect to the new versions, or should we leave them? The site for reference is: http://takeholdprinting.com
Intermediate & Advanced SEO | | GoogleMcDougald0 -
Duplicate Page Title problems with Product Catalogues (Categories, Subcategories etc.)
Hey guys, I've done a fair bit of Googling and "mozzing" and can't seem to find a definitive solution. In our product catalogue on our site we have multiple ways to access the product for navigation purposes, and SeoMoz is throwing up hundreds of duplicate page title errors which are basically just different ways to get to the same product yet it sees it as a "separate page" and thus duplicating itself. Is this just SeoMoz confusing itself or does Google actually see it this way too? For example, a product might be: www.example.com/region/category/subcategory/ www.example.com/region2/category/subcategory/ www.example.com/region/category/subcategory2/ etc. Is the only solution to have the product ONLY listed in one combination? This kind of kills our ability to have easy refinement for customers browsing the catalogue, i.e: something that falls under the "Gifts for Men" might also be a match for "Father's Day Gifts" or "Gifts for Dad" etc. Any solution or advice is greatly appreciated, cheers 🙂
Intermediate & Advanced SEO | | ExperienceOz0 -
Ok to use rich snippets for same product on multiple pages?
I am developing a new set of pages for a series of products which exist on separate sub domains linked to the root domain. The product pages on the sub domains have rich snippets; review count, review score etc. The new pages im building out are for the same products though on the root domain and with different content. Im not comfortable marking those pages up with rich snippets too given they will have the same review counts, scores etc though would like to if its viable? Any thoughts/opinions? Thanks, Andy
Intermediate & Advanced SEO | | AndyMacLean0 -
Are there any negative effects to using a 301 redirect from a page to another internal page?
For example, from http://www.dog.com/toys to http://www.dog.com/chew-toys. In my situation, the main purpose of the 301 redirect is to replace the page with a new internal page that has a better optimized URL. This will be executed across multiple pages (about 20). None of these pages hold any search rankings but do carry a decent amount of page authority.
Intermediate & Advanced SEO | | Visually0 -
Duplicate page content and duplicate pate title
Hi, i am running a global concept that operates with one webpage that has lot of content, the content is also available on different domains, but with in the same concept. I think i am getting bad ranking due to duplicate content, since some of the content is mirrored from the main page to the other "support pages" and they are almost 200 in total. Can i do some changes to work around this or am i just screwed 🙂
Intermediate & Advanced SEO | | smartmedia0 -
What causes internal pages to have a page rank of 0 if the home page is PR 5?
The home page PageRank is 5 but every single internal page is PR 0. Things I know I need to address each page has 300 links (Menu problem). Each article has 2-3 duplicates caused from the CMS working on this now. Has anyone else had this problem before? What things should I look out for to fix this issue. All internal linking is follow there is no page rank sculpting happening on the pages.
Intermediate & Advanced SEO | | SEOBrent0