URLs: Removing duplicate pages using anchor?
-
I've been working on removing duplicate content on our website. There are tons of pages created based on size but the content is the same.
The solution was to create a page with 90% static content and 10% dynamic, that changed depending on the "size" Users can select the size from a dropdown box.
So instead of 10 URLs, I now have one URL.
- Users can access a specific size by adding an anchor to the end of the URL (?f=suze1, ?f=size2)
For e.g:
Old URLs.
- www.example.com/product-alpha-size1
- www.example.com/product-alpha-size2
- www.example.com/product-alpha-size3
- www.example.com/product-alpha-size4
- www.example.com/product-alpha-size5
New URLs
- www.example.com/product-alpha-size1
- www.example.com/product-alpha-size1?f=size2
- www.example.com/product-alpha-size1?f=size3
- www.example.com/product-alpha-size1?f=size4
- www.example.com/product-alpha-size1?f=size5
Do search engines read the anchor or drop them? Will the rank juice be transfered to just www.example.com/product-alpha-size1?
-
Thanks Everett,
- Rel="canonical" is in place, so that's covered
- The urls with the parameter are only accessible if you want to directly access a particular size. If you are on the default page and switch sizes from the dropdown, no URL change is presented.
- I have left webmaster to decide what should be crawled or not. The parameter has been mentioned though.
-
Cyto,
The Google Webmaster Tools parameter handling, in my opinion, is often best left up to Google. In other words, I rarely change it. Instead, I try to fix the issue itself. In your case, here is what I would advise:
Instead of using a parameter in the URL, use cookies or hidden divs to change the content on the page to the different size. Have a look at most major online retailers. You can select a size or color from the drop down and it never changes the URL.
If this is not possible, I recommend the following:
Ensure the rel = "canonical" tag on all of those pages references the canonical version (e.g. /product-alpha-size1) which will consolidate the link-related metrics like PageRank into the one page.
-
Please say YES
-
Thank you Celilcan2,
- I'll set it up as 'yes' and it 'narrows' the page
- What is the perk of doing this though? Will Google not count anything after the parameter as something or value, it would focus on just the single URL?
-
Go to google webmaster tools
- On the Dashboard, under Crawl, click URL Parameters.
- Next to the parameter you want, click Edit. (If the parameter isn’t listed, click Add parameter. Note that this tool is case sensitive, so be sure to type your parameter exactly as it appears in your URL.)
- If the parameter doesn't affect the content displayed to the user, select **No ... **in the Does this parameter change... list, and then click Save. If the parameter does affect the display of content, click Yes: Changes, reorders, or narrows page content, and then select how you want Google to crawl URLs with this parameter.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to get a large number of urls out of Google's Index when there are no pages to noindex tag?
Hi, I'm working with a site that has created a large group of urls (150,000) that have crept into Google's index. If these urls actually existed as pages, which they don't, I'd just noindex tag them and over time the number would drift down. The thing is, they created them through a complicated internal linking arrangement that adds affiliate code to the links and forwards them to the affiliate. GoogleBot would crawl a link that looks like it's to the client's same domain and wind up on Amazon or somewhere else with some affiiiate code. GoogleBot would then grab the original link on the clients domain and index it... even though the page served is on Amazon or somewhere else. Ergo, I don't have a page to noindex tag. I have to get this 150K block of cruft out of Google's index, but without actual pages to noindex tag, it's a bit of a puzzler. Any ideas? Thanks! Best... Michael P.S., All 150K urls seem to share the same url pattern... exmpledomain.com/item/... so /item/ is common to all of them, if that helps.
Intermediate & Advanced SEO | | 945010 -
Different content on pages with the same URL--except one is at www and the other at www2
Hi! I have two pages with unique content on each. However, they have virtually the same URL--except one is a www and the other is a www2. As far as I know, both pages were meant to gain organic traction. How should this situation be handled for SEO purposes? Thanks for any help! ---Ivey
Intermediate & Advanced SEO | | Nichiha0 -
Hundreds of 301 Redirects. Remove Pages or Not?
Hi Mozers, I have a website that has literally got hundreds of 301 redirects. I had a close look at these URLs and only some of them have backlinks to it and remaining all of them are not indexing in Google and has got not backlinks at all. Based on what I have noticed experts mentioning, loads of 301 redirects can potentially slow down the site speed. In a case like the website I have, should I completely take off the pages from website to reduce the number of 301 redirects or should I leave 301 redirects? There is no traffic or backlinks coming from these URLs. Malika
Intermediate & Advanced SEO | | Malika10 -
Duplicate content - how to diagnose duplicate content from another domain before publishing pages?
Hi, 🙂 My company is having new distributor contract, and we are starting to sell products on our own webshop. Bio-technology is an industry in question and over 1.000 products. Writing product description from scratch would take many hours. The plan is to re-write it. With permission from our contractors we will import their 'product description' on our webshop. But, I am concerned being penalies from Google for duplicate content. If we re-write it we should be fine i guess. But, how can we be sure? Is there any good tool for comparing only text (because i don't want to publish the pages to compare URLs)? What else should we be aware off beside checking 'product description' for duplicate content? Duplicate content is big issue for all of us, i hope this answers will be helpful for many of us. Keep it hard work and thank you very much for your answers, Cheers, Dusan
Intermediate & Advanced SEO | | Chemometec0 -
Do I need to re-index the page after editing URL?
Hi, I had to edit some of the URLs. But, google is still showing my old URL in search results for certain keywords, which ofc get 404. By crawling with ScremingFrog it gets me 301 'page not found' and still giving old URLs. Why is that? And do I need to re-index pages with new URLs? Is 'fetch as Google' enough to do that or any other advice? Thanks a lot, hope the topic will help to someone else too. Dusan
Intermediate & Advanced SEO | | Chemometec0 -
Duplicate page titles and brand name
Duplicate page titles and brand name. According to SEOMOZ in my one of web site .there are many duplicate title tags some of them are company names .my problem is Do i need to make unique tile tags for each pages with keywords and what about putting company name in each page. 2)Is it necessary to put company name (Brand name ) in each page.
Intermediate & Advanced SEO | | innofidelity0 -
202 error page set in robots.txt versus using crawl-able 404 error
We currently have our error page set up as a 202 page that is unreachable by the search engines as it is currently in our robots.txt file. Should the current error page be a 404 error page and reachable by the search engines? Is there more value or is it a better practice to use 404 over a 202? We noticed in our Google Webmaster account we have a number of broken links pointing the site, but the 404 error page was not accessible. If you have any insight that would be great, if you have any questions please let me know. Thanks, VPSEO
Intermediate & Advanced SEO | | VPSEO0 -
Should I build links to the home page or a url containing the keyword?
I run an IT company and the company name does not contain the key word I am trying to rank on. I also have a bunch of pages with page rank that containing the actual keywords, for example: http://www.mycompanyname.com/tech-support/locations/brighton My target keyword is "Tech Support Brighton" My Home page is PR4 and my location based pages are PR3. My plan was to build 3 or 4 location pages for the locations we provide tech support for and target location based keyword Anchor text to these URL's e.g "Tech Support Brighton" and then for the home page build links that have the anchor text "Tech Support". Does this sound sane? Many Thanks, K
Intermediate & Advanced SEO | | SEOKeith0