URL Parameter Setting Recommendation - Webmaster Tools, Breadcrumbs & 404s
-
Hi All,
We use a parameter called "breadCrumb" to drive the breadcrumbs on our ecommerce product pages that are categorized in multiple places. For example, our "Blue Widget" product may have the following URLs:
http://www.oursite.com/item3332/blue-widget
http://www.oursite.com/item3332/blue-widget_?breadCrumb=BrandTree_
http://www.oursite.com/item3332/blue-widget_?breadCrumb=CategoryTree1_
http://www.oursite.com/item3332/blue-widget_?breadCrumb=CategoryTree2_We use a canonical tag pointing back to the base product URL. The parameter only changes the breadcrumbs. Which of the following, if any, settings would you recommend for such a parameter in GWT:
Does this parameter change page content seen by the user? Options: Yes/No
How does this parameter affect page content? Options: Narrows/Specifies/OtherCurrently, google decided to automatically assign the parameter as "Yes/Other/Let Googlebot Decide" without notifying us. We noticed a drop in rankings around the suspected time of the assignment.
Lastly, we have a consistent flow of products that are discontinued that we 404. As a result of the breadcrumb parameter, our 404s increase significantly (one for each path). Would 800 404 crawl errors out of 18k products cause a penalty on a young site? We got an "Increase in '404' pages' email from GWT, shortly after our rankings seemed to drop.
Thank you for any advice or suggestions! Doug
-
Thanks for the response Anthony! It's greatly appreciated.
> splitting the link equity between these duplicate pages
This is our main fear given that every listing page link uses this parameter.We thought about using the "No" option because we agree that the content doesn't really change, but google says this next to the option: "If many URLs differ only in this parameter, Googlebot will crawl one representative URL.". It may have the same result? Not sure.
Maybe the following would force the issue?
Does this parameter change page content seen by the user? >> Yes
How does this parameter affect page content? >> Other
Which URLs with this parameter should Googlebot crawl? >> Every URLAlso, we could try to delete their auto assignment of the parameter and hope for a different result. Anyway, thanks again for the feedback.
-
I would tell GWT that this parameter Does Not change the content on the page. A single breadcrumb path does not significantly change the content on a page.
Consider a site that sells shoes and may be using parameters like fakeshoestore.com/mens?color=black. In this case, the color parameter is creating a completely new set of results, Narrowing the entire men's shoe category down to black shoes. That is a new page and you want it treated as such. (No canonical, Yes->Narrows).
With Google thinking http://www.oursite.com/item3332/blue-widget and http://www.oursite.com/item3332/blue-widget_?breadCrumb=BrandTree_ have different content on them, they may be splitting the link equity between these duplicate pages, allowing neither of them to rank well. Even though you have the canonical tag in place (good job) it is more of a hint to Google as opposed to an absolute directive.
The 404 pages isn't a huge concern. As with all 404's redirect the ones that receive organic traffic or have external links pointing at them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why Is Google Webmaster Tools Pulling Zero Keyword Data?
I just linked a Google Webmaster Tools account to Google Analytics for a client, and Search Engine Optimization reports are showing up in Google Analytics as enabled, but there is zero keyword data, landing page data, etc., in the reports themselves. Has anyone encountered this?
Intermediate & Advanced SEO | | yoursearchteam0 -
URL Re-Writes & HTTPS: Link juice loss from 301s?
Our URLs are not following a lot of the best practices found here: http://moz.com/blog/11-best-practices-for-urls We have also been waiting to implement HTTPS. I think it might be time to take the plunge on re-writing the URLs and converting to a fully secure site, but I am concerned about ranking dips from the lost link juice from the 301s. Many of our URLs are very old, with a decent amount of quality links. Are we better off leaving as is or taking the plunge?
Intermediate & Advanced SEO | | TheDude0 -
Complex URL Migration
Hi There, I have three separate questions which are all related. Some brief back ground. My client has an adventure tourism company that takes predominantly North American customers on adventure tours to three separate destinations: New Zealand, South America and the Himalayas. They previously had these sites on their own URL's. These URL's had the destination in the URL (eg: sitenewzealand.com). 2 of the three URL's had good age and lots of incoming links. This time last year a new web company was bought in and convinced them to pull all three sites onto a single domain and to put the sites under sub folders (eg: site.com/new-zealand). The built a brand new site for them on a Joomla platform. Unfortunately the new sites have not performed and halved the previous call to action rates. Organic traffic was not adversely affected with this change, however it hasn't grown either. I have been overhauling these new sites with a project team and we have managed to keep the new design but make usability/marketing changes that have the conversion rate nearly back to where it originally was and we have managed to keep the new design (and the CMS) in place. We have recently made programmatic changes to the joomla system to push the separate destination sites back onto their original URL's. My first question is around whether technically this was a good idea. Question 1 Does our logic below add up or is it flawed logic? The reasons we decided to migrate the sites back onto their old URL's were: We have assumed that with the majority of searches containing the actual destination (eg: "New Zealand") that all other things being equal it is likely to attract a higher click through rate on the domain www.sitenewzealand.com than for www.site.com/new-zealand. Having the "newzealand" in the actual URL would provide a rankings boost for target keyword phrases containing "new zealand" in them. We also wanted to create the consumer perception that we are specialists in each of the destinations which we service rather than having a single site which positions us as a "multi-destination" global travel company. Two of the old sites had solid incoming links and there has been very little new links acquired for the domain used for the past 12 months. It was also assumed that with the sites on their own domains that the theme for each site would be completely destination specific rather than having the single site with multiple destinations on it diluting this destination theme relevance. It is assumed that this would also help us to rank better for the destination specific search phrases (which account for 95% of all target keyword phrases). The downsides of this approach were that we were splitting out content onto three sites instead of one with a presumed associated drop in authority overall. The other major one was the actual disruption that a relatively complex domain migration could cause. Opinions on the logic we adopted for deciding to split these domains out would be highly appreciated. Question 2 We migrated the folder based destination specific sites back onto their old domains at the start of March. We were careful to thoroughly prepare the htaccess file to ensure we covered off all the new redirects needed and to directly redirect the old redirects to the new pages. The structure of each site and the content remained the same across the destination specific folders (eg: site.com/new-zealand/hiking became sitenewzealand.com/hiking). To achieve this splitting out of sites and the ability to keep the single instance of Joomla we wrote custom code to dynamically rewrite the URL's. This worked as designed. Unfortunately however, Joomla had a component which was dynamically creating the google site maps and as this had not had any code changes it got all confused and started feeding up a heap of URL's which never previously existed. This resulted in each site having 1000 - 2000 404's. It took us three weeks to work this out and to put a fix into place. This has now been done and we are down to zero 404's for each site in GWT and we have proper google site maps submitted (all done 3 days ago). In the meantime our organic rankings and traffic began to decline after around 5 days (after the migration) and after 10 days had dropped down to around 300 daily visitors from around 700 daily visitors. It has remained at that level for the past 2 weeks with no sign of any recovery. Now that we have fixed the 404's and have accurate site maps into google, how long do you think it will take to start to see an upwards trend again and how long it is likely to take to get to similar levels of organic traffic compared to pre-migration levels? (if at all). Question 3 The owner of the company is understandably nervous about the overall situation. He is wishing right now that we had never made the migration. If we decided to roll back to what we previously had are we likely to cause further recovery delays and would it come back to what we previously had in a reasonably quick time frame? A huge thanks to everyone for reading what is quite a technical and lengthy post and a big thank you in advance for any answers. Kind Regards
Intermediate & Advanced SEO | | activenz
Conrad0 -
Will Canonical tag on parameter URLs remove those URL's from Index, and preserve link juice?
My website has 43,000 pages indexed by Google. Almost all of these pages are URLs that have parameters in them, creating duplicate content. I have external links pointing to those URLs that have parameters in them. If I add the canonical tag to these parameter URLs, will that remove those pages from the Google index, or do I need to do something more to remove those pages from the index? Ex: www.website.com/boats/show/tuna-fishing/?TID=shkfsvdi_dc%ficol (has link pointing here)
Intermediate & Advanced SEO | | partnerf
www.website.com/boats/show/tuna-fishing/ (canonical URL) Thanks for your help. Rob0 -
Canonical url issue
Canonical url issue My site https://ladydecosmetic.com on seomoz crawl showing duplicate page title, duplicate page content errors. I have downloaded the error reports csv and checked. From the report, The below url contains duplicate page content.
Intermediate & Advanced SEO | | trixmediainc
https://www.ladydecosmetic.com/unik-colours-lipstick-caribbean-peach-o-27-item-162&category_id=40&brands=66&click=brnd And other duplicate urls as per report are,
https://www.ladydecosmetic.com/unik-colours-lipstick-plum-red-o-14-item-157&category_id=40&click=colorsu&brands=66 https://www.ladydecosmetic.com/unik-colours-lipstick-plum-red-o-14-item-157&category_id=40 https://www.ladydecosmetic.com/unik-colours-lipstick-plum-red-o-14-item-157&category_id=40&brands=66&click=brnd But on every these url(all 4) I have set canonical url. That is the original url and an existing one(not 404). https://www.ladydecosmetic.com/unik-colours-lipstick-caribbean-peach-o-27-item-162&category_id=0 Then how this issues are showing like duplicate page content. Please give me an answer ASAP.0 -
Rewriting URL
I'm doing a major URL rewriting on our site to make the URL more SEO friendly as well as more comfortable and intuitive for our users. Our site has a lot of indexed pages, over 250k. So it will take Google a while to reindex everything. I was thinking that when Google Bot encounters the new URLs, it will probably figure out it's duplicate content with the old URL. At least until it recrawls the old URL and get a 301 directing them to the new URL. This will probably lower the ranking of every page being crawled. Am I right to assume this is what will happen? Or is it fine as long as the old URLs get 301 redirect? If it is indeed a problem, what's the best solution? rel="canonical" on every single page maybe? Another approach? Thank you.
Intermediate & Advanced SEO | | corwin0 -
Forwarding Empty URLs to Homepage for SEO & Old Backlink Salvaging - Is there any value or risk?
Our company owns about 30 URLs that we aren't currently using. Is there any SEO value to be gained by forwarding these content-less URLs to our homepage if they aren't currently indexed by google? Some of these sites were previously in use at low traffic volumes by companies who licensed use of our brand and URL. After parting ways a year or longer in the past, no 301 redirection was done to save the link juice, so it's long gone at this point. However, there may be some sites on the net that are still linking to various pages on the URL. What would be the best course of action to salvage any value of these URLs until they are in use again as full websites? Insights would be greatly appreciated! Cheers, Justin
Intermediate & Advanced SEO | | grayline0 -
Should I Use City Name in URL?
Having a website designed for a car dealership and deciding what attributes to use in the URL. Should I include the city name in the URL? Or does that help for SEO purposes? Other ideas of what to research or try are appreciated too. Thanks 🙂
Intermediate & Advanced SEO | | kylesuss0