Handling long URLs and overly-dynamic URLs on eCommerce site
-
Hello Forum,
I've been optimizing an eCommerce site and our SEOmoz crawls are favorable for the most part, except for long URLs and overly-dynamic URLs. These issues stem from two URL types: Layered navigation (faceted search) and non-Google internal search results. I outline the issues for each below.
We use an SEO-friendly URL structure for our product category pages, but once bots start "clicking" our layered navigation options, all the parameters are appended to our SEO-friendly urls, causing the SEOmoz crawl warnings.
Layered Navigation :
SEO-Friendly Category Page: oursite.com/shop/meditation-cushions.htmlEffects of layered navigation: oursite.com/shop/meditation-cushions.html?bolster_material_quality=414&bolsters_appearance=206&color=12&dir=asc&height=291&order=name
As you can see the parameters include product attributes and page sorts. I should note that all pages generated by these parameters use the element to point back to the SEO-friendly URL We have also set up Google's Webmaster Tools to handle these parameters.
Internal Search Function:
Our URLs start off simple: oursite.com/catalogsearch/result/?q=brown. Then the bot clicks all the layered navigation options, yielding oursite.com/catalogsearch/result/index/?appearance=54&cat=67&clothing_material=83&color=12&product_color=559&q=brown. Also, all search results are set to noindex,follow.My question is: Should we worry about these overly-dynamic and long ULR warnings? We have set up canonical elements, "noindex,follow" solutions, and configured Webmaster Tools to handle our parameters. If these are a concern, how would you resolve these issues?
-
I see this thread was from last year, so I am hoping between then and now you have determined an answer and would be able to advise. I am having the same issue with our consumer sight.
-
If you make them friendly it will shorten them
x=y can become y
But having done that and they are still too long i would ignore them as they are no-index.
-
There another company handling the server side of things. All I know is that we're using PHP and MySQL for Magento.
Even if we did a friendly URL rewrite, wouldn't we still get long URLs? We would just have each parameter become words separated by slashed. i.e .
/shop/meditation-cushions.html/high quaily/patterened/green/10inches/sortedbyname/
I suppose these URLs shorter. Is something like this better?
-
Marc
The crawl warnings are those found in SEOmoz's crawl diagnostics: "Overly-Dynamic URL" and "Long URL." These are not duplicate content issues and the URLs resolve properly.
I just want to make sure we're not getting dinged for having URLs that are too long. If we are, what are some way to go about shortening them?
-Aaron
-
What kind of "crawl warnings" are we talking about here? Duplicate content? Do the URL's resolve properly when the additional parameters are appending to the SEO-friendly URL's?
"I should note that all pages generated by these parameters use the element to point back to the SEO-friendly URL We have also set up Google's Webmaster Tools to handle these parameters."
Keep in mind, using canonical tags is like setting up 301 redirects on all those pages. Some people don't now that, so I thought I'd just throw it out there. So, if any of those additional pages with the host of parameters contain unique/different content than the seo-friendly versions, using canonical tags is not a good move as they will get no attention from search engines that respect the canonical tag.
For example, do not use a canonical tag on a 'Page 2' to point back to page 1. Each page will contain different information/products/whatever, and you want search engines (SE) to see and index those pages, regardless of what the URL looks like (as long as it works and your Title/META/H1-H6 tags are all in order to reflect the different content on each page.
I'm not sure I'm following your concern 100% percent, so I hope I was on the right path with what I said. Can you please be more specific as to what you concern is with the "overly-dynamic and long ULR warnings" please, and I'll be happy to help you out some more.
- Marc
-
The easy fix is the canonical, yet Bing suggest not using the canonical on the true page, only the duplicates. Best if you can handle that in code, but not a big worry of you cant.
Facet naviagtion is a big problem, with no easy answers.
What sort of server are you using, on a windows server it is very easy to set up friendly urls for your dynamic urls.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages with URL Too Long
Hello Mozzers! MOZ keeps kindly telling me the URLs are too long. However, this is largely due to the structure of E-commerce site, which has to include 'brand' 'range' and 'products' keyword. For example -
Moz Pro | | tigersohelll
https://www.choicefurnituresuperstore.co.uk/Devonshire-Rustic-Oak-Bedside-Cabinet-1-Drawer-p40668.html MOZ recommends no more than 75 characters. This means we have 25-30 characters for both the brand name and product name. Questions:
If it is an issue, how to fix it on my site?
If it's not an issue, how can we turn off this alert from MOZ?
Anyone know how big an issue URLs are as a ranking factor? I thought pretty low.0 -
Block Moz (or any other robot) from crawling pages with specific URLs
Hello! Moz reports that my site has around 380 duplicate page content. Most of them come from dynamic generated URLs that have some specific parameters. I have sorted this out for Google in webmaster tools (the new Google Search Console) by blocking the pages with these parameters. However, Moz is still reporting the same amount of duplicate content pages and, to stop it, I know I must use robots.txt. The trick is that, I don't want to block every page, but just the pages with specific parameters. I want to do this because among these 380 pages there are some other pages with no parameters (or different parameters) that I need to take care of. Basically, I need to clean this list to be able to use the feature properly in the future. I have read through Moz forums and found a few topics related to this, but there is no clear answer on how to block only pages with specific URLs. Therefore, I have done my research and come up with these lines for robots.txt: User-agent: dotbot
Moz Pro | | Blacktie
Disallow: /*numberOfStars=0 User-agent: rogerbot
Disallow: /*numberOfStars=0 My questions: 1. Are the above lines correct and would block Moz (dotbot and rogerbot) from crawling only pages that have numberOfStars=0 parameter in their URLs, leaving other pages intact? 2. Do I need to have an empty line between the two groups? (I mean between "Disallow: /*numberOfStars=0" and "User-agent: rogerbot")? (or does it even matter?) I think this would help many people as there is no clear answer on how to block crawling only pages with specific URLs. Moreover, this should be valid for any robot out there. Thank you for your help!0 -
New pages on my web site
I have created web sites that appear somewhere on Google in hardly any time at all, but I appear to have forgotten something or things are different for pages added recently to an existing website. I have added a page on a particular subject, optimized it using on page grader, so that I get an A, and a check mark for everything except H1 tags and rel=canonical which my web hosting provider does not support. I do have a check mark for accessible to search engines The page has the format http://www.domain.com/specific-keyword It is in the menu, so should have internal links to it, as I understand it. I have created a new site map, and submitted it in webmaster tools. Interestingly it says that of the 96 pages only 76 were indexed is this a clue? and why would they not index a page I have then shared the page on google plus, facebook, tumblr, pinterest and twitter and some others In OSE it comes up as domain authority 28 page authority 1, the social media shares do show up in metrics on the right but no links internal or external are shown, they do on other pages I created in the same way. Is it just a case of waiting or is their something I do to help thank you
Moz Pro | | singingtelegramsuk0 -
Web Site Migration Testing and SEO-QA Automation?
Hey Mozzers, Are there any good Migration-SEO-QA Tools out there? Given a prioritized list of URLs and prioritized list of Keywords, is there a tool that can compare basic SEO factors, old URL vs. new URL, and identify all the specific gaps that need to be fixed? Here is a basic SEO-QA acceptance checklist, for porting any website. . . . Until the porting work is completed we cannot accept the new website. Givens: 1. A list of the Top 100 URLs from the old site, prioritized by conversion rates, landing page traffic, and inbound links. 2. A list of the planned 404 - mapped URLs, old to new site, from the porting team. 3. A list of the current Top 200 Keywords, prioritized. 4. A good amount of SEO work has already been done, by several professionals, for the current (old) site. **How to evaluate if the new site will be acceptable to Google? Check ON-PAGE SEO Factors... ** **. . . that is, the NEW site must be AS GOOD AS (or better than) the current (old) site,
Moz Pro | | George.Fanucci
in the eyes of Google, to preserve the On-Page SEO work already done. ** Criteria: URLs ok? :: Is the URL mapping ok, old to new, best web page? LINKS ok :: Are all internal LINKS and keyword Anchor Text ported? TEXT ok :: On-page content, TEXT and keywords ok? TITLE ok :: HTML Title and title keywords ok? DESCRIPTION ok :: HTML Meta Description ok? H1, H2 ok :: HTML H1, H2 and keywords ok? IMG kwds :: HTML IMG and ALT keywords ok? URL kwds :: URL - keywords in new URLs ok? Potential porting defects: Keywords in URL missing: Keywords in HTML Title missing: Keywords in Meta Description missing: Any internal LINKS or Link anchor text missing: Keywords in Page TEXT missing: H1, H2 missing keywords: HTML IMG alt-text, IMG file URLs, any missing keywords: Notes: Until the porting work is completed we cannot accept the new site, or set a target date for potential cutover. There are eight (8) data items per URL, and about one hundred (100) URLs to be considered for SEO-QA before going live. We were expecting to cutover before the end of February, at the latest. There is no point in doing full QA acceptance-tests until the porting work is completed. QA spot-checks have found far too many defects. About 60% of the landing-page traffic comes via the top 40 URLs. With over 100 URLs to look at, it can take more than a week or two just to do SEO-QA in detail, manually, item-by-item, page-by-page, side-by-side, old vs. new. Spot-checks indicate a business disaster would occur unless the porting defects are fixed before going live. _Any Migration-QA Tools?_Given a prioritized list of URLs and prioritized list of Keywords, is there a tool that can compare basic On-Page SEO factors, old URL vs. new URL, and identify most of the specific gaps that need to be fixed before going live with the new site? _ *** Edit: Any comments on the SEO criteria, tools, or methods will be appreciated!_0 -
How do I see all links to my site in Web Explore?
It appears that the Web Explore looks for exact matches to a page. I'd like to see all matches to any page. Will regular expressions work?
Moz Pro | | Leverage_Marketing0 -
How do I increase domain authority? Real Estate SIte
I have a site that is just a few months old. How do I get the domain authority up?
Moz Pro | | bronxpad0 -
SEOmoz PRO campaign fot HTTPS site
Hi all, I'm trying to configure a PRO campagin for a https website. Butt it won't work. The software says it found a one redirect (for http to https I guess), and that's it. So now I don't have any data.... Can anybody help me? Thnx! Martijn
Moz Pro | | Men4Media0 -
How long will it take for the link analysis to be updated?
In an effort to evaluate SEOmoz I started off my free trial with a fresh website that I am going to get to rank on the first page and hopefully to the #1 spot. I have been link building and changing the website around a lot. When I created the website hosted by HostGator with wordpress installed I marked down where it showed up the first time. After checking for many days to about the 100th spot I found it in spot 65. I have since then been spending some time (not a ton of time, but enough) guest posting, some bookmarking, press release creation and some other odds and ends stuff. The page has gone up to 23rd but there have been no signs of any work done on the link analysis. So, How long does it take?
Moz Pro | | DLRISM0