[Advice] Dealing with an immense URl structure full of canonicals with Budget & Time constraint
-
Good day to you Mozers,
I have a website that sells a certain product online and, once bought, is specifically delivered to a point of sale where the client's car gets serviced.
This website has a shop, products and informational pages that are duplicated by the number of physical PoS. The organizational decision was that every PoS were supposed to have their own little site that could be managed and modified.
Examples are:
- Every PoS could have a different price on their product
- Some of them have services available and some may have fewer, but the content on these service page doesn't change.
I get over a million URls that are, supposedly, all treated with canonical tags to their respective main page. The reason I use "supposedly" is because verifying the logic they used behind canonicals is proving to be a headache, but I know and I've seen a lot of these pages using the tag.
i.e:
- https:mysite.com/shop/ <-- https:mysite.com/pointofsale-b/shop
- https:mysite.com/shop/productA <-- https:mysite.com/pointofsale-b/shop/productA
The problem is that I have over a million URl that are crawled, when really I may have less than a tenth of them that have organic trafic potential.
Question is:
For products, I know I should tell them to put the URl as close to the root as possible and dynamically change the price according to the PoS the end-user chooses. Or even redirect all shops to the main one and only use that one.I need a short term solution to test/show if it is worth investing in development and correct all these useless duplicate pages. Should I use Robots.txt and block off parts of the site I do not want Google to waste his time on?
I am worried about: Indexation, Accessibility and crawl budget being wasted.
Thank you in advance,
-
Hey Chris!
Thanks a lot for your time. I did send you a PM the day after your original post, I will send you another :).
Thanks a lot for your additionnal advice. You're right about managing client's expectations and its crucial. You're pointing out some valid points and I will have to ponder about how I approach this whole situation.
Charles,
-
Hey Charles,
No problem, I've been out of the office most of the past week so I'm trying to catch up on a few of these now, sorry! I don't recall seeing any PMs either.
I feel weird to recommend shaving 3/4 of their site on which they put a lot of money in.
That's perfectly normal and I'd have the same reservations. If you do decide to go ahead with it though (and I'm absolutely not looking to push you into a decision either way, just providing the info) you can highlight the fact that paying a lot of money for a website doesn't make it inherently good. If those extra pages are providing no unique value then they're just a hindrance to their long-term goal of earning a return from that site via organic traffic.
It's a conversation we have semi-regularly with new clients. They think that because they just spent $20k on a new site, making changes to it is silly and a waste of the money they invested in the first place. "Sure it's broken but it was expensive"... I don't think search engines or users really care how much it cost
in the eyes of the client, it may come off as bold.
It certainly is bold and don't be fooled, there is a reasonable chance their rankings will get worse before they get better. In some cases when we perform a cleanup like this we'll see a brief drop before a steady improvement.
This doesn't happen all the time by any means, in fact we did a smaller scale version of this last week for two new clients and both have already started moving ahead over the weekend without a drop in rankings prior. It's really just about managing expectations and pitching the long term benefit over the short term fear.
Just be very careful in the way you project-manage it - be meticulous with updating internal links and 301 any pages that have external links pointing to them as well. You want to end up with a clean, efficient and crawlable website that retains as much value as possible.
You understand many sets of eyes are directed at them and a lot is to gain.
Also a very valid concern!
I'm probably not telling you anything you don't already know anyhow so don't think I'm trying to lecture you on how to do your job, just sharing my knowledge and anecdotal evidence on similar things.
-
Hey Chris!
Thanks for that lenghty response. It is very much appreciated and so is your offer for help. Let me check with some people to see if I can share the company's name.
[EDIT] Sent you a private msgOne of the reason I want to test the waters is, to be real honest, I feel weird to recommend shaving 3/4 of their site on which they put a lot of money in. I guess it comes down to reassuring them that these changes will be positive, but in the eyes of the client, it may come off as bold.
Another thing is, it is an international business that have different teams for different country. For more than 20 countries, they are the only one to try and sell their product online. You understand many sets of eyes are directed at them and a lot is to gain.
-
Hi Charles,
That's a tough one! I definitely see the motivation to test the waters here first before you go spending time on it but it will likely take less time than you think and either way, the user experience will be significantly better once you're done so I'd expect that either way, your time/dev investment would likely be viable.
I suppose you could block certain sections via Robots and wait to measure the results but I'd be more inclined to throw on the gloves and get elbow deep!
You've already mentioned the issues the current structure causes so you are aware of them which is great. With those in mind, focus on the user experience. What is it they're looking for on your site? How would they expect to find it? Can they find the solution with as few clicks as practical?
Rand did a Whiteboard Friday recently on Cleaning up the Cruft which was a great overview of the broader areas you can often trim your site back down to size. For me anyway, the aim is to have as few pages on the site as practical. If a page(s), category, tag etc doesn't need to exist then just remove it!
It's hard to say or to give specific advice here without seeing your site but chances are if you were to sit down and physically map out your website you'd find a lot of redundancy that, once fixed, would cut your million pages down to a significantly more manageable number. A recent example of this for us was a client who had a bunch of redundant blog categories and tags as well as multiple versions of some URLs due to poor internal linking. We cut their total URL volume from over 300 to just 78 and that alone was enough to significantly improve their search visibility.
I'd be happy to take a closer look at this one if you're willing to share your URL, though I understand if you're not. Either way, the best place to start here will be reviewing your site structure and seeing if it truly makes sense.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
When I crawl my website I have urls with (#!162738372878) at the end of my urls
When I crawl my website I have urls with (#!162738372878) at the end of my urls. I used screaming frog to look check my website and I seen these. My normal urls are in there too, but each of them have a copy with this strange symbol and number at the end. I used a website builder called homestead to make the website and I seen a bunch of there urls in my crawl as well - http://editor.homestead.com/faq is an example I recently created a new website with their new website builder and transferred it to my old domain. However, I didnt know they didnt offer 301 redirects or canonical tags(learned about those afterwards) and I changed my page names. So they recommended I leave the old website published along with the new website. So if I search my website name on google, sometimes both will show in the results. I just want to sort this all out somehow. My website is www.coastlinetvinstalls.com Any feedback is greatly appreciated. Thanks, Matt
Intermediate & Advanced SEO | | Matt160 -
Preserving URL Structure from Os Commerce to Magento
I have a website that is built on OS Commerce and I am planning to transition to Magento. I was told that the transition to Magento would change my url structure. How do I preserve my current url structure while migrating to the Magento platform so that I do not lose my backlink profile.
Intermediate & Advanced SEO | | WebServiceConsulting.com0 -
Wordpress Permalink Structure
A quick Wordpress Permalink checkup... I'm generally a fan of %postname% Permalink structure in Wordpress - although this does create a completely flat architecture, so that ALL Pages AND Posts are found at www.domain.com/_________ I'm sure I've heard, read, or ingested somewhere that it makes more sense to use /blog/%postname% which then makes all Blog Posts reside at www.domain.com/blog/________ with the static Page content still being at www.domain.com/________ Any thoughts to why this would NOT be a good idea? To me this seems more logical.. and like I say I'm sure I've heard an authority say Google kind of prefers that it can differentiate Blog content from everything else. I've used this successfully on a few sites so far, and all seems to be good. (Moz although not Wordpress, uses this structure for it's blog). Thanks!
Intermediate & Advanced SEO | | GregDixson0 -
Pagination, Canonical, Prev & Next
Hello All
Intermediate & Advanced SEO | | Vitalized
I have a question about my Magento setup. I have lots of categories which have many products so the categories paginate. I've seen info about making sure the Canonical tag doesn't simply send Search Engines back to the first page meaning the paginated pages won't get indexed. I've also seen info about using the rel=next & rel=prev to help Search Engines understand the category pages are paginated... Is it okay to use both? I've made sure that: category/?p=1 has a canonical of category/ to make sure there isn't duplicate content. Here's an example of category/?p=2 meta data:
http://website.com/category/?p=2" />
http://website.com/category/" />
http://website.com/category/?p=3" />0 -
Read More & SEO
I have just had my site redesigned. The site was designed with only important facts bullets at the top of the page and all other information is below in the read more section that expands when clicked. I am wondering if I need to have this information in the read more section visible to the customer or if having the majority of the text in the read more is OK? and how it will effect rankings having it this way? I have had spots #1 &2 on Google for my keywords- until the site was redesigned...wondering if this was part of the reason. I have moved some of the text up to be visible on some of the pages - but it makes the site look cramped - and competes with the ease of use the site design Any insight on this is appreciated.
Intermediate & Advanced SEO | | Cheri110 -
Url structure for multiple search filters applied to products
We have a product catalog with several hundred similar products. Our list of products allows you apply filters to hone your search, so that in fact there are over 150,000 different individual searches you could come up with on this page. Some of these searches are relevant to our SEO strategy, but most are not. Right now (for the most part) we save the state of each search with the fragment of the URL, or in other words in a way that isn't indexed by the search engines. The URL (without hashes) ranks very well in Google for our one main keyword. At the moment, Google doesn't recognize the variety of content possible on this page. An example is: http://www.example.com/main-keyword.html#style=vintage&color=blue&season=spring We're moving towards a more indexable URL structure and one that could potentially save the state of all 150,000 searches in a way that Google could read. An example would be: http://www.example.com/main-keyword/vintage/blue/spring/ I worry, though, that giving so many options in our URL will confuse Google and make a lot of duplicate content. After all, we only have a few hundred products and inevitably many of the searches will look pretty similar. Also, I worry about losing ground on the main http://www.example.com/main-keyword.html page, when it's ranking so well at the moment. So I guess the questions are: Is there such a think as having URLs be too specific? Should we noindex or set rel=canonical on the pages whose keywords are nested too deep? Will our main keyword's page suffer when it has to share all the inbound links with these other, more specific searches?
Intermediate & Advanced SEO | | boxcarpress0 -
Product URL structure for a marketplace model
Hello All. I run an online marketplace start-up that has around 10000 products listed from around 1000+ sellers. We are a similar model to etsy/ebay in the sense that we provide a platform but sellers to list products and sell them. I have a URL structure question. I have read http://www.seomoz.org/q/how-to-define-best-url-structure-for-product-pages which seems to show everyone suggests to use Products: products/category/product-name Categories: products/category as the structure for product pages. Because we are a marketplace (our category structure has multiple tiers sometimes up to 3) our sellers choose a category for products to go in. How we have handled this before is we have used: Products: products/last-tier-category-chosen/product-name (eg: /products/sweets-and-snacks/fluffy-marshmallows) Categories: products/category (eg: /products/sweets-and-snacks) However we have two issues with this: The categories can sometimes change, or users can change them which means the links completely change and undo any link building work built up. The urls can get a bit long and am worried that the most important data (the fluffy marshmallow that reflects in the page title and content) is left till too late in the URL. As a result we plan to change our URL structure (we are going through a rebuild anyhow so losing old links is not an issue here) so that the new structure was: Products: products/product-name(eg: /products/fluffy-marshmallows) Categories: products/category (eg: /products/sweets-and-snacks) My concern about doing this however, and question here, is whether this willnegatively impact the "structure" of pages when google crawls our marketplace.Because "fluffy marshmallows" will no longer technically fit into the url structure of "sweets and snacks". I dont know if this would have a negative impact or not. FYI etsy (one of the largest marketplace models in the world) us the latter approach and do not have categories in product urls, eg: listing/42003836/vintage-french-industrial-inspired-side Any ideas on this? Many thanks!
Intermediate & Advanced SEO | | LiamPatterson0 -
Expiring URL seo
a buddy of mine is running a niche job board and is having issues with expiring URLs. we ruled it out cuz a 301 is meant to be used when the content has moved to another page, or the page was replaced. We were thinking that we'd be just stacking duplicate content on old urls that would never be 'replaced'. Rather they have been removed and will never come back. So 410 is appropriate but maybe we overlooked something. any ideas?
Intermediate & Advanced SEO | | malachiii0