[Advice] Dealing with an immense URl structure full of canonicals with Budget & Time constraint
-
Good day to you Mozers,
I have a website that sells a certain product online and, once bought, is specifically delivered to a point of sale where the client's car gets serviced.
This website has a shop, products and informational pages that are duplicated by the number of physical PoS. The organizational decision was that every PoS were supposed to have their own little site that could be managed and modified.
Examples are:
- Every PoS could have a different price on their product
- Some of them have services available and some may have fewer, but the content on these service page doesn't change.
I get over a million URls that are, supposedly, all treated with canonical tags to their respective main page. The reason I use "supposedly" is because verifying the logic they used behind canonicals is proving to be a headache, but I know and I've seen a lot of these pages using the tag.
i.e:
- https:mysite.com/shop/ <-- https:mysite.com/pointofsale-b/shop
- https:mysite.com/shop/productA <-- https:mysite.com/pointofsale-b/shop/productA
The problem is that I have over a million URl that are crawled, when really I may have less than a tenth of them that have organic trafic potential.
Question is:
For products, I know I should tell them to put the URl as close to the root as possible and dynamically change the price according to the PoS the end-user chooses. Or even redirect all shops to the main one and only use that one.I need a short term solution to test/show if it is worth investing in development and correct all these useless duplicate pages. Should I use Robots.txt and block off parts of the site I do not want Google to waste his time on?
I am worried about: Indexation, Accessibility and crawl budget being wasted.
Thank you in advance,
-
Hey Chris!
Thanks a lot for your time. I did send you a PM the day after your original post, I will send you another :).
Thanks a lot for your additionnal advice. You're right about managing client's expectations and its crucial. You're pointing out some valid points and I will have to ponder about how I approach this whole situation.
Charles,
-
Hey Charles,
No problem, I've been out of the office most of the past week so I'm trying to catch up on a few of these now, sorry! I don't recall seeing any PMs either.
I feel weird to recommend shaving 3/4 of their site on which they put a lot of money in.
That's perfectly normal and I'd have the same reservations. If you do decide to go ahead with it though (and I'm absolutely not looking to push you into a decision either way, just providing the info) you can highlight the fact that paying a lot of money for a website doesn't make it inherently good. If those extra pages are providing no unique value then they're just a hindrance to their long-term goal of earning a return from that site via organic traffic.
It's a conversation we have semi-regularly with new clients. They think that because they just spent $20k on a new site, making changes to it is silly and a waste of the money they invested in the first place. "Sure it's broken but it was expensive"... I don't think search engines or users really care how much it cost
in the eyes of the client, it may come off as bold.
It certainly is bold and don't be fooled, there is a reasonable chance their rankings will get worse before they get better. In some cases when we perform a cleanup like this we'll see a brief drop before a steady improvement.
This doesn't happen all the time by any means, in fact we did a smaller scale version of this last week for two new clients and both have already started moving ahead over the weekend without a drop in rankings prior. It's really just about managing expectations and pitching the long term benefit over the short term fear.
Just be very careful in the way you project-manage it - be meticulous with updating internal links and 301 any pages that have external links pointing to them as well. You want to end up with a clean, efficient and crawlable website that retains as much value as possible.
You understand many sets of eyes are directed at them and a lot is to gain.
Also a very valid concern!
I'm probably not telling you anything you don't already know anyhow so don't think I'm trying to lecture you on how to do your job, just sharing my knowledge and anecdotal evidence on similar things.
-
Hey Chris!
Thanks for that lenghty response. It is very much appreciated and so is your offer for help. Let me check with some people to see if I can share the company's name.
[EDIT] Sent you a private msgOne of the reason I want to test the waters is, to be real honest, I feel weird to recommend shaving 3/4 of their site on which they put a lot of money in. I guess it comes down to reassuring them that these changes will be positive, but in the eyes of the client, it may come off as bold.
Another thing is, it is an international business that have different teams for different country. For more than 20 countries, they are the only one to try and sell their product online. You understand many sets of eyes are directed at them and a lot is to gain.
-
Hi Charles,
That's a tough one! I definitely see the motivation to test the waters here first before you go spending time on it but it will likely take less time than you think and either way, the user experience will be significantly better once you're done so I'd expect that either way, your time/dev investment would likely be viable.
I suppose you could block certain sections via Robots and wait to measure the results but I'd be more inclined to throw on the gloves and get elbow deep!
You've already mentioned the issues the current structure causes so you are aware of them which is great. With those in mind, focus on the user experience. What is it they're looking for on your site? How would they expect to find it? Can they find the solution with as few clicks as practical?
Rand did a Whiteboard Friday recently on Cleaning up the Cruft which was a great overview of the broader areas you can often trim your site back down to size. For me anyway, the aim is to have as few pages on the site as practical. If a page(s), category, tag etc doesn't need to exist then just remove it!
It's hard to say or to give specific advice here without seeing your site but chances are if you were to sit down and physically map out your website you'd find a lot of redundancy that, once fixed, would cut your million pages down to a significantly more manageable number. A recent example of this for us was a client who had a bunch of redundant blog categories and tags as well as multiple versions of some URLs due to poor internal linking. We cut their total URL volume from over 300 to just 78 and that alone was enough to significantly improve their search visibility.
I'd be happy to take a closer look at this one if you're willing to share your URL, though I understand if you're not. Either way, the best place to start here will be reviewing your site structure and seeing if it truly makes sense.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
When I crawl my website I have urls with (#!162738372878) at the end of my urls
When I crawl my website I have urls with (#!162738372878) at the end of my urls. I used screaming frog to look check my website and I seen these. My normal urls are in there too, but each of them have a copy with this strange symbol and number at the end. I used a website builder called homestead to make the website and I seen a bunch of there urls in my crawl as well - http://editor.homestead.com/faq is an example I recently created a new website with their new website builder and transferred it to my old domain. However, I didnt know they didnt offer 301 redirects or canonical tags(learned about those afterwards) and I changed my page names. So they recommended I leave the old website published along with the new website. So if I search my website name on google, sometimes both will show in the results. I just want to sort this all out somehow. My website is www.coastlinetvinstalls.com Any feedback is greatly appreciated. Thanks, Matt
Intermediate & Advanced SEO | | Matt160 -
Structured data? Confused
I understand the basic concept of structured data (guiding engines onto how to view the content) but how do I implement this? After creating a product page with images, content, links, etc. What do I do to make sure we are good on all the different structured content types? Is there a tool to make it happen? Sorry for not understanding it...
Intermediate & Advanced SEO | | Jamesmcd030 -
Will google merge structured data from two pages if they have the same canonical?
Will google merge structured data from two pages if they have the same canonical? The crawler should be able to get to the tab through an ahref. The tab in question is "Cast & Crew." Thank you in advance for any insight! szmOmj8.jpg uM8qUfi.jpg
Intermediate & Advanced SEO | | catbur0 -
What should my main sitemap URL be?
Hi Mozzers - regarding the URL of a website's main website: http://example.com/sitemap.xml is the normal way of doing it but would it matter if I varied this to: http://example.com/mainsitemapxml.xml or similar? I can't imagine it would matter but I have never moved away from the former before - and one of my clients doesn't want to format the URL in that way. What the client is doing is actually quite interesting - they have the main sitemap: http://example.com/sitemap.xml - that redirects to the sitemap file which is http://example.com/sitemap (with no xml extension) - might that redirect and missing xml extension the redirected to sitemap cause an issue? Never come across such a setup before. Thanks in advance for your feedback - Luke
Intermediate & Advanced SEO | | McTaggart0 -
Capitals in URLs
Hello Mozzers. I've just been looking at a site with capitals in the URL - capitals are used in the product descriptions, so you'll have a URL structure like this: www.company.com/directory1/Double-Beds-Luxury (such URLs do not work if I lower the case of the capitals). There are 50,000 such products on the site. Clearly one drawback is potential customers might type in, or link to, the lower case of the URL and get a "not found" result (though the urls are relatively long so not that likely I'm thinking). Are there any additional drawbacks with the use of capitals outlined here?
Intermediate & Advanced SEO | | McTaggart0 -
URL blocked
Hi there, I have recently noticed that we have a link from an authoritative website, however when I looked at the code, it looked like this: <a <span="">href</a><a <span="">="http://www.mydomain.com/" title="blocked::http://www.mydomain.com/">keyword</a> You will notice that in the code there is 'blocked::' What is this? has it the same effect as a nofollow tag? Thanks for any help
Intermediate & Advanced SEO | | Paul780 -
How to deal with old, indexed hashbang URLs?
I inherited a site that used to be in Flash and used hashbang URLs (i.e. www.example.com/#!page-name-here). We're now off of Flash and have a "normal" URL structure that looks something like this: www.example.com/page-name-here Here's the problem: Google still has thousands of the old hashbang (#!) URLs in its index. These URLs still work because the web server doesn't actually read anything that comes after the hash. So, when the web server sees this URL www.example.com/#!page-name-here, it basically renders this page www.example.com/# while keeping the full URL structure intact (www.example.com/#!page-name-here). Hopefully, that makes sense. So, in Google you'll see this URL indexed (www.example.com/#!page-name-here), but if you click it you essentially are taken to our homepage content (even though the URL isn't exactly the canonical homepage URL...which s/b www.example.com/). My big fear here is a duplicate content penalty for our homepage. Essentially, I'm afraid that Google is seeing thousands of versions of our homepage. Even though the hashbang URLs are different, the content (ie. title, meta descrip, page content) is exactly the same for all of them. Obviously, this is a typical SEO no-no. And, I've recently seen the homepage drop like a rock for a search of our brand name which has ranked #1 for months. Now, admittedly we've made a bunch of changes during this whole site migration, but this #! URL problem just bothers me. I think it could be a major cause of our homepage tanking for brand queries. So, why not just 301 redirect all of the #! URLs? Well, the server won't accept traditional 301s for the #! URLs because the # seems to screw everything up (server doesn't acknowledge what comes after the #). I "think" our only option here is to try and add some 301 redirects via Javascript. Yeah, I know that spiders have a love/hate (well, mostly hate) relationship w/ Javascript, but I think that's our only resort.....unless, someone here has a better way? If you've dealt with hashbang URLs before, I'd LOVE to hear your advice on how to deal w/ this issue. Best, -G
Intermediate & Advanced SEO | | Celts180 -
What Is The Preferred Url Structure For Se’s?
Here is my issue, my domain is abcdomian.com and I’m trying to rank the site for the keyword “example”. All of my content is under “abcdomain.com/folder/example/” and building content off of “abcdomain.com/example” is not an option. So I’m thinking about moving the content to “abcdomain.com/online-example/” and 301ing the old pages . Of the two paths below, which will have a greater impact on my rankings for the term “example”? Current: abcdomain.com/folder/example/
Intermediate & Advanced SEO | | samp582
Proposed: abcdomain.com/online-example/ Thoughts?0