[Advice] Dealing with an immense URl structure full of canonicals with Budget & Time constraint
-
Good day to you Mozers,
I have a website that sells a certain product online and, once bought, is specifically delivered to a point of sale where the client's car gets serviced.
This website has a shop, products and informational pages that are duplicated by the number of physical PoS. The organizational decision was that every PoS were supposed to have their own little site that could be managed and modified.
Examples are:
- Every PoS could have a different price on their product
- Some of them have services available and some may have fewer, but the content on these service page doesn't change.
I get over a million URls that are, supposedly, all treated with canonical tags to their respective main page. The reason I use "supposedly" is because verifying the logic they used behind canonicals is proving to be a headache, but I know and I've seen a lot of these pages using the tag.
i.e:
- https:mysite.com/shop/ <-- https:mysite.com/pointofsale-b/shop
- https:mysite.com/shop/productA <-- https:mysite.com/pointofsale-b/shop/productA
The problem is that I have over a million URl that are crawled, when really I may have less than a tenth of them that have organic trafic potential.
Question is:
For products, I know I should tell them to put the URl as close to the root as possible and dynamically change the price according to the PoS the end-user chooses. Or even redirect all shops to the main one and only use that one.I need a short term solution to test/show if it is worth investing in development and correct all these useless duplicate pages. Should I use Robots.txt and block off parts of the site I do not want Google to waste his time on?
I am worried about: Indexation, Accessibility and crawl budget being wasted.
Thank you in advance,
-
Hey Chris!
Thanks a lot for your time. I did send you a PM the day after your original post, I will send you another :).
Thanks a lot for your additionnal advice. You're right about managing client's expectations and its crucial. You're pointing out some valid points and I will have to ponder about how I approach this whole situation.
Charles,
-
Hey Charles,
No problem, I've been out of the office most of the past week so I'm trying to catch up on a few of these now, sorry! I don't recall seeing any PMs either.
I feel weird to recommend shaving 3/4 of their site on which they put a lot of money in.
That's perfectly normal and I'd have the same reservations. If you do decide to go ahead with it though (and I'm absolutely not looking to push you into a decision either way, just providing the info) you can highlight the fact that paying a lot of money for a website doesn't make it inherently good. If those extra pages are providing no unique value then they're just a hindrance to their long-term goal of earning a return from that site via organic traffic.
It's a conversation we have semi-regularly with new clients. They think that because they just spent $20k on a new site, making changes to it is silly and a waste of the money they invested in the first place. "Sure it's broken but it was expensive"... I don't think search engines or users really care how much it cost
in the eyes of the client, it may come off as bold.
It certainly is bold and don't be fooled, there is a reasonable chance their rankings will get worse before they get better. In some cases when we perform a cleanup like this we'll see a brief drop before a steady improvement.
This doesn't happen all the time by any means, in fact we did a smaller scale version of this last week for two new clients and both have already started moving ahead over the weekend without a drop in rankings prior. It's really just about managing expectations and pitching the long term benefit over the short term fear.
Just be very careful in the way you project-manage it - be meticulous with updating internal links and 301 any pages that have external links pointing to them as well. You want to end up with a clean, efficient and crawlable website that retains as much value as possible.
You understand many sets of eyes are directed at them and a lot is to gain.
Also a very valid concern!
I'm probably not telling you anything you don't already know anyhow so don't think I'm trying to lecture you on how to do your job, just sharing my knowledge and anecdotal evidence on similar things.
-
Hey Chris!
Thanks for that lenghty response. It is very much appreciated and so is your offer for help. Let me check with some people to see if I can share the company's name.
[EDIT] Sent you a private msgOne of the reason I want to test the waters is, to be real honest, I feel weird to recommend shaving 3/4 of their site on which they put a lot of money in. I guess it comes down to reassuring them that these changes will be positive, but in the eyes of the client, it may come off as bold.
Another thing is, it is an international business that have different teams for different country. For more than 20 countries, they are the only one to try and sell their product online. You understand many sets of eyes are directed at them and a lot is to gain.
-
Hi Charles,
That's a tough one! I definitely see the motivation to test the waters here first before you go spending time on it but it will likely take less time than you think and either way, the user experience will be significantly better once you're done so I'd expect that either way, your time/dev investment would likely be viable.
I suppose you could block certain sections via Robots and wait to measure the results but I'd be more inclined to throw on the gloves and get elbow deep!
You've already mentioned the issues the current structure causes so you are aware of them which is great. With those in mind, focus on the user experience. What is it they're looking for on your site? How would they expect to find it? Can they find the solution with as few clicks as practical?
Rand did a Whiteboard Friday recently on Cleaning up the Cruft which was a great overview of the broader areas you can often trim your site back down to size. For me anyway, the aim is to have as few pages on the site as practical. If a page(s), category, tag etc doesn't need to exist then just remove it!
It's hard to say or to give specific advice here without seeing your site but chances are if you were to sit down and physically map out your website you'd find a lot of redundancy that, once fixed, would cut your million pages down to a significantly more manageable number. A recent example of this for us was a client who had a bunch of redundant blog categories and tags as well as multiple versions of some URLs due to poor internal linking. We cut their total URL volume from over 300 to just 78 and that alone was enough to significantly improve their search visibility.
I'd be happy to take a closer look at this one if you're willing to share your URL, though I understand if you're not. Either way, the best place to start here will be reviewing your site structure and seeing if it truly makes sense.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Domain name in URL
Hi we are targeting local cities across the UK with landing pages for our website. We have built up a few links for https://www.caffeienmarketing.co.uk/bristol/ and recently advised that I should change the URL to https://www.caffeienmarketing.co.uk/marketing-agency-bristol/ and 301 directing the old one 2 questions really: 1. is there any benefit in doing this these days in that this is the main keyword target we have for this page? 2. Do I get 100% benefit for all the links built up on the old page
Intermediate & Advanced SEO | | Caffeine_Marketing0 -
URL structure change and xml sitemap
At the end of April we changed the url structure of most of our pages and 301 redirected the old pages to the new ones. The xml sitemaps were also updated at that point to reflect the new url structure. Since then Google has not indexed the new urls from our xml sitemaps and I am unsure of why. We are at 4 weeks since the change, so I would have thought they would have indexed the pages by now. Any ideas on what I should check to make sure pages are indexed?
Intermediate & Advanced SEO | | ang0 -
Crawl budget
I am a believer in this concept, showing google less pages will increase their importance. here is my question: I manage a website with millions of pages, high organic traffic (lower than before). I do believe that too many pages are crawled. there are pages that I do not need google to crawl and followed. noindex follow does not save on the mentioned crawl budget. deleting those pages is not possible. any advice will be appreciated. If I disallow those pages I am missing on pages that help my important pages.
Intermediate & Advanced SEO | | ciznerguy2 -
Attack of the dummy urls -- what to do?
It occurs to me that a malicious program could set up thousands of links to dummy pages on a website: www.mysite.com/dynamicpage/dummy123 www.mysite.com/dynamicpage/dummy456 etc.. How is this normally handled? Does a developer have to look at all the parameters to see if they are valid and if not, automatically create a 301 redirect or 404 not found? This requires a table lookup of acceptable url parameters for all new visitors. I was thinking that bad url names would be rare so it would be ok to just stop the program with a message, until I realized someone could intentionally set up links to non existent pages on a site.
Intermediate & Advanced SEO | | friendoffood1 -
Is it a problem that Google's index shows paginated page urls, even with canonical tags in place?
Since Google shows more pages indexed than makes sense, I used Google's API and some other means to get everything Google has in its index for a site I'm working on. The results bring up a couple of oddities. It shows a lot of urls to the same page, but with different tracking code.The url with tracking code always follows a question mark and could look like: http://www.MozExampleURL.com?tracking-example http://www.MozExampleURL.com?another-tracking-examle http://www.MozExampleURL.com?tracking-example-3 etc So, the only thing that distinguishes one url from the next is a tracking url. On these pages, canonical tags are in place as: <link rel="canonical<a class="attribute-value">l</a>" href="http://www.MozExampleURL.com" /> So, why does the index have urls that are only different in terms of tracking urls? I would think it would ignore everything, starting with the question mark. The index also shows paginated pages. I would think it should show the one canonical url and leave it at that. Is this a problem about which something should be done? Best... Darcy
Intermediate & Advanced SEO | | 945010 -
Strange URLs, how do I fix this?
I've just check Majestic and have seen around 50 links coming from one of my other sites. The links all look like this: http://www.dwww.mysite.com
Intermediate & Advanced SEO | | JohnPeters
http://www.eee.mysite.com
http://www.w.mysite.com The site these links are coming from is a html site. Any ideas whats going on or a way to get rid of these urls? When I visit the strange URLs such as http://www.dwww.mysite.com, it shows the home page of http://www.mysite.com. Is there a way to redirect anything like this back to the home page?0 -
Read More & SEO
I have just had my site redesigned. The site was designed with only important facts bullets at the top of the page and all other information is below in the read more section that expands when clicked. I am wondering if I need to have this information in the read more section visible to the customer or if having the majority of the text in the read more is OK? and how it will effect rankings having it this way? I have had spots #1 &2 on Google for my keywords- until the site was redesigned...wondering if this was part of the reason. I have moved some of the text up to be visible on some of the pages - but it makes the site look cramped - and competes with the ease of use the site design Any insight on this is appreciated.
Intermediate & Advanced SEO | | Cheri110 -
Best way to migrate to a new URL structure
Hello everyone, We’re changing our URL structure from something like this: example.com/index.php?language=English To something like this: example.com**/english/**index.php The change is implemented with mod_rewrite so all the old URLs can still work We have hundreds of thousands of pages that are currently indexed with the old URL structure What’s the best way to get Google to rapidly update its index and to maintain as much ranking as possible? 301 redirect all the old URLs to the new equivalent format? If we detect that the URL is in an old format, render the page with a canonical tag pointing to the new equivalent format as well as adding a noindex, nofollow tag? Something else? Thanks for your input!
Intermediate & Advanced SEO | | anthematic0