Change of URLs: "little by little" VS "all at once"
-
Hi guys,
We're planning to change our URLs structure for our product pages (to make them more SEO friendly) and it's obviously something very sensitive regarding the 301 redirections that we have to take with...
I'm having a doubt about Mister Google: if we slowly do that modification (area by area, to minimize the risk of problems in case of bad 301 redirection), would we lose rankings in the search engine? (I'm wondering if they might consider our website is not "coherent" -> not the same product page URLs structure for all the product pages during some time)
Thanks for your kind opinion
-
Hi Nakul,
Maybe the initial post was not explicit enough: we will obviously redirect (301) all the old URLs. And to make sure we won't mess it up with the redirections, we want to update the new product URLs littl by little, product area by product area.
Which means that during this "transition" period, some product URLs will have the old structure, some others will have the new URL structure (both are given above) and the question is: does Google matter about the coherence of (product pages) URLs in the same website?
-
Will the old URLs continue to work or will they redirect ? If you can share the URL here in public here or via PM, that might help.
-
Hi Nakul,
A product can't be in more than one category on our website so that won't be a problem.
-
Hi Keri,
Yes the second one will be the new. It's the word price that will be in the URL and not it's value. We are a price comparison website so the keyword price is core for us.
-
I agree with Keri.You don't want to do that. Also, what happens if your product is in multiple categories.
Do you have multiple URLs of the same product then ? Would you have a canonical tag ?
-
Is the second URL your new URL? You're including your price in your URL? What happens if your price changes?
-
Hi Nakul,
Our domain is quite strong, we are talking about more than 450 K product pages.
Here is an example of URL change that we'll do:
domain/[category ID]/[product ID]/[product name]
-> domain/[category name]/[product name]-price-p[product ID]_[category ID]
-
Pedro
How strong is your domain/website ? Can you give examples of what you are doing ? How many product pages are you talking about ?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO Implications of firewalls that block "foreign connections"
Hello! A client's IT security team has firewalls on the site with GEO blocking enabled. This is to prevent foreign connections to applications as part of a contractual agreements with their own clients. Does anyone have any experience with workarounds for this? Thank you!
Intermediate & Advanced SEO | | SimpleSearch0 -
Why is rel="canonical" pointing at a URL with parameters bad?
Context Our website has a large number of crawl issues stemming from duplicate page content (source: Moz). According to an SEO firm which recently audited our website, some amount of these crawl issues are due to URL parameter usage. They have recommended that we "make sure every page has a Rel Canonical tag that points to the non-parameter version of that URL…parameters should never appear in Canonical tags." Here's an example URL where we have parameters in our canonical tag... http://www.chasing-fireflies.com/costumes-dress-up/womens-costumes/ rel="canonical" href="http://www.chasing-fireflies.com/costumes-dress-up/womens-costumes/?pageSize=0&pageSizeBottom=0" /> Our website runs on IBM WebSphere v 7. Questions Why it is important that the rel canonical tag points to a non-parameter URL? What is the extent of the negative impact from having rel canonicals pointing to URLs including parameters? Any advice for correcting this? Thanks for any help!
Intermediate & Advanced SEO | | Solid_Gold1 -
Using rel="nofollow" when link has an exact match anchor but the link does add value for the user
Hi all, I am wondering what peoples thoughts are on using rel="nofollow" for a link on a page like this http://askgramps.org/9203/a-bushel-of-wheat-great-value-than-bushel-of-goldThe anchor text is "Brigham Young" and the page it's pointing to's title is Brigham Young and it goes into more detail on who he is. So it is exact match. And as we know if this page has too much exact match anchor text it is likely to be considered "over-optimized". I guess one of my questions is how much is too much exact match or partial match anchor text? I have heard ratios tossed around like for every 10 links; 7 of them should not be targeted at all while 3 out of the 10 would be okay. I know it's all about being natural and creating value but using exact match or partial match anchors can definitely create value as they are almost always highly relevant. One reason that prompted my question is I have heard that this is something Penguin 3.0 is really going look at.On the example URL I gave I want to keep that particular link as is because I think it does add value to the user experience but then I used rel="nofollow" so it doesn't pass PageRank. Anyone see a problem with doing this and/or have a different idea? An important detail is that both sites are owned by the same organization. Thanks
Intermediate & Advanced SEO | | ThridHour0 -
If I own a .com url and also have the same url with .net, .info, .org, will I want to point them to the .com IP address?
I have a domain, for example, mydomain.com and I purchased mydomain.net, mydomain.info, and mydomain.org. Should I point the host @ to the IP where the .com is hosted in wpengine? I am not doing anything with the .org, .info, .net domains. I simply purchased them to prevent competitors from buying the domains.
Intermediate & Advanced SEO | | djlittman0 -
Sub Domains vs. Persistent URLs
I've always been under the assumption that when building a micro-site it was better to use a true path (e.g. yourcompany.com/microsite) URL as opposed to a sub domain (microsite.yourcompany.com) from an SEO perspective. Can you still generate significant SEO gains from a sub domain if you were forced to use it providing the primary (e.g. yourcompany.com) had a lot of link clout/authority? Meaning, if I had to go the sub domain route would it be the end of the world?
Intermediate & Advanced SEO | | VERBInteractive0 -
Request a 3rd party to change the URL to your site or 302 redirect it?
Hi, My new site design for small hope went live about 3 months ago and I changed the URL structure. I'm kind of new to the SEO game, but I've done quite a bit of research, successfully implemented all the 301s I need and tested them and everything is working fine. But I'm having a little trouble with this propagating through Google et al. Most of the advice is to sit tight and wait but I want to be a little more proactive. So even though the 301s are in place, can I ALSO send emails to our partners and request link changes? Would Google find this suspicious behaviour and penalise us? Thanks, Adam
Intermediate & Advanced SEO | | NaescentAdam0 -
Yoast meta description in ' ' instead of " " problem
Hi Guys this is really strange, i am using yoast seo for wordpress on two sites. On both sites i am seeing meta name='description' instead of meta name="description" And this is why google is probably not reading it correctly, on many other link submission sites which read your meta data automatically say site blocked. How to i fix this? Thanks
Intermediate & Advanced SEO | | SamBuck0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0