URL Re-Writes & HTTPS: Link juice loss from 301s?
-
Our URLs are not following a lot of the best practices found here: http://moz.com/blog/11-best-practices-for-urls
We have also been waiting to implement HTTPS.
I think it might be time to take the plunge on re-writing the URLs and converting to a fully secure site, but I am concerned about ranking dips from the lost link juice from the 301s. Many of our URLs are very old, with a decent amount of quality links.
Are we better off leaving as is or taking the plunge?
-
Thanks all...Much appreciated!
Looking at the examples below, does anyone think this move could result in a negative effect?
**From: **http://www.xyzwidgets.com/widgets/commercial-widgets/small_blue_widget.htm
**To: **https://www.xyzwidgets.com/small-blue-widget
**From: **http://www.xyzwidgets.com/info/videos/general/what-are-widgets.htm
-
If youre going to be updating your URLs for best-practices, I would incorporate the conversion to https as well - do it all in one shot, as you've said.
Just ensure you're implementing 301 redirects properly. Not doing so can have disastrous results.
-
In addition to what Robert just said. If you add a 301 now to format url properly, and later add a second 301 to move to HTTPS, you will add redirect to redirect losing that little bit of page juice twice.
-
The only downside to that approach is if there is no benefit to moving to HTTPS, you have wasted time (if that was the only reason for you doing so). However, if you are using 301's either way, you may as well move to HTTPS - it won't hurt you and it might help you.
-
My thinking is that the potential for increase in CTR in the SERPS can have a greater affect than the potential 301 harm.
I notice many of you are still waiting for the jury to be a bit more conclusive on whether to move to HTTPS. However, if I'm redirecting all pages using Moz's bes practice, shouldn't I just take the HTTPS plunge at the same time? Is there any reason not to?
-
301's of any kind can result in a slight decrease in "link-juice" moving forward, although it can be hard to determine exactly how much (not a large amount relatively speaking). That being said, as Massimiliano stated, I haven't personally come across this scenario in my work.
The HTTP/HTTPS debate is still going and as Ray said, it might be best to adopt a "wait and see" strategy.
Of these things, you have pointed out that your urls do not follow best practices stated in the link - it is likely that new urls combined with 301 redirects to HTTPS will not hurt your rankings and may in fact help you. As Ray stated, it is about cost and whether you think the potential rankings are worth the time, effort and money you will spend making it happen.
-
In my experience the power of proper url, with the right keywords in the right place, is so great I wouldn't wait a second before to fix them.
Again based on my experience I never noticed a decrease in ranking due to 301.
I recently moved three websites from http to https and I didn't notice any decrease in ranking I could associate with the redirect.
Of course since we daily work on improving ranking is hard to distinguish a small decrease due to 301 from the general improvement.
-
The benefit in the ranking influence for http / https sites is still unclear. Many SEOs are still holding off on this conversion to see what its impact, hopefully measurable, may end up being.
Moz has a great post on Https necessities and practices here: http://moz.com/blog/seo-tips-https-ssl
If it is going to be an intense project (costs an mount of money that makes you question its worth), I would hold off until more information is exposed about https as a ranking factor. If the conversion is easy, then I would get it implemented now and reap any benefits that come from https.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google WMT/search console: Thousands of "Links to your site" even only one back-link from a website.
Hi, I can see in my search console that a website giving thousands of links to my site where hardly only one back-link from one of their page to our page. Why this is happening? Here is screenshot: http://imgur.com/a/VleUf
Intermediate & Advanced SEO | | vtmoz0 -
Old URLs that have 301s to 404s not being de-indexed.
We have a scenario on a domain that recently moved to enforcing SSL. If a page is requested over non-ssl (http) requests, the server automatically redirects to the SSL (https) URL using a good old fashioned 301. This is great except for any page that no longer exists, in which case you get a 301 going to a 404. Here's what I mean. Case 1 - Good page: http://domain.com/goodpage -> 301 -> https://domain.com/goodpage -> 200 Case 2 - Bad page that no longer exists: http://domain.com/badpage -> 301 -> https://domain.com/badpage -> 404 Google is correctly re-indexing all the "good" pages and just displaying search results going directly to the https version. Google is stubbornly hanging on to all the "bad" pages and serving up the original URL (http://domain.com/badpage) unless we submit a removal request. But there are hundreds of these pages and this is starting to suck. Note: the load balancer does the SSL enforcement, not the CMS. So we can't detect a 404 and serve it up first. The CMS does the 404'ing. Any ideas on the best way to approach this problem? Or any idea why Google is holding on to all the old "bad" pages that no longer exist, given that we've clearly indicated with 301s that no one is home at the old address?
Intermediate & Advanced SEO | | boxclever0 -
Pros & Cons of Switching Your Main Domain to Mask Links & Combat EMDs
Hello Mozzers, I'd love to receive some advice for a client of mine and insights you may have regarding pros and cons on changing your main domain to mask links. Within a competitive niche there are about 4 different sites that routinely rank 1-4. Our site crushes all three on just about all metrics except we have a high volume of nofollow links and our site remains at #4. Our site is much older so we have significantly more links than these smaller sites, including pre-penguin penalty spammy links (like blog comments that make up 50+ nofollow links from 1 comment per domain). Obviously we are attempting to remove any toxic links and disavow, however the blog comment nofollow links skew our anchor text ratio pretty intensely and we are worried that we aren't going to make a dent in removing this type of links. Just disavowing them hasn't worked alone, so if we are unable to remove the bulk of these poor quality links (nofollow, off-topic anchor text, etc..) we are considering 301 redirecting the current domain to a new one. We've seen success with this in a couple of scenarios, but wanted to see other insights as to if masking links with a 301 could send fresh signals and positively effect rankings. Also wanted to mention, 2 of the 3 competitors that outrank us have EMD's for the primary keywords. Appreciate your time, insights, and advice on this matter.
Intermediate & Advanced SEO | | Leadhub0 -
Intra-linking to pages with a different Canonical url ?
Hello Moz Community! I'm hoping to get some advice around intra-linking practices and the benefits when a page that is being linked to has a different canonical tag than it's own URL. Confused? Allow me to elaborate. Scenario: Background: Ecommerce Company is trying to increase its organic ranking for key, broad terms in the cycling industry. Ecommerce company is trying to rank its category pages for a main term. To help this, the company focusing on increasing the quality of its intra-linking structure (the links and anchor texts that link to another page within the site). Example goal: to have it's Road Cassettes category page rank for 'Road Cassettes' Company's 'cassettes' main category page is here: /Components/Drivetrain/Cassettes/ And the company uses filtered navigation logic to drill down into 'road cassettes' specifically: /Components/Drivetrain/Cassettes/?page_no=1&fq=ATR_RoadBiking:True SEOs are instructed to include occasional links back to this page, with SEO friendly anchor text, to help strengthen it's authority for the main term. The Issue / Question: Main category URL: /Components/Drivetrain/Cassettes/ Road Cassettes category URL: /Components/Drivetrain/Cassettes/?page_no=1&fq=ATR_RoadBiking:True Road Cassettes Canonical URL: /Components/Drivetrain/Cassettes/ The canonical URL of the filtered Road Cassettes category is its main category URL. Will Company be able to effectively rank its Road Cassettes category URL for 'Road Cassettes' if the canonical URL is the main category? Should the canonical URL not be the main category? OR Will increasing the intra-linking to the Road Cassettes URL help the main category URL rank for 'Road Cassettes' - by passing all it's authority?
Intermediate & Advanced SEO | | Ray-pp0 -
Dealing with Penguin: Changing URL instead of removing links
I have some links pointing to categories from article directories, web directories, and a few blogs. We are talking about 20-30 links in total. They are less than 5% of the links to my site (counting unique domains). I either haven't been able to make contact with webmasters, or they are asking money to remove the links. If I simply rename the URL (for example changing mysite.com/t-shirt.html to mysite.com/tshirts.html), will that resolve any penguin issues? The link will forward to the homepage since that page no longer exists. I really want to avoid using the disavow tool if possible. I appreciate the feedback. If you have actually done this, please share your experience.
Intermediate & Advanced SEO | | inhouseseo0 -
Redirect ruined domain to new domain without passing link juice
A new client has a domain which has been hammered by bad links, updates etc and it's basically on its arse because of previous SEO guys. They have various domains for their business (brand.com, brand.co.uk) and want to use a fresh domain and take it from there. Their current domain is brand.com (the ruined one). They're not bothered about the rankings for brand.com but they want to redirect brand.com to brand.co.uk so that previous clients can find them easily. Would a 302 redirect work for this? I don't want to set up a 301 redirect as I don't want any of the crappy links pointing across. Thanks!
Intermediate & Advanced SEO | | jasonwdexter0 -
Home Page Link Juice Dilution
I have worked to build out a keyword targeted library of over 700 Guides of approx. 800 word each. They are specifically targeted at actionable verticals and contain 3x strategically placed CTAs in each article. So far, I have only managed to get a low level of uniques per day to this section of the website. This website's external backlinks are largely pointed at the home page. Furthermore, the home page has a footer link to 10,000 SEO crawl-able user generated profiles. These profiles have little potential for conversion and offer little value. Given the above information, I was hoping that someone could help me with the following questions: Is it possible that home page link juice is becoming diluted as result 10,000 user profiles being live on the site? If so, can a "no follow" on the home page footer link to the user profiles prevent the juice from transferring? Overall, I would like to redirect this PR5 domain's link juice to these guides where they will have a much higher conversion rate.
Intermediate & Advanced SEO | | TQContent0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0