URL Re-Writes & HTTPS: Link juice loss from 301s?
-
Our URLs are not following a lot of the best practices found here: http://moz.com/blog/11-best-practices-for-urls
We have also been waiting to implement HTTPS.
I think it might be time to take the plunge on re-writing the URLs and converting to a fully secure site, but I am concerned about ranking dips from the lost link juice from the 301s. Many of our URLs are very old, with a decent amount of quality links.
Are we better off leaving as is or taking the plunge?
-
Thanks all...Much appreciated!
Looking at the examples below, does anyone think this move could result in a negative effect?
**From: **http://www.xyzwidgets.com/widgets/commercial-widgets/small_blue_widget.htm
**To: **https://www.xyzwidgets.com/small-blue-widget
**From: **http://www.xyzwidgets.com/info/videos/general/what-are-widgets.htm
-
If youre going to be updating your URLs for best-practices, I would incorporate the conversion to https as well - do it all in one shot, as you've said.
Just ensure you're implementing 301 redirects properly. Not doing so can have disastrous results.
-
In addition to what Robert just said. If you add a 301 now to format url properly, and later add a second 301 to move to HTTPS, you will add redirect to redirect losing that little bit of page juice twice.
-
The only downside to that approach is if there is no benefit to moving to HTTPS, you have wasted time (if that was the only reason for you doing so). However, if you are using 301's either way, you may as well move to HTTPS - it won't hurt you and it might help you.
-
My thinking is that the potential for increase in CTR in the SERPS can have a greater affect than the potential 301 harm.
I notice many of you are still waiting for the jury to be a bit more conclusive on whether to move to HTTPS. However, if I'm redirecting all pages using Moz's bes practice, shouldn't I just take the HTTPS plunge at the same time? Is there any reason not to?
-
301's of any kind can result in a slight decrease in "link-juice" moving forward, although it can be hard to determine exactly how much (not a large amount relatively speaking). That being said, as Massimiliano stated, I haven't personally come across this scenario in my work.
The HTTP/HTTPS debate is still going and as Ray said, it might be best to adopt a "wait and see" strategy.
Of these things, you have pointed out that your urls do not follow best practices stated in the link - it is likely that new urls combined with 301 redirects to HTTPS will not hurt your rankings and may in fact help you. As Ray stated, it is about cost and whether you think the potential rankings are worth the time, effort and money you will spend making it happen.
-
In my experience the power of proper url, with the right keywords in the right place, is so great I wouldn't wait a second before to fix them.
Again based on my experience I never noticed a decrease in ranking due to 301.
I recently moved three websites from http to https and I didn't notice any decrease in ranking I could associate with the redirect.
Of course since we daily work on improving ranking is hard to distinguish a small decrease due to 301 from the general improvement.
-
The benefit in the ranking influence for http / https sites is still unclear. Many SEOs are still holding off on this conversion to see what its impact, hopefully measurable, may end up being.
Moz has a great post on Https necessities and practices here: http://moz.com/blog/seo-tips-https-ssl
If it is going to be an intense project (costs an mount of money that makes you question its worth), I would hold off until more information is exposed about https as a ranking factor. If the conversion is easy, then I would get it implemented now and reap any benefits that come from https.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Url structure on product pages - Should we apply canonicalized links in breadcrumbs or entry folders
We have products in the that go into mulitiple categories on our e-commerce site. But of course, each product is only canonicalized to one category. My question is: what should the breadcrumbs look like when users access a product from a non-canonicalized/primary category ?Should we apply canonicalized links in breadcrumbs or entry folders? For example: Let´s say we have product called "glacier hiking in the alps". It is in two categories; 1) glacier hiking 2) mountain tours. And is canonicalized to the glacier hiking category. If a user accesses it from the mountain tours category, should the url/breadcrumbs look like this: www.example.com/glacier-hiking/glacier-hiking-in-the-alps (because that is the canonicalized version) Or should it look like like this: www.example.com/mountain-tours/glacier-hiking-in-the-alps (because that is where the user came from) Thanks in advance!
Intermediate & Advanced SEO | | guidetoiceland0 -
Https Loss of Search traffic
Hey guys, We moved our site to from http to https. We subsequently lost 25% in our search traffic in 1 Month. We changed a few other pieces such as images, added new content etc. Has anyone got any suggestions on how we start to understand what happened? Thanks in advance.
Intermediate & Advanced SEO | | Johnny_AppleSeed0 -
Case Sensitive URLs, Duplicate Content & Link Rel Canonical
I have a site where URLs are case sensitive. In some cases the lowercase URL is being indexed and in others the mixed case URL is being indexed. This is leading to duplicate content issues on the site. The site is using link rel canonical to specify a preferred URL in some cases however there is no consistency whether the URLs are lowercase or mixed case. On some pages the link rel canonical tag points to the lowercase URL, on others it points to the mixed case URL. Ideally I'd like to update all link rel canonical tags and internal links throughout the site to use the lowercase URL however I'm apprehensive! My question is as follows: If I where to specify the lowercase URL across the site in addition to updating internal links to use lowercase URLs, could this have a negative impact where the mixed case URL is the one currently indexed? Hope this makes sense! Dave
Intermediate & Advanced SEO | | allianzireland0 -
Link Building
We have just recently launched a new website in Australia and as l am new to the SEO community, l was looking for a little advice on link building. Where is best to start? There are not many authorative websites for our industry. Are there specific websites that are good to link to? Are there any good tools to assist with this? Any help would be great. Thank you.
Intermediate & Advanced SEO | | RobSchofield0 -
Should We Link To Our News?
We just started an "In the News" section on our webpage. We are not sure what would be the best for SEO purposes. Should we link to the news websites that have the stories about our company, even if they have no link bank? Or should we just take screenshots of the news article and only link to articles that link back to us (this is what we a currently doing)? Here is our news page, http://www.buyautoparts.com/News/
Intermediate & Advanced SEO | | joebuilder0 -
Site wide footer links vs. single link for websites we design
I’ve been running a web design business for the past 5 years, 90% or more of the websites we build have a “web design by” link in the footer which links back to us using just our brand name or the full “web design by brand name” anchor text. I’m fully aware that site-wide footer links arent doing me much good in terms of SEO, but what Im curious to know is could they be hurting me? More specifically I’m wondering if I should do anything about the existing links or change my ways for all new projects, currently we’re still rolling them out with the site-wide footer links. I know that all other things being equal (1 link from 10 domains > 10 links from 1 domain) but is (1 link from 10 domains > 100 links from 10 domains)? I’ve got a lot of branded anchor text, which balances out my exact match and partial match keyword anchors from other link building nicely. Another thing to consider is that we host many of our clients which means there are quite a few on the same server with a shared IP. Should I? 1.) Go back into as many of the sites as I can and remove the link from all pages except the home page or a decent PA sub page- keeping a single link from the domain. 2.) Leave all the old stuff alone but start using the single link method on new sites. 3.) Scratch the site credit and just insert an exact-match anchor link in the body of the home page and hide with with CSS like my top competitor seems to be doing quite successfully. (kidding of course.... but my competitor really is doing this.)
Intermediate & Advanced SEO | | nbeske0 -
Sitewide blog link and Article links
Hi Guys I just wanted to give you all a heads up on something I adjusted recently that worked really well and wanted to ask for your own experiences on this. 1. We have a blog that adds regular content and within the blog we link from the keyword we are targeting. Standard stuff right ! We were struggling for movement on a keyword so I removed the links from the articles and added a link on the site wide blogroll. The link on the blogroll included the keyword but was a longer descriptive link. Low and behold we got a first page listing when the changed it.The change in ranking was made a few days later. I have always been given the impression that site wide isn't that great ? So explain this one . Of course there are many other factors etc 🙂 What are your experiences and thoughts on what happened here ?
Intermediate & Advanced SEO | | onlinemediadirect0 -
Should we block urls like this - domainname/shop/leather-chairs.html?brand=244&cat=16&dir=ascℴ=price&price=1 within the robots.txt?
I've recently added a campaign within the SEOmoz interface and received an alarming number of errors ~9,000 on our eCommerce website. This site was built in Magento, and we are using search friendly url's however most of our errors were duplicate content / titles due to url's like: domainname/shop/leather-chairs.html?brand=244&cat=16&dir=asc&order=price&price=1 and domainname/shop/leather-chairs.html?brand=244&cat=16&dir=asc&order=price&price=4. Is this hurting us in the search engines? Is rogerbot too good? What can we do to cut off bots after the ".html?" ? Any help would be much appreciated 🙂
Intermediate & Advanced SEO | | MonsterWeb280