Moz was unable to crawl your site? Redirect Loop issue
-
Moz was unable to crawl your site on Jul 25, 2017. I am getting this message for my site: It says "unable to access your homepage due to a redirect loop.
Site is working fine and last crawled on 22nd July. I am not sure why this issue is coming.
When I checked the website in Chrome extension it saysThe server has previously indicated this domain should always be accessed via HTTPS (HSTS Protocol). Chrome has cached this internally, and did not connect to any server for this redirect. Chrome reports this redirect as a "307 Internal Redirect" however this probably would have been a "301 Permanent redirect" originally. You can verify this by clearing your browser cache and visiting the original URL again.
Not sure if this is actual issue, This is migrated on Https just 5 days ago so may be it will resolved automatically. Not sure, can anybody from Moz team help me with this?
-
HI,just checking if anyone figured out the issue with this?
-
Yes, this is actually very confusing.
I don't know on which metrics Moz is crawling the websites and providing the issues report. Few days back when I put the query I also got this issue on my website and now this issue is removed automatically.
Nothing is alarming in Google Webmasters too and when I checked manually everything seems fine. So, can't say much about this issue
So, if you will get some solution then just let me know also so that I can also work on this
Thanks anyways.
-
Hey Guys
I'm getting the same redirect loop error today for one of my client's sites. We have not changed anything on the site recently and it worked perfectly in Moz Pro Campaign for several weeks, so what's happened? The same thing happened to another client of mine a week ago. Their site crawled perfectly in Moz for weeks, but all of a sudden Moz could not crawl it because of a redirect loop issue.
The site is http://aprilrandlelaw.com/
FULL ERROR MESSAGE
We were unable to access your homepage due to a redirect loop, which prevented us from crawling the rest of your site. Your homepage is likely redirecting to itself. Because we can only crawl if we find unique pages, the redirect on your homepage is stopping us from crawling past that page. It is possible that other browsers and search engines are encountering this problem and aborting their sessions as well. We recommend eliminating any unnecessary, circular or indefinite redirects on your homepage. Also, make sure your site is not mandating cookies, which can cause circular redirects and make crawling more difficult. Typically errors like this should be investigated and fixed by the site webmaster.
-
Hi Rahul. You can run your site through Goodle's Pagespeed Insights tool and also see the redirects. Also there are many other things reported by Pagespeed that could be fixed. I have found optimizing images and minification to be pretty simple and Pagespeed will provide the optimized files for download.
Best!
-
Thanks Chris for your reply.
No, not plugin.
we used 301 but redirect path in Chrome is showing 302 on www and 301 then on https://kuzyklaw.com
May be this is why we are getting this issue, I will check with my developers to fix this. Thanks anyways.
-
Hi. Hopefully MOZ team will respond but I noticed if I type "http://www.kuzyklaw.com" I get a 301 to "https://www.kuzyklaw.com" and then another 301 to "https://kuzyklaw.com/".
One to many redirects I think. Noticed you are WP. Did you use a plugin for the https change?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirecting from https to http - will pass whole link juice to new http website pages?
Hi making permanent 301 redirection from https to http - will pass whole link juice to new http website pages?
White Hat / Black Hat SEO | | Aman_1230 -
Old Press Release sites - Which ones do you Disavow and leave alone
Hi Mozers! I need your help. I'm in the final stages of a huge link audit and press releases are a big concern. As you know, press release distribution sites up until 2012 had "follow" links, giving webmasters a delight of having their keyword anchor texts a big boost in rankings. These are the websites that are troubling me today so i would appreciate your input on my strategy below as most of these websites are asking for money to remove them: 1. Press Release sites that are on the same C-class - Disavow 2. Not so authoritative press release websites that just follow my www domain only (no anchor texts) - I leave it alone 3. Not so authoritative press release websites but have anchor texts that are followed - Disavow 4. Post 2012 press release websites that have "followed" anchor text keywords - Request to remove, then disavow 5. Post 2012 press release websites that just follow my www domain only (no anchor texts) - leave it alone #2 and #5 are my biggest concern. Now more than ever I would appreciate your follow ups. I will respond quickly and apply "good answers" to the one's that make the most sense as my appreciation to you. God bless you all.
White Hat / Black Hat SEO | | Shawn1240 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Do some sites get preference over others by Google just because? Grandfathered theory
So I have a theory that Google "grandfathers" in a handful of old websites from every niche and that no matter what the site does, it will always get the authority to rank high for the relevant keywords in the niche. I have a website in the crafts/cards/printables niche. One of my competitors is http://printable-cards.gotfreecards.com/ This site ranks for everything... http://www.semrush.com/info/gotfreecards.com+(by+organic) Yet, when I go to visit their site, I notice duplicate content all over the place (extremely thin content, if anything at all for some pages that rank for highly searched keywords), I see paginated pages that should be getting noindexed, bad URL structure and I see an overall unfriendly user experience. Also, the backlink profile isn't very impressive, as most of the good links are coming from their other site, www.got-free-ecards.com. Can someone tell me why this site is ranking for what it is other than the fact that it's around 5 years old and potentially has some type of preference from Google?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Disavow tool for blocking 4 to 5 sites for Article Republishing
Am finding some very low authority sites (recently picked our articles from ezine and other article sites - written over a year back) and pasted on to there site. The number of articles copies are not 1 or 2, but more than 10-12 in all these domains This has also led to our anchor based url - backlink to us from them (a part of article). Have Wrote down to remove my author profile and articles - but there has been no response from webmaster of these sites. Is Disavow a right approach. The number of such sites are 4 or 5 in nature !!
White Hat / Black Hat SEO | | Modi0 -
New Site Structure
Greetings SEOmoz Team and Users, I need some advise, our site has more products to offer so I am try to optimize the index for a general term and each page product for it's own main keyword. Our site offers accommodation such apartments, hotels and vacation rentals so this is my structure: Index: Main Keyword 1 | Keyword 2 | Site name(brand name) Page Product 1: Main Keyword 1 | Keyword 2 | Site name (brand name) Page Product 2: Main Keyword 1 | Keyword 2 | Site name (brand name) Also can I use the brand name at the end of title tag with separate words ? example: londonescape or london escape or londonescape.net London Apartments | short term london apartments | London Escape or London Apartments | short term london apartments | LondonEscape I think ''London Escape'' is better because has more popularity. Looking forward to hear from you. Thanks, Giuseppe
White Hat / Black Hat SEO | | WorldEscape0 -
How should I look to gain traffic for a blog hosted on our own site?
Ok I know what you're going to say.. social media etc right? The problem is we do that already but we're stuck in an industry that just isn't very sexy! People reading twitter/facebook don't want to read about toilets in their spare time, or keep it as a hobby. If you search bathroom blog (uk google), you will see we are about 5th, it gets us no traffic, as I imagine it's not a popular term so I can see you starting to ask why we write a blog in the first place. We write the blog because at the moment the whole industry is dominated by pile it high sell it cheap online stores and all their blogs are written for google only so finding decent or unbiased advice is rare. Seriously, these guys are creating fake blogs and paying for links inside other blogs all over the place in order to boost their rankings so we figure if you really want some good advice people can't find it. If they find our blog they will get some good advice and good content.. and then the hope is that they give us a call and we can see if we can help them. The problem is these guys are hogging all the keywords.. and it seems the bathroom industry has very few keywords that people search! Now I know we are in it for the long haul.. so taking time doesn't bother me, but I was wondering if there were any tried and tested places to post a blog that could get us at least seen so that people could see how different our content is? Or if all else fails, can anyone suggest the right keywords to aim at besides the "bathroom suites" etc that these huge online stores dominate?
White Hat / Black Hat SEO | | willryles0 -
Using Redirects To Avoid Penalties
A quick question, born out of frustration! If a webpage has been penalised for unnatural links, what would be the effects of moving that page to a new URL and setting up a 301 redirect from the old penalised page to the new page? Will Google treat the new page as ‘non-penalised’ and restore your rankings? It really shouldn’t work, but I’m convinced (although not certain) that our clients competitor has done this, with great effect! I suppose you could also achieve this using canonicalisation too! Many thanks in advance, Lee.
White Hat / Black Hat SEO | | Webpresence0