Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Why would our server return a 301 status code when Googlebot visits from one IP, but a 200 from a different IP?
-
I have begun a daily process of analyzing a site's Web server log files and have noticed something that seems odd. There are several IP addresses from which Googlebot crawls that our server returns a 301 status code for every request, consistently, day after day. In nearly all cases, these are not URLs that should 301. When Googlebot visits from other IP addresses, the exact same pages are returned with a 200 status code.
Is this normal? If so, why? If not, why not?
I am concerned that our server returning an inaccurate status code is interfering with the site being effectively crawled as quickly and as often as it might be if this weren't happening.
Thanks guys!
-
Howdie,
Yes, I believe we got this sorted out. Interestingly, it wasn't any of the suggestions made here causing the 301 status code responses. I posted a thread in Google Webmaster Tools Forum regarding the issue and received a response that I am 99.5% sure is the correct answer.
Here is a link to that thread for future readers' reference: https://productforums.google.com/forum/#!mydiscussions/webmasters/zOCDAVudxNo
I believe the underlying issue has to do with incorrect handling of a redirect for this domain: ccisound.com
I am currently pursuing getting it corrected with our IT Director. Once the remedy is in place, I should know right away if it solves the issue I am seeing in the server logs. I'll post back here once I am 100% certain that was the issue.
Thanks all! This has been an interesting one for me!
-
Hi Dana, have you definitively sorted this out?
-
They are pretty detailed, I'll send you yesterday's in a zip file so you can take a look. I'm certain that have everything needed. Thanks Eric!
-
Right, a DNS manager could do a redirect, but that would not be visible in the web server log. It would only be visible in whatever is managing the DNS.
-
Depends what kind of DNS manager you are using. A redirect via DNS can still be possible.
In my experience DNS managing software can redirect users with 301 or 302 headers depending on what settings you have. If your DNS manager has a security protocol along with redirect rules, it could be causing the issue.
Examples of DNS redirects:
-
The request headers will also show if any and what cookies the user may have set. Which it looks like is how your server determines if it should provide the client the desktop or mobile version.
-
How detailed are your log files? Can you see the user-agent (browser name) Maybe you could ask your IT department to log request headers? If that will make the log files too big, they can probably do it only for the 'problem' IPs, or only for cases that the webserver returns a 301. I'll take a look if you like. Email is in my profile.
Best,
-Eric
-
Thanks so much Eric. Yes, I was thinking about the mobile version of our site being related to what I'm seeing too. However, I am unaware that we 301 redirect anything from the main site to the mobile site. In fact, users can actually switch to the mobile site via desktop by clicking "Mobile Site" in the footer and then browse the mobile version of the site via desktop. All of the URLs are identical.
Just out of curiosity I browsed to the mobile version of our site, grabbed a URL and then plugged it into "Fetch as Googlebot" in GWT. For all options, including desktop and the three mobile options a status code of 200 was returned.
-
The problem can't be related to DNS. If the problem was related to DNS, the request would never make it to your server, and you would never see anything related to the request in your log files.
Because you can see it in your log file, it is definitely happening on your own webserver (not some external problem).
The requesting IP is probobly not the problem, but it could be if your server automatically adds to a banned list any IP that requests > X pages in Y time - your server might think this is a DOS (denial of service) attack.... But if your server was set up to do this, your IT guys would probobly know about it. This isn't something that is normally enabled 'out of the box' someone would need to intentionally activate a behavior like that.
More likely, is that there is another common denominator besides the requester IP... I would guess that it's the user agent string (the browser or device the user is using).
Taking a quick look at what I think is your site, you have a mobile version available. Google of course would be interested in what your site looks like to a mobile browser, and would send a 'fake' user agent string pretending to be so (a cell phone or a tablet etc...) If your server sees this request, and tries to automatically redirect the browser to the mobile version of the site, then you would have your 301 code (which in this case is exactly what you intended, so your all set!)
There are probably a few other cases that could cause a 301 for just some IPs, but this is the only one that comes to mind at the moment.
Good Luck!
-
Here is the response from my IT Director regarding the possibility that this is being done by our DNS manager:
"I do not believe so. Our DNS does translation of human readable names to IP address. It has nothing to do with the status being returned to a browser, and even if it did it could not write to the log file."
Is this accurate? I understand that the DNS cannot write to the log file, but if the DNS can flag a request to receive a certain status code from the server, then this scenario would still be a possibility.
-
According to our IT Director we have no spam filters, no mod_security module, absolutely nothing on our server to prevent it from being crawled by bot, human or spider from any IP address, including black-listed IPs.
To me, other than the obvious (no security is probably not a good idea at all), that means that the 301 status codes being returned because of a problem with server set up.
I do have server logs that I'd be willing to share privately with anyone who's willing to take a gander. Don't worry, I won't send you a month's worth. 1-2 days should be plenty.
In the meantime I am going to dive in and take a look further. It's entirely possible that IPs from Google are not the only ones receiving nothing but 301 status codes in response to requests.
-
Thanks William. Good suggestion. I am on it! I'll post back here once I know more.
-
I would not be surprised if this was done by your DNS. If you use a DNS manager, they could possibly redirect certain users or IPs based on patterns of visits.
I suggest finding out more about any server configurations from the admin and seeing who they use as a DNS provider or manager.
-
Excellent thoughts! Yes, they are consistently the same IP addresses every time. There are several producing the same phenomenon, so I looked at this one 66.249.79.174
According to what I can find online this is definitely Google and the data center is located in Mountain View, California. We are a USA company, so it seems unlikely that it is a country issue. It could be that this IP (and the others like it) are inadvertently being blocked by a spam filter.
It doesn't matter the day or time, every time Googlebot attempts to crawl from this IP address our server returns 301 status codes for every request, with no exceptions.
I am thinking I need to request a list of IP addresses being blocked by the server's spam filter. I am not a server administrator...would this be something reasonable for me to ask the people who set it up?
Is returning a 301 status code the best scenario for handling a bot attempting to disguise itself as googlebot? I would think setting the server up to respond with a 304 would be better? (Sorry, that's kind of a follow-up "side" question)
Let me know your thoughts and I'm going to go see if I can find out more about the spam filter.
-
Where are the 301s taking Googlebot on those IP addresses? And are they the same IP addresses every time? Have you narrowed those IP addresses down to any particular datacenter/country? It could be possible there is some configuration with your server that treats IP addresses differently depending on the country... it could also be that the IP addresses getting the 301s are known blacklisted spam IP addresses but are masking themselves as Googlebot so your server's blacklist software is keeping them out. It's really hard to say without looking into the data myself but I'm definitely interested in what you find out.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 Redirect in breadcrumb. How bad is it?
Hi all, How bad is it to have a link in the breadcrumb that 301 redirects? We had to create some hidden category pages in our ecommerce platform bigcommerce to create a display on our category pages in a certain format. Though whilst the category page was set to not visable in bigcommerce admin the URL still showed in the live site bread crumb. SO, we set a 301 redirect on it so it didnt produce a 404. However we have lost a lot of SEO ground the past few months. could this be why? is it bad to have a 301 redirect in the breadrcrumb.
Intermediate & Advanced SEO | | oceanstorm0 -
Different content on the same URL depending on the IP address of the visitor
Hi! Does anybody have any expierence on the SEO impact when changing the content of a page depending on the IP address of the visitor? Would be text content as well as meta information. This happening on the same URL. Many thanks.
Intermediate & Advanced SEO | | Schoellerallibert0 -
Google Is Indexing my 301 Redirects to Other sites
Long story but now i have a few links from my site 301 redirecting to youtube videos or eCommerce stores. They carry a considerable amount of traffic that i benefit from so i can't take them down, and that traffic is people from other websites, so basically i have backlinks from places that i don't own, to my redirect urls (Ex. http://example.com/redirect) My problem is that google is indexing them and doesn't let them go, i have tried blocking that url from robots.txt but google is still indexing it uncrawled, i have also tried allowing google to crawl it and adding noindex from robots.txt, i have tried removing it from GWT but it pops back again after a few days. Any ideas? Thanks!
Intermediate & Advanced SEO | | cuarto7150 -
Too many backlinks from one domain?
I've been in the process of creating a tourism-based website for the state of Kansas. I'm a photographer for the state, and have inked a nice little side income to my day job as a web designer by selling prints from Kansas (along with my travels elsewhere). I'm still in the process of developing it, but it's at least at a point that I need to really start thinking about SEO factor of the amount of backlinks I have from it going back to my main photography website. The Kansas site is at http://www.kansasisbeautiful.com and my photography website is http://www.mickeyshannon.com. This tourism website will serve a number of purposes: To promote the state and show people it's not just a flat, boring place. To help promote my photography. The entire site is powered by my photography. To sell a book I'm planning to publish later this year/early next year of Kansas images. To help increase sales of photography prints of my work. What I'm worried about is the amount of backlinks I have going from the Kansas site to my photography site. Not to mention every image is hosted on my photography domain (no need to upload to two domains when one can serve the same purpose). I'm currently linking back to my site on most pages via a little "Like the Photos? Buy a print" link in the top right corner. In addition, when users get to the website map, all photo listings click back to a page on my photography site that they can purchase prints. And the main navigation also has a link for "Photos" that takes them to my Kansas photo galleries on my photography website as well. The question I have: Is it really bad SEO-wise to have anywhere from 1 to 10+ backlinks on every page from one domain (kansasisbeautiful.com) linking back to mickeyshannon.com? Would I be better served moving all of the content from kansasisbeautiful into a subdirectory on my photography site (mickeyshannon.com/kansas/) and redirecting the entire domain there? I haven't actually launched this website yet, so I'm trying to make the right call before pushing it to the public. Any advice would be appreciated!
Intermediate & Advanced SEO | | msphoto0 -
Question about moving content from one site to another without a 301
I could use a second opinion about moving content from some inactive sites to my main site. Once upon a time, we had a handful of geotargeted websites set up targeting various cities that we serve. This was in addition to our main site, which was mostly targeted to our primary office and ranked great for those keywords. Our main site has plenty of authority, has been around for ages, etc. We built out these geo-targeted sites with some good landing pages and kept them active with regularly scheduled blog posts which were unique and either interesting or helpful. Although we had a little success with these, we eventually saw the light and realized that our main site was strong enough to rank for these cities as well, which made life a whole lot easier, not to mention a lot less spammy. We've got some good content on these other sites that I'd like to use on our main site, especially the blog posts. Now that I've got it through my head that there's no such thing as a duplicate content penalty, I understand that I could just start moving this content over so long as I put a 301 redirect in place where the content used to be on these old sites. Which leads me to my question. Our SEO was careful not to have these other websites pointing to our main site to avoid looking like we were trying to do something shady from a link building perspective. His concern is that these redirects would undermine that effort and having a bunch of redirects from a half dozen sites could end up hurting us somehow. Do you think that is the case? What he is suggesting we do is remove all of the content that we'd like to use and use Webmaster Tools to request that this content be removed from the index. Then, after the sites have been recrawled, we'll check for ourselves to confirm they've been removed and proceed with using the content however we'd like. Thoughts?
Intermediate & Advanced SEO | | LeeAbrahamson0 -
Duplicate content on sites from different countries
Hi, we have a client who currently has a lot of duplicate content with their UK and US website. Both websites are geographically targeted (via google webmaster tools) to their specific location and have the appropriate local domain extension. Is having duplicate content a major issue, since they are in two different countries and geographic regions of the world? Any statement from Google about this? Regards, Bill
Intermediate & Advanced SEO | | MBASydney0 -
How do I go about changing a 302 redirect to a 301.
Hello Friends! Thanks for viewing my question. Ok,My question today is How do I go about redirecting a 302 link to a 301 link. I understand the benefits of doing this as far as link juice and how the Search Engines views the two Re-Directs. I am wanting to know where I would start to do this. Thank you in advance for any help or suggestions!
Intermediate & Advanced SEO | | FrontlineMobility0 -
Googlebot HTTP 204 Status Code Handling?
If a user runs a search that returns no results, and the server returns a 204 (No Content), will Googlebot treat that as the rough equivalent of a 404 or a noindex? If not, then it seems one would want to noindex the page to avoid low quality penalties, but that might require more back and forth with the server, which isn't ideal. Kurus
Intermediate & Advanced SEO | | kurus0