Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Why would our server return a 301 status code when Googlebot visits from one IP, but a 200 from a different IP?
-
I have begun a daily process of analyzing a site's Web server log files and have noticed something that seems odd. There are several IP addresses from which Googlebot crawls that our server returns a 301 status code for every request, consistently, day after day. In nearly all cases, these are not URLs that should 301. When Googlebot visits from other IP addresses, the exact same pages are returned with a 200 status code.
Is this normal? If so, why? If not, why not?
I am concerned that our server returning an inaccurate status code is interfering with the site being effectively crawled as quickly and as often as it might be if this weren't happening.
Thanks guys!
-
Howdie,
Yes, I believe we got this sorted out. Interestingly, it wasn't any of the suggestions made here causing the 301 status code responses. I posted a thread in Google Webmaster Tools Forum regarding the issue and received a response that I am 99.5% sure is the correct answer.
Here is a link to that thread for future readers' reference: https://productforums.google.com/forum/#!mydiscussions/webmasters/zOCDAVudxNo
I believe the underlying issue has to do with incorrect handling of a redirect for this domain: ccisound.com
I am currently pursuing getting it corrected with our IT Director. Once the remedy is in place, I should know right away if it solves the issue I am seeing in the server logs. I'll post back here once I am 100% certain that was the issue.
Thanks all! This has been an interesting one for me!
-
Hi Dana, have you definitively sorted this out?
-
They are pretty detailed, I'll send you yesterday's in a zip file so you can take a look. I'm certain that have everything needed. Thanks Eric!
-
Right, a DNS manager could do a redirect, but that would not be visible in the web server log. It would only be visible in whatever is managing the DNS.
-
Depends what kind of DNS manager you are using. A redirect via DNS can still be possible.
In my experience DNS managing software can redirect users with 301 or 302 headers depending on what settings you have. If your DNS manager has a security protocol along with redirect rules, it could be causing the issue.
Examples of DNS redirects:
-
The request headers will also show if any and what cookies the user may have set. Which it looks like is how your server determines if it should provide the client the desktop or mobile version.
-
How detailed are your log files? Can you see the user-agent (browser name) Maybe you could ask your IT department to log request headers? If that will make the log files too big, they can probably do it only for the 'problem' IPs, or only for cases that the webserver returns a 301. I'll take a look if you like. Email is in my profile.
Best,
-Eric
-
Thanks so much Eric. Yes, I was thinking about the mobile version of our site being related to what I'm seeing too. However, I am unaware that we 301 redirect anything from the main site to the mobile site. In fact, users can actually switch to the mobile site via desktop by clicking "Mobile Site" in the footer and then browse the mobile version of the site via desktop. All of the URLs are identical.
Just out of curiosity I browsed to the mobile version of our site, grabbed a URL and then plugged it into "Fetch as Googlebot" in GWT. For all options, including desktop and the three mobile options a status code of 200 was returned.
-
The problem can't be related to DNS. If the problem was related to DNS, the request would never make it to your server, and you would never see anything related to the request in your log files.
Because you can see it in your log file, it is definitely happening on your own webserver (not some external problem).
The requesting IP is probobly not the problem, but it could be if your server automatically adds to a banned list any IP that requests > X pages in Y time - your server might think this is a DOS (denial of service) attack.... But if your server was set up to do this, your IT guys would probobly know about it. This isn't something that is normally enabled 'out of the box' someone would need to intentionally activate a behavior like that.
More likely, is that there is another common denominator besides the requester IP... I would guess that it's the user agent string (the browser or device the user is using).
Taking a quick look at what I think is your site, you have a mobile version available. Google of course would be interested in what your site looks like to a mobile browser, and would send a 'fake' user agent string pretending to be so (a cell phone or a tablet etc...) If your server sees this request, and tries to automatically redirect the browser to the mobile version of the site, then you would have your 301 code (which in this case is exactly what you intended, so your all set!)
There are probably a few other cases that could cause a 301 for just some IPs, but this is the only one that comes to mind at the moment.
Good Luck!
-
Here is the response from my IT Director regarding the possibility that this is being done by our DNS manager:
"I do not believe so. Our DNS does translation of human readable names to IP address. It has nothing to do with the status being returned to a browser, and even if it did it could not write to the log file."
Is this accurate? I understand that the DNS cannot write to the log file, but if the DNS can flag a request to receive a certain status code from the server, then this scenario would still be a possibility.
-
According to our IT Director we have no spam filters, no mod_security module, absolutely nothing on our server to prevent it from being crawled by bot, human or spider from any IP address, including black-listed IPs.
To me, other than the obvious (no security is probably not a good idea at all), that means that the 301 status codes being returned because of a problem with server set up.
I do have server logs that I'd be willing to share privately with anyone who's willing to take a gander. Don't worry, I won't send you a month's worth. 1-2 days should be plenty.
In the meantime I am going to dive in and take a look further. It's entirely possible that IPs from Google are not the only ones receiving nothing but 301 status codes in response to requests.
-
Thanks William. Good suggestion. I am on it! I'll post back here once I know more.
-
I would not be surprised if this was done by your DNS. If you use a DNS manager, they could possibly redirect certain users or IPs based on patterns of visits.
I suggest finding out more about any server configurations from the admin and seeing who they use as a DNS provider or manager.
-
Excellent thoughts! Yes, they are consistently the same IP addresses every time. There are several producing the same phenomenon, so I looked at this one 66.249.79.174
According to what I can find online this is definitely Google and the data center is located in Mountain View, California. We are a USA company, so it seems unlikely that it is a country issue. It could be that this IP (and the others like it) are inadvertently being blocked by a spam filter.
It doesn't matter the day or time, every time Googlebot attempts to crawl from this IP address our server returns 301 status codes for every request, with no exceptions.
I am thinking I need to request a list of IP addresses being blocked by the server's spam filter. I am not a server administrator...would this be something reasonable for me to ask the people who set it up?
Is returning a 301 status code the best scenario for handling a bot attempting to disguise itself as googlebot? I would think setting the server up to respond with a 304 would be better? (Sorry, that's kind of a follow-up "side" question)
Let me know your thoughts and I'm going to go see if I can find out more about the spam filter.
-
Where are the 301s taking Googlebot on those IP addresses? And are they the same IP addresses every time? Have you narrowed those IP addresses down to any particular datacenter/country? It could be possible there is some configuration with your server that treats IP addresses differently depending on the country... it could also be that the IP addresses getting the 301s are known blacklisted spam IP addresses but are masking themselves as Googlebot so your server's blacklist software is keeping them out. It's really hard to say without looking into the data myself but I'm definitely interested in what you find out.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there any benefit to changing 303 redirects to 301?
A year ago I moved my marketplace website from http to https. I implemented some design changes at the same time, and saw a huge drop in traffic that we have not recovered from. I've been searching for reasons for the organic traffic decline and have noticed that the redirects from http to https URLs are 303 redirects. There's little information available about 303 redirects but most articles say they don't pass link juice. Is it worth changing them to 301 redirects now? Are there risks in making such a change a year later, and is it likely to have any benefits for rankings?
Intermediate & Advanced SEO | | MAdeit0 -
:Pointing hreflang to a different domain
Hi all, Let's say I have two websites: www.mywebsite.com and www.mywebsite.de - they share a lot of content but the main categories and URLs are almost always different. Am I right in saying I can't just set the hreflang tag on every page of www.mywebsite.com to read: rel='alternate' hreflang='de' href='http://mywebsite.de' /> That just won't do anything, right? Am I also right in saying that the only way to use hreflang properly across two domains is to have a customer hreflang tag on every page that has identical content translated into German? So for this page: www.mywebsite.com/page.html my hreflang tag for the german users would be: <link < span="">rel='alternate' hreflang='de' href='http://mywebsite.de/page.html' /></link <> Thanks for your time.
Intermediate & Advanced SEO | | Bee1590 -
301 redirect subdirectory to new domain
I'm planning on using 301 redirects to spin out a subdirectory of my current website to be its own separate domain. For instance, I currently have a website www.website.com and my writers write tech news at www.website.com/news. Now I want to 301 redirect www.website.com/news to www.technews.com. Will this have any negative impact on SEO? What are some steps that I can take to minimize these impacts?
Intermediate & Advanced SEO | | Chris_Bishop1 -
Too many 301 redirects?
Hey, My company currently has one chief website with about 500-600 other domains that all feature the same material as the chief website. These domains have been around for about 5 years and have actually picked up some link traffic. I have all of these identical web-pages utilizing rel=canonical but I was wondering if I would be better served, from SEO purposes, to 301 redirect all of these sites to their respective pages on our chief website? If I add 500 301 redirects, will the major search engines consider this to be black-hat link-building even though the sites are related and technically already feature the same content? For an example, the chief website is www.1099pro.com and I would 301 redirect the below sites to the chief site: 1099softwarepro.com 1099softwarepro.info 1099softwarepro.net 1099softwarepro.biz 1099softwareprofessionals.com 1099softwareprofessionals.info ...you get the point
Intermediate & Advanced SEO | | Stew2220 -
Duplicate content on sites from different countries
Hi, we have a client who currently has a lot of duplicate content with their UK and US website. Both websites are geographically targeted (via google webmaster tools) to their specific location and have the appropriate local domain extension. Is having duplicate content a major issue, since they are in two different countries and geographic regions of the world? Any statement from Google about this? Regards, Bill
Intermediate & Advanced SEO | | MBASydney0 -
Are Bluehost servers slow or is it just me?
I have a ton of websites on Bluehosts servers... are my sites slow because I have so many sites on there? Or is Bluehost slow for everyone?
Intermediate & Advanced SEO | | jhinchcliffe1 -
Can I make 301 redirects on a Windows server (without access to IIS)?
Hey everyone, I've been trying to figure out a way to set up some 301 redirects to handle the broken links left behind after a site restructuring, but I can only ever find information on 2 methods that I can't use (as far as I can tell). The first method is to do some stuff with an htaccess file, but that looks like it only works on Linux-based servers. The method described for Windows servers is generally to install this IIS rewrite/redirect module and run that, but I don't think our web hosting company allows users to log directly into the server, so I wouldn't be able to use the IIS thing. Is there any other way to get a 301 redirect set up? And is this uncommon for a web hosting company to do, or do you all just run your sites on Linux-based servers or your own Windows machines? Thanks!
Intermediate & Advanced SEO | | BrianAlpert780 -
Changing Server IP Addresses. Should I be concerned?
Hello Mozers Our site has been on a dedicated server for about four years now. (no other sites, just ours on the server) I have made the decision to move it to a much better and faster server than the current server we are on for more than one reason. My big fear is Google will lose trust for my site because of the IP change. Ip's stay with the server at 1and1 they do not follow the website. So, I have done my due diligence and copied over all code and databases and have tested it completely to insure there are no issues when I change the DNS to point to the new server. Made sure 1and1 is giving me an IP that has never been used, I am Keeping the old server on until cached DNS records expire for it. Is there anything else I need to do to make sure I do not lose current rankings in Google? I have heard nightmare stories about making these kinds of changes but at this point for our site there is no turning back this is a change that must take place. Any pointers and advice would be much appreciated! Thanks!
Intermediate & Advanced SEO | | Robbie82991