Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
429 Errors?
-
I have over 500,000 429 errors in webmaster tools. Do I need to be concerned about these errors?
-
I am getting the same 429 errors also? The only changes I have made are that I switched my account on Godaddy from a deluxe hostimg to a managed WordPress hosting account?
-
I highly doubt this error would have anything to do with that. I would also recommending cross checking those rankings with another party tool like Authority Labs - or you can look for your average position in Google Webmaster Tools. Moz runs rankings once a week, and sometimes it might happen to pick up on a temporary fluctuation. So I'd confirm the ranking drop before deciding what to do next
-
I have the same question. Also, these are the only errors on my site. MOZ shows that the main keyword for the site just dropped from ranked #1 to ranked #21 on Google. Does this error have anything to do with it?
-
Sounds like probably the same issue wcbuckner describes - if it's problem in any way I would contact GoDaddy about it and see what they have to say.
-
This error is shown only in Moz Analytics. It's oky everywhere except here.
-
I just got a lot of these on my Moz report and I also host with Godaddy. My issue is how do we know if it does or doesn't happen when Google crawls our site? I am trying to get a page rank and I hope this is not stopping my site from getting ranked.
-
What exactly is happening, the same 429 errors? Does wcbuckner's response explain it for you?
-
Facing same problem with Godaddy. But can anybody say how to resolve this problem please?
-
Having the same issue and just spoke with Godaddy who said it's not a concern and that what's happening is Moz's software is pinging the client's server too many times within a given time period, so Godaddy's system is temporarily blocking Moz's IP, which causes the error. The do not ever, according to this rep, block Google's services that hit the server.
-
Interesting. I too had not come across a 429 error either.
I crawled your site once with Screaming Frog at normal speed and got some 429 errors. Those pages are indexed and cached - so there does not seem to be a dire emergency.
I did a second crawl, slower, with Screaming Frog - and still got a few 429 errors but not nearly as many. Thing is though, even though pages are getting indexed and cached, some pages will throw the 429 error on some crawls, and then maybe not the next crawl. So it's enough to get through, but would be better to not have them.
From what I can tell, it seems this code is set at the server level - so perhaps you should contact your host to inquire about it. Are you on a normal hosting setup or are you going through something like WP Engine? The number of requests allowed needs to be increased. Or as Mike said, this could be an included API call that's causing it.
Hope that helps!
-Dan
-
I found this from the Internet Engineering Task Force (IETF):
"429 Too Many Requests
The 429 status code indicates that the user has sent too many requests in a given amount of time ("rate limiting").
The response representations SHOULD include details explaining the condition, and MAY include a Retry-After header indicating how long to wait before making a new request.
For example:
HTTP/1.1 429 Too Many Requests
Content-Type: text/html
Retry-After: 3600<title>Too Many Requests</title>
Too Many Requests
I only allow 50 requests per hour to this Web site per logged in user. Try again soon.
Note that this specification does not define how the origin server identifies the user, nor how it counts requests. For example, an origin server that is limiting request rates can do so based upon counts of requests on a per-resource basis, across the entire server, or even among a set of servers. Likewise, it might identify the user by its authentication credentials, or a stateful cookie.
Responses with the 429 status code MUST NOT be stored by a cache."
From doing a quick read, it looks like this error would be thrown when API request are called too quickly... so... yeah?
Sorry I can't be any more helpful.
Mike
-
Here is a screenshot of webmaster errors: http://prntscr.com/zc5p2.
-
Hmmm...
I have never hear of a 429 error. And that error isn't listed by Google Webmaster Tools or W3.org either.
If you mean a 409 error, that means Conflict, "The server encountered a conflict fulfilling the request. The server must include information about the conflict in the response. The server might return this code in response to a PUT request that conflicts with an earlier request, along with a list of differences between the requests."
If you do mean 429, can you provide a screenshot of it?
Thanks,
Mike
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
520 Error from crawl report with Cloudflare
I am getting a lot of 520 Server Error in crawl reports. I see this is related to Cloudflare. We know 520 is Cloudflare so maybe the Moz team can change this from "unknown" to "Cloudflare 520". Perhaps the Moz team can update the "how to fix" section in the reporting, if they have some possible suggestions on how to avoid seeing these in the report of if there is a real issue that needs to be addressed. At this point I don't know. There must be a solution that Moz can provide like a setting in Cloudflare that will permit the Rogerbot if Cloudflare is blocking it because it does not like its behavior or something. It could be that Rogerbot is crawling my site on a bad day or at a time when we were deploying a massive site change. If I know when my site will be down can I pause Rogerbot? I found this https://developers.cloudflare.com/support/troubleshooting/general-troubleshooting/troubleshooting-crawl-errors/
Technical SEO | | awilliams_kingston0 -
Errors In Search Console
Hi All, I am hoping someone might be able to help with this. Last week one of my sites dropped from mid first day to bottom of page 1. We had not been link building as such and it only seems to of affected a single search term and the ranking page (which happens to be the home page). When I was going through everything I went to search console and in crawl errors there are 2 errors that showed up as detected 3 days before the drop. These are: wp-admin/admin-ajax.php showing as response code 400 and also xmlrpc.php showing as response code 405 robots.txt is as follows: user-agent: * disallow: /wp-admin/ allow: /wp-admin/admin-ajax.php Any help with what is wrong here and how to fix it would be greatly appreciated. Many Thanks
Technical SEO | | DaleZon0 -
404 Error Pages being picked up as duplicate content
Hi, I recently noticed an increase in duplicate content, but all of the pages are 404 error pages. For instance, Moz site crawl says this page: https://www.allconnect.com/sc-internet/internet.html has 43 duplicates and all the duplicates are also 404 pages (https://www.allconnect.com/Coxstatic.html for instance is a duplicate of this page). Looking for insight on how to fix this issue, do I add an rel=canonical tag to these 60 error pages that points to the original error page? Thanks!
Technical SEO | | kfallconnect0 -
"5XX (Server Error)" - How can I fix this?
Hey Mozers! Moz Crawl tells me I am having an issue with my Wordpress category - it is returning a 5XX error and i'm not sure why? Can anyone help me determine the issue? Crawl Issues and Notices for: http://www.refusedcarfinance.com/news/category/news We found 1 crawler issue(s) for this page. High Priority Issues 1 5XX (Server Error) 5XX errors (e.g., a 503 Service Unavailable error) are shown when a valid request was made by the client, but the server failed to complete the request. This can indicate a problem with the server, and should be investigated and fixed.
Technical SEO | | RocketStats0 -
:443 - 404 error
I get strange :443 errors in my 404 monitor on Wordpress https://www.compleetverkleed.nl:443/hoed-al-capone-panter-8713647758068-2/
Technical SEO | | Happy-SEO
https://www.compleetverkleed.nl:443/cart/www.compleetverkleed.nl/feestkleding
https://www.compleetverkleed.nl:443/maskers/ I have no idea where these come from :S2 -
Some URLs were not accessible to Googlebot due to an HTTP status error.
Hello I'm a seo newbie and some help from the community here would be greatly appreciated. I have submitted the sitemap of my website in google webmasters tools and now I got this warning: "When we tested a sample of the URLs from your Sitemap, we found that some URLs were not accessible to Googlebot due to an HTTP status error. All accessible URLs will still be submitted." How do I fix this? What should I do? Many thanks in advance.
Technical SEO | | GoldenRanking140 -
403 forbidden error website
Hi Mozzers, I got a question about new website from a new costumer http://www.eindexamensite.nl/. There is a 403 forbidden error on it, and I can't find what the problem is. I have checked on: http://gsitecrawler.com/tools/Server-Status.aspx
Technical SEO | | MaartenvandenBos
result:
URL=http://www.eindexamensite.nl/ **Result code: 403 (Forbidden / Forbidden)** When I delete the .htaccess from the server there is a 200 OK :-). So it is in the .htaccess. .htaccess code: ErrorDocument 404 /error.html RewriteEngine On
RewriteRule ^home$ / [L]
RewriteRule ^typo3$ - [L]
RewriteRule ^typo3/.$ - [L]
RewriteRule ^uploads/.$ - [L]
RewriteRule ^fileadmin/.$ - [L]
RewriteRule ^typo3conf/.$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME} !-l
RewriteRule .* index.php Start rewrites for Static file caching RewriteRule ^(typo3|typo3temp|typo3conf|t3lib|tslib|fileadmin|uploads|screens|showpic.php)/ - [L]
RewriteRule ^home$ / [L] Don't pull *.xml, *.css etc. from the cache RewriteCond %{REQUEST_FILENAME} !^..xml$
RewriteCond %{REQUEST_FILENAME} !^..css$
RewriteCond %{REQUEST_FILENAME} !^.*.php$ Check for Ctrl Shift reload RewriteCond %{HTTP:Pragma} !no-cache
RewriteCond %{HTTP:Cache-Control} !no-cache NO backend user is logged in. RewriteCond %{HTTP_COOKIE} !be_typo_user [NC] NO frontend user is logged in. RewriteCond %{HTTP_COOKIE} !nc_staticfilecache [NC] We only redirect GET requests RewriteCond %{REQUEST_METHOD} GET We only redirect URI's without query strings RewriteCond %{QUERY_STRING} ^$ We only redirect if a cache file actually exists RewriteCond %{DOCUMENT_ROOT}/typo3temp/tx_ncstaticfilecache/%{HTTP_HOST}/%{REQUEST_URI}/index.html -f
RewriteRule .* typo3temp/tx_ncstaticfilecache/%{HTTP_HOST}/%{REQUEST_URI}/index.html [L] End static file caching DirectoryIndex index.html CMS is typo3. any ideas? Thanks!
Maarten0 -
Why crawl error "title missing or empty" when there is already "title and meta desciption" in place?
I've been getting 73 "title missing or empty" warnings from SEOMOZ crawl diagnostic. This is weird as I've installed yoast wordpress seo plugin and all posts do have title and meta description. But why the results here.. can anyone explain what's happening? Thanks!! Here are some of the links that are listed with "title missing, empty". Almost all our blog posts were listed there. http://www.gan4hire.com/blog/2011/are-you-here-for-good/ http://www.gan4hire.com/blog/2011/are-you-socially-awkward/
Technical SEO | | JasonDGreatMaeM3.png TLcD8.png
0