Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
On a dedicated server with multiple IP addresses, how can one address group be slow/time out and all other IP addresses OK?
-
We utilize a dedicated server to host roughly 60 sites on. The server is with a company that utilizes a lady who drives race cars.... About 4 months ago we realized we had a group of sites down thanks to monitoring alerts and checked it out. All were on the same IP address and the sites on the other IP address were still up and functioning well. When we contacted the support at first we were stonewalled, but eventually they said there was a problem and it was resolved within about 2 hours. Up until recently we had no problems.
As a part of our ongoing SEO we check page load speed for our clients. A few days ago a client who has their site hosted by the same company was running very slow (about 8 seconds to load without cache). We ran every check we could and could not find a reason on our end. The client called the host and were told they needed to be on some other type of server (with the host) at a fee increase of roughly $10 per month.
Yesterday, we noticed one group of sites on our server was down and, again, it was one IP address with about 8 sites on it. On chat with support, they kept saying it was our ISP. (We speed tested on multiple computers and were 22MB down and 9MB up +/-2MB). We ran a trace on the IP address and it went through without a problem on three occassions over about ten minutes. After about 30 minutes the sites were back up.
Here's the twist: we had a couple of people in the building who were on other ISP's try and the sites came up and loaded on their machines. Does anyone have any idea as to what the issue is?
-
Agreed and thanks, unfortunately, the host provider is anything but a one man op. Its huge. Moving to a tier four farm in Nov/Dec. Major company, in house phone, email, chat support. Etc.
As to Sha......I don't care if her answer came from martians, it was one of the best I have seen. (Note to moz staff......Hint, Hint)
-
Nah...the cool stuff is courtesy of my Boss whose brain can be kinda scary at times - I'm just soaking up the awesomeness he spreads around
We have this little reciprocal thing that is improving us both (although I don't think he's ever going to hunger for SEO the way I do! But then, that would make him kinda nuts! hehe)
(since you said "non-server side guy" I'm thinking that I probably should have mentioned that you can basically think of each IP being related to a card similar to a network card in your computer)
That whole owning and renting story is pretty common in that world, but is only a problem if you don't strike someone who knows what they are talking about.
We run our own client servers and I have to admit that I shudder when a client comes to us with an existing account from a couple of specific companies. 8(
No probs, always welcome.
-
@Sha, wow! What an exceptionally thorough and all-around awesome reply!
@Robert, you may have come to this conclusion on your own but perhaps it's time to consider a new host. You mentioned "they do not have the servers they just sell the service". I would definitely recommend purchasing service directly from a host and not from a middleman. A true host will often have their own data center and 100+ employees while middleman can sometimes be a 1-man or otherwise small shop. Their knowledge and support can be quite sketchy.
-
Ok, Now I am annoyed.....
Journalist, web dev, writer, good grammar and spelling, and now this....Server Side Pro...... You are good.
This really does seem to make sense to a non server side type guy. I will follow up before we change to another farm. Just found out recently they do not have the servers they just sell the service. Thanks again Sha
-
Hi Robert,
I think I've picked up on all of the questions here (there's a lot going on!) and have borrowed some awesomeness from my Tech Wizard (Boss) to fill in the exciting bits, so here goes:
I'll start with the easy one first... well actually, none of them are that hard
As a part of our ongoing SEO we check page load speed for our clients. A few days ago a client who has their site hosted by the same company was running very slow (about 8 seconds to load without cache). We ran every check we could and could not find a reason on our end. The client called the host and were told they needed to be on some other type of server (with the host) at a fee increase of roughly $10 per month.
OK, basically the answer to this one would be that your client's site was being throttled back by the host because it was using more bandwidth than was allowed under their existing plan. By moving them to the next plan (the extra $10 per month) the problem is resolved and the site speed returns to normal. Throttling it back gets the client to call... 8(
OK, 1 down and 2 to go...
About 4 months ago we realized we had a group of sites down thanks to monitoring alerts and checked it out. All were on the same IP address and the sites on the other IP address were still up and functioning well. When we contacted the support at first we were stonewalled, but eventually they said there was a problem and it was resolved within about 2 hours. Up until recently we had no problems.
and also
Yesterday, we noticed one group of sites on our server was down and, again, it was one IP address with about 8 sites on it.
OK, you know already that there can be up to 8 IPs on a box and at times something in the network will go bad. There are some variables here as to what is wrong. If you are on a Class C Network and one IP goes down then it means that the Switch or Router has gone bad (whether it is a Switch or a Router is determined by how the host has their hardware set up). If you are on a Class D Network and one IP goes down, then the problem is one of 3 things, the Card, the port, or the cable connecting the two, related to that IP.
The trick is that the person on the phone needs to realise what they are dealing with and escalate it to get the hardware issue resolved (A recent interaction with that particular host for one of our clients indicated to me that the realization part might be a little hit and miss, so good to have an understanding of what might be happening if it happens again)
Phew! Nearly there, last of all...
**On chat with support, they kept saying it was our ISP. (We speed tested on multiple computers and were 22MB down and 9MB up +/-2MB). We ran a trace on the IP address and it went through without a problem on three occassions over about ten minutes. After about 30 minutes the sites were back up. **
Here's the twist: we had a couple of people in the building who were on other ISP's try and the sites came up and loaded on their machines. Does anyone have any idea as to what the issue is?
OK this one is all about DNS caching. That particular host (the one that likes lady racing drivers) has a fail-over system in place. This means that if an IP goes down, the domains on that IP will automatically fail-over to another box.
So, if you have looked at those domains on your machine, it will be cached. When you go back to check the site you are still looking at the cached version. The other people in the building are coming to the domain fresh and through a different ISP, so they see those domains because they are back up on the new box.
When the host reps were telling you that it was your ISP, what they really meant was that it had failed-over to a new box and you were still seeing the cached DNS location.
OK, think I covered it all so....that's all Folks!
Have a great holiday weekend!
Sha
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our clients Magento 2 site has lots of obsolete categories. Advice on SEO best practice for setting server level redirects so I can delete them?
Our client's Magento website has been running for at least a decade, so has a lot of old legacy categories for Brands they no longer carry. We're looking to trim down the amount of unnecessary URL Redirects in Magento, so my question is: Is there a way that is SEO efficient to setup permanent redirects at a server level (nginx) that Google will crawl to allow us at some point to delete the categories and Magento URL Redirects? If this is a good practice can you at some point then delete the server redirects as google has marked them as permanent?
Technical SEO | | Breemcc0 -
Two websites, one company, one physical address - how to make the best of it in terms of local visibility?
Hello! I have one company which will be operating in two markets, printing and website design / development. I’m planning on building two websites, each for every market. But I’m a bit confused about how to optimize these websites locally. My thought is to use my physical address for one website (build citations, get listed in directories, etc. ) and PO Box for another. Do you think there is a better idea?
Technical SEO | | VELV1 -
301 Redirect for multiple links
I just relaunched my website and changed a permalink structure for several pages where only a subdirectory name changed. What 301 Redirect code do I use to redirect the following? I have dozens of these where I need to change just the directory name from "urban-living" to "urban", and want it to catch the following all in one redirect command. Here is an example of the structure that needs to change. Old
Technical SEO | | shawnbeaird
domain.com/urban-living (single page w/ content)
domain.com/urban-living/tempe (single page w/ content)
domain.com/urban-living/tempe/the-vale (single page w/ content) New
domain.com/urban
domain.com/urban/tempe
domain.com/urban/tempe/the-vale0 -
Getting high priority issue for our xxx.com and xxx.com/home as duplicate pages and duplicate page titles can't seem to find anything that needs to be corrected, what might I be missing?
I am getting high priority issue for our xxx.com and xxx.com/home as reporting both duplicate pages and duplicate page titles on crawl results, I can't seem to find anything that needs to be corrected, what am I be missing? Has anyone else had a similar issue, how was it corrected?
Technical SEO | | tgwebmaster0 -
How to do ip canonicalization ?
Hi , my website is opening with IP too. i think its duplicate content for google...only home page is opening with ip, no other pages, how can i fix it?, might be using .htaccess i am able to do...but don't know proper code for this...this website is on wordpress platform... Thanks Ramesh
Technical SEO | | unibiz0 -
Robots.txt to disallow /index.php/ path
Hi SEOmoz, I have a problem with my Joomla site (yeah - me too!). I get a large amount of /index.php/ urls despite using a program to handle these issues. The URLs cause indexation errors with google (404). Now, I fixed this issue once before, but the problem persist. So I thought, instead of wasting more time, couldnt I just disallow all paths containing /index.php/ ?. I don't use that extension, but would it cause me any problems from an SEO perspective? How do I disallow all index.php's? Is it a simple: Disallow: /index.php/
Technical SEO | | Mikkehl0 -
OK to block /js/ folder using robots.txt?
I know Matt Cutts suggestions we allow bots to crawl css and javascript folders (http://www.youtube.com/watch?v=PNEipHjsEPU) But what if you have lots and lots of JS and you dont want to waste precious crawl resources? Also, as we update and improve the javascript on our site, we iterate the version number ?v=1.1... 1.2... 1.3... etc. And the legacy versions show up in Google Webmaster Tools as 404s. For example: http://www.discoverafrica.com/js/global_functions.js?v=1.1
Technical SEO | | AndreVanKets
http://www.discoverafrica.com/js/jquery.cookie.js?v=1.1
http://www.discoverafrica.com/js/global.js?v=1.2
http://www.discoverafrica.com/js/jquery.validate.min.js?v=1.1
http://www.discoverafrica.com/js/json2.js?v=1.1 Wouldn't it just be easier to prevent Googlebot from crawling the js folder altogether? Isn't that what robots.txt was made for? Just to be clear - we are NOT doing any sneaky redirects or other dodgy javascript hacks. We're just trying to power our content and UX elegantly with javascript. What do you guys say: Obey Matt? Or run the javascript gauntlet?0 -
Websites on same c class IP address
If two websites are on the same c class IP address, what does it mean ? Does two websites belong to the same company ?
Technical SEO | | seoug_20050