Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How do we handle sitemaps in robots.txt when multiple domains point to same physical location?
-
we have www.mysite.net, www.mysite.se, www.mysite.fi and so on. all of these domains point to the same physical location on our webserver, and we replace texts given back to client depending on which domain he/she requested.
My problem is this: How do i configure sitemaps in robots.txt when robots.txt is used by multiple domains? If I for instance put the rows
Sitemap: http://www.mysite.net/sitemapNet.xml
Sitemap: http://www.mysite.net/sitemapSe.xmlin robots.txt, would that result in some cross submission error?
-
Thanks for your help René!
-
yup
-
Yes, I mean GTW of course :).
A folder for each site would definitely make some things easier, but it would also mean more work every time we need to republish the site or make configurations.
Did I understand that googlelink correctly in that if we have verified ownership in GWT for all involved domains cross-site submission in robots.txt was okay? I guess google will think its okay anyway.
-
actually google has the answer, right here: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=75712
I always try to do what google recommends even though something might work just as well.. just to be on the safe side
-
you can't submit a sitemap in GA so I'm guessing you mean GWT
Whether or not you put it in the robots.txt shouldn't be a problem. since in each sitemap, the urls would look something like this:
Sitemap 1:<url><loc>http:/yoursite.coim/somepage.html</loc></url>
Sitemap 2:<url><loc>http:/yoursite.dk/somepage.html</loc></url>
I see no need to filter what sitemap is shown to the crawler. If your .htaccess is set-up to redirect traffic from the TLD (top level domain eg .dk .com ex.) to the correct pages. Then the sitemaps shouldn't be a problem.
The best solution would be: to have a web in web. (a folder for each site on the server) and then have the htaccess redirect to the right folder. in this folder you have a robots.txt and a sitemap for that specific site. that way all your problems will be gone in a jiffy. It will be just like managing different 3 sites. even though it isn't.
I am no ninja with .htaccess files but I understand the technology behind it and know what you can do in them. for a how to do it guide, ask google thats what I allways do when I need to goof around in the htaccess. I hope it made sense.
-
Thanks for your response René!
Thing is we already submit the sitemaps in google analytics, but this SEO company we hired wants us to put the sitemaps in robots.txt as well.
The .htaccess idea sounds good, as long as google or someone else dont think we are doing some cross-site submission error (as described here http://www.sitemaps.org/protocol.php#submit_robots)
-
I see no need to use robots.txt for that. use Google and Bings webmaster tools. Here you have each domain registered and can submit sitemaps to them for each domain.
If you want to make sure that your sitemaps are not crawled by a bot for a wrong language. I would set it up in the .htaccess to test for the entrance domain and make sure to redirect to the right file. Any bot will enter a site just like a browser so it needs to obey the server. so if the server tells it to go somewhere it will.
the robots.txt can't by it self, do what you want. The server can however. But in my opinion using bing and google webmaster tools should do the trick.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Canonicalising a product with multiple variants
I am working with an ecommerce site and have encountered an issue I haven't come across before and would appreciate some advice on how to proceed. There are multiple variation products with one master product and then up to 20 or 30 variant products, the variation could be colour, size or both. The site has been set up to canonicalise all the variations to the master variant product, which I understand to be best practice. But, this is where the issue occurs, the master variant product URL 302 redirects to one of the variant product URLs. Example below. My question is, is this harmful to our SEO efforts? Would be be best to canonicalise to a preferred colour or size variation? EXAMPLE: Master variant product: www.example.co.uk/primary-category/product-123 Seeing this product on the page and clicking will 302 redirect to www.example/co.uk/primiary-category/product-123/colour-456 On page www.example/co.uk/primiary-category/product-123/colour-456 the canonical tag is www.example.co.uk/primary-category/product-123 Any help with this would be greatly appreciated.
On-Page Optimization | | SimonKenworthy0 -
Best Tool for Retrieving Multiple URL Word Counts in Bulk?
I am doing some content analysis with over 200 URLs to go through! Does anybody know of, or can recommend any bulk on-page word count checkers which would help with the heavy lifting? Any suggestions are greatly appreciated. Thanks!
On-Page Optimization | | NickG-1230 -
Alt Tags on multiple product images
Hi I work on SEO for an ecommerce site and wanted to find out how important it is to optimise all images with alt tags. We have alt tags in place, however have not optimised descriptions for the following example images: Front of cupboard Back of cupboard Side of cupboard etc Is this dangerous for SEO if these images all have the same alt tag? We have thousands of products so it would be a huge job to update these, but if it's crucial for SEO we can work through our priorities. Thank you!
On-Page Optimization | | BeckyKey0 -
How to 301 redirect, without access to .htaccess and to a new domain
There are few ways to do this and I would like to ask other Mozzers if they have found the best way. We have a site .co.uk and are moving it back to .com. However we do not have any access to the site folders for .co.uk. (We have to move it anyway as our provider is withdrawing their service). We have built our URL 301 redirect file and it is ready to go, but how to impliment it? We can repoint .co.uk to another site, and then redirect all traffic for each URL but this is quite messy, or just forget trying to 301 each page and just rediect the whole site.
On-Page Optimization | | BruceA
the .com has more authority already, but we ready do not want to frustrate visitors who are using a link to reach a product, only to find they hit our homepage and not the product. Your thoughts would be very welcome or other ideas Bruce0 -
How to Handle duplicate pages/titles in Wordpress
The wordpress blog causes problems with page titles. If you go to the second page of blog posts it there's a different URL but with the same page title. for example: page 1: site/blog page 2: site/blog/page/2 Each page gets flagged for duplicate page titles. Thanks in advance for your thoughts,
On-Page Optimization | | heymarshall1 -
Handling multiple locations in the footer
I have a client with several locations. Should I include only the main office's address in the footer? The client is wanting to add them all.
On-Page Optimization | | SearchParty0 -
How to handle Meta Tags on Pagination... page 2,3,4....
Seems that SEOMoz reports are considering my paginated pages as duplicate Meta Tags. For example, I have a product catalog with 5 paginated pages. Obviously the content on each page is unique and the URL ends in =4, =5 for the page number, but the Title and Description are the same for all the pages. Any suggestions on how to handle this? The pages other than page 1 are not indexed, so it should not be a big deal. But wondering if I should programatically ad the page number to the additional pages to show a difference?
On-Page Optimization | | paddlej0 -
Should I include location in title tag to rank higher in local search
I'm working on a site for a small guest house (http://www.tommysonthebeach.com). I have created a Google Place page (Bing and Yahoo Local) as well and I have the address in the footer on every page. I have the location (Indian Rocks Beach) at the beginning of most titles tags because that is how people tend to search, e.g. "Indian Rocks Beach vacation rental." In theory I would think that I don't need location in the title tag because Google knows the location, and I could use the real estate for other keywords suchs as "pet friendly" or "beach hotel," etc. But when I look at the SERPS, those ranking highly all seem to have the location at the beginning of the title tag. Thanks. P.S. The site is currently not showing up in Google local search apparently because Google thinks it's a vacation rental agency, which are not allowed in local search. I'm trying to get that fixed.
On-Page Optimization | | bvalentine0