Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Blocking domestic Google's in Robots.txt
-
Hey,
I want to block Google.co.uk from crawling a site but want Google.de to crawl it.
I know how to configure the Robots.txt to block Google and other engines - is there a fix to block certain domestic crawlers?
any ideas?
Thanks
B
-
Thanks Guys for all of the help.
I think we will just implement cross domain GeoIP redirects to ensure users get the right location and currency.
Cheers
-
Are you having the issue of your .de pages ranking in .co.uk instead of your .co.uk pages?
If that's the case then I'd look towards usage of HREFLANG both on-page and in the xml sitemaps. That is going to provided Googlebot with a better view of the country-language targeting for the site.
-
Hi, country specific search engine spiders cannot be blocked using robots.txt file or any other method. However, you can block certain IP ranges pertaining to certain countries.
Best regards,
Devanur Rafi
-
Hi Gareth,
I don't think this is going to work as every crawler by Google is run by the same useragent: Googlebot. What you could do but what I really wouldn't recommend to do is generating the robots.txt automatically. Check if the IP address of the user is in another country and then Disallow them. It probably won't work as the crawler from let's say Germany could also be used for the UK.
Also the specific data for a countries search engine gets collected first and then gets looked at to see what they need users to serve, not the other way around that content gets acquired for a specific country.
Hope this helps!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Advise on the right way to block country specific users but not block Googlebot - and not be seen to be cloaking. Help please!
Hi, I am working on the SEO of an online gaming platform - a platform that can only be accessed by people in certain countries, where the games and content are legally allowed.
International SEO | | MarkCanning
Example: The games are not allowed in the USA, but they are allowed in Canada. Present Situation:
Presently when a user from the USA visits the site they get directed to a restricted location page with the following message: RESTRICTED LOCATION
Due to licensing restrictions, we can't currently offer our services in your location. We're working hard to expand our reach, so stay tuned for updates! Because USA visitors are blocked Google which primarily (but not always) crawls from the USA is also blocked, so the company webpages are not being crawled and indexed. Objective / What we want to achieve: The website will have multiple region and language locations. Some of these will exist as standalone websites and others will exist as folders on the domain. Examples below:
domain.com/en-ca [English Canada]
domain.com/fr-ca [french Canada]
domain.com/es-mx [spanish mexico]
domain.com/pt-br [portugese brazil]
domain.co.in/hi [hindi India] If a user from USA or another restricted location tries to access our site they should not have access but should get a restricted access message.
However we still want google to be able to access, crawl and index our pages. Can i suggest how do we do this without getting done for cloaking etc? Would this approach be ok? (please see below) We continue to work as the present situation is presently doing, showing visitors from the USA a restricted message.
However rather than redirecting these visitors to a restricted location page, we just black out the page and show them a floating message as if it were a model window.
While Googlebot would be allowed to visit and crawl the website. I have also read that it would be good to put paywall schema on each webpage to let Google know that we are not cloaking and its a restricted paid page. All public pages are accessible but only if the visitor is from a location that is not restricted Any feedback and direction that can be given would be greatly appreciated as i am new to this angle of SEO. Sincere thanks,0 -
Problem to get multilingual posts indexed on Google
Last year on June I decided to make my site multi-lingual. The domain is: https://www.dailyblogprofits.com/ The main language English and I added Portuguese and a few posts on Spanish. What happened since then? I started losing traffic from Google and posts on Portuguese are not being indexed. I use WPML plugin to make it multi-lingual and I had Yoast installed. This week I uninstalled Yoast and when I type on google "site:site:dailyblogprofits.com/pt-br" I started seeing Google indexing images, but still not the missing posts. I have around 145 posts on Portuguese, but on Search Console it show only 57 hreflang tags. Any idea what is the problem? I'm willing to pay for an SEO Expert to resolve this problem to me.
International SEO | | Cleber0090 -
How to check Robots.txt File
I want to know that how we are going to check that the Robots.txt File of the website is working properly. Kindly elaborate the mechanism for it please.
International SEO | | seobac1 -
Redirect to 'default' or English (/en) version of site?
Hi Moz Community! I'm trying to work through a thorny internationalization issue with the 'default' and English versions of our site. We have an international set-up of: www.domain.com (in english) www.domain.com/en www.domain.com/en-gb www.domain.com/fr-fr www.domain.com/de-de and so on... All the canonicals and HREFLANGs are set up, except the English language version is giving me pause. If you visit www.domain.com, all of the internal links on that page (due to the current way our cms works) point to www.domain.com/en/ versions of the pages. Content is identical between the two versions. The canonical on, say, www.domain.com/en/products points to www.domain.com/products. Feels like we're pulling in two different directions with our internationalization signals. Links go one way, canonical goes another. Three options I can see: Remove the /en/ version of the site. 301 all the /en versions of pages to /. Update the hreflangs to point the EN language users to the / version. **Redirect the / version of the site to /en. **The reverse of the above. **Keep both the /en and the / versions, update the links on / version. **Make it so that visitors to the / version of the site follow links that don't take them to the /en site. It feels like the /en version of the site is redundant and potentially sending confusing signals to search engines (it's currently a bit of a toss-up as to which version of a page ranks). I'm leaning toward removing the /en version and redirecting to the / version. It would be a big step as currently - due to the internal linking - about 40% of our traffic goes through the /en path. Anything to be aware of? Any recommendations or advice would be much appreciated.
International SEO | | MaxSydenham0 -
In the U.S., how can I stop the European version of my site from outranking the U.S. version?
I've got a site with two versions – a U.S. version and a European version. Users are directed to the appropriate version through a landing page that asks where they're located; both sites are on the same domain, except one is .com/us and the other is .com/eu. My issue is that for some keywords, the European version is outranking the U.S. version in Google's U.S. SERPs. Not only that, but when Google displays sitelinks in the U.S. SERPs, it's a combination of pages on the European site and the U.S. site. Does anyone know how I can stop the European site from outranking the U.S. site in the U.S.? Or how I can get Google to only display sitelinks for pages on the U.S. site in the U.S. SERPs? Thanks in advance for any light you can shed on this topic!
International SEO | | matt-145670 -
Google.ie returning more and more UK based results, why?
I have discovered the most infuriating issue with Google Search for Irish users and it seems to be getting increasingly worse in the last 2 years or so. This is not only frustrating as a business owner (in fact it could bring a business to its knees) but it is rage inducing as a consumer.
International SEO | | Secrets
Google knows the location where I am searching from and I'm using google.ie yet I still get just a small number of Irish websites usually followed by eBay and Amazon results then a never ending list of websites that are based in the United Kingdom. Now, I know the one thing that we all have in common is the use of the English language, however what we don't have in common is shipping costs. In order to slightly increase the number of Irish based companies I need to add in the phrase 'Ireland' to my search (on google.ie in Ireland) and this makes only a small difference. In fact, oftentimes Google seems to throw in the odd American or Australian site just to really wind me up.
It's completely absurd that Google rarely returns results for .ie websites or irish based websites when searching in Ireland. Many UK companies don't ship to Ireland (including many of the eBay and Amazon results). This is killing Irish businesses who have the products and cheaper or free shipping and many how are working damn hard on their SEO are still being passed up for companies that have nothing to do with our economy.... Why oh why is this happening.0 -
If I redirect based on IP will Google still crawl my international sites if I implement Hreflang
We are setting up several international sites. Ideally, we wouldn't set up any redirects, but if we have to (for merchandising reasons etc) I'd like to assess what the next best option would be. A secondary option could be that we implement the redirects based on IP. However, Google then wouldn't be able to access the content for all the international sites (we're setting up 6 in total) and would only index the .com site. I'm wondering whether the Hreflang annotations would still allow Google to find the International sites? If not, that's a lot of content we are not fully benefiting from. Another option could be that we treat the Googlebot user agent differently, but this would probably be considered as cloaking by the G-Man. If there are any other options, please let me know.
International SEO | | Ben.JD0 -
How to rank in Google for a specific country?
Hi, I've a relative good ranking for a specific keyword in google.com (english queries (hl=en)), but searching for the same keyword in google.com.br (Brazilian Portuguese (hl=pt-BR)), my rank for that keyword is far worst. The question is: I need to do something specific to rank in google.com.br (hl=pt-BR)? I'm doing the regular link building. Creating some blogs, blogging for 10 days before droping my links, and creating link wheels the same way. The blogs I create to make links are written in Brazilian Portuguese, also, the blog that I'm trying to rank higher, is also written in Brazilian Portuguese. Sorry for the english, it's not my native language. Thanks
International SEO | | izaiasalmeida0