Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Advise on the right way to block country specific users but not block Googlebot - and not be seen to be cloaking. Help please!
-
Hi,
I am working on the SEO of an online gaming platform - a platform that can only be accessed by people in certain countries, where the games and content are legally allowed.
Example: The games are not allowed in the USA, but they are allowed in Canada.Present Situation:
Presently when a user from the USA visits the site they get directed to a restricted location page with the following message:RESTRICTED LOCATION
Due to licensing restrictions, we can't currently offer our services in your location. We're working hard to expand our reach, so stay tuned for updates!Because USA visitors are blocked Google which primarily (but not always) crawls from the USA is also blocked, so the company webpages are not being crawled and indexed.
Objective / What we want to achieve:
The website will have multiple region and language locations. Some of these will exist as standalone websites and others will exist as folders on the domain. Examples below:
domain.com/en-ca [English Canada]
domain.com/fr-ca [french Canada]
domain.com/es-mx [spanish mexico]
domain.com/pt-br [portugese brazil]
domain.co.in/hi [hindi India]If a user from USA or another restricted location tries to access our site they should not have access but should get a restricted access message.
However we still want google to be able to access, crawl and index our pages.Can i suggest how do we do this without getting done for cloaking etc?
Would this approach be ok? (please see below)
We continue to work as the present situation is presently doing, showing visitors from the USA a restricted message.
However rather than redirecting these visitors to a restricted location page, we just black out the page and show them a floating message as if it were a model window.
While Googlebot would be allowed to visit and crawl the website.I have also read that it would be good to put paywall schema on each webpage to let Google know that we are not cloaking and its a restricted paid page. All public pages are accessible but only if the visitor is from a location that is not restricted
Any feedback and direction that can be given would be greatly appreciated as i am new to this angle of SEO.
Sincere thanks,
-
o ensure SEO compliance while restricting access to certain countries, follow these 3 steps and keep in mind that these are critical to follow if you want to work on multinational and multilingual site:
Page Blackout for Restricted Visitors: Instead of redirecting users, blackout the content and display a message. For example, https://fifamobilefc.com/ shows a message to users from restricted countries while allowing Google to crawl the pages.
Implement Paywall Schema: Use paywall schema markup to signal to Google that content is restricted but not cloaked. This helps maintain transparency with search engines.
Geo-Targeting: Employ geo-targeting to identify and present the message to users from restricted countries, while still allowing Google to access the content.
By applying these methods, you can maintain SEO compliance while effectively restricting access to users from certain countries. Regular monitoring via Google Search Console ensures continued adherence to best practices.
-
@MarkCanning said in Advise on the right way to block country specific users but not block Googlebot - and not be seen to be cloaking. Help please!:
Hi,
I am working on the SEO of an online gaming platform - a platform that can only be accessed by people in certain countries, where the games and content are legally allowed.
Example: The games are not allowed in the USA, but they are allowed in Canada.
Present Situation:
Presently when a user from the USA visits the site they get directed to a restricted location page with the following message:
RESTRICTED LOCATION
Due to licensing restrictions, we can't currently offer our services in your location. We're working hard to expand our reach, so stay tuned for updates!
Because USA visitors are blocked Google which primarily (but not always) crawls from the USA is also blocked, so the company webpages are not being crawled and indexed.
Objective / What we want to achieve:
The website will have multiple region and language locations. Some of these will exist as standalone websites and others will exist as folders on the domain. Examples below:
domain.com/en-ca [English Canada]
domain.com/fr-ca [french Canada]
domain.com/es-mx [spanish mexico]
domain.com/pt-br [portugese brazil]
domain.co.in/hi [hindi India]
If a user from USA or another restricted location tries to access our site they should not have access but should get a restricted access message.
However we still want google to be able to access, crawl and index our pages.
Can i suggest how do we do this without getting done for cloaking etc?
Would this approach be ok? (please see below)
We continue to work as the present situation is presently doing, showing visitors from the USA a restricted message.
However rather than redirecting these visitors to a restricted location page, we just black out the page and show them a floating message as if it were a model window.
While Googlebot would be allowed to visit and crawl the website.
I have also read that it would be good to put paywall schema on each webpage to let Google know that we are not cloaking and its a restricted paid page. All public pages are accessible but only if the visitor is from a location that is not restricted
Any feedback and direction that can be given would be greatly appreciated as i am new to this angle of SEO.By blacking out the page for visitors from restricted locations while allowing Googlebot access, you're ensuring compliance without hindering indexing. Implementing paywall schema can further clarify to Google that the restriction is based on licensing rather than cloaking. Just ensure consistent implementation across all restricted pages and adhere to Google's guidelines to avoid any issues.
-
@George_Inoriseo hi george, i submitted a previous reply on here but can't see it anywhere.
Firstly thank you for your feedback. I have some extra questions.
Lets assume we have a Canadian version of the website and a US human visitor tries to visit that site or any page on the site. They should be able to browse to the site but an overlay would appear meaning they cannot use the site or proceed any further. The overlap would say te site is restricted in their location. I see other companies doing this. What way would google handle this:
- Could they proceed to crawl the website or would the javascript overlap prevent Googlebot from crawling and indexeing?
- If googlebot where to look at the hash information of the page companred to the hash of what a user sees would they be the same? I believe if their is a big difference in the hash this is a signal for cloaking - because it shows the information / page size is substantially different.
- Would it be wise to avoid user agent lookups in the code? Again i believe this can signal to Google to Googl that manipulation is taking place.
I heard from a google offical that paywall schema might not be a great method.
"Paywall markup would not be suited here since there's no log-in or paymeny that can be done to get access when in the wrong country".Thanks
-
@George_Inoriseo thanks very much George.
The website will have a .com domain and then subfolders will branch off that for different countries / languages. So the structure would be like this:
domain.com
doman.com/en-ca (english Canada}
domain.com/fr-ca (french Canada)The company have licenses for certain countries and in countries where they don't have a license to operate (e.g. USA) users visiting our sites from those countries, should not be able to play. So on our Canadian website, if we detect a user is from USA (where we don't have a license) the user should get a message telling them they can't play. They should be able to visit the site ok, but the website would sniff the location and tell them that they can't play with the website blacked out.
As you suggested we could have a javascript overlay that loads if the user is from the USA. I assume this would only look at the geolocation and not the user agent? Looking up the user agent would be a clear sign we are doing something different for users and Googlebot would it not? Would an overlay restrict Googlebot from crawling the site and because the user is seeing something different to Googlebot could this be perceived as cloaking?
I spoke to someone at Google regarding paywall schema and the feeling was this: "paywall markup would not be suited since there is no log-in or payment that can be done to get access when in the wrong country".
Thanks again George.
-
@MarkCanning here is what I would do:
Avoid Redirects for Blocked Regions: Instead of redirecting users from blocked regions to a different page, use a client-side overlay (like a modal window) to display the restricted access message. This method keeps all users on the same URL.
Implement Paywall Schema: Applying the paywall schema is a smart move. It informs Google that your content restrictions are based on user location, not pay-to-access barriers, which helps avoid penalties for cloaking.
Ensure Accessible Content for Googlebot: Allow Googlebot to crawl the original content. Ensure that your site’s robots.txt file permits Googlebot to access the URLs of region-specific pages.
Use hreflang Tags for Multi-Region Sites: For multiple language and region versions, use hreflang tags to help Google understand the geographic and language targeting of your pages. This will also prevent duplicate content issues.
Monitor and Adapt: Keep an eye on Google Search Console to monitor how these changes affect your site's indexing and adjust your strategies as needed.
This strategy should help you manage SEO for restricted content effectively, while staying compliant with Google’s guidelines.
Best of luck!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website is not getting indexed
Hi,
Content Development | | Aman0022
Hope you all are doing great!
I have created a dog blog a few weeks back which talks about all things about dogs (http://pawspulse.com/). I am publishing couple of articles everyday which are more than 5k words long with proper keyword research but still Google is not indexing my content. My content is systematically categorized in proper categories related to dog guides, nutrition, accessories, dog breeds etc.
Can anyone help me how to get the website index faster fully. Any help will be much appreciated. Thanks0 -
How to rank a website in different countries
I have a website which I want to rank in UK, NZ and AU and I want to keep my domain as .com in all the countries. I have specified the lang=en now what needs to be done to rank one website in 3 different English countries without changing the domain extension i.e. .com.au or .com.nz
SEO Tactics | | Ravi_Rana0 -
Multilang site: Auto redirect 301 or 302?
We need to establish if 301 or 302 response code is to be used for our auto redirects based on Accept-Language header. https://domain.com
International SEO | | fJ66doneOIdDpj
30x > https://domain.com/en
30x > https://domain.com/ru
30x > https://domain.com/de The site architecture is set up with proper inline HREFLANG.
We have read different opinions about this, Ahrefs says 302 is the correct one:
https://ahrefs.com/blog/301-vs-302-redirects/
302 redirect:
"You want to redirect users to the right version of the site for them (based on location/language)." You could argue that the root redirect is never permanent as it varies based on user language settings (302)
On the other hand, the lang specific redirects are permanent per language: IF Accept-Language header = en
https://domain.com > 301 > https://domain.com/en
IF Accept-Language header = ru
https://domain.com > 301 > https://domain.com/ru So each of these is 'permanent'. So which is the correct?0 -
Redirect the main site to keyword-rich subfolder / specific page for SEO
Hi, I have two questions. Question 1: is it worthwhile to redirect the main site to keyword-rich subfolder / specific page for SEO? For example, my company's webpage is www.example.com. Would it make sense to redirect (301) the main site to address www.example.com/service-one-in-certain-city ? I am asking this as I have learned that it is important for SEO to have keywords in the URL, and I was thinking that we could do this and include the most important keywords to the subfolder / specific URL. What are the pros and cons of this? Should I create folders or pages just the sake of keywords? Question 2: Most companies have their main URL shown as www.example.com when you access their domain. However, some multi-language sites show e.g. www.example.com/en or www.example.com/en/main when you type the domain to your web browser to access the site. I understand that this is a common practice to use subdomains or folders to separate different language versions. My question is regarding subfolders. Is it better to have only the subfolder shown (www.example.com/en) or should I also include the specific page's URL after the subfolder with keywords (www.example.com/en/main or www.example.com/en/service-one-in-certain-city)? I don't really understand why some companies show only the subfolder of a specific language page and some the page's URL after the subfolder. Thanks in advance, Sam
International SEO | | Awaraman1 -
Subdomains or subfolders for language specific sites?
We're launching an .org.hk site with English and Traditional Chinese variants. As the local population speaks both languages we would prefer not to have separate domains and are deciding between subdomains and subfolders. We're aware of the reasons behind generally preferring folders, but many people, including moz.com, suggest preferring subfolders to subdomains with the notable exception of language-specific sites. Does this mean subdomains should be preferred for language specific sites, or just that they are okay? I can't find any rationale to this other than administrative simplification (e.g. easier to set up different analytics / hosting), which in our case is not an issue. Can anyone point me in the right direction?
International SEO | | SOS_Children0 -
Ranking issues for UK vs US spelling - advice please
Hi guys, I'm reaching out here for what may seem to be a very simple and obvious issue, but not something I can find a good answer for. We have a .com site hosted in Germany that serves our worldwide audience. The site is in English, but our business language is British (UK) English. This means that we rank very well for (e.g.) optimisation software but optimization software is nowhere to be found. The cause of this to me seems obvious; a robot reading those two phrases sees two distinct words. Nonetheless, having seen discussions of a similar nature around the use of plurals in keywords, it would seem to me that Google should have this sort of thing covered. Am I right or wrong here? If I'm wrong, then what are my options? I really don't want to have to make a copy of the entire site; apart from the additional effort involved in content upkeep I see this path fraught with duplicate content issues. Any help is very much appreciated, thanks.
International SEO | | StevenHowe0 -
Cross domain rel alternate, will it help or hurt?
I have a website that has similar pages on a US version and a UK version. Currently we want Uk traffic to go to the US, but the US domain is so strong it is outranking the UK in the UK. We want to try using rel alternate but have some concerns. Currently for some of our keywords US is #1, UK is #4. If we implement rel alternate, will it just remove our US page? We don't want to shoot ourselves in the foot and lose traffic. Is this worth doing, will it just remove our US ranking and our double listing? Any anecdotes, experiences or opinions are appreciated. Thanks.
International SEO | | MarloSchneider0 -
Country name displayed after domain name in google SERP
our online shop targets clients in the US and worldwide (same URL - no subdirectories - currency changes based on IP). when searching in google.ie or google.no for our site google displays in the SERPS "US" or "United States" after the URL for our site, but for most other US competitors it does not show the country in the SERPS. I deleted our google places listing 2 weeks ago, since I suspected it may be related, but no change so far. In google webmaster tools we have targeted the shop domain to United States, which may be another factor. Unfortunately we can not undo this setting since without it our google US ranking for the most relevant competitive keyword drops from position 8 to position 100+. Server location is in Germany which despite lots of US links and US contact info and USD currency appparently makes google think that the site is not targeting the US. Does anybody know what triggers the country name in the SERPS (google places or webmaster tools or other) and can give advice if there is any way to get rid of it.
International SEO | | lcourse0