Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best practice to redirects based on visitors' detected language
-
One of our websites has two languages, English and Italian.
The English pages are available at the root level:
www.site.com/ English homepage www.site.com/page1
www.site.com/page2The Italian pages are available under the /it/ level:
www.site.com/it Italian homepage www.site.com/it/pagina1
www.site.com/it/pagina2When an Italian visitor first visits www.mysit.com we'd like to redirect it to www.site.com/it but we don't know if that would impact search engine spiders (eg GoogleBot) in any way...
It would be better to do a Javascript redirect? Or an http 3xx redirect? If so, which of the 3xx redirect should we use?
Thank you
-
We've adopted the following solution:
we show the English homepage, but we determine the user's preferred language (from the Accept-Language header sent by the browser). If our site supports that language, we show a temporary balloon that highlights the related link to go to the localized homepage.
Thank you all for your hints and notes.
-
I would stay away from javascript redirects as it can be considered cloaking. Best thing to do is have a page for new visitors (those not having your cookie) and send them to a page that allows them to choose what language they want. You can then set a cookie so when they return it will automatically direct them to the right site.
By not doing any sneaky javascript redirects or IP redirects, you allow google the ability to crawl all the pages of your site and improve indexing, trust, etc etc... Also, I would go into Google webmaster tools and specify the country your /it pages are directed to. This will help in international search and trust from Google.
-
I've done a test with a simple ASP page with a Response.Redirect: <% Response.Redirect "test.htm" %>
This is what Fiddler has catched: HTTP/1.1 302 Object moved Server: Microsoft-IIS/5.1 Date: Thu, 05 May 2011 06:44:10 GMT X-Powered-By: ASP.NET Location: test.htm Content-Length: 121 Content-Type: text/html Cache-control: private <title>Object moved</title>
Object Moved
This object may be found <a href="">here</a>.
I don't think that 302 would be the best solution. As specified in the HTTP specs ( http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html ) wouldn't we prefer a 307 Temporary Redirect?
Thank you
-
You also asked about which 30x redirect to use. I'm also looking for this answer. We currently an ASP header redirect. I don't think this is best, but I'm not sure a 301 redirect can be used. I'd like to hear from others too.
This is what we have now:
lang = Request.ServerVariables("HTTP_ACCEPT_LANGUAGE")
real_lang = Left(lang,2)
'Response.Write real_lang
Select case real_lang
case "en"
Response.Redirect "/en"
case "fr"
Response.Redirect "/fr"
case "de"
Response.Redirect "/ge"
case else
Response.Redirect "/en"End Select
-
They automatically redirect people in the uk who type in www.google.com to www.google.co.uk
But, this is different from changing language on a visitor. I'm not sure what google would do if I was in Italy and used my american laptop to visit google.com. I don't think they'd switch me to www.google.it, but maybe someone else has this answer.
Using the browser language settings has worked well for us.
-
You might want to look into what Google do themselves.
They automatically redirect people in the uk who type in www.google.com to www.google.co.uk
If it's good enough for google it's good enough for us. Just make sure you do not look like you are cloaking.
You need to give users the ability to change language when they are on the website though. As Vince mentioned just because a user is visiting the website from Italy it does not mean that they are Italian.
-
Hi Daminao,
I do a redirect based on browser language. I'd stay away from IP/location based redirects. You can have English vistors in Italian locations that would be lost on your pages.
hth,
Vince
-
Hi Damiano,
Matt explained very good in this video and basically he answers all your question.
If you have additional Q. please let me know
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO Best Practices regarding Robots.txt disallow
I cannot find hard and fast direction about the following issue: It looks like the Robots.txt file on my server has been set up to disallow "account" and "search" pages within my site, so I am receiving warnings from the Google Search console that URLs are being blocked by Robots.txt. (Disallow: /Account/ and Disallow: /?search=). Do you recommend unblocking these URLs? I'm getting a warning that over 18,000 Urls are blocked by robots.txt. ("Sitemap contains urls which are blocked by robots.txt"). Seems that I wouldn't want that many urls blocked. ? Thank you!!
Intermediate & Advanced SEO | | jamiegriz0 -
What's the best way to noindex pages but still keep backlinks equity?
Hello everyone, Maybe it is a stupid question, but I ask to the experts... What's the best way to noindex pages but still keep backlinks equity from those noindexed pages? For example, let's say I have many pages that look similar to a "main" page which I solely want to appear on Google, so I want to noindex all pages with the exception of that "main" page... but, what if I also want to transfer any possible link equity present on the noindexed pages to the main page? The only solution I have thought is to add a canonical tag pointing to the main page on those noindexed pages... but will that work or cause wreak havoc in some way?
Intermediate & Advanced SEO | | fablau3 -
Why do people put xml sitemaps in subfolders? Why not just the root? What's the best solution?
Just read this: "The location of a Sitemap file determines the set of URLs that can be included in that Sitemap. A Sitemap file located at http://example.com/catalog/sitemap.xml can include any URLs starting with http://example.com/catalog/ but can not include URLs starting with http://example.com/images/." here: http://www.sitemaps.org/protocol.html#location Yet surely it's better to put the sitemaps at the root so you have:
Intermediate & Advanced SEO | | McTaggart
(a) http://example.com/sitemap.xml
http://example.com/sitemap-chocolatecakes.xml
http://example.com/sitemap-spongecakes.xml
and so on... OR this kind of approach -
(b) http://example/com/sitemap.xml
http://example.com/sitemap/chocolatecakes.xml and
http://example.com/sitemap/spongecakes.xml I would tend towards (a) rather than (b) - which is the best option? Also, can I keep the structure the same for sitemaps that are subcategories of other sitemaps - for example - for a subcategory of http://example.com/sitemap-chocolatecakes.xml I might create http://example.com/sitemap-chocolatecakes-cherryicing.xml - or should I add a sub folder to turn it into http://example.com/sitemap-chocolatecakes/cherryicing.xml Look forward to reading your comments - Luke0 -
Best-practice URL structures with multiple filter combinations
Hello, We're putting together a large piece of content that will have some interactive filtering elements. There are two types of filters, topics and object types. The architecture under the hood constrains us so that everything needs to be in URL parameters. If someone selects a single filter, this can look pretty clean: www.domain.com/project?topic=firstTopic
Intermediate & Advanced SEO | | digitalcrc
or
www.domain.com/project?object=typeOne The problems arise when people select multiple topics, potentially across two different filter types: www.domain.com/project?topic=firstTopic-secondTopic-thirdTopic&object=typeOne-typeTwo I've raised concerns around the structure in general, but it seems to be too late at this point so now I'm scratching my head thinking of how best to get these indexed. I have two main concerns: A ton of near-duplicate content and hundreds of URLs being created and indexed with various filter combinations added Over-reacting to the first point above and over-canonicalizing/no-indexing combination pages to the detriment of the content as a whole Would the best approach be to index each single topic filter individually, and canonicalize any combinations to the 'view all' page? I don't have much experience with e-commerce SEO (which this problem seems to have the most in common with) so any advice is greatly appreciated. Thanks!0 -
Help FORUM ( User generated content ) SEO best practices
Hello Moz folks ! For the very first time im dealing with a massive community who rely on UGC ( user generated content ). Their forum is finding a great deal of duplicate content/broken link/ duplicate title and on-site issue. I have Advance SEO knowledge related to ecommerce or blogging but new to forum and UGC. I would really love to learn or get ressources links that would allow me to see/understand the best practices in term of SEO. Any help is greatly appreciated. Best, Yan
Intermediate & Advanced SEO | | ydesjardins2000 -
Php 301 redirect
Hi I am migrating an old wordpress site to a custom PHP site and the URL profiles will be different, so want to retain all link profiles and more importantly if a user visits the old urls via search then they are seamlessly transferred to the new equivalent page For example www.domain.com/about-us is going to need to redirect to www.domain.com/aboutus.php www.domain.com/furniture is going to need to redirect to www.domain.com/furniture-collections.php etc What is the best way of achieving this apart from .htaccess as not 100% confident of doing this. Could it be done via PHP or using meta tags?
Intermediate & Advanced SEO | | ocelot0 -
Remove URLs that 301 Redirect from Google's Index
I'm working with a client who has 301 redirected thousands of URLs from their primary subdomain to a new subdomain (these are unimportant pages with regards to link equity). These URLs are still appearing in Google's results under the primary domain, rather than the new subdomain. This is problematic because it's creating an artificial index bloat issue. These URLs make up over 90% of the URLs indexed. My experience has been that URLs that have been 301 redirected are removed from the index over time and replaced by the new destination URL. But it has been several months, close to a year even, and they're still in the index. Any recommendations on how to speed up the process of removing the 301 redirected URLs from Google's index? Will Google, or any search engine for that matter, process a noindex meta tag if the URL's been redirected?
Intermediate & Advanced SEO | | trung.ngo0 -
Do 404 pages pass link juice? And best practices...
Last year Google said bad links to 404 pages wouldn't hurt your site. Could that still be the case in light of recent Google updates to try and combat spammy links and negative SEO? Can links to 404 pages benefit a website and pass link juice? I'd assume at the very least that any link juice will pass through links FROM the 404 page? Many websites have great 404 pages that get linked to: http://www.opensiteexplorer.org/links?site=http%3A%2F%2Fretardzone.com%2F404 - that was the first of four I checked from the "60 Really Cool...404 Pages" that actually returned the 404 HTTP Status! So apologies if you find the word 'retard' offensive. According to Open Site Explorer it has a decent Page Authority and number of backlinks - but it doesn't show in Google's SERPs. I'd never do it, but if you have a particularly well-linked to 404 page, is there an argument for giving it 200 OK Status? Finally, what are the best practices regarding 404s and address bar links? For example, if
Intermediate & Advanced SEO | | Alex-Harford
www.examplesite.com/3rwdfs returns a 404 error, should I make that redirect to
www.examplesite.com/404 or leave it as is? Redirecting to www.examplesite.com/404 might not be user-friendly as people won't be able to correct the URL in the address bar. But if I have a great 404 page that people link to, I don't want links going to loads of random pages do I? Is either way considered best practice? If I did a 301 redirect I guess it would send the wrong signal to the crawlers? Should I use a 302 redirect, or even a 304 Not Modified redirect?1