Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best way to indicate multiple Lang/Locales for a site in the sitemap
-
So here is a question that may be obvious but wondering if there is some nuance here that I may be missing.
Question: Consider an ecommerce site that has multiple sites around the world but are all variations of the same thing just in different languages. Now lets say some of these exist on just a normal .com page while others exist on different ccTLD's. When you build out the XML Sitemap for these sites, especially the ones on the other ccTLD's, we want to ensure that using
<loc>http://www.example.co.uk/en_GB/"</loc>
<xhtml:link<br>rel="alternate"
hreflang="en-AU"
href="http://www.example.com.AU/en_AU/"
/>
<xhtml:link<br>rel="alternate"
hreflang="en-NZ"
href="http://www.example.co.NZ/en_NZ/"
/>Would be the correct way of doing this. I know I have to change this for each different ccTLD but it just looks weird when you start putting about 10-15 different language locale variations as alternate links. I guess I am just looking for a bit of re-affirmation I am doing this right.</xhtml:link<br></xhtml:link<br>
Thanks!
-
Yes, you are doing the right thing. You may also want to look at including Meta Tags in the as well. ()
-
Maybe the best solution is using a tool like this one by MediaFlow: http://www.themediaflow.com/resources/tools/href-lang-tool/.
You feed the tool with an .csv file and it returns you with a sitemaps.xml with all the hreflang annotations included.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moving to new site. Should I take old blog posts with me?
Our company website has needed a complete overhaul for some time now and the new one is almost ready to go live. We also have a separate "news" site that is houses around 800 blog posts and news items. (That news site will be thrown away because it's on a completely different domain and causes confusion.) So we have a main site with about 100 decent blog posts and a separate news site with 800 poor posts. I plan on bringing all the main site blog posts over to the new site (both WordPress), but my question is whether or not to bring over the news site posts? All, handful, none? Another issue is the news site doesn't have Google Analytics, so I'm not sure if any posts actually generate traffic, but I can from the main site we do get some referrals from it. As far as quality of content goes, it's poor. Not sure who wrote it all, but it's mainly text press releases that aren't very interesting. Is it worth bringing over for SEO purposes or simply delete the site and create a mass redirect so all of those pages will direct to the new website's blog page? Any help is greatly appreciated.
Web Design | | codyfrew0 -
How long should an old site redirecting to a new site remain activated on a server?
Once I switch a site to a new domain (with links to corresponding/relative pages), will I have to keep the old site live forever for those links to work, or how long should I wait before I inactivate the old site on our server?
Web Design | | jwanner0 -
301 Redirect all pictures when moving to a new site?
We have 30,000 pictures on our site. Moz will return 404's on some occasionally, but Google seems to ignore those. Should I 301 redirect all those images when we move to a new site lay-out? Appreciate your views!
Web Design | | Discountvc0 -
Sitemap Update Frequency?
Hello, My question today is regarding sitemaps. I'm often confused by this and because I am a bit obsessive I believe I may be giving myself more work than needed.. Basically my question is, do I need to update and/or re-generate my sitemap every time I make a change to the site? I mean, I must have to if I add a page, correct? And so in Google's Webmaster Tools, do I just delete the current sitemap and re-upload a new one for Google to crawl? Is it possible to overdo this? Any sitemap suggestions would be fantastic. I feel like there's been a few weeks where I've updated the sitemap daily and re-submitted it and I worry that might be hurting my site. Thanks!
Web Design | | jesse-landry0 -
Redirects (301/302) versus errors (404)
I am not able to convincingly decide between using redirects versus using 404 errors. People are giving varied opinions. Here are my cases 1. Coding errors - we put out a bad link a. Some people are saying redirect to home page; the user at least has something to do PLUS more importantly it does NOT hurt your SEO ranking. b. Counter - the page ain't there. Return 404 2. Product removed - link1 to product 1 was out there. We removed product1; so link1 is also gone. It is either lying in people's bookmarks, OR because of coding errors we left it hanging out at some places on our site.
Web Design | | proptiger0 -
How will it affect my site if i link to a site with adult content?
We are currently working on creating 2 sites for a company, one with no adult content, one with adult content. Will it affect the non adult content site if i link to the other one in terms of Google and being blocked by some internet providers.
Web Design | | MattWheatcroft0 -
Best method to stop crawler access to extra Nav Menu
Our shop site has a 3 tier drop down mega-menu so it's easy to find your way to anything from anywhere. It contains about 150 links and probably 300 words of text. We also have a more context-driven single layer of sub-category navigation as well as breadcrumbs on our category pages. You can get to every product and category page without using the drop down mega-menu. Although the mega-menu is a helpful tool for customers, it means that every single page in our shop has an extra 150 links on it that go to stuff that isn't necessarily related or relevant to the page content. This means that when viewed from the context of a crawler, rather than a nice tree like crawling structure, we've got more of an unstructured mesh where everything is linked to everything else. I'd like to hide the mega-menu links from being picked up by a crawler, but what's the best way to do this? I can add a nofollow to all mega-menu links, but are the links still registered as page content even if they're not followed? It's a lot of text if nothing else. Another possibility we're considering is to set the mega-menu to only populate with links when it's main button is hovered over. So it's not part of the initial page load content at all. Or we could use a crude yet effective system we have used for some other menus we have of base encoding the content inline so it's not readable by a spider. What would you do and why? Thanks, James
Web Design | | DWJames0 -
XML Sitemap that updates daily/weekly?
Hi, I have a sitemap on my site, that updates but it isn't a XML sitemap. See here: http://www.designerboutique-online.com/sitemap/ I have used some free software to crawl the site and create a sitemap of pages, however I think that if I were to upload the sitemap, it would be out of date as soon as I listed new products on the site, so would need to rerun it. Does anyone know how I can get this to refresh daily or weekly? Or any software that can do it? I have a web firm that are willing to do one, but our relationship is at an all time low and I don't want to hand over £200 for them to do one. Anyone with any ideas or advice? Thanks Will
Web Design | | WillBlackburn0