Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Two divisions, same parent company, identical websites
-
A client of mine has intentionally built two websites with identical content; both companies sell the same product, one via an 80 year old local brand, well known. The other division is a national brand, new, and working to expand. The old and new divisions cannot be marketed as a single company for legal reasons. My life would be simple if the rules for distinguishing between nation's could apply, but I only have city X, and The U.S. I understand there is no penalty for duplicate content per se but I need to say to Google, "if searcher is in city X, serve content X. If not, serve content U.S. Both sites have atrocious DA and from what GA tells me, the National content appears to have never been served in a SERP in 3 years. I've been asked to improve visibility for both sites.
-
Hi, Katarina! Thanks for this very thorough response - I'm beginning to see a light at the end of the tunnel. When you say stress the address via directories, you are referring to making sure my external listings and directories are current, consistent, correct, yes? Just confirming you are not recommending something internal to the site? We are writing out driving directions where possible, and using the google maps api to display the location.
Also, we won't have unique images for the products - I might be able to do something to edit them differently, but they are the same thing. Will naming them uniquely matter?
For the rest, we are writing, writing, writing! The client had no idea their former developer (yup, they paid someone to do this to them) had done a bad thing, and when I first read their GA and MOZ data (before we really dove into the content on each page and realized it had literally been pasted from one site to the other), I thought the data had to be wrong, ha!
We're pursuing the suggestion about unique content, and think we have a way way to enough of it to matter. Thanks for taking the time to answer. I will try to post some before and after scores when we are done.
-
Hi,
when you are saying 2 websites - are they completely different domains? In this case you need to rewrite the content. I cannot see how just different images would tell Google there isnt another identical website or a website with 90% of duplicated content.
I would suggest the following:
1. Keep the product names the same (unless you are allowed to change them) but make sure your images and descriptions are different.
2. Add completely different testimonials, reviews and case studies
3. Add completely different About us/Meet the team pages
4. Differentiate as much of content as you can and add extra sections where unique content can be added.
5. Don't replicate your backlinking strategy
6. Based on the areas targeted, find out about how effective geo redirects would be
7. Stress the address/location targeted via content, directories, G Maps
Simply flood the websites with a lot of unique content, change or at least reword what can be reworded. Make % of the duplicated content as minimal as you can.
I hope this helps. Challenging. Good luck!
Katarina
-
Thank you! When I add photos, should I name them with locations in mind? Or are you saying that by having different photos, the search engines will recognize different content?
Also - the employees and leadership are the same, even the external partners are the same. But I could be careful about how employee bios are added - so the content is not duplicate, but unique on each site, so that's a good resource for unique content, if I plan carefully and keep it in mind. Thank you!
Driving directions are written out on the local site (the national site is digital), but I am thinking I might be able to reference a location in the testimonial or home city of the person offering the testimonial.
-
My friends this is a big challenge for you as MichaelAMG mentioned, if you do not care about the content of the sites both will hurt each other. So this are some tips for multi-location businesses do to help improve their location pages
1. Use testimonials
2. Write out driving directions
3. Create employee bios
4. Add photos -
You feel my pain! LOL, thanks. We are trying to rewrite content now, but their product offering (how they name their products, describe them, etc). are IDENTICAL. The business partners they link to and how they describe those offers are IDENTICAL. The most I can hope for is to never mention the city of the parent organization on the national site, EVER and to mention it A LOT on the city based site. We are hoping a top level blog with posts containing lots of city based v. national based keywords will help some, too. Do you think if I pair weekly geo-sensitive blog posts with improved geo-sensitive page content, I will have a chance of defining separate content for "near me" geolocation purposes? We are working on robust on page content with the proper geolocation keyword references now.
-
That sounds rough. What you will want to do is alter your content for your single city based website to reflect that you serve that city, then when Google is looking for a match for a person near that city, it should see that site as the best match do to the weight it puts on geolocation. In the long run, you will want to re-write all of your content on one site so that your two sites will not be hurting each other or look like copy/paste spam sites.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why has my website been removed from Bing?
I have a website that has recently been removed from Bing's index, but can't figure out why. The website isn't new, and it is indexed just fine on Google. These are the steps I've tried: The website is verified in Bing Webmaster Tools and successfully submitted the sitemap. I tested the URL to ensure that Bingbot is allowed to crawl the site I submitted URLs to Bing via the URL Submission tool There isn't a "noindex" on the site preventing it from being indexed When I do a URL Inspection, an error message comes up saying "The inspected URL is known to Bing but has some issues which are preventing us from serving it to our users. We recommend you to follow Bing Webmaster Guidelines." I contacted Bing to ask whether the website was removed in error, but received a reply that the website doesn't comply with Bing's quality guidelines, but they wouldn't go into detail as to which guidelines the website isn't meeting. The website URL is https://www.pardeehospital.org. Can anyone offer any advice or insight as to why Bing won't index our site? Thank you!
Intermediate & Advanced SEO | | lindsey.steinkamp0 -
SEO on dynamic website
Hi. I am hoping you can advise. I have a client in one of my training groups and their site is a golf booking engine where all pages are dynamically created based on parameters used in their website search. They want to know what is the best thing to do for SEO. They have some landing pages that Google can see but there is only a small bit of text at the top and the rest of the page is dynamically created. I have advised that they should create landing pages for each of their locations and clubs and use canonicals to handle what Google indexes.Is this the right advice or should they noindex? Thanks S
Intermediate & Advanced SEO | | bedynamic0 -
Check website update frequency?
Is the tools out there that can check our frequently website is updated with new content products? I'm trying to do an SEO analysis between two websites. Thanks in advance Richard
Intermediate & Advanced SEO | | seoman100 -
Problems in indexing a website built with Magento
Hi all My name is Riccardo and i work for a web marketing agency. Recently we're having some problem in indexing this website www.farmaermann.it which is based on Magento. In particular considering google web master tools the website sitemap is ok (without any error) and correctly uploaded. However only 72 of 1.772 URL have been indexed; we sent the sitemap on google webmaster tools 8 days ago. We checked the structure of the robots.txt consulting several Magento guides and it looks well structured also.
Intermediate & Advanced SEO | | advmedialab
In addition to this we noticed that some pages in google researches have different titles and they do not match the page title defined in Magento backend. To conclude we can not understand if this indexing problems are related to the website sitemap, robots.txt or something else.
Has anybody had the same kind of problems? Thank you all for your time and consideration Riccardo0 -
Two blogs on a single domain?
Hi guys, Does anyone have any experience of having (trying to rank) two separate blogs existing on one domain, for instance: www.companysite.com/service1/blogwww.companysite.com/service2/blogThese 2 pages (service 1 and service 2) offer completely different services (rank for different keywords).(for example, a company that provides 2 separate services: SEO service and IT service)Do you think it is a good/bad/confusing search engine practice trying to have separate blogs for each service or do you think there should be only one blog that contains content for both services?Bearing in mind that there is an already existing subdomain for a non-profit part of business that ranks for different keywords: non-profit.companysite.comand it will potentially have another blog so the URL would look like: non-profit.companysite.com/blogAny ideas would be appreciated!Thanks
Intermediate & Advanced SEO | | kellys.marketing0 -
Archiving a festival website - subdomain or directory?
Hi guys I look after a festival website whose program changes year in and year out. There are a handful of mainstay events in the festival which remain each year, but there are a bunch of other events which change each year around the mainstay programming.This often results in us redoing the website each year (a frustrating experience indeed!) We don't archive our past festivals online, but I'd like to start doing so for a number of reasons 1. These past festivals have historical value - they happened, and they contribute to telling the story of the festival over the years. They can also be used as useful windows into the upcoming festival. 2. The old events (while no longer running) often get many social shares, high quality links and in some instances still drive traffic. We try out best to 301 redirect these high value pages to the new festival website, but it's not always possible to find a similar alternative (so these redirects often go to the homepage) Anyway, I've noticed some festivals archive their content into a subdirectory - i.e. www.event.com/2012 However, I'm thinking it would actually be easier for my team to archive via a subdomain like 2012.event.com - and always use the www.event.com URL for the current year's event. I'm thinking universally redirecting the content would be easier, as would cloning the site / database etc. My question is - is one approach (i.e. directory vs. subdomain) better than the other? Do I need to be mindful of using a subdomain for archival purposes? Hope this all makes sense. Many thanks!
Intermediate & Advanced SEO | | cos20300 -
Moving Content To Another Website With No Redirect?
I've got a website that has lots of valuable content and tools but it's been hit too hard by both Panda and Penguin. I came to the conclusion that I'd be better off with a new website as this one is going to hell no matter how much time and money I put in it. Had I started a new website the first time it got hit by Penguin, I'd be profitable today. I'd like to move some of that content to this other domain but I don't want to do 301 redirects as I don't want to pass bad link juice. I know I'll lose all links and visitors to the original website but I don't care. My only concern is duplicate content. I was thinking of setting the pages to noindex on the original website and wait until they don't appear in Google's index. Then I'd move them over to the new domain to be indexed again. Do you see any problem with this? Should I rewrite everything instead? I hate spinning content...!
Intermediate & Advanced SEO | | sbrault741 -
Website stuck on the second page
Hi there Can you please help me. I did some link building and worked with website last couple of months and rank got better but all keywords are on the second page, some of them are 11th and 12th. Is there anything I did wrong and google dont allow the website on the first page? Or should I just go on. It just looks strange keywords are on the second page for 2 weeks and not going to the first page for any single day. The website is quite old, around 10 years. Anyone knows what it is or where I can read about it?
Intermediate & Advanced SEO | | fleetway0