Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Duplicate content on ecommerce sites
-
I just want to confirm something about duplicate content.
On an eCommerce site, if the meta-titles, meta-descriptions and product descriptions are all unique, yet a big chunk at the bottom (featuring "why buy with us" etc) is copied across all product pages, would each page be penalised, or not indexed, for duplicate content?
Does the whole page need to be a duplicate to be worried about this, or would this large chunk of text, bigger than the product description, have an effect on the page.
If this would be a problem, what are some ways around it? Because the content is quite powerful, and is relavent to all products...
Cheers,
-
Yes, duplicate content can harm your e-commerce sites. It can confuse search engines, making it hard for your site to rank well. Here are some simple ways to deal with it:
Use Canonical Tags: This tells search engines which version of a page is the main one.
Unique Product Descriptions: Try to write unique descriptions for each product, even if they are similar.
Noindex, Follow Tags: For pages that you don't want indexed, use these tags to prevent search engines from listing them.For a full guide on handling duplicate content, check out this blog: https://www.resultfirst.com/blog/ecommerce-seo/how-to-handle-duplicate-content-on-your-ecommerce-site/
I hope it will be helpful for you.
-
@Dr-Pete Thanks, exactly what I was looking for. Really thank you very much
-
With the caveat that this is a 7-yo thread -- I'd say that it's generally more of a filter these days (vs. a Capital-P penalty). The OEM or large resellers are almost always going to win these battles, and you'll be at a disadvantage if you duplicate their product descriptions word-for-word.
Can you still rank? Sure, but you're going to have an easier time if you can add some original value. If you aren't allowed to modify the info, is there anything you can add to it -- custom reviews (not from users, but say an editorial-style review), for example? You don't have to do it for thousands of products. You could start with ten or 25 top sellers and see how things go.
-
-
What do you suggest as a solution if you are a reseller of a product and you are using the same description as measurements, characteristics etc? Especially if your wholeseller demands not to alternate the titles and the descriptions.
-
Then you are saying that all resellers selling, for example, an X model of sports shoes will get penalised because they are using the same description? Test: take a phrase or a paragraph from the most authoritative brand and paste to google. You will have results from other resellers. They don't actually look "penalized" if you see their PA score...
-
-
I'm going to generally agree with (and thumb up) Mark, but a couple of additional comments:
(1) It really varies wildly. You can, with enough duplication, make your pages look thin enough to get filtered out. I don't think there's a fixed word-count or percentage, because it depends on the nature of the duplicate content, the non-duplicate content, the structure/code of the page, etc. Generally speaking, I would not add a long chunk of "Why Buy With Us" text - not only is it going to increase duplicate-content risks, but most people won't read it. Consider something short and punchy - maybe even an image or link that goes to a site with a full description. That way, most people will get the short message and people who are worried can get more details on a stand-alone page. You could even A/B test it - I suspect the long-form content may not be as powerful as you think.
(2) While duplicate content is not "penalized" in the traditional sense, the impact of it can approach penalty-like levels since the Panda updates.
(3) Definitely agreed with Mark that you have to watch both internal and external duplication. If you're a product reseller, for example, and you have a duplicate block in your own site AND you duplicate the manufacturer's product description, then you're at even more risk.
-
James- Great question.....let me provide a little guidance.....we have a bunch of ecommerce sites we help manage for SEO.I am going to lump together several of googles "focus areas" into one. They are duplicate content, shallow content and copied duplicate content. Because with an ecommerce site, all 3 of these items can be the same or interchangeable thing. Here are the major issues/things to focus on:Alot of ecommerce sites, in the past, have been able to generate substantial SEO value by listing products in variations of sizes and colors and with brief descriptions , and then create 1,000's of pages of what used to be considered unique content; (Shallow content). THOSE DAYS ARE GONE. Assuming you still have the standard information copied and pasted on every page, that you mention above, ideally you want 250 unique words of description of a product. Bare minimum you should have 100 words.....and in addition to the on-page content, you should make sure your meta descriptions are unique. Remember, Unique means relevant content that is different. With duplicate content issues, google isn't penalizing you to hurt your ranking but they will only give you SEO value for the page they think is unique...for example if you have 40 pages of the same product but small variations in color or size or sku, and little to differentiate the pages, then they will count those 40 pages as 1 page....you lose the opportunity to build 39 pages of unique content value. The last thing to be careful of is if you have product that other companies have.....(you are a distributor or supplier or wholesaler and not the manufacturer). Then the manufacturer posts standard info and a bunch of people copy it and use it. YOU WILL BE PENALIZED BY GOOGLE FOR THIS BECAUSE IT IS COPIED DUPLICATE CONTENT. Most important point to remphasis----you know you are going to have some duplicate content on a website......you know that it it likely that if you are selling different variations of the same product, that you will have alot of the same stuff.....again, make sure you have unique and different content focused on your keywords. Target at least 50% different or unique content on each page as a MINIMUM.....Hope this helps.Mark
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it Ok to have multiple domains (separate website different content) rank for similar keywords?
Is it 'OK' to have multiple domains in the following instance? Does Google actively discourage multiple (but completely different sites) domains from the same company appearing in the search results for the same and or similar keywords if the content is slightly different? This is where the 'main site' has the details, and you can purchase product, and the second site is a blog site only. We are creating a separate content blogsite; which would be on a second domain that will be related to one portion of content on main site. They would be linking back and forth, or maybe the blog site would just link over to the main site so they can purchase said product. This would be a similar scenario to give you an idea of how it would be structured: MAIN SITE: describes a few products, and you can purchase from this site SECOND SITE, different domain: a blog site that contains personal experiences with one of the products. BOTH sites will be linked back and forth....or as mentioned maybe the blog site could just link over to the 'main site' Logo would be a modified version of the main logo and look and feel of the sight would be similar but not exactly the same. MORE INFO: the main site has existed for way over 10 years, starting to gain some traction in an extremely competitive market, but does not rank super high, is gaining traction due to improvements in speed, content, onpage SEO, etc... So in addition to my main question of is this 'ok' to have this second domain, also will it hurt the rankings or negatively affect the 'main' site? Wondering about duplicate content issues, except it will be slightly different...
SEO Tactics | | fourwhitesocks0 -
Duplicate content, although page has "noindex"
Hello, I had an issue with some pages being listed as duplicate content in my weekly Moz report. I've since discussed it with my web dev team and we decided to stop the pages from being crawled. The web dev team added this coding to the pages <meta name='robots' content='max-image-preview:large, noindex dofollow' />, but the Moz report is still reporting the pages as duplicate content. Note from the developer "So as far as I can see we've added robots to prevent the issue but maybe there is some subtle change that's needed here. You could check in Google Search Console to see how its seeing this content or you could ask Moz why they are still reporting this and see if we've missed something?" Any help much appreciated!
Technical SEO | | rj_dale0 -
Duplicate content in Shopify - subsequent pages in collections
Hello everyone! I hope an expert in this community can help me verify the canonical codes I'll add to our store is correct. Currently, in our Shopify store, the subsequent pages in the collections are not indexed by Google, however the canonical URL on these pages aren't pointing to the main collection page (page 1), e.g. The canonical URL of page 2, page 3 etc are used as canonical URLs instead of the first page of the collections. I have the canonical codes attached below, it would be much appreciated if an expert can urgently verify these codes are good to use and will solve the above issues? Thanks so much for your kind help in advance!! -----------------CODES BELOW--------------- <title><br /> {{ page_title }}{% if current_tags %} – tagged "{{ current_tags | join: ', ' }}"{% endif %}{% if current_page != 1 %} – Page {{ current_page }}{% endif %}{% unless page_title contains shop.name %} – {{ shop.name }}{% endunless %}<br /></title>
Intermediate & Advanced SEO | | ycnetpro101
{% if page_description %} {% endif %} {% if current_page != 1 %} {% else %} {% endif %}
{% if template == 'collection' %}{% if collection %}
{% if current_page == 1 %} {% endif %}
{% if template == 'product' %}{% if product %} {% endif %}
{% if template == 'collection' %}{% if collection %} {% endif %}0 -
Duplicate content due to parked domains
I have a main ecommerce website with unique content and decent back links. I had few domains parked on the main website as well specific product pages. These domains had some type in traffic. Some where exact product names. So main main website www.maindomain.com had domain1.com , domain2.com parked on it. Also had domian3.com parked on www.maindomain.com/product1. This caused lot of duplicate content issues. 12 months back, all the parked domains were changed to 301 redirects. I also added all the domains to google webmaster tools. Then removed main directory from google index. Now realize few of the additional domains are indexed and causing duplicate content. My question is what other steps can I take to avoid the duplicate content for my my website 1. Provide change of address in Google search console. Is there any downside in providing change of address pointing to a website? Also domains pointing to a specific url , cannot provide change of address 2. Provide a remove page from google index request in Google search console. It is temporary and last 6 months. Even if the pages are removed from Google index, would google still see them duplicates? 3. Ask google to fetch each url under other domains and submit to google index. This would hopefully remove the urls under domain1.com and doamin2.com eventually due to 301 redirects. 4. Add canonical urls for all pages in the main site. so google will eventually remove content from doman1 and domain2.com due to canonical links. This wil take time for google to update their index 5. Point these domains elsewhere to remove duplicate contents eventually. But it will take time for google to update their index with new non duplicate content. Which of these options are best best to my issue and which ones are potentially dangerous? I would rather not to point these domains elsewhere. Any feedback would be greatly appreciated.
Intermediate & Advanced SEO | | ajiabs0 -
Removing duplicate content
Due to URL changes and parameters on our ecommerce sites, we have a massive amount of duplicate pages indexed by google, sometimes up to 5 duplicate pages with different URLs. 1. We've instituted canonical tags site wide. 2. We are using the parameters function in Webmaster Tools. 3. We are using 301 redirects on all of the obsolete URLs 4. I have had many of the pages fetched so that Google can see and index the 301s and canonicals. 5. I created HTML sitemaps with the duplicate URLs, and had Google fetch and index the sitemap so that the dupes would get crawled and deindexed. None of these seems to be terribly effective. Google is indexing pages with parameters in spite of the parameter (clicksource) being called out in GWT. Pages with obsolete URLs are indexed in spite of them having 301 redirects. Google also appears to be ignoring many of our canonical tags as well, despite the pages being identical. Any ideas on how to clean up the mess?
Intermediate & Advanced SEO | | AMHC0 -
International SEO - cannibalisation and duplicate content
Hello all, I look after (in house) 3 domains for one niche travel business across three TLDs: .com .com.au and co.uk and a fourth domain on a co.nz TLD which was recently removed from Googles index. Symptoms: For the past 12 months we have been experiencing canibalisation in the SERPs (namely .com.au being rendered in .com) and Panda related ranking devaluations between our .com site and com.au site. Around 12 months ago the .com TLD was hit hard (80% drop in target KWs) by Panda (probably) and we began to action the below changes. Around 6 weeks ago our .com TLD saw big overnight increases in rankings (to date a 70% averaged increase). However, almost to the same percentage we saw in the .com TLD we suffered significant drops in our .com.au rankings. Basically Google seemed to switch its attention from .com TLD to the .com.au TLD. Note: Each TLD is over 6 years old, we've never proactively gone after links (Penguin) and have always aimed for quality in an often spammy industry. **Have done: ** Adding HREF LANG markup to all pages on all domain Each TLD uses local vernacular e.g for the .com site is American Each TLD has pricing in the regional currency Each TLD has details of the respective local offices, the copy references the lacation, we have significant press coverage in each country like The Guardian for our .co.uk site and Sydney Morning Herlad for our Australia site Targeting each site to its respective market in WMT Each TLDs core-pages (within 3 clicks of the primary nav) are 100% unique We're continuing to re-write and publish unique content to each TLD on a weekly basis As the .co.nz site drove such little traffic re-wrting we added no-idex and the TLD has almost compelte dissapread (16% of pages remain) from the SERPs. XML sitemaps Google + profile for each TLD **Have not done: ** Hosted each TLD on a local server Around 600 pages per TLD are duplicated across all TLDs (roughly 50% of all content). These are way down the IA but still duplicated. Images/video sources from local servers Added address and contact details using SCHEMA markup Any help, advice or just validation on this subject would be appreciated! Kian
Intermediate & Advanced SEO | | team_tic1 -
Duplicate content on subdomains.
Hi Mozer's, I have a site www.xyz.com and also geo targeted sub domains www.uk.xyz.com, www.india.xyz.com and so on. All the sub domains have the content which is same as the content on the main domain that is www.xyz.com. So, I want to know how can i avoid content duplication. Many Thanks!
Intermediate & Advanced SEO | | HiteshBharucha0 -
How get rid of duplicate content, titles, etc on php cartweaver site?
my website http://www.bartramgallery.com was created using php and cartweaver 2.0 about five years ago by a web developer. I was really happy with the results of the design was inspired to get into web development and have been studying ever since. My biggest problem at this time is that I am not knowledgable with php and the cartweaver product but am learning as I read more. The issue is that seomoz tools are reporting tons of duplicate content and duplicate title pages etc. This is likely from the dynamic urls and same pages with secondary results etc. I just made a new sitemap with auditmypc I think it was called in an attempt to get rid of all the duplicate page titles but is that going to solve anything or do I need to find another way to configure the site? There are many pages with the same content competing for page rank and it is a bit frustrating to say the least. If anyone has any advice it would be greatly appreciated even pointing me in the right direction. Thank you, Jesse
Intermediate & Advanced SEO | | WSOT0