Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How does google recognize original content?
-
Well, we wrote our own product descriptions for 99% of the products we have. They are all descriptive, has at least 4 bullet points to show best features of the product without reading the all description. So instead using a manufacturer description, we spent $$$$ and worked with a copywriter and still doing the same thing whenever we add a new product to the website.
However since we are using a product datafeed and send it to amazon and google, they use our product descriptions too. I always wait couple of days until google crawl our product pages before i send recently added products to amazon or google. I believe if google crawls our product page first, we will be the owner of the content? Am i right? If not i believe amazon is taking advantage of my original content.
I am asking it because we are a relatively new ecommerce store (online since feb 1st) while we didn't have a lot of organic traffic in the past, i see that our organic traffic dropped like 50% in April, seems like it was effected latest google update. Since we never bought a link or did black hat link building. Actually we didn't do any link building activity until last month. So google thought that we have a shallow or duplicated content and dropped our rankings? I see that our organic traffic is improving very very slowly since then but basically it is like between 5%-10% of our current daily traffic.
What do you guys think? You think all our original content effort is going to trash?
-
Some believe that the code of your website is taken into consideration by Google. This basically implies that duplicate content only applies to the creation of multiple blogs all coded the same with the same text. This was a tactic used by many using automated software.
This is just a rumor and from personal experience, movie news blogs and website tend to churn out identical news stories including pictures, video and text. I have not seen any of these sites being held back in their rankings.
-
Thanks.
About ten years ago I sold a lot of stuff on Amazon. Things were going well. I was the only person selling a nice selection of items. Then they started to sell the same items - and sold them at such a low price there was no way for me to make a profit. Impossible. That was just like working really really hard for someone who would become almost an impossible to beat competitor and dominate your SERPs for the next decade.
-
(offers napkin to EGOL to wipe up coffee spittle)
-
Excellent points by EGOL.
Amazon, and Walmart, are two edged swords that cut one way (you). I understand why businesses go that route, but it is very difficult to win. Sometimes someone does though:
A lady who is a friend of mine about 15 years ago took over the US arm of a German toy distributor and they created a very cool doll. Everyone with the German company and all on the US marketing team screamed they had to take it to Walmart. She politely refused to and said, let Walmart come to me. She then went all over hawking the doll and ended up on HSN. (I think that is the original big TV sales channel). About a year in everyone wanted these dolls and Walmart did not have them.
When Walmart called, she named the price - she did not have to kiss someone's... They were pleased to do the kissing.
One of my favorite stories of all time.
-
Well, sounds like i am screwed since we are sending our feeds to amazon last 7 months. I am going to update the feed and remove the descriptions from amazon feed. But i don't know if it will help me at all. By the way, i am talking about amazon ads, Not selling on amazon. However if amazon doesn't have that product in their database, they basically use your descriptions and create a product page but says this product is available on external website.
-
However since we are using a product datafeed and send it to amazon and google, they use our product descriptions too.
- spits coffee *
Whoa! I would not do that. I would remove or replace those descriptions on Amazon if at all possible.
When you sell on Amazon, any content, any image, any anything that you put on their site will be used against you. And, if you strike gold there then Amazon will quickly become your competitor.
This is exactly why I don't sell on amazon. They solicit me a couple times a year to sell my stuff on their site. No way. I did that in the past and my work benefited Amazon more than it benefited me and benefited my competitors too.
I always wait couple of days until google crawl our product pages before i send recently added products to amazon or google. I believe if google crawls our product page first, we will be the owner of the content? Am i right? If not i believe amazon is taking advantage of my original content.
This is not true. I don't care who says this is true, I am going to argue. No way. I'll argue with anybody about this. Even the big names at Google. They do a horrible job at attributing first publisher. Horrible. Horrible.
I have published a lot of content given to me by others. Other people have stolen my content. I can tell you with assurance that the powerful often wins... and if a LOT of people have grabbed your content you can lose to a ton of weak sites.
Google does not honor first publisher. They honor powerful publishers - like Amazon. Giving content to Amazon that you are going to publish on your website is feeding the snake!
So google thought that we have a shallow or duplicated content and dropped our rankings?
If your content is on Amazon, they are probably taking your traffic. Go out and look at the SERPs.
-
Serkie
Given these are product descriptions, but apply only to you selling them (even if it is through Amazon/G) I think there are a couple of ways you can go. One would be to add author markup if that is possible; I don't know how many products, etc. you are dealing with or what type of eCommerce or other platform you may be using.
Secondarily, within your actual text, you could state authorship and place a link back to you.(likely at very end of description.)
Last would be that if you register a copyright (no not a circle with a c in it as most do - the real thing) it can be fairly inexpensive. Depending how you package it to the copyright office we find it can run about a dollar a page. That would give you ownership should you ever have an issue with someone using your description without authorization (obviously you give it to Amazon and Google.)
A final note is this: when you started rewriting the descriptions my guess is you wrote, changed, rewrote, etc. In the event you ever had to defend yourself or prove you are the actual owner, in a court the documents showing how you arrived at the final are invaluable.
I don't know if this is what you were looking for, but I hope something here will help.
Best
-
For our ecommerce sites we always make sure to have original content in our product feeds as well as our pages. That way the things from our feeds don't poach from our sites and we have a broader range of search terms covered as well as avenues to be reached through.
-
Google typically looks at who published it first, as well as the authority of the sites that house the content. You could be running into problems because Amazon is going to have much more authority.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Same content, different languages. Duplicate content issue? | international SEO
Hi, If the "content" is the same, but is written in different languages, will Google see the articles as duplicate content?
Intermediate & Advanced SEO | | chalet
If google won't see it as duplicate content. What is the profit of implementing the alternate lang tag?Kind regards,Jeroen0 -
Google does not want to index my page
I have a site that is hundreds of page indexed on Google. But there is a page that I put in the footer section that Google seems does not like and are not indexing that page. I've tried submitting it to their index through google webmaster and it will appear on Google index but then after a few days it's gone again. Before that page had canonical meta to another page, but it is removed now.
Intermediate & Advanced SEO | | odihost0 -
Content Below the Fold
Hi I wondered what the view is on content below the fold? We have the H1, product listings & then some written content under the products - will Google just ignore this? I can't hide it under a tab or put a lot of content above products - so I'm not sure what the other option is? Thank you
Intermediate & Advanced SEO | | BeckyKey0 -
My site shows 503 error to Google bot, but can see the site fine. Not indexing in Google. Help
Hi, This site is not indexed on Google at all. http://www.thethreehorseshoespub.co.uk Looking into it, it seems to be giving a 503 error to the google bot. I can see the site I have checked source code Checked robots Did have a sitemap param. but removed it for testing GWMT is showing 'unreachable' if I submit a site map or fetch Any ideas on how to remove this error? Many thanks in advance
Intermediate & Advanced SEO | | SolveWebMedia0 -
Removing duplicate content
Due to URL changes and parameters on our ecommerce sites, we have a massive amount of duplicate pages indexed by google, sometimes up to 5 duplicate pages with different URLs. 1. We've instituted canonical tags site wide. 2. We are using the parameters function in Webmaster Tools. 3. We are using 301 redirects on all of the obsolete URLs 4. I have had many of the pages fetched so that Google can see and index the 301s and canonicals. 5. I created HTML sitemaps with the duplicate URLs, and had Google fetch and index the sitemap so that the dupes would get crawled and deindexed. None of these seems to be terribly effective. Google is indexing pages with parameters in spite of the parameter (clicksource) being called out in GWT. Pages with obsolete URLs are indexed in spite of them having 301 redirects. Google also appears to be ignoring many of our canonical tags as well, despite the pages being identical. Any ideas on how to clean up the mess?
Intermediate & Advanced SEO | | AMHC0 -
Google crawling different content--ever ok?
Here are a couple of scenarios I'm encountering where Google will crawl different content than my users on initial visit to the site--and which I think should be ok. Of course, it is normally NOT ok, I'm here to find out if Google is flexible enough to allow these situations: 1. My mobile friendly site has users select a city, and then it displays the location options div which includes an explanation for why they may want to have the program use their gps location. The user must choose the gps, the entire city, or he can enter a zip code, or choose a suburb of the city, which then goes to the link chosen. OTOH it is programmed so that if it is a Google bot it doesn't get just a meaningless 'choose further' page, but rather the crawler sees the page of results for the entire city (as you would expect from the url), So basically the program defaults for the entire city results for google bot, but for for the user it first gives him the initial ability to choose gps. 2. A user comes to mysite.com/gps-loc/city/results The site, seeing the literal words 'gps-loc' in the url goes out and fetches the gps for his location and returns results dependent on his location. If Googlebot comes to that url then there is no way the program will return the same results because the program wouldn't be able to get the same long latitude as that user. So, what do you think? Are these scenarios a concern for getting penalized by Google? Thanks, Ted
Intermediate & Advanced SEO | | friendoffood0 -
No-index pages with duplicate content?
Hello, I have an e-commerce website selling about 20 000 different products. For the most used of those products, I created unique high quality content. The content has been written by a professional player that describes how and why those are useful which is of huge interest to buyers. It would cost too much to write that high quality content for 20 000 different products, but we still have to sell them. Therefore, our idea was to no-index the products that only have the same copy-paste descriptions all other websites have. Do you think it's better to do that or to just let everything indexed normally since we might get search traffic from those pages? Thanks a lot for your help!
Intermediate & Advanced SEO | | EndeR-0 -
So What On My Site Is Breaking The Google Guidelines?
I have a site that I'm trying to rank for the Keyword "Jigsaw Puzzles" I was originally ranked around #60 or something around there and then all of a sudden my site stopped ranking for that keyword. (My other keyword rankings stayed) Contacted Google via the site reconsideration and got the general response... So I went through and deleted as many links as I could find that I thought Google may not have liked... heck, I even removed links that I don't think I should have JUST so I could have this fixed. I responded with a list of all links I removed and also any links that I've tried to remove, but couldn't for whatever reasons. They are STILL saying my website is breaking the Google guidelines... mainly around links. Can anyone take a peek at my site and see if there's anything on the site that may be breaking the guidelines? (because I can't) Website in question: http://www.yourjigsawpuzzles.co.uk UPDATE: Just to let everyone know that after multiple reconsideration requests, this penalty has been removed. They stated it was a manual penalty. I tried removing numerous different types of links but they kept saying no, it's still breaking rules. It wasn't until I removed some website directory links that they removed this manual penalty. Thought it would be interesting for some of you guys.
Intermediate & Advanced SEO | | RichardTaylor0