Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Product schema GSC Error 'offers, review, or aggregateRating should be specified'
-
I do not have a sku, global identifier, rating or offer for my product. Nonetheless it is my product. The price is variable (as it's insurance) so it would be inappropriate to provide a high or low price. Therefore, these items were not included in my product schema. SD Testing tool showed 2 warnings, for missing sku and global identifier.
Google Search Console gave me an error today that said: 'offers, review, or aggregateRating should be specified'
I don't want to be dishonest in supplying any of these, but I also don't want to have my page deprecated in the search results. BUT I DO want my item to show up as a product. Should I forget the product schema? Advice/suggestions?
Thanks in advance.
-
Really interested to see that others have been receiving this too, we have been having this flagged on a couple of sites / accounts over the past month or two
Basically, Google Data Studio's schema error view is 'richer' than that of Google's schema tool (stand-alone) which has been left behind a bit in terms of changing standards. Quite often you can put the pages highlighted by GSC (Google Search Console) into Google's schema tool, and they will show as having warnings only (no errors) yet GSC says there are errors (very confusing for a lot of people)
Let's look at an example:
- https://d.pr/i/xEqlJj.png (screenshot step 1)
- https://d.pr/i/tK9jVB.png (screenshot step 2)
- https://d.pr/i/dVriHh.png (screenshot step 3)
- https://d.pr/i/X60nRi.png (screenshot step 4)
... basically the schema tool separates issues into two categories, errors and warnings
But Google Search Console's view of schema errors, is now richer and more advanced than that (so adhere to GSC specs, not schema tool specs - if they ever contradict each other!)
What GSC is basically saying is this:
"Offers, review and aggregateRating are recommended only and usually cause a warning rather than an error if omitted. However, now we are taking a more complex view. If any one of these fields / properties is omitted, that's okay but one of the three MUST now be present - or it will change from an warning to an error. SO to be clear, if one or two of these is missing, it's not a big deal - but if all three are missing, to us at Google - the product no longer constitutes as a valid product"
So what are the implications of having schema which generates erroneous, invalid products in Google's eyes?
This was the key statement I found from Google:
Google have this document on the Merchant Center (all about Google Shopping paid activity): https://support.google.com/merchants/answer/6069143?hl=en-GB
They say: "Valid structured markup allows us to read your product data and enable two features: (1) Automatic item updates: Automatic item updates reduce the risk of account suspension and temporary item disapproval due to price and availability mismatches. (2) Google Sheets Merchant Center add-on: The Merchant Center add-on in Google Sheets can crawl your website and uses structured data to populate and update many attributes in your feed. Learn more about using Google sheets to submit your product data. Prevent temporary disapprovals due to mismatched price and availability information with automatic item updates. This tool allows Merchant Center to update your items based on the structured data on your website instead of using feed-based product data that may be out of date."
So basically, without 'valid' schema mark-up, your Google Shopping (paid results) are much more likely to get rejected at a higher frequency, as Google's organic crawler passes data to Google Shopping through schema (and assumedly, they will only do this if the schema is marked as non-erroneous). Since you don't (well, you haven't said anything about this) use Google Shopping (PLA - Product Listing Ads), this 'primary risk' is mostly mitigated
It's likely that without valid product schema, your products will not appear as 'product' results within Google's normal, organic results. As you know, occasionally product results make it into Google's normal results. I'm not sure if this can be achieved without paying Google for a PLA (Product Listings Ad) for the hypothetical product in question. If webmasters can occasionally achieve proper product listings in Google's SERPs without PLA, e.g like this:
https://d.pr/i/XmXq6b.png (screenshot)
... then be assured that, if your products have schema errors - you're much less likely to get them listed in such a way for for free. In the screenshot I just gave, they are clearly labelled as sponsored (meaning that they were paid for). As such, not sure how much of an issue this would be
For product URLs which rank in Google's SERPs which do not render 'as' products:
https://d.pr/i/aW0sfD.png (screenshot)
... I don't think that such results would be impacted 'as' highly. You'll see that even with the plain-text / link results, sometimes you get schema embedded like those aggregate product review ratings. Obviously if the schema had errors, the richness of the SERP may be impacted (the little stars might disappear or something)
Personally I think that this is going to be a tough one that we're all going to have to come together and solve collectively. Google are basically saying, if a product has no individual review they can read, or no aggregate star rating from a collection of reviews, or it's not on offer (a product must have at least one of these three things) - then to Google it doesn't count as a product any more. That's how it is now, there's no arguing or getting away from it (though personally I think it's pretty steep, they may even back-track on this one at some point due to it being relatively infeasible for most companies to adopt for all their thousands of products)
You could take the line of re-assigning all your products as services, but IMO that's a very bad idea. I think Google will cotton on to such 'clever' tricks pretty quickly and undo them all. A product is a product, a service is a service (everyone knows that)
Plus, if your items are listed as services they're no longer products and may not be eligible for some types of SERP deployment as a result of that
The real question for me is, why is Google doing this?
I think it's because, marketers and SEOs have known for a long time that any type of SERP injection (universal search results, e.g: video results, news results, product results injected into Google's 'normal' results) are more attractive to users and because people 'just trust' Google they get a lot of clicks
As such, PLA (Google Shopping) has been relatively saturated for some time now and maybe Google feel that the quality of their product-based results, has dropped or lowered in some way. It would make sense to pick 2-3 things that really define the contents of a trustworthy site which is being more transparent with its user-base, and then to re-define 'what a product is' based around those things
In this way, Google will be able to reduce the amount of PLA results, reduce the amount of 'noise' they are generating and just keep the extrusions (the nice product boxes in Google's SERPs) for the sites that they feel really deserve them. You might say, well if this could result in their PLA revenue decreasing - why do it? Seems crazy
Not really though, as Google make all their revenue from the ads that they show. If it becomes widely known that Google's product-related search results suck, people will move away from Google (in-fact, they have often quoted Amazon as being their leading competitor, not another search engine directly)
People don't want to search for website links any more. They want to search for 'things'. Bits of info that pop out (like how you can use Google as a calculator or dictionary now, if you type your queries correctly). They want to search for products, items, things that are useful to them
IMO this is just another step towards that goal
Thank you for posting this question as it's helped me get some of my own thoughts down on this matter
-
I had a similar issue as we offer SaaS solutions with various different prices.
How I resolved this problem was by changing the Entity Type from Product to Service. Then you no longer need Sku or product related parameters.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best way to handle product filter URLs?
I've been researching and can't find a clear cut answer. Imagine you have a product category page e.g. domain/jeans You've a lot of options as to how to filter the results domain/jeans?=ladies,skinny,pink,10 or domain/jeans/ladies-skinny-pink-10 or domain/jeans/ladies/skinny?=pink,10 And in this how do you handle titles, breadcrumbs etc. Is the a way you prefer to handle filters and why do you do it that way? I'm trying to make my mind up as some very big names handle this differently e.g. http://www.next.co.uk/shop/gender-women-category-jeans/colour-pink-fit-skinny-size-10r VS https://www.matalan.co.uk/womens/shop-by-category/jeans?utf8=✓&[facet_filter][meta.tertiary_category][Skinny]=on&[facet_filter][variants.meta.size][Size+10]=on&[facet_filter][meta.master_colour][Midwash]=on&[facet_filter][min_current_price][gte]=6.0&[facet_filter][min_current_price][lte]=18.0&per=36&sort=
Technical SEO | | RodneyRiley0 -
Duplicate content and 404 errors
I apologize in advance, but I am an SEO novice and my understanding of code is very limited. Moz has issued a lot (several hundred) of duplicate content and 404 error flags on the ecommerce site my company takes care of. For the duplicate content, some of the pages it says are duplicates don't even seem similar to me. additionally, a lot of them are static pages we embed images of size charts that we use as popups on item pages. it says these issues are high priority but how bad is this? Is this just an issue because if a page has similar content the engine spider won't know which one to index? also, what is the best way to handle these urls bringing back 404 errors? I should probably have a developer look at these issues but I wanted to ask the extremely knowledgeable Moz community before I do 🙂
Technical SEO | | AliMac260 -
Schema markup for Webinars
I'm looking for a solution to implement schema markup for our webinars. We have an events page that has a list of upcoming events, as well as a list of the webinars we've done with a link to YouTube to watch the webinar. The webinars on our events page have the title and date. What kind of schema markup can we implement for these past events? It is not really an event, but it's not a video either (unless we embed the video on our site). Any tips? (**Also, I would like to use JSON-LD instead of HTML to implement the schema). Thanks!
Technical SEO | | laurenpritchett0 -
Schema for Banks and SEO
I'm researching Schema opportunities for a bank, but besides the shema markup available today (like bankorcreditunion) and developments with FIBO, I can find no answer as to the effect of tagging interest rates and such in terms of SERP/CTR performance or visibility. Does anyone have a case study to share or some insight on the matter?
Technical SEO | | Netsociety0 -
Http to https - is a '302 object moved' redirect losing me link juice?
Hi guys, I'm looking at a new site that's completely under https - when I look at the http variant it redirects to the https site with "302 object moved" within the code. I got this by loading the http and https variants into webmaster tools as separate sites, and then doing a 'fetch as google' across both. There is some traffic coming through the http option, and as people start linking to the new site I'm worried they'll link to the http variant, and the 302 redirect to the https site losing me ranking juice from that link. Is this a correct scenario, and if so, should I prioritise moving the 302 to a 301? Cheers, Jez
Technical SEO | | jez0000 -
404 error - but I can't find any broken links on the referrer pages
Hi, My crawl has diagnosed a client's site with eight 404 errors. In my CSV download of the crawl, I have checked the source code of the 'referrer' pages, but can't find where the link to the 404 error page is. Could there be another reason for getting 404 errors? Thanks for your help. Katharine.
Technical SEO | | PooleyK0 -
Where to put Schema On Page
What part of my page should I put Schema data? Header? Footer? Also All pages? or just home page?
Technical SEO | | bozzie3114 -
404 errors on non-existent URLs
Hey guys and gals, First Moz Q&A for me and really looking forward to being part of the community. I hope as my first question this isn't a stupid one but I was just struggling to find any resource that dealt with the issue and am just looking for some general advice. Basically a client has raised a problem with 404 error pages - or the lack thereof- on non-existent URLs on their site; let's say for example: 'greatbeachtowels.com/beach-towels/asdfas' Obviously content never existed on this page so its not like you're saying 'hey, sorry this isn't here anymore'; its more like- 'there was never anything here in the first place'. Currently in this fictitious example typing in 'greatbeachtowels.com/beach-towels/asdfas**'** returns the same content as the 'greatbeachtowels.com/beach-towels' page which I appreciate isn't ideal. What I was wondering is how far do you take this issue- I've seen examples here on the seomoz site where you can edit the URI in a similar manner and it returns the same content as the parent page but with the alternate address. Should 404's be added across all folders on a site in a similar way? How often would this scenario be and issue particularly for internal pages two or three clicks down? I suppose unless someone linked to a page with a misspelled URL... Also would it be worth placing 301 redirects on a small number of common mis-spellings or typos e.g. 'greatbeachtowels.com/beach-towles' to the correct URLs as opposed to just 404s? Many thanks in advance.
Technical SEO | | AJ2340