Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Does Google ACTUALLY ding you for having long Meta Titles? Or do studies just suggest a lower CTR?
-
I do SEO in an agency and have many clients. I always get the question, "Will that hurt my SEO?". When it comes to Meta Title and even Meta Description Length, I understand Google will truncate it which may result in a lower CTR, but does it actually hurt your ranking? I see in many cases Google will find keywords within a long meta description and display those and then in other cases it will simply truncate it. Is Google doing whatever they want willy-nilly or is there data behind this?
Thank you!
-
I think meta descriptions are important.
They are your first chance to display a call to action to a customer and to get them to click through to your site. Hence a poorly written one, truncated etc. is probably not as enticing as one within the 160 characters - that does not truncate.
We have acted for several clients where we have optimized the MD and improved the CTR by .08% (ie less than 1%) but that has amounted to over 20,000 additional clicks on their site a year.
Also I loved Rand's WBF which indirectly addresses the issue, but correlates with my view, though probably not as strong that dwell time is a significant factor on ranking.
https://moz.com/blog/impact-of-queries-and-clicks-on-googles-rankings-whiteboard-friday
On your questions directly:-
Will it hurt your SEO? - Yes, two possible reasons
1/ you keyword stuff it.
2/ no-one clicks through because you have a bad MD
On truncation - there are exceptions, but google generally does not if you fit within there pixel/character limit.
My view - draft and implement your MD's properly...
Hope that assists.
-
Great question, and I certainly heard the "will this hurt my seo" thing all the time as a consultant. A couple of thoughts...
- To my knowledge, there is no specific algorithmic feature that would lower a page's rank because of too long descriptions
- Long meta descriptions, however, may be truncated (as you pointed out) or ignored and replaced altogether by Google if they find a more appropriate subsection of text on the page.
- A succinct, well written meta description may help with CTR which itself may be a ranking factor
- Google has stated that they want you to write good meta descriptions, for what it is worth.
What I try and say to clients is "are you prepared to build a top 10 website in your industry". If they are sweating good meta descriptions, they aren't ready to compete in the big leagues.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Product pages - should the meta description match our product description?
Hi, I am currently adding new products to my website and was wondering, should I use our product description (which is keyword optimised) in the meta description for SEO purposes? Or would this be picked up by Google as duplicate content? Thanks in advance.
Algorithm Updates | | markjoyce1 -
Anyone experience google penalties for full-screen pop-ups?
Although we always recommend against onload pop-ups for clients, (we feel the effect the user experience) we do have a few clients that insist on them. I was reading this article the other day https://searchenginewatch.com/2016/05/17/how-do-i-make-sure-my-site-is-mobile-friendly/ which lead me to https://support.google.com/webmasters/answer/6101188 and I'm happy to see that Google is going to consider these types of content a downgrade when it comes to rank. My question is 2 fold: Has anyone experienced a drop in organic traffic on mobile due to this update? and do you think this will include user triggered content like photo galleries, bookings, email sign ups? We haven't noticed any drops yet but it is something we will be keeping a close eye on in the next little while. Let's hear what the community has to say 🙂
Algorithm Updates | | VERBInteractive1 -
Google sets brand/domain name at the end of SERP titles
Hi all, I am experiencing that Google puts our domain name at the end of the titles in SERPs. So if ia have a title: "See our super cool website", Google would show "See our super cool website - Betxpert.com" in the SERPs Well. This is okay. Apart from the fact that i myself often put the brand name in the title AND the fact that Google mispells the site name. The brand is BetXpert with a upper case X...so when i get a SERP with "See our super cool website - BetXpert - Betxpert.com" I am annoyed 🙂 Any one out the know how to tell Google the EXACT brand name, such that they do not set a value the site owner does not want to have? -Rasmus
Algorithm Updates | | rasmusbang0 -
Google is forcing a 301 by truncating our URLs
Just recently we noticed that google has indexed truncated urls for many of our pages that get 301'd to the correct page. For example, we have:
Algorithm Updates | | mmac
http://www.eventective.com/USA/Massachusetts/Bedford/107/Doubletree-Hotel-Boston-Bedford-Glen.html as the url linked everywhere and that's the only version of that page that we use. Google somehow figured out that it would still go to the right place via 301 if they removed the html filename from the end, so they indexed just: http://www.eventective.com/USA/Massachusetts/Bedford/107/ The 301 is not new. It used to 404, but (probably 5 years ago) we saw a few links come in with the html file missing on similar urls so we decided to 301 them instead thinking it would be helpful. We've preferred the longer version because it has the name in it and users that pay attention to the url can feel more confident they are going to the right place. We've always used the full (longer) url and google used to index them all that way, but just recently we noticed about 1/2 of our urls have been converted to the shorter version in the SERPs. These shortened urls take the user to the right page via 301, so it isn't a case of the user landing in the wrong place, but over 100,000 301s may not be so good. You can look at: site:www.eventective.com/usa/massachusetts/bedford/ and you'll noticed all of the urls to businesses at the top of the listings go to the truncated version, but toward the bottom they have the full url. Can you explain to me why google would index a page that is 301'd to the right page and has been for years? I have a lot of thoughts on why they would do this and even more ideas on how we could build our urls better, but I'd really like to hear from some people that aren't quite as close to it as I am. One small detail that shouldn't affect this, but I'll mention it anyway, is that we have a mobile site with the same url pattern. http://m.eventective.com/USA/Massachusetts/Bedford/107/Doubletree-Hotel-Boston-Bedford-Glen.html We did not have the proper 301 in place on the m. site until the end of last week. I'm pretty sure it will be asked, so I'll also mention we have the rel=alternate/canonical set up between the www and m sites. I'm also interested in any thoughts on how this may affect rankings since we seem to have been hit by something toward the end of last week. Don't hesitate to mention anything else you see that may have triggered whatever may have hit us. Thank you,
Michael0 -
While doing directory submission, We should submit unique description and title ?
Hello Moz Members, I just want to clarify that, We do directory submission in 50 of sites. For Example: I have to target 10 keyword's, and i am doing directory submission. I have 10 Unique titles and 10 unique description. I just need to submit these 10 keywords in 50 directory's 10 keywords * 50 directory = 500 submission. I will just submit the same 10 Unique titles and 10 unique description to these 500 directory. So it wont be count as duplicate content and duplicate title in every directory. Or Every time i do directory submission i have to submit unique description and unique title. Please help me with these question, I am really confused how shall i proceed to directory submission. If any one have fast approval directory sites list then please share the information with me. Regards & Thanks, Chhatarpal Singh
Algorithm Updates | | chhatarpal0 -
Proper Way To Submit A Reconsideration Request To Google
Hello, In previous posts, I was speaking about how we were penalized by Google for unnatural links. Basically 50,000 our of our 58,000 links were coming from 4-5 sites with the same exact anchor text and img alt tags. This obviously was causing our issues. Needless to say, I wen through the complete link profile to determine that all of the links besides this were of natrural origins. My question here is what is the accepted protocol of submitting a reinclusion request; For example, how long should it be? Should I disclose that I was in fact using paid links, and now that I removed (or at least nofollowed) them? I want to make sure that the request as good as it should so I can get our rankings up in a timely manner. Also, how long until the request is typically aknowledged? Thanks
Algorithm Updates | | BestOdds0 -
Stop google indexing CDN pages
Just when I thought I'd seen it all, google hits me with another nasty surprise! I have a CDN to deliver images, js and css to visitors around the world. I have no links to static HTML pages on the site, as far as I can tell, but someone else may have - perhaps a scraper site? Google has decided the static pages they were able to access through the CDN have more value than my real pages, and they seem to be slowly replacing my pages in the index with the static pages. Anyone got an idea on how to stop that? Obviously, I have no access to the static area, because it is in the CDN, so there is no way I know of that I can have a robots file there. It could be that I have to trash the CDN and change it to only allow the image directory, and maybe set up a separate CDN subdomain for content that only contains the JS and CSS? Have you seen this problem and beat it? (Of course the next thing is Roger might look at google results and start crawling them too, LOL) P.S. The reason I am not asking this question in the google forums is that others have asked this question many times and nobody at google has bothered to answer, over the past 5 months, and nobody who did try, gave an answer that was remotely useful. So I'm not really hopeful of anyone here having a solution either, but I expect this is my best bet because you guys are always willing to try.
Algorithm Updates | | loopyal0 -
How long does a news article stay on Google's 'News' section on the SERP?
Our site is recognised as a news source for our niche - was just wondering if anyone had any idea how long the news story stays on the front page of the SERP once Google picks it up?
Algorithm Updates | | DanHill0