Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Does Google ACTUALLY ding you for having long Meta Titles? Or do studies just suggest a lower CTR?
-
I do SEO in an agency and have many clients. I always get the question, "Will that hurt my SEO?". When it comes to Meta Title and even Meta Description Length, I understand Google will truncate it which may result in a lower CTR, but does it actually hurt your ranking? I see in many cases Google will find keywords within a long meta description and display those and then in other cases it will simply truncate it. Is Google doing whatever they want willy-nilly or is there data behind this?
Thank you!
-
I think meta descriptions are important.
They are your first chance to display a call to action to a customer and to get them to click through to your site. Hence a poorly written one, truncated etc. is probably not as enticing as one within the 160 characters - that does not truncate.
We have acted for several clients where we have optimized the MD and improved the CTR by .08% (ie less than 1%) but that has amounted to over 20,000 additional clicks on their site a year.
Also I loved Rand's WBF which indirectly addresses the issue, but correlates with my view, though probably not as strong that dwell time is a significant factor on ranking.
https://moz.com/blog/impact-of-queries-and-clicks-on-googles-rankings-whiteboard-friday
On your questions directly:-
Will it hurt your SEO? - Yes, two possible reasons
1/ you keyword stuff it.
2/ no-one clicks through because you have a bad MD
On truncation - there are exceptions, but google generally does not if you fit within there pixel/character limit.
My view - draft and implement your MD's properly...
Hope that assists.
-
Great question, and I certainly heard the "will this hurt my seo" thing all the time as a consultant. A couple of thoughts...
- To my knowledge, there is no specific algorithmic feature that would lower a page's rank because of too long descriptions
- Long meta descriptions, however, may be truncated (as you pointed out) or ignored and replaced altogether by Google if they find a more appropriate subsection of text on the page.
- A succinct, well written meta description may help with CTR which itself may be a ranking factor
- Google has stated that they want you to write good meta descriptions, for what it is worth.
What I try and say to clients is "are you prepared to build a top 10 website in your industry". If they are sweating good meta descriptions, they aren't ready to compete in the big leagues.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Cache
So, when I gain a link I always check to see if the page that is linking is in the Google cache. I've noticed recently that more and more pages are actually not showing up in Google's cache, yet still appear in search results. I did read an article from someone whoo works at Google a few weeks back that there is sometimes an error with the cache and occasionally the cache will not display. This week, my own website isn't showing up in the cache yet I'm still ranking in SERP's. I'm not worried about it, mostly whitehat, but has there been any indication that Google are phasing out the ability to check cache's of websites?
Algorithm Updates | | ThorUK0 -
Very strange, inconsistent and unpredictable Google ranking
I have been searching through these forums and haven't come across someone that faces the same issue I am. The folks on the Google forums are certain this is an algorithm issue, but I just can't see the logic in that because this appears to be an issue fairly unique to me. I'll take you through what I've gone through. Sorry for it being long. Website URL: https://fenixbazaar.com 1. In early February, I made the switch to https with some small hiccups. Overall however the move was smooth, had redirects all in place, sitemap, indexing was all fine. 2. One night, my organic traffic dropped by almost 100%. All of my top-ranking articles completely disappeared from rank. Top keyword searches were no longer yielding my best performing articles on the front page of results, nor on the last page of results. My pages were still being indexed, but keyword searches weren't delivering my pages in results. I went from 70-100 active users to 0. 3. The next morning, everything was fine. Traffic back up. Top keywords yielding results for my site on the front page. All was back to normal. Traffic shot up. Only problem was the same issue happened that night, and again for the next three nights. Up and down. 4. I had a developer and SEO guy look into my backend to make sure everything was okay. He said there were some redirection issues but nothing that would cause such a significant drop. No errors in Search Console. No warnings. 5. Eventually, the issue stopped and my traffic improved back to where it was. Then everything went great: the site was accepted into Google News, I installed AMP pages perfectly and my traffic boomed for almost 2 weeks. 6. At this point numerous issues with my host provider, price increases, and incredibly outdated cpanel forced me to change hosts. I did without any issues, although I lost a number of articles albeit low-traffic ones in the move. These now deliver 404s and are no longer indexed in the sitemap. 7. After the move there were a number of AMP errors, which I resolved and now I sit at 0 errors. Perfect...or so it seems. 8. Last week I applied for hsts preload and am awaiting submission. My site was in working order and appeared set to get submitted. I applied after I changed hosts. 9. The past 5 days or so has seen good traffic, fantastic traffic to my AMP pages, great Google News tracking, linking from high-authority sites. Good performance all round. 10. I wake up this morning to find 0 active people on my site. I do a Google search and notice my site isn't even the first result whenever I do an actual search for my name. The site doesn't even rank for its own name! My site is still indexed but search results do not yield results for my actual sites. Check Search Console and realised the sitemap had been "processed" yesterday with most pages indexed, which is weird because it was submitted and processed about a week earlier. I resubmitted the sitemap and it appears to have been processed and approved immediately. No changes to search results. 11. All top-ranking content that previously placed in carousal or "Top Stories" in Google News have gone. Top-ranking keywords no longer bring back results with my site: I went through the top 10 ranking keywords for my site, my pages don't appear anywhere in the results, going as far back as page 20 (last page). The pages are still indexed when I check, but simply don't appear in search results. It's happening all over again! Is this an issue any of you have heard of before? Where a site is still being indexed, but has been completely removed from search results, only to return within a few hours? Up and down? I suspect it may be a technical issue, first with the move to https, and now with changing hosts. The fact the sitemap says processed yesterday, suggests maybe it updated and removed the 404s (there were maybe 10), and now Google is attempting to reindexed? Could this be viable? The reason I am skeptical of it being an algorithm issue is because within a matter of hours my articles are ranking again for certain keywords. And this issue has only happened after a change to the site has been applied. Any feedback would be greatly appreciated 🙂
Algorithm Updates | | fenixbazaar0 -
US domain pages showing up in Google UK SERP
Hi, Our website which was predominantly for UK market was setup with a .com extension and only two years ago other domains were added - US (.us) , IE (.ie), EU (.eu) & AU (.com.au) Last year in July, we noticed that few .us domain urls were showing up in UK SERPs and we realized the sitemap for .us site was incorrectly referring to UK (.com) so we corrected that and the .us domain urls stopped appearing in the SERP. Not sure if this actually fixed the issue or was such coincidental. However in last couple of weeks more than 3 .us domain urls are showing for each brand search made on Google UK and sometimes it replaces the .com results all together. I have double checked the PA for US pages, they are far below the UK ones. Has anyone noticed similar behaviour &/or could anyone please help me troubleshoot this issue? Thanks in advance, R
Algorithm Updates | | RaksG0 -
Deindexed from Google images Sep17th
We have a travel website that has been ranked in Google for 12-14years. The site produces original images with branding on them and have been for years ranking well. There's been no site changes. We have a Moz spamscore 1/17 and Domain Authority 59. Sep 17th all our images just disappeared from Google Image Search. Even searching for our domain with keyword photo results in nothing. I've checked our Search console and no email from Google and I see no postings on Moz and others relating to search algo changes with Images. I'm at a loss here.. does anyone have some advice?
Algorithm Updates | | danta2 -
Meta Keyword Tags
What is the word on Meta Keyword Tags? Are they good to have, or bad? Our biggest competitor seems to have them.
Algorithm Updates | | Essential-Pest0 -
Is it possible that Google may have erroneous indexing dates?
I am consulting someone for a problem related to copied content. Both sites in question are WordPress (self hosted) sites. The "good" site publishes a post. The "bad" site copies the post (without even removing all internal links to the "good" site) a few days after. On both websites it is obvious the publishing date of the posts, and it is clear that the "bad" site publishes the posts days later. The content thief doesn't even bother to fake the publishing date. The owner of the "good" site wants to have all the proofs needed before acting against the content thief. So I suggested him to also check in Google the dates the various pages were indexed using Search Tools -> Custom Range in order to have the indexing date displayed next to the search results. For all of the copied pages the indexing dates also prove the "bad" site published the content days after the "good" site, but there are 2 exceptions for the very 2 first posts copied. First post:
Algorithm Updates | | SorinaDascalu
On the "good" website it was published on 30 January 2013
On the "bad" website it was published on 26 February 2013
In Google search both show up indexed on 30 January 2013! Second post:
On the "good" website it was published on 20 March 2013
On the "bad" website it was published on 10 May 2013
In Google search both show up indexed on 20 March 2013! Is it possible to be an error in the date shown in Google search results? I also asked for help on Google Webmaster forums but there the discussion shifted to "who copied the content" and "file a DMCA complain". So I want to be sure my question is better understood here.
It is not about who published the content first or how to take down the copied content, I am just asking if anybody else noticed this strange thing with Google indexing dates. How is it possible for Google search results to display an indexing date previous to the date the article copy was published and exactly the same date that the original article was published and indexed?0 -
Google is forcing a 301 by truncating our URLs
Just recently we noticed that google has indexed truncated urls for many of our pages that get 301'd to the correct page. For example, we have:
Algorithm Updates | | mmac
http://www.eventective.com/USA/Massachusetts/Bedford/107/Doubletree-Hotel-Boston-Bedford-Glen.html as the url linked everywhere and that's the only version of that page that we use. Google somehow figured out that it would still go to the right place via 301 if they removed the html filename from the end, so they indexed just: http://www.eventective.com/USA/Massachusetts/Bedford/107/ The 301 is not new. It used to 404, but (probably 5 years ago) we saw a few links come in with the html file missing on similar urls so we decided to 301 them instead thinking it would be helpful. We've preferred the longer version because it has the name in it and users that pay attention to the url can feel more confident they are going to the right place. We've always used the full (longer) url and google used to index them all that way, but just recently we noticed about 1/2 of our urls have been converted to the shorter version in the SERPs. These shortened urls take the user to the right page via 301, so it isn't a case of the user landing in the wrong place, but over 100,000 301s may not be so good. You can look at: site:www.eventective.com/usa/massachusetts/bedford/ and you'll noticed all of the urls to businesses at the top of the listings go to the truncated version, but toward the bottom they have the full url. Can you explain to me why google would index a page that is 301'd to the right page and has been for years? I have a lot of thoughts on why they would do this and even more ideas on how we could build our urls better, but I'd really like to hear from some people that aren't quite as close to it as I am. One small detail that shouldn't affect this, but I'll mention it anyway, is that we have a mobile site with the same url pattern. http://m.eventective.com/USA/Massachusetts/Bedford/107/Doubletree-Hotel-Boston-Bedford-Glen.html We did not have the proper 301 in place on the m. site until the end of last week. I'm pretty sure it will be asked, so I'll also mention we have the rel=alternate/canonical set up between the www and m sites. I'm also interested in any thoughts on how this may affect rankings since we seem to have been hit by something toward the end of last week. Don't hesitate to mention anything else you see that may have triggered whatever may have hit us. Thank you,
Michael0 -
While doing directory submission, We should submit unique description and title ?
Hello Moz Members, I just want to clarify that, We do directory submission in 50 of sites. For Example: I have to target 10 keyword's, and i am doing directory submission. I have 10 Unique titles and 10 unique description. I just need to submit these 10 keywords in 50 directory's 10 keywords * 50 directory = 500 submission. I will just submit the same 10 Unique titles and 10 unique description to these 500 directory. So it wont be count as duplicate content and duplicate title in every directory. Or Every time i do directory submission i have to submit unique description and unique title. Please help me with these question, I am really confused how shall i proceed to directory submission. If any one have fast approval directory sites list then please share the information with me. Regards & Thanks, Chhatarpal Singh
Algorithm Updates | | chhatarpal0