When does it make sense to make a meta description longer than what's considered best practice?
-
I've seen all the length recommendations and understand the reasoning is that they will be cut off when you search the time but I've also noticed that Google will "move" the meta description if the search term that the user is using is in the cached version of the page. S
I have a case where Google is indexing the pages but not caching the content (at least not yet). So we see the meta description just fine on the Google results but we can't see the content cache when checking the Google cached version.
**My question is: **In this case, why would it be a bad idea to make a slightly lengthier (but still relevant) meta description with the intent that one of the terms in that description could match the user's search terms and the description would "move" to highlight that term in the results.
-
I believe that this is going to be such a tiny tiny benefit that it might move you up from position 998 to 997, but if you are in the top 30 positions in the SERPs it will have zero impact.
It is also possible that the longer your description, the smaller the impact on any keyword.
I would invest my time in short, hot descriptions that elicit clicks.
-
Thanks Oleg.
Couldn't it help (even if a little) with ranking if the term that is in the meta description (the long one) wouldn't have been there otherwise?
-
Sure, it could match the search term and move around to display in serps but it won't help with ranking. The other downside is now instead of a complete message (which is the goal of a meta desc), you are letting G decide what parts to show and may not get your message across as you would like.
Ideal scenario is using the keywords within the max meta description length. No harm in going over, just not the best case scenario.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best practice recommendations for enabling multiple languages on your site?
I find that the advice for multi-language sites is always tied with multi-region, but what about US only sites that want to be multi-lingual? What are the best practice recommendations there? HREFLANG tags necessary? TLDs? Do you need to purchase yoursite.us , yoursite.sp , etc.. or would yoursite.com/en yoursite.com/sp suffice? Should the extensions be region based even if the language is the only difference?
Intermediate & Advanced SEO | | emilydavidson0 -
One of my Friend's website Domain Authority is Reducing? What could be the reason?
Hello Guys, One of my friend's website domain authority is decreasing since they have moved their domain from HTTP to https.
Intermediate & Advanced SEO | | Max_
There is another problem that his blog is on subfolder with HTTP.
So, can you guys please tell me how to fix this issue and also it's losing some of the rankings like 2-5 positions down. Here is website URL: myfitfuel.in/
here is the blog URL: myfitfuel.in/mffblog/0 -
What's the best way to A/B test new version of your website having different URL structure?
Hi Mozzers, Hope you're doing good. Well, we have a website, up and running for a decent tenure with millions of pages indexed in search engines. We're planning to go live with a new version of it i.e a new experience for our users, some changes in site architecture which includes change in URL structure for existing URLs and introduction of some new URLs as well. Now, my question is, what's the best way to do a A/B test with the new version? We can't launch it for a part of users (say, we'll make it live for 50% of the users, an remaining 50% of the users will see old/existing site only) because the URL structure is changed now and bots will get confused if they start landing on different versions. Will this work if I reduce crawl rate to ZERO during this A/B tenure? How will this impact us from SEO perspective? How will those old to new 301 URL redirects will affect our users? Have you ever faced/handled this kind of scenario? If yes, please share how you handled this along with the impact. If this is something new to you, would love to know your recommendations before taking the final call on this. Note: We're taking care of all existing URLs, properly 301 redirecting them to their newer versions but there are some new URLs which are supported only on newer version (architectural changes I mentioned above), and these URLs aren't backward compatible, can't redirect them to a valid URL on old version.
Intermediate & Advanced SEO | | _nitman0 -
Robots.txt - blocking JavaScript and CSS, best practice for Magento
Hi Mozzers, I'm looking for some feedback regarding best practices for setting up Robots.txt file in Magento. I'm concerned we are blocking bots from crawling essential information for page rank. My main concern comes with blocking JavaScript and CSS, are you supposed to block JavaScript and CSS or not? You can view our robots.txt file here Thanks, Blake
Intermediate & Advanced SEO | | LeapOfBelief0 -
Using the same content on different TLD's
HI Everyone, We have clients for whom we are going to work with in different countries but sometimes with the same language. For example we might have a client in a competitive niche working in Germany, Austria and Switzerland (Swiss German) ie we're going to potentially rewrite our website three times in German, We're thinking of using Google's href lang tags and use pretty much the same content - is this a safe option, has anyone actually tries this successfully or otherwise? All answers appreciated. Cheers, Mel.
Intermediate & Advanced SEO | | dancape1 -
Is there anyway to recover my site's rankings?
My site has been top 3 for 'speed dating' on Google.co.uk since about 2003 and it went to below top 50 for a lot of it's main keywords shortly after 27 Oct 2012. I did a re-submission request and was told there was 'no manual spam action'. My conclusions is I was dropped by Google because of poor quality links I've gained over 10+ years. I have a Domain Authority of 40, a regular blog http://bit.ly/oKyi88, a KLOUT of 42, user reviews and quality content. Since Oct 2012 I've done some technical improvements and managed to get a few questionable links removed. I've continued blogging reguarly and got more active on Twitter. I've seen no improvement and my traffic is 80% down on last year. It would be great to be able to produce content that others want to link to but I've not had much success from that in over 10 years of trying and I've not seen many others in my sector, with small budgets having much success. Is there anything I can do to regain favour with Google?
Intermediate & Advanced SEO | | benners0 -
Why did this website disappear from Google's SERPs?
For the first several months this website, WEBSITE, ranked well in Google for several local search terms like, "Columbia MO spinal decompression" and "Columbia, MO car accident therapy." Recently the website has completely disappeared from Google's SEPRs. It does not even exist when I copy and paste full paragraphs into Google's search bar. The website still ranks fine in Bing and Yahoo, but something happened that caused it to be removed from Google. Beside for optimizing the meta data, adding headers, alt tags, and all of the typical on-page SEO stuff, we did create a guest post for a relevant, local blog. Here is the post: Guest Post. The post's content is 100% unique. I realize the post has way to many internal/external links, which we definitely did not recommend, but can anyone find a reason why this website was removed from Google's SERPs? And possibly how we should go about getting it back into Google's SERPs? Thanks in advance for any help.
Intermediate & Advanced SEO | | VentaMarketing0 -
Do I need to use canonicals if I will be using 301's?
I just took a job about three months and one of the first things I wanted to do was restructure the site. The current structure is solution based but I am moving it toward a product focus. The problem I'm having is the CMS I'm using isn't the greatest (and yes I've brought this up to my CMS provider). It creates multiple URL's for the same page. For example, these two urls are the same page: (note: these aren't the actual urls, I just made them up for demonstration purposes) http://www.website.com/home/meet-us/team-leaders/boss-man/
Intermediate & Advanced SEO | | Omnipress
http://www.website.com/home/meet-us/team-leaders/boss-man/bossman.cmsx (I know this is terrible, and once our contract is up we'll be looking at a different provider) So clearly I need to set up canonical tags for the last two pages that look like this: http://www.omnipress.com/boss-man" /> With the new site restructure, do I need to put a canonical tag on the second page to tell the search engine that it's the same as the first, since I'll be changing the category it's in? For Example: http://www.website.com/home/meet-us/team-leaders/boss-man/ will become http://www.website.com/home/MEET-OUR-TEAM/team-leaders/boss-man My overall question is, do I need to spend the time to run through our entire site and do canonical tags AND 301 redirects to the new page, or can I just simply redirect both of them to the new page? I hope this makes sense. Your help is greatly appreciated!!0