Google not using meta desription
-
I have seen several posts as to why google may not be using the meta description. The most common reason is that they found other text on the page that is more relevant. While I have seen this to be the case on some, most of the time it is not. Is there any way to alert the google bot that it is taking wrong info that would not be good to the user?
-
Try to use the header tags properly
Only one h1
a few h2
etc -
@zoooky Thanks for the info, I had the same problem.
-
There is no way to tell Google if it is using the wrong meta description as such. However it is possible to influence it.
Basically you have two options- Technical SEO - have a good look after the page layout and the quality of the HTML on the page. You may find it is a simple technical error such as JavaScript hiding text or incorrect use of headings.
- User intent - Review the user intent behind the context of the page, if Google sees it as not relevant you will be fighting an uphill battle.
-
Google rewrites meta descriptions for mainly 2 reasons.
-
The first reason is the poor use of meta descriptions to summarize the page.
-
The second reason is more accurately matching the search query or intent.
Unfortunately, there is not any alert option that tells google bot to show a predefined meta description.
-
-
@pau4ner Thanks. Yeah, that is what I was figuring. The problem is it is way off. For example my description could be:
Kolea at Waikoloa Beach Resort offers two and three bedroom vacation rentals on the beach.
Google is making it:
1BR , 2BR, Houses, Condos, and on and onIt is pulling info that makes zero sense.
-
Hi, there is nothing that can be done to "force" Google to use the metadescription. Just as with titles, Google can choose to honor or not the metadescription that you wrote.
However, I have found that using the exact same keyword you want to rank that post for increases the odds of Google using your metadescription. If the keyword consists of several words, make sure to use them in the same exact order in your metadescriptions.
I hope that helps!
-
It is always advisable to have a meta description of 150-160 characters. Anything beyond that will be truncated by Google.
Try restructuring your content in such a way that it fits within 15-160 characters
All the best!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My posts are ignored by google not sure why
I have been investigating this problem for some time now and there must be a technical problem. My posts seem to be ignored by google. For example, if i put this title of my article in google, other sites come up instead of my website Lincolnshire Caravan Owners Struggle To Compete With Haven https://www.in2town.co.uk/skegness-news/lincolnshire-caravan-owners-struggle-to-compete-with-haven/ I am trying to find out what technical problem I have that is stopping google displaying my post. Can anyone advise me on what tools to use and how to find out what is going wrong
Technical SEO | | headlinesplus0 -
GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2
Whole website moved to https://www. HTTP/2 version 3 years ago. When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol Robots file is correct (simply allowing all and referring to https://www. sitemap Sitemap is referencing https://www. pages including homepage Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working 301 redirects set up for non-secure and non-www versions of website all to https://www. version Not using a CDN or proxy GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so. Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2 Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page. Any thoughts, further tests, ideas, direction or anything will be much appreciated!
Technical SEO | | AKCAC1 -
Google Search Console - Excluded Pages and Multiple Properties
I have used Moz to identify keywords that are ideal for my website and then I optimized different pages for those keywords, but unfortunately rankings for some of the pages have declined. Since I am working with an ecommerce site, I read that having a lot of Excluded pages on the Google Search Console was to be expected so I initially ignored them. However, some of the pages I was trying to optimize are listed there, especially under the 'Crawled - currently not indexed' and the 'Discovered - currently not indexed' sections. I have read this page (link: https://moz.com/blog/crawled-currently-not-indexed-coverage-status ) and plan on focusing on Steps 5 & 7, but wanted to ask if anyone else has had experience with these issues. Also, does anyone know if having multiple properties (https vs http, www vs no www) can negatively affect a site? For example, could a sitemap from one property overwrite another? Would removing one property from the Console have any negative impact on the site? I plan on asking these questions on a Google forum, but I wanted to add it to this post in case anyone here had any insights. Thank you very much for your time,
SEO Tactics | | ForestGT
Forest0 -
Excessive use of KeyWord?
Hey I have an Immigration website in South Africa
Technical SEO | | NikitaG
MigrationLawyers.co.za and the website used to be divided in to two categories:
1st part - South African Immigration
2nd part - United Kingdom Immigration Because of that we made all the pages include the word "South Africa" in the titles. eg.
...ers.co.za/work-permit-south-africa
...ers.co.za/spousal-visa-south-africa
...ers.co.za/retirement-permit-south-africa
...ers.co.za/permanent-residence-south-africa I'm sure you get the idea.
we since, removed the UK part of the website and now are left only with the SA part. Now my question is: Is it bad? will google see this as spammy, as I'm targeting "South Africa" in almost every link of the website. Should I stick to the structure for new pages, or try to avoid any more use of "South Africa". Perhaps I can change something as it currently stands? Kind Regards
Nikita0 -
Google Index Speed Opinions
Hello Everyone, Under normal circumstances, new posts to my site are indexed almost instantly by Google. I know this because an occasional search with quotation marks surrounding the 1st paragraph of text displays my newly published page. I use this tactic from time to time to ensure contributors aren't syndicating content. My question is this: I've noticed over the last day or so that my newly published articles are not yet indexed. For example, an article that was published over 24 hours ago does not appear to be indexed yet. Is this cause for concern? Is there an average wait time for indexation? XML issue? Thanks in advance for the help/insight.
Technical SEO | | JSOC0 -
What else: struggling with google position
Hi. I understand everyone is offering their time for free here so any advice or support is much appreciated. http://www.cytronex.com
Technical SEO | | AdamJamesCytronex
PA 44 || mR 4.6 || mT 5.73 || 986 links from 43 Root Domains
DA 33 || 3,942 links from 71 Domains We've dropped from position 25ish to position 70ish in keyword searches for 'electric bikes'. I've tried everything and I just don't understand! It's genuine content, the actual product is increasingly popular, we have several links from sites which are (well, to my mind) reasonable quality. I've only just been brought in to look at this and my lack of any SEO or web experience is not putting my boss off expecting an instant solution 😞 As I'm only just getting to grips with it, Analytics was only installed about a month ago so I can't pin point a moment when it dropped. We're consistently out-positioned by sites with lower PA/DA scores. Any insight anyone might have would be amazing! Thanks
Adam0 -
Is a 302 useful here?
We have a site that had one super successful viral video a couple of years back and basically the site needs a ton of work to even be functional. We don't have the time or the resources to even touch it. Our video is still getting tons of views today and I'm fairly certain it's the only reason the site still gets traffic. Most of the views come from youtube which prompts them to check out the site. We plan on going back to the site at a later date, but for now wanted to redirect it to another site of ours. In this case is it best practice to 302? or is a 301 still the proper solution?
Technical SEO | | ClaytonKendall0 -
Is the RSS created using google reader automatic?
I created an RSS feed to information on my blog using the notes capability of google reader and feedburner. when I update my blog does this feed recognize the change or do I need to do a manual update of google reader?
Technical SEO | | casper4340