Open Graph Meta Description...
-
Does my html meta description tag have to be the same as my Open Graph meta description?
I'm having problems pulling through my meta description into Google SERPs and I wondered if its because my 'OG' data is not consistent?
Thanks Guys,
Kay
-
Thanks Vijay!
-
Hi Paul,
We have a meta description which is quite generic and has a call to action highlighting the free shipping on our products pages, I guess Google doesnt think this is product specific enough. I will re-write and see what happens.
No Google is not pulling through my OG data - its just pulling through random pieces of the page, which don't really make sense. i.e. product ID, snippets of on page header info, etc. Almost like the custom meta isn't there at all.
Thanks for your help.
Kay
-
The page meta-description is completely unrelated to the Open Graph data as far as the search crawlers are concerned, Kay.
Have you confirmed that the meta-description is explicitly declared on each page?
Also to note: there are many circumstances where Google will rewrite your meta-description to something it "thinks" is better. Very annoying, but the only antidote is to try to insure your own meta-description is well-matched to the page.
Are you saying that the search engines are picking up your OG description instead of your meta-description?
Paul
-
Hi Kay,
**og:description : **This meta data descriptor is very similar to the meta description tag in HTML. This is where you describe your content, but instead of it showing on a search engine results page, it shows below the link title on Facebook.
Unlike a regular meta description tag, it won’t affect your SEO. (So, don’t spend too much time figuring out how to sneak in keywords.) However, it’s a good idea to make it compelling because you want people to click on it.
You are not limited to a character count, but it’s best to use around 200 letters. In some cases, depending on a link/title/domain, Facebook can display up to 300 characters, but I suggest treating anything above 200 as something extra.
Example:
Source : https://blog.kissmetrics.com/open-graph-meta-tags/
I hope this helps, if you have further questions, please feel free to respond.
Regards,
Vijay
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website title duplicates in SEO description
Hi - My website title (company name) repeats in the SEO description. My host service is Square Space. How do I fix this?
Technical SEO | | Peeker
Thanks! Paula board-directors0 -
How To Change Descriptions On Category Page 2 / 3 etc
I have a quick question that I hope someone might be able to help me with. On a wordpress website I have a lot of posts in each category. My problem is there are now several category pages. ie: https://www.mywebsite.com/category/cat-name/ https://www.mywebsite.com/category/cat-name/page2 https://www.mywebsite.com/category/cat-name/page3 The problem is on the category page I can set page title / description etc. But the problem is I cant do that on page2 / page 3 etc. Does anyone know how I can change the titles and decriptions etc on those pages. Thanks
Technical SEO | | DaleZon0 -
Meta HTML tag code
I have been instructed by Moz that I have some missing meta description tags; however, this is what comes up when I searched for more help on this site: "The proper coding for a meta HTML tag is These Meta descriptions can be nested anywhere in the element." Obviously the actual coding is missing... so can anyone tell me what the proper coding for a meta HTML tag is? Thanks!
Technical SEO | | marissaRT0 -
Easy Question: regarding no index meta tag vs robot.txt
This seems like a dumb question, but I'm not sure what the answer is. I have an ecommerce client who has a couple of subdirectories "gallery" and "blog". Neither directory gets a lot of traffic or really turns into much conversions, so I want to remove the pages so they don't drain my page rank from more important pages. Does this sound like a good idea? I was thinking of either disallowing the folders via robot.txt file or add a "no index" tag or 301redirect or delete them. Can you help me determine which is best. **DEINDEX: **As I understand it, the no index meta tag is going to allow the robots to still crawl the pages, but they won't be indexed. The supposed good news is that it still allows link juice to be passed through. This seems like a bad thing to me because I don't want to waste my link juice passing to these pages. The idea is to keep my page rank from being dilluted on these pages. Kind of similar question, if page rank is finite, does google still treat these pages as part of the site even if it's not indexing them? If I do deindex these pages, I think there are quite a few internal links to these pages. Even those these pages are deindexed, they still exist, so it's not as if the site would return a 404 right? ROBOTS.TXT As I understand it, this will keep the robots from crawling the page, so it won't be indexed and the link juice won't pass. I don't want to waste page rank which links to these pages, so is this a bad option? **301 redirect: **What if I just 301 redirect all these pages back to the homepage? Is this an easy answer? Part of the problem with this solution is that I'm not sure if it's permanent, but even more importantly is that currently 80% of the site is made up of blog and gallery pages and I think it would be strange to have the vast majority of the site 301 redirecting to the home page. What do you think? DELETE PAGES: Maybe I could just delete all the pages. This will keep the pages from taking link juice and will deindex, but I think there's quite a few internal links to these pages. How would you find all the internal links that point to these pages. There's hundreds of them.
Technical SEO | | Santaur0 -
Is possible to reutilize products description taken from mydomain.com in a ecommerce site?
My issue is related with cross-domain duplicate content. In the first domain (aaa.com) I have 30-40 products well described with lots of content (story, description, features, technical sheets etc). This is my primary, brand domain. I want to open an e-commerce in another domain (bbb.com) where I will sell all the products that reside in aaa.com domain. If I'm going to use the content (taken from aaa.com) for describing e-commerce products in the bbb.com domain could it be seen as duplicate content? What do you suggest? It would be a hell to rewrite all the products description and even worse, technical sheets and features/characteristics can't be written differently. Thanks in advace
Technical SEO | | polidistillerie1 -
Empty Meta Robots Directive - Harmful?
Hi, We had a coding update and a side-effect of that was that our directive was emptied, in other words it now reads as: on all of the site. I've since noticed that Google's cache date on all of the pages - at least, the ones I tested - have a Cached date of no later than 17 December '12 - that's the Monday after the directive was removed on mass. So, A, does anyone have solid evidence of an empty directive causing problems? Past experience, Matt Cutts, Fishkin quote, etc. And then B - It seems fairly well correlated but, does my entire site's homogenous Cached date point to this tag removal? Or is it fairly normal to have a particular cache date across a large site (we're a large ecommerce site). Our site: http://www.zando.co.za/ I'm having the directive reinstated as soon as Dev permitting. And then, for extra credit, is there a way with Google's API, or perhaps some other tool, to run an arbitrary list and retrieve Cached dates? I'd want to do this for diagnosis purposes and preferably in a way that OK with Google. I'd avoid CURLing for the cached URL and scraping out that dates with BASH, or any such kind of thing. Cheers,
Technical SEO | | RocketZando0 -
Google inconsistent in display of meta content vs page content?
Our e-comm site includes more than 250 brand pages - lrg image, some fluffy text, maybe a video, links to categories for that brand, etc. In many cases, Google publishes our page title and description in their search results. However, in some cases, Google instead publishes our H1 and the aforementioned fluffy page content. We want our page content to read well, be descriptive of the brand and appropriate for the audience. We want our meta titles and descriptions brief and likely to attract CTR from qualified shoppers. I'm finding this difficult to manage when Google pulls from two different areas inconsistently. So my question... Is there a way to ensure Google only utilizes our title/desc for our listings?
Technical SEO | | websurfer0 -
Backtracking from verification meta tag to the correct Google account is difficult
A Google verification meta tag was created and implemented on a site that I am now responsible for (I took over an SEO project after a long lapse), but no one seems to know what Google account was used to create the meta tag in the first place. I'm finding it very difficult to backtrack from verification meta tag to the Google account, and all the online help is for those having trouble moving forward with the verification. Any suggestions or advice?
Technical SEO | | MaryDoherty0