Does Google ignore duplicate meta descriptions?
-
Hi there SEO mozzers,
I am dealing with a website that has duplicate meta descriptions (we know is bad).As a punishment, Google totally ignores the meta descriptions and picks content from the website and displays it in SERP.
I already read the https://moz.com/blog/why-wont-google-use-my-meta-description but I was wondering if there is more information/knowledge out there.
Any tips are appreciated!
-
Hi Nikki,
Thank you for taking the time to answer my question.
I was worried that this is precisely the case.
-
Yes, if your meta descriptions are duplicates, Google won't use these and instead get a snippet of content on the page. If you want Google to use your assigned meta description, write original and quality descriptions. If there's absolutely no way for you to assign individual descriptions on pages (I've seen some page structures built to share 1 metatitle & description - it's ridiculous), make sure the first few sentences of the page are acceptable as your description as Google would likely pull that info.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Defining duplicate content
If you have the same sentences or paragraphs on multiple pages of your website, is this considered duplicate content and will it hurt SEO?
Intermediate & Advanced SEO | | mnapier120 -
Error Meta Description
(adult website) https://www.google.com.br/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=robertinha Why Google is not reading my description of Yoast plugin? Vídeos de sexo - Vídeos porno
Intermediate & Advanced SEO | | stroke
www.robertinha.com.br/
Robertinha.com.br. lupa. facebook twitter plus. Página Inicial; Última Atualização: terça, 14 abril 2015. Página Inicial. Categorias. Amadoras (227) · Coroas (6) ... If I site: meusite.com.br work, he read correctly, but the site search not.
I do not understand https://www.google.com.br/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=site:robertinha.com.br Vídeos de sexo - Vídeos porno
www.robertinha.com.br/
Vídeos de sexo grátis: assista agora mesmo vídeos porno com gatas, gostosas, safadas fazendo muito sexo.0 -
When does it make sense to make a meta description longer than what's considered best practice?
I've seen all the length recommendations and understand the reasoning is that they will be cut off when you search the time but I've also noticed that Google will "move" the meta description if the search term that the user is using is in the cached version of the page. S I have a case where Google is indexing the pages but not caching the content (at least not yet). So we see the meta description just fine on the Google results but we can't see the content cache when checking the Google cached version. **My question is: **In this case, why would it be a bad idea to make a slightly lengthier (but still relevant) meta description with the intent that one of the terms in that description could match the user's search terms and the description would "move" to highlight that term in the results.
Intermediate & Advanced SEO | | navidash0 -
Issue with Robots.txt file blocking meta description
Hi, Can you please tell me why the following error is showing up in the serps for a website that was just re-launched 7 days ago with new pages (301 redirects are built in)? A description for this result is not available because of this site's robots.txt – learn more. Once we noticed it yesterday, we made some changed to the file and removed the amount of items in the disallow list. Here is the current Robots.txt file: # XML Sitemap & Google News Feeds version 4.2 - http://status301.net/wordpress-plugins/xml-sitemap-feed/ Sitemap: http://www.website.com/sitemap.xml Sitemap: http://www.website.com/sitemap-news.xml User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/ Other notes... the site was developed in WordPress and uses that followign plugins: WooCommerce All-in-One SEO Pack Google Analytics for WordPress XML Sitemap Google News Feeds Currently, in the SERPs, it keeps jumping back and forth between showing the meta description for the www domain and showing the error message (above). Originally, WP Super Cache was installed and has since been deactivated, removed from WP-config.php and deleted permanently. One other thing to note, we noticed yesterday that there was an old xml sitemap still on file, which we have since removed and resubmitted a new one via WMT. Also, the old pages are still showing up in the SERPs. Could it just be that this will take time, to review the new sitemap and re-index the new site? If so, what kind of timeframes are you seeing these days for the new pages to show up in SERPs? Days, weeks? Thanks, Erin ```
Intermediate & Advanced SEO | | HiddenPeak0 -
Duplicate content for images
On SEOmoz I am getting duplicate errors for my onsite report. Unfortunately it does not specify what that content is... We are getting these errors for our photo gallery and i am assuming that the reason is some of the photos are listed in multiple categories. Can this be the problem? what else can it be? how can we resolve these issues?
Intermediate & Advanced SEO | | SEODinosaur0 -
Google Not Indexing Description or correct title (very technical)
Hey guys, I am managing the site: http://www.theattractionforums.com/ If you search the keyword "PUA Forums", it will be in the top 10 results, however the title of the forum will be "PUA Forums" rather than using the code in the title tag, and no description will display at all (despite there being one in the code). Any page other than the home-page that ranks shows the correct title and description. We're completely baffled! Here are some interesting bits and pieces: It shows up fine on Bing If I go into GWT and Fetch as Google Bot, it shows up as "Unreachable" when I try to pull the home-page. We previously found that it was pulling 'index.htm' before 'index.php' - and this was pulling a blank page. I've fixed this in the .htaccess however to make it redirect, however this hasn't solved the problem. I've disallowed it from pulling the description .etc from the Open Directory with the use of meta tags - didn't change anything. It's vBulletin and is running vBSEO Any suggestions at all guys? I'll be forever in anyones debt who can solve this, it's proving to be near impossible to fix. Here is the .htaccess file, it may be a part of the issue: RewriteEngine On DirectoryIndex index.php index.html Redirect /index.html http://www.theattractionforums.com/index.php RewriteCond %{HTTP_HOST} !^www.theattractionforums.com
Intermediate & Advanced SEO | | trx
RewriteRule (.*) http://www.theattractionforums.com/$1 [L,R=301] RewriteRule ^((urllist|sitemap_).*.(xml|txt)(.gz)?)$ vbseo_sitemap/vbseo_getsitemap.php?sitemap=$1 [L] RewriteCond %{REQUEST_URI} !(admincp/|modcp/|cron|vbseo_sitemap/)
RewriteRule ^((archive/)?(..php(/.)?)?)$ vbseo.php [L,QSA] RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME} !^(admincp|modcp|clientscript|cpstyles|images)/
RewriteRule ^(.+)$ vbseo.php [L,QSA]
RewriteRule ^forum/(.*)$ http://www.theattractionforums.com/$1 [R=301,L]0 -
Duplicate Content On A Subdomain
Hi, We have a client who is currently close to completing a site specifically aimed at the UK market (they're doing this in-house so we've had no say in how it will work). The site will almost be a duplicate (in terms of content, targeted keywords etc.) of a section of the main site (that sits on the root domain) - the main site is targeted toward the US. The only difference will be certain spellings and currency type. If this new UK site were to sit on a sub domain of the main site, which is a .com, will this cause duplicate content issues? I know that there wouldn't be an issue if the new site were to be on a separate .co.uk domain (according to Matt Cutts), but it looks like the client wants it to be on a sub domain. Any help/advice would be greatly appreciated.
Intermediate & Advanced SEO | | jasarrow0 -
Google Places - How do we rank
So, google places showing up on search results is great feature . . . But how can we get our results to the top? I mean I can see some terrible websites appearing at the top of the google places with their places page having no activity whatsoever. Is there a trick to this at all? What can we do to increase our ranking on Google Places because our old GOOD rankings are now appearing BELOW the map results Cheers
Intermediate & Advanced SEO | | kayweb0