Duplicate meta descriptions
-
Hi All
Does having quite a few Duplicate meta descriptions hurt SEO. I am worried that I have too many and thinking this could be the reason for my recent drop in search visibility.
Thanks in Advance.
Andy
-
thanks for your responses and clearing this up.
-
Thanks Kevin!
-
Donna--Good point. I revised my previous answer to include that meta descriptions as a display factor that could influence ctr's but not rankings. Meta KW/Descriptions do not influence rankings--instead can help to increase click through's. By including a good unique one increases chances that Google will use it and, more times than not, will have a higher ctr than the auto generated one if the meta description is not included. Great clarification.
-
Hi Andy,
Assuming you mean rankings when you say visibility, having too many meta descriptions won't have any effect because Google doesn't take them into account when ranking web pages.
That said, I've found when there are multiple meta description tags on a page, Google tends to use the first one it encounters. When there are none, it selects when it deems to be relevant from the page and displays that.
Studies have shown that using your primary keyword phrase and being different or helpful in your meta description helps increase click-thru rates. Inclusion of your primary keyword phrase reinforces the relevance of the search result and makes the search result stand out (because search terms are highlighted in bold). Being different or helpful gives the searcher what he or she wants - the solution to a problem or to be entertained.
-
As you probably know, all meta descriptions should be unique. Will a couple hurt? probably not but I advise against it as a display factor. Furthermore, some people will actually eliminate any duplicates meta descriptions and let Google auto-create one. Please see Matt Cutt's video on the topic.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Main menu duplication
I am working on a site that has just gone through a migration to Shopify at the very same time as Google did an update in October. So problems from day 1. All main menu categories have subsequently over the past 6 weeks fallen off a cliff. All aspects of the site have been reviewed in terms of technical, link profile and on-page, with the site in better shape than several ranking competitors. One issue that i'd like some feedback on is the main menu which has 4 iterations in the source. desktop desktop (sticky) mobile mobile (sticky - appears as a second desktop sticky but I assume for mobile) These items that are "duplicated" menus are the top level menu items only. The rest of the nested menu items are included within the last mobile menu option. So desktop menu in source doesn't include any of the sub-menu items, the mobile version carries all these there are 4 versions of the top level main menu items in source Should I be concerned? Considering we have significant issues should this be cleaned up?
Intermediate & Advanced SEO | | MickEdwards0 -
301 redirect to avoid duplicate content penalty
I have two websites with identical content. Haya and ethnic Both websites have similar products. I would like to get rid of ethniccode I have already started to de-index ethniccode. My question is, Will I get any SEO benefit or Will it be harmful if I 301 direct the below only URL’s https://www.ethniccode/salwar-kameez -> https://www.hayacreations/collections/salwar-kameez https://www.ethniccode/salwar-kameez/anarkali-suits - > https://www.hayacreations/collections/anarkali-suits
Intermediate & Advanced SEO | | riyaaaz0 -
Duplicate content - Images & Attachments
I have been looking a GWT HTML improvements on our new site and I am scratching my head on how to stop some elements of the website showing up as duplicates for Meta Descriptions and Titles. For example the blog area: <a id="zip_0-anchor" class="zippedsection_title"></a>This blog is full of information and resources for you to implement; get more traffic, more leads an /blog//blog/page/2//blog/page/3//blog/page/4//blog/page/6//blog/page/9/The page has rel canonicals on them (using Yoast Wordpress SEO) and I can't see away of stopping the duplicate content. Can anyone suggest how to combat this? or is there nothing to worry about?
Intermediate & Advanced SEO | | Cocoonfxmedia0 -
Duplicate Meta Descriptions in Press Releases
We have a client that does multiple press releases a year. One issue we noticed is that every press release has the same meta description tag and the duplicates are starting to really add up. Unfortunately the client does not want to create specialized meta descriptions for new press releases due to legal restrictions (every new meta description must be reviewed). What should we do about this?
Intermediate & Advanced SEO | | RosemaryB0 -
Some Tools Not Recognizing Meta Tags
I am analyzing a site which has several thousands of pages, checking the headers, meta tags, and other on page factors. I noticed that the spider tool on SEO Book (http://tools.seobook.com/general/spider-test) does not seem to recognize the meta tags for various pages. However, using other tools including Moz, it seems the meta tags are being recognized. I wouldn't be as concerned with why a tool is not picking up the tags. But, the site suffered a large traffic loss and we're still trying to figure out what remaining issues need to be addressed. Also, many of those pages once ranked in Google and now cannot be found unless you do a site:// search. Is it possible that there is something blocking where various tools or crawlers can easily read them, but other tools cannot. This would seem very strange to me, but the above is what I've witnessed recently. Your suggestions and feedback are appreciated, especially as this site continues to battle Panda.
Intermediate & Advanced SEO | | ABK7170 -
Meta NoIndex tag and Robots Disallow
Hi all, I hope you can spend some time to answer my first of a few questions 🙂 We are running a Magento site - layered/faceted navigation nightmare has created thousands of duplicate URLS! Anyway, during my process to tackle the issue, I disallowed in Robots.txt anything in the querystring that was not a p (allowed this for pagination). After checking some pages in Google, I did a site:www.mydomain.com/specificpage.html and a few duplicates came up along with the original with
Intermediate & Advanced SEO | | bjs2010
"There is no information about this page because it is blocked by robots.txt" So I had added in Meta Noindex, follow on all these duplicates also but I guess it wasnt being read because of Robots.txt. So coming to my question. Did robots.txt block access to these pages? If so, were these already in the index and after disallowing it with robots, Googlebot could not read Meta No index? Does Meta Noindex Follow on pages actually help Googlebot decide to remove these pages from index? I thought Robots would stop and prevent indexation? But I've read this:
"Noindex is a funny thing, it actually doesn’t mean “You can’t index this”, it means “You can’t show this in search results”. Robots.txt disallow means “You can’t index this” but it doesn’t mean “You can’t show it in the search results”. I'm a bit confused about how to use these in both preventing duplicate content in the first place and then helping to address dupe content once it's already in the index. Thanks! B0 -
Duplicate Content on Press Release?
Hi, We recently held a charity night in store. And had a few local celebs turn up etc... We created a press release to send out to various media outlets, within the press release were hyperlinks to our site and links on certain keywords to specific brands on our site. My question is, should we be sending a different press release to each outlet to stop the duplicate content thing, or is sending the same release out to everyone ok? We will be sending approx 20 of these out, some going online and some not. So far had one local paper website, a massive football website and a local magazine site. All pretty much same content and a few pics. Any help, hints or tips on how to go about this if I am going to be sending out to a load of other sites/blogs? Cheers
Intermediate & Advanced SEO | | YNWA0 -
Duplicate content - canonical vs link to original and Flash duplication
Here's the situation for the website in question: The company produces printed publications which go online as a page turning Flash version, and as a separate HTML version. To complicate matters, some of the articles from the publications get added to a separate news section of the website. We want to promote the news section of the site over the publications section. If we were to forget the Flash version completely, would you: a) add a canonical in the publication version pointing to the version in the news section? b) add a link in the footer of the publication version pointing to the version in the news section? c) both of the above? d) something else? What if we add the Flash version into the mix? As Flash still isn't as crawlable as HTML should we noindex them? Is HTML content duplicated in Flash as big an issue as HTML to HTML duplication?
Intermediate & Advanced SEO | | Alex-Harford0