Should I remove all meta descriptions to avoid duplicates as a short term fix?
-
I’m currently trying to implement Matt Cutt’s advice from a recent YouTube video, in which he said that it was better to have no meta descriptions at all than duplicates.
I know that there are better alternatives, but, if forced to make a choice, would it be better to remove all duplicate meta descriptions from a site than to have duplicates (leaving a lone meta tag description on the home page perhaps?). This would be a short term fix prior to making changes to our CMS to allow us to add unique meta descriptions to the most important pages.
I’ve seen various blogs across the internet which recommend removing all the tags in these circumstances, but I’m interested in what people on Moz think of this.
The site currently has a meta description which is duplicated across every page on the site.
-
Yes, if you can quickly write unique meta descriptions you can start doing that immediately and if the site is not big and you manage to finish it in 1- 2 weeks there is no use to delete meta descriptions. If you see that this will occupy more then 2 weeks it is better to delete duplicated metadescriptions. But the best solution will be to write unique meta description immediately for home page and other important pages which rank right now.
-
Thanks Marc for answering what is in many ways an unfair question.
I definitely agree that the long term objective should be different and relevant meta descriptions as you say. It's also good to know that each of the approaches I suggested were ultimately bad practice, even if one of them is less bad than the other.
-
this is a tough choice between 2 bad "mistakes" - if you leave it blank or empty you don`t use the potential you have... if you have duplicates this is also the same PLUS being afraid what could happen because Matt Cutts mentioned this topic...
You can
t expect a really serious recommendation because both is a "no-go" to me at the moment. If you leave it blank, Google will decide what they display within their SERPs respective Snippets. If your sites have a good and relevant text this won
t end up in a desaster but if there is nothing which Google can pick from???! You no what I mean? (many shops have this problem then)On the other hand this announcement might be a hint to what already happend with the meta keywords... they have no relevance anymore... the only recommendation I would give you now: try it out!
Take one page or a few more and delete the meta description. Leave it blank and wait a little bit... sooner or later you will be able to see the effect within the SERPs then. Upon this result you can make your decision. Don`t do that with your start/home/entry page... use sites (URLs I mean) which are linked more deeper...
BUT your long-term goal should be: create different and relevant meta descriptions
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do we avoid duplicate/thin content on +150,000 product pages?
Hey guys! We got a rather large product range (books) on our eCommerce site (+150,000 titles). We get book descriptions as meta data from our publishers, which we display on the product pages. This obviously is not unique, as many other sites display the same piece of description of the book. It is important for us to rank on those book titles, so my question to You is: How would you go about it? I mean, it seems like a rather unrealistic task to paraphrase +150,000 (and growing) book descriptions. As I see it, there are these options: 1. Don't display the descriptions on the product pages (however then those pages will get even thinner!)
Intermediate & Advanced SEO | | Jacob_Holm
2. Display the (duplicate) descriptions, but put no-index on those product pages in order not to punish the rest of the site (not really an option, though).
3. Hire student workers to produce unique product descriptions for all 150,000 products (seems like a huge and expensive task) But how would You solve such a challenge?
Thanks a lot! Cheers, Tommy.0 -
Not sure how we're blocking homepage in robots.txt; meta description not shown
Hi folks! We had a question come in from a client who needs assistance with their robots.txt file. Metadata for their homepage and select other pages isn't appearing in SERPs. Instead they get the usual message "A description for this result is not available because of this site's robots.txt – learn more". At first glance, we're not seeing the homepage or these other pages as being blocked by their robots.txt file: http://www.t2tea.com/robots.txt. Does anyone see what we can't? Any thoughts are massively appreciated! P.S. They used wildcards to ensure the rules were applied for all locale subdirectories, e.g. /en/au/, /en/us/, etc.
Intermediate & Advanced SEO | | SearchDeploy0 -
Measure impact from new meta descriptions
Hi guys, I'm looking to implement new meta descriptions across a site and i want to measure the impact. So far I'm thinking of extracting the CTR data from GWT for the last 90 days to get the most accurate CTR averages for each URL. Then once the new meta descriptions have been implemented, compare the CTR with the old CTR averages accross URLs. Do you think this would be the most accurate way of measuring the impact? Cheers, Chris
Intermediate & Advanced SEO | | jayoliverwright1 -
Issue with Site Map - how critical would you rank this in terms of needing a fix?
A problem has been introduced onto our sitemap whereby previously excluded URLs are no longer being correctly excluded. These are returning a HTTP 400 Bad Request server response, although do correctly redirect to users. We have around 2300 pages of content, and around 600-800 of these previously excluded URLs, An example would be http://www.naturalworldsafaris.com/destinations/africa-and-the-indian-ocean/botswana/suggested-holidays/botswana-classic-camping-safari/Dates and prices.aspx (the page does correctly redirect to users). The site is currently being rebuilt and only has a life span of a few months. The cost our current developers have given us for resolving this is quite high with this in mind. I was just wondering: How much of a critical issue would you view this? Would it be sufficient (bearing in mind this is an interim measure) to change these pages so that they had a canonical or a redirect - they would however remain on the sitemap. Thanks
Intermediate & Advanced SEO | | KateWaite
Kate0 -
Links not removed
Hello, I want some help regarding Bad links, I have Uploaded Disavow links, webmaster tools before 4-5 months But still, They are showing in Back links to my Site & Not disavow, can any one Help For this ? why they still appears in backlinks to my site, Why not removed Still ? Thanx in Advance, Falguni
Intermediate & Advanced SEO | | Sanjayth0 -
Duplicate Content for Deep Pages
Hey guys, For deep, deep pages on a website, does duplicate content matter? The pages I'm talk about are image pages associated with products and will never rank in Google which doesn't concern me. What I'm interested to know though is whether the duplicate content would have an overall effect on the site as a whole? Thanks in advance Paul
Intermediate & Advanced SEO | | kevinliao1 -
Meta Keywords: Should we use them or not?
I am working through our site and see that meta keywords are being used heavily and unnecessarily. Each of our info pages will have 2 or 3 keyword phrases built into them. Should we just duplicate the keyword phrases into the meta keyword field, should put in additional keywords beyond or not use it at all? Thoughts and opinions appreciated
Intermediate & Advanced SEO | | Towelsrus1 -
Press Release and Duplicate Content
Hello folks, We have been using Press Releases to promote our clients business for a couple of years and we have seen great results in referral traffic and SEO wise. Recently one of our clients requested us to publish the PR on their website as well as blast it out using PRWeb and Marketwire. I think that this is not going to be a duplicate content issue for our client's website since I believe that Google can recognize which content has been published first, but I will be more than happy to get some of the Moz community opinions. Thank you
Intermediate & Advanced SEO | | Aviatech0