Having problems resolving duplicate meta descriptions
-
Recently, I’ve recommended to the team running one of our websites that we remove duplicate meta descriptions. The site currently has a large number of these and we’d like to conform to SEO best practice. I’ve seen Matt Cutt’s recent video entitled, ‘Is it necessary for every page to have a meta description’, where he suggests that webmasters use meta descriptions for their most tactically important pages, but that it is better to have no meta description than duplicates. The website currently has one meta description that is duplicated across the entire site.
This seemed like a relatively straight forward suggestion but it is proving much more challenging to implement over a large website. The site’s developer has tried to resolve the meta descriptions, but says that the current meta description is a site wide value. It is possible to create 18 distinct replacements for 18 ‘template’ pages, but any sub-pages of these will inherit the value and create more duplicates. Would it be better to:
- Have no meta descriptions at all across the site?
- Stick with the status quo and have one meta description site-wide?
- Make 18 separate meta descriptions for the 18 most important pages, but still have 18 sets of duplicates across the sub-pages of the site.
Or…is there a solution to this problem which would allow us to follow the best practice in Matt’s video?
Any help would be much appreciated!
-
That sounds like an interesting suggestion and definitely something to look into, thank you. Sadly, the developer for the site is on holiday until next Monday, so I won't be to get an answer until next week.
Theoretically, if the changes were not possible, would it be better to have one single meta description on the home page and none across the rest of the site? Or would it be better to leave the site as it is?
-
I think your best option is to build out your CMS to add values for meta descriptions for each page. You should be able to have your developer build your CMS so that you can inject a meta description value for the page you are working on. This is pretty standard for in-house/WordPress/Drupal.
If your meta description is a site wide value, then the developer has just put one value into the header that loads for every page. You need to be able to customize this as a best practice, as you know. Building 18 template pages is more work than modifying the CMS to inject a meta value, so I wouldn't recommend it.
Is this an option for you?
-
If it is an in-house CMS I see no reason why you can't make your developer do the work to get it exactly how you want it. Otherwise, what's the bloody point in having a bespoke CMS?
Devs will nearly always say things aren't possible when they are. It's a constant battle. I know because I've battled it before.
I should say that I am not involved in this battle currently - our current dev is incredibly accommodating and just does everything I ask - believe me its a breath of fresh air and makes a massive difference. I have a situation where stuff our old dev said was impossible have suddenly become so!
-
Hi there, thanks for the reply. We are using an in-house CMS.
-
What kind of CMS are you using? Is it an in-house one or Wordpress/Drupal/etc.?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Possible duplicate content issue
Hi, Here is a rather detailed overview of our problem, any feedback / suggestions is most welcome. We currently have 6 sites targeting the various markets (countries) we operate in all websites are on one wordpress install but are separate sites in a multisite network, content and structure is pretty much the same barring a few regional differences. The UK site has held a pretty strong position in search engines the past few years. Here is where we have the problem. Our strongest page (from an organic point of view) has dropped off the search results completely for Google.co.uk, we've picked this up through a drop in search visibility in SEMRush, and confirmed this by looking at our organic landing page traffic in Google Analytics and Search Analytics in Search Console. Here are a few of the assumptions we've made and things we've checked: Checked for any Crawl or technical issues, nothing serious found Bad backlinks, no new spammy backlinks Geotarggetting, this was fine for the UK site, however the US site a .com (not a cctld) was not set to the US (we suspect this to be the issue, but more below) On-site issues, nothing wrong here - the page was edited recently which coincided with the drop in traffic (more below), but these changes did not impact things such as title, h1, url or body content - we replaced some call to action blocks from a custom one to one that was built into the framework (Div) Manual or algorithmic penalties: Nothing reported by search console HTTPs change: We did transition over to http at the start of june. The sites are not too big (around 6K pages) and all redirects were put in place. Here is what we suspect has happened, the https change triggered google to re-crawl and reindex the whole site (we anticipated this), during this process, an edit was made to the key page, and through some technical fault the page title was changed to match the US version of the page, and because geotargetting was not turned on for the US site, Google filtered out the duplicate content page on the UK site, there by dropping it off the index. What further contributes to this theory is that a search of Google.co.uk returns the US version of the page. With country targeting on (ie only return pages from the UK) that UK version of the page is not returned. Also a site: query from google.co.uk DOES return the Uk version of that page, but with the old US title. All these factors leads me to believe that its a duplicate content filter issue due to incorrect geo-targetting - what does surprise me is that the co.uk site has much more search equity than the US site, so it was odd that it choose to filter out the UK version of the page. What we have done to counter this is as follows: Turned on Geo targeting for US site Ensured that the title of the UK page says UK and not US Edited both pages to trigger a last modified date and so the 2 pages share less similarities Recreated a site map and resubmitted to Google Re-crawled and requested a re-index of the whole site Fixed a few of the smaller issues If our theory is right and our actions do help, I believe its now a waiting game for Google to re-crawl and reindex. Unfortunately, Search Console is still only showing data from a few days ago, so its hard to tell if there has been any changes in the index. I am happy to wait it out, but you can appreciate that some of snr management are very nervous given the impact of loosing this page and are keen to get a second opinion on the matter. Does the Moz Community have any further ideas or insights on how we can speed up the indexing of the site? Kind regards, Jason
Intermediate & Advanced SEO | | Clickmetrics0 -
Are ALL CAPS construed as spamming if they are used in a meta description tag call to action?
I know this seems like an old school question. As a long time SEO I would never use ALL CAPS in a title tag (unless a brand name is capitalized). However I recently came across a Moz video about creating better calls to action in the meta description tags. Some of the examples had CTAs that were using all caps (i.e. CALL NOW! or LOWEST QUOTES!) I realize there is a debate about the user experience implications. However I'm more concerned about search engines penalizing websites that are using ALL CAPS CTAs in their meta description tags. Any feedback/advice would be appreciated. Thanks
Intermediate & Advanced SEO | | RosemaryB0 -
Duplicate content but different pages?
Hi there! Im getting LOTS of "duplicate content" pages but the thing is they are different pages. My website essentially is a niche video hosting site with embedded videos from Youtube. Im working on adding personal descriptions to each video but keeping the same video title (should I re-word it from the original also? Any help?
Intermediate & Advanced SEO | | sarevme0 -
What is better for Meta description ??
Hi everybody, I noticed that a lot of websites prefer their meta description would be the first words of the content inside.
Intermediate & Advanced SEO | | roeesa
I on the other hand thought that google will prefer the meta description to be like a peek to what going to be inside.
anyone can explain me, what is better? Thanks 🙂0 -
What constitutes a duplicate page?
Hi, I have a question about duplicate page content and wondered if someone is able to shed some light on what actually constitutes a "duplicate". We publish hundreds of bus timetable pages that have similar, but technically with unique urls and content. For example http://www.intercity.co.nz/travel-info/timetable/lookup/akl The template of the page is oblivious duplicated, but the vast majority of the content is unique to each page, with data being refreshed each night. Our crawl shows these as duplicate page errors, but is this just a generalisation because the urls are very similar? (only the last three characters change for each page - in this case /akl) Thanks in advance.
Intermediate & Advanced SEO | | BusBoyNZ0 -
Duplicate content from development website
Hi all - I've been trawling for duplicate content and then I stumbled across a development URL, set up by a previous web developer, which nearly mirrors current site (few content and structure changes since then, but otherwise it's all virtually the same). The developer didn't take it down when the site was launched. I'm guessing the best thing to do is tell him to take down the development URL (which is specific to the pizza joint btw, immediately. Is there anything else I should ask him to do? Thanks, Luke
Intermediate & Advanced SEO | | McTaggart0 -
How to Resolve Duplication of HTTPS & HTPP URLs?
Right now, I am working on eCommerce website. [Lamps Lighting and More] I can find out both URLs in website as follow. HTTP Version: http://www.lampslightingandmore.com/ HTTPS Version: https://www.lampslightingandmore.com/ I have check one of my competitor who has implemented following canonical on both pages. Please, view source code for both URLs. http://www.wayfair.com ** https://www.wayfair.com** Then, I checked similar thing in SEOmoz website. 🙂 Why should I not check in SEOmoz because, They are providing best SEO information so may be using best practice to deal with HTTPS & HTTP. LOL I tried to load following URL so it redirect to home page. https://www.seomoz.org is redirecting to http://www.seomoz.org But, following URL is not redirecting any where as well as not set canonical over there. https://www.seomoz.org/users/settings I can find out following code on http://www.seomoz.org/robots.txt **User-agent: *** ** Disallow: /api/user?*** So, I am quite confuse to solve issue. Which one is best 301 redirect or canonical tag? If any live example to see so that's good for me and make me more confident.
Intermediate & Advanced SEO | | CommercePundit0 -
Google Places Duplicate Listings
Hey Mozzers- I know the basic process for handling duplicate listings, but I just want to make sure and ask because this one is a little sensitive. I have a client with a claimed and verified listings page, which is here: http://maps.google.com/maps/place?q=chambers+and+associates&hl=en&cid=9065936543314453461 There is also another listing (which I have not claimed yet) here: http://maps.google.com/maps/place?q=dr.+george+chambers&hl=en&cid=14758636806656154330 The first listing has 0 reviews, where the 2nd unverified listing has 12 fantastic 5 star reviews. We can all agree that if I can get these two listings to merge, his general listing will perform much better than it already is (the first listing has about 200 actions per months). So, what is the best way to merge these two without losing any reviews and without suspending my places account? Thanks in advance! Ian
Intermediate & Advanced SEO | | itrogers0