Having problems resolving duplicate meta descriptions
-
Recently, I’ve recommended to the team running one of our websites that we remove duplicate meta descriptions. The site currently has a large number of these and we’d like to conform to SEO best practice. I’ve seen Matt Cutt’s recent video entitled, ‘Is it necessary for every page to have a meta description’, where he suggests that webmasters use meta descriptions for their most tactically important pages, but that it is better to have no meta description than duplicates. The website currently has one meta description that is duplicated across the entire site.
This seemed like a relatively straight forward suggestion but it is proving much more challenging to implement over a large website. The site’s developer has tried to resolve the meta descriptions, but says that the current meta description is a site wide value. It is possible to create 18 distinct replacements for 18 ‘template’ pages, but any sub-pages of these will inherit the value and create more duplicates. Would it be better to:
- Have no meta descriptions at all across the site?
- Stick with the status quo and have one meta description site-wide?
- Make 18 separate meta descriptions for the 18 most important pages, but still have 18 sets of duplicates across the sub-pages of the site.
Or…is there a solution to this problem which would allow us to follow the best practice in Matt’s video?
Any help would be much appreciated!
-
That sounds like an interesting suggestion and definitely something to look into, thank you. Sadly, the developer for the site is on holiday until next Monday, so I won't be to get an answer until next week.
Theoretically, if the changes were not possible, would it be better to have one single meta description on the home page and none across the rest of the site? Or would it be better to leave the site as it is?
-
I think your best option is to build out your CMS to add values for meta descriptions for each page. You should be able to have your developer build your CMS so that you can inject a meta description value for the page you are working on. This is pretty standard for in-house/WordPress/Drupal.
If your meta description is a site wide value, then the developer has just put one value into the header that loads for every page. You need to be able to customize this as a best practice, as you know. Building 18 template pages is more work than modifying the CMS to inject a meta value, so I wouldn't recommend it.
Is this an option for you?
-
If it is an in-house CMS I see no reason why you can't make your developer do the work to get it exactly how you want it. Otherwise, what's the bloody point in having a bespoke CMS?
Devs will nearly always say things aren't possible when they are. It's a constant battle. I know because I've battled it before.
I should say that I am not involved in this battle currently - our current dev is incredibly accommodating and just does everything I ask - believe me its a breath of fresh air and makes a massive difference. I have a situation where stuff our old dev said was impossible have suddenly become so!
-
Hi there, thanks for the reply. We are using an in-house CMS.
-
What kind of CMS are you using? Is it an in-house one or Wordpress/Drupal/etc.?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Directory with Duplicate content? what to do?
Moz keeps finding loads of pages with duplicate content on my website. The problem is its a directory page to different locations. E.g if we were a clothes shop we would be listing our locations: www.sitename.com/locations/london www.sitename.com/locations/rome www.sitename.com/locations/germany The content on these pages is all the same, except for an embedded google map that shows the location of the place. The problem is that google thinks all these pages are duplicated content. Should i set a canonical link on every single page saying that www.sitename.com/locations/london is the main page? I don't know if i can use canonical links because the page content isn't identical because of the embedded map. Help would be appreciated. Thanks.
Intermediate & Advanced SEO | | nchlondon0 -
Travel site schema problems
I'm developing a travel site for my home state (Kansas - http://www.kansasisbeautiful.com though it's still has some development being worked on), but struggling to find scheme to work with for some items. So far my site is laid out by both region (northeast Kansas, western Kansas, etc.) and location types (waterfalls, parks, etc.). I'm currently working on coding in schema markup. I've found schema types for waterfalls, parks and landmarks, but I'm struggling to find anything for scenic drives (or highways, drives, anything related), hiking/biking trails and regions (northeast Kansas, southeast Kansas, etc.) The question I have is: What can I do to still try and put some kind of markup when there's nothing available that fits the item I'm trying to markup?
Intermediate & Advanced SEO | | msphoto0 -
¿Disallow duplicate URL?
Hi comunity, thanks for answering my question. I have a problem with a website. My website is: http://example.examples.com/brand/brand1 (good URL) but i have 2 filters to show something and this generate 2 URL's more: http://example.examples.com/brand/brand1?show=true (if we put 1 filter) http://example.examples.com/brand/brand1?show=false (if we put other filter) My question is, should i put in robots.txt disallow for these filters like this: **Disallow: /*?show=***
Intermediate & Advanced SEO | | thekiller990 -
Google not displaying meta description
Hi, one of my clients is receiving the following error in SERP - "A description of the page is not available because of this site's robots.txt". The site is built on WordPress and I realized that by default, the settings were checked to blocks bots from crawling the site. So, I turned it off, fixed robots.txt and submitted the sitemap again. Since, then it's been almost 10 days, the problem still exists. Can anyone tell me what should be done to fix it or if there's a way to get Google to recrawl the pages again.
Intermediate & Advanced SEO | | mayanksaxena0 -
Duplicate content question
Hi there, I work for a Theater news site. We have an issue where our system creates a chunk of duplicate content in Google's eyes and we're not sure how best to solve. When an editor produces a video, it simultaneously 1) creates a page with it's own static URL (e.g. http://www.theatermania.com/video/mary-louise-parker-tommy-tune-laura-osnes-and-more_668.html); and 2) displays said video on a public index page (http://www.theatermania.com/videos/). Since the content is very similar, Google sees them as duplicate. What should we do about this? We were thinking that one solution would to be dynamically canonicalize the index page to the static page whenever a new video is posted, but would Google frown on this? Alternatively, should we simply nofollow the index page? Lastly, are there any solutions we may have missed entirely?
Intermediate & Advanced SEO | | TheaterMania0 -
Duplicated privacy policy pages
I work for a small web agency and I noticed that many of the sites that we build have been using the same privacy policy. Obviously it can be a bit of a nightmare to write a unique privacy policy for each client so is Google likely to class this as duplicate content and result in a penalty? They must realise that privacy policies are likely to be the same or very similar as most legal writing tends to be! I can block the content in robots.txt or meta no-index it if necesarry but I just wanted to get some feedback to see if this is necessary!
Intermediate & Advanced SEO | | Jamie.Stevens1 -
Having a hard time with duplicate page content
I'm having a hard time redirecting website.com/ to website.com The crawl report shows both versions as duplicate content. Here is my htaccess: RewriteEngine On
Intermediate & Advanced SEO | | cgman
RewriteBase /
#Rewrite bare to www
RewriteCond %{HTTP_HOST} ^mywebsite.com
RewriteRule ^(([^/]+/)*)index.php$ http://www.mywebsite.com/$1 [R=301,L] RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME}.php -f
RewriteRule ^(.*)$ $1.php [NC,L]
RewriteCond %{HTTP_HOST} !^.localhost$ [NC]
RewriteRule ^(.+)/$ http://%{HTTP_HOST}$1 [R=301,L] I added the last 2 lines after seeing a Q&A here, but I don't think it has helped.0 -
Duplicate Content
Hi everyone, I have a TLD in the UK with a .co.uk and also the same site in Ireland (.ie). The only differences are the prices and different banners maybe. The .ie site pulls all of the content from the .co.uk domain. Is this classed as content duplication? I've had problems in the past in which Google struggles to index the website. At the moment the site appears completely fine in the UK SERPs but for Ireland I just have the Title and domain appearing in the SERPs, with no extended title or description because of the confusion I caused Google last time. Does anybody know a fix for this? Thanks
Intermediate & Advanced SEO | | royb0