Is any code to prevent duplicate meta description on blog pages
-
Is any code to prevent duplicate meta description on blog pages
I use rell canonical on blog page and to prevent duplicate title y use on page category title de code %%page%%
Is there any similar code so to description?
-
Well, I checked the site and you are already adding the page number to the title. Then do the same for the Categories / or any other area that is being "paged".
bloh/page/2/
title: Blog dental de Clinicas Dentales Propdental - Página 2 de 59
desc: Lee sobre cierto tema y aprende bla bla bla - Pagina 2In my case I would remove the "de 59" just to avoid the 70 chars limit in titles.
You can get great info here: http://moz.com/learn/seo/title-tag and here: http://moz.com/learn/seo/meta-description
Hope that helps.
-
I am using wordpress and YOAST seo plugin.
I have specific metas for each posts, the duplicate meta descriptions are on different blog pages. Maybe it should be like that
http://www.propdental.es/blog/
http://www.propdental.es/blog/page/2/
http://www.propdental.es/blog/page/3/
same or no description on page 2, and different title due to place this code on title of blog page %%page%%
-
It would help if you tell us a little more about the platform you are using. If it is wordpress, you should go with the Yoast SEO plugin, which will make all the changes and adjustments for you, while you can set specific METAs for specific posts.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Quick Fix to "Duplicate page without canonical tag"?
When we pull up Google Search Console, in the Index Coverage section, under the category of Excluded, there is a sub-category called ‘Duplicate page without canonical tag’. The majority of the 665 pages in that section are from a test environment. If we were to include in the robots.txt file, a wildcard to cover every URL that started with the particular root URL ("www.domain.com/host/"), could we eliminate the majority of these errors? That solution is not one of the 5 or 6 recommended solutions that the Google Search Console Help section text suggests. It seems like a simple effective solution. Are we missing something?
Technical SEO | | CREW-MARKETING1 -
Meta description issue on Google
Hello, I have a small issue on Google with our Meta Description tag not always being properly displayed. If you search for the term: Globe Car (in two words), everything is being displayed properly: http://screencast.com/t/YQCUkJnk Now do the same search for the term GlobeCar (in one word) and the meta tag set into our homepage seems to be totallly ignored and Google is now displaying something that is generated from out of their hat: http://screencast.com/t/K0KeeRGSgspV Anyone has an idea what would cause this? Thanks!
Technical SEO | | GlobeCar1 -
How to deal with duplicated content on product pages?
Hi, I have a webshop with products with different sizes and colours. For each item I have a different URL, with almost the same content (title tag, product descriptions, etc). In order to prevent duplicated content I'am wondering what is the best way to solve this problem, keeping in mind: -Impossible to create one page/URL for each product with filters on colour and size -Impossible to rewrite the product descriptions in order to be unique I'm considering the option to canonicolize the rest of de colours/size variations, but the disadvantage is that in case the product is not in stock it disappears from the website. Looking forward to your opinions and solutions. Jeroen
Technical SEO | | Digital-DMG0 -
Need Help With WWW vs. Non-WWW Duplicate Pages
A friend I'm working with at RedChairMarket.com is having duplicate page issues. Among them, both www and non-www URLs are being generated automatically by his software framework, ASP.net mvc 3. How should we go about finding and tackling these duplicates? Thanks!
Technical SEO | | BrittanyHighland0 -
Once duplicate content found, worth changing page or forget it?
Hi, the SEOmoz crawler has found over 1000 duplicate pages on my site. The majority are based on location and unfortunately I didn't have time to add much location based info. My question is, if Google has already discovered these, determined they are duplicate, and chosen the main ones to show on the SERPS, is it worth me updating all of them with localalized information so Google accepts the changes and maybe considers them different pages? Or do you think they'll always be considered duplicates now?
Technical SEO | | SpecialCase0 -
Catch 22 on duplicate page titles
Hi all, I'm quite new to the SEO space so I apologise if all the information below isn't technically perfect. I ran the SEOmoz pro tool for the first time a month ago (fantastic tool). It picked up a wealth of errors on our site that we are now working on. the problem: we use dynamic pages to display job listings pulled from our database that have picked up many duplicate page titles and content. For example: _Landing page: _http://www.arm.co.uk/jobs/it-contract-jobs/sec=itcontractjobs _Page 2: _http://www.arm.co.uk/jobs/1/-/-/2/itcontractjobs-/9999/2 _Page 3: _http://www.arm.co.uk/jobs/1/-/-/2/itcontractjobs-/9999/3 Following the results of the Moz tool we have now 'no indexed' and 'no followed' the dynamic pages and the errors have dramatically dropped, great! However, on reflection we generate quite a lot of traffic to individual job's listed on our website. By no following the pages we have restricted passing on any 'juice' to these pages, and by no indexing we may be taking them out of Googles index completely. These dynamic pages and individual job listings do generate a lot of traffic to our website via organic search. We do submit the site index to Google that should index the individual jobs that way. So, the question is (I hope this is making sense), are the gains of reducing errors picked up in the moz tool (to improve the overall site performance) likely to outweigh the traffic generated on these dynamically generated pages by being indexed and followed by Google. Ultimately we would like the static landing pages to retain a stronger page rank. Any guidance is very much appreciated. Best Regards,
Technical SEO | | ARMofficial
Sam.0