Best practice: unique meta descriptions on blog 'tag' pages
-
Hi everyone,
I'm curious, are there best practices for introducing unique meta descriptions on blog tag pages (I'm using wordpress)?
For instance, using platinum seo, on an original post, the meta description is either the excerpt or a specified custom sentence. It doesn't appear that platinum seo allows for custom descriptions on tag pages.
Love to hear your thoughts.
Thanks!
Peter
-
Kane is correct here.
Generally, you don't want to index tagged pages because you'll run into duplicate content issues in addition to providing useless pages that do not bring much value.
In terms of user navigation, tagged pages can be convenient but not as a strategy to grow your site. Here's another example of a personal site that has the category page with a "noindex" tag (http://www.onblastblog.com/blog/) and chose to remove tagged pages altogether.
If unique content was written on that category page, only then can it be considered to be indexed.
If you're using WordPress, the best plugin to organize your basic title & description data is Yoast.
Biznit, since you wrote that message a couple years ago, I hope you haven't exploited tag pages by creating hundreds and indexing them.
-
Hi Chaps, Nice thread....is there a plug in that will take the title and description of your tags and put it into the page? It seems kinda obvious... I'm happy to write unique content for each tag page (I have the same issue, alot of my tag pages rank well)...but I want to improve those tag pages with unique content. It kinda seems ridiculous why it is not easy to add unique content...perhaps a unique "area" - but without using a widget of text that only shows up on a give tag page (perhaps using widget logic! zzz).
I have like 200 tag pages.....if I could just have a plugin that takes the tag title - as page h1 title and the description as a unique intro paragraph that pops up automatically each one (if I choose to write it)...I guess that would be better. All I would have to do is keep the descriptions up to date in the wordpress back end?
I'm amazed that someone hasn't done this, because tags and categories are a great way to build a deep seo friendly website - and then with time and effort, it should be equally easy to edit those pages (as if they were a page), and why they aren't is a mystery to little old me....but I'll settle for an easy way to create a unique header and paragraph for the moment!
-
My opinion is that the tag pages if done correctly are not considered duplicate content, just make sure that you only use excerpts on your tag pages and not the full article. Custom descriptions for tags is not found in Platinum SEO as there is no need, you can assign tag descriptions directly from Wordpress > Posts > Tags . Just like you, my tag pages rank very well and bring me a lot of traffic
-
Gotcha. I agree - don't cut off the traffic if those pages are already ranking. Unique descriptions for each tag page are definitely the way to go, if you can keep up with all of your tags. This will obviously be much easier if you use 1-3 tags per post, and not 10 random ones. I personally try to reuse tags as much as possible, and combine similar ones... helps keep your sanity.
For the record, I consider the unique paragraph approach the ideal solution for categories and tags, but typically tags are harder to keep up with.
-
Thanks for replying, Kane. I'm with you re: noindexing of tags. I'm all for it, but I have a few tags that return a reasonable amount of traffic on a daily basis. I'd hate to cut it off so I guess I'm forced with dealing on a tag by tag basis (ugh).
After I posted this I figured out how to 'activate' the tag description on my blog (it was a theme issue). So, now it's off to setting up custom descriptions...
-
I usually noindex Wordpress tag pages, meaning I tell Google not to show them in search results. This is because they don't add a ton of SEO value to the site - tag pages are typically 100% duplicate content.
That said, IF you provide a custom description at the top of each tag page, you could make an argument for keeping them indexed, since there would now be relevant unique content at the top of each page.
Here's an example from a personal site where I keep my category pages indexed: http://www.seattlehomestead.com/category/gardening/. See the paragraph at the top of the page? That's the minimum amount of unique content you'd want on a tag or category page to keep it indexed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website blog is hacked. Whats the best practice to remove bad urls
Hello So our site was hacked which created a few thousand spam URLs on our domain. We fixed the issue and changed all the spam urls now return 404. Google index shows a couple of thousand bad URLs. My question is- What's the fastest way to remove the URLs from google index. I created a site map with sof the bad urls and submitted to Google. I am hoping google will index them as they are in the sitemap and remove from the index, as they return 404. Any tools to get a full list of google index? ( search console downloads are limited to 1000 urls). A Moz site crawl gives larger list which includes URLs not in Google index too. Looking for a tool that can download results from a site: search. Any way to remove the URLs from the index in bulk? Removing them one by one will take forever. Any help or insight would be very appreciated.
Technical SEO | | ajiabs1 -
Brand page meta tag
I have around 100 brands on my website (with 100 different pages.) Please suggest me best way to create meta tags for all brand pages.
Technical SEO | | Obbserv0 -
My Home Page meta title on Google isn't what it should be
Hey guys My website is http://www.oxfordmeetsfifth.com According to SEOcentro, my website should appear to Google as Fashion Tips for Women | Oxford Meets Fifth. I have used the Yoast plugin and force rewrote titles to ensure that is the home page meta title. It also appears correctly in browser. Could anyone advise why this is the case? Thanks in advance!
Technical SEO | | OxfordMeetsFifth0 -
Removal of date archive pages on the blog
I'm currently building a site which currently has an archive of blog posts by month/year but from a design perspective would rather not have these on the new website. Is the correct practice to 301 these to the main blog index page? Allow them to 404? Or actually to keep them after all. Many thanks in advance Andrew
Technical SEO | | AndieF0 -
Best practices for repetitive job postings
I have a client who is a recruiter for skilled trades jobs. They post quite a few jobs on their job board on a regular basis. They frequently have job postings that are very similar to older jobs or multiple current job postings that are similar to each other. Looking at their webmaster tools and site: command search in google, it does appear they have some duplicate content issues. We're thinking it's because of the similar job posts. What is the best practice for dealing with this? And is there any way to correct the situation so that the number of "omitted due to similarity" results declines? Thanks for you help!
Technical SEO | | PlusROI0 -
Duplicate Meta Descriptions From Pages That Don't Exist
Hi Guys I am hoping someone can help me out here. I have had a new site built with a unique theme and using wordpress as the CMS. Everything was going fine but after checking webmaster tools today I noticed something that I just cannot get my head around. Basically I am getting warnings of Duplicate page warnings on a couple of things. 1 of which i think i can understand but do not know how to get the warning to go. Firstly I get this warning of duplicate meta desciption url 1: / url 2: /about/who-we-are I understand this as the who-we-are page is set as the homepage through the wordpress reading settings. But is there a way to make the dup meta description warning disappear The second one I am getting is the following: /services/57/ /services/ Both urls lead to the same place although I have never created the services/57/ page the services/57/ page does not show on the xml sitemap but Google obviously see it because it is a warning in webmaster tools. If I press edit on services/57/ page it just goes to edit the /services/ page/ is there a way I can remove the /57/ page safely or a method to ensure Google at least does not see this. Probably a silly question but I cannot find a real comprehensive answer to sorting this. Thanks in advance
Technical SEO | | southcoasthost0 -
Language/country redirect best practice?
Hi, What is SEO best practice when it comes to redirecting users from www.domain.com to their specific language/country, let's say www.domain.com/de for Germany? From what I heard in on of the whiteboard fridays, it seems to be Javascript based on IP and browser language, and then set a cookie - correct? Or should we let our users manually select their language/country at the first visit? Any suggestion appreciated, thanks!
Technical SEO | | rtora0 -
How Best to Handle 'Site Jacking' (Unauthorized Use of Someone else's Dedicated IP Address)
Anyone can point their domain to any IP address they want. I've found at least two domains (same owner) with two totally unrelated domains (to each other and to us) that are currently pointing their domains to our IP address. The IP address is on our dedicated server (we control the entire physical server) and is exclusive to only that one domain (so it isn't a virtual hosting misconfiguration issue) This has caused Google to index their two domains with duplicate content from our site (found by searching for site:www.theirdomain.com) Their site does not come up in the first 50 results though for any of the keywords we come up for so Google obviously knows THEY are the dupe content, not us (our site has been around for 12 years - much longer than them.) Their registration is private and we have not been able to contact these people. I'm not sure if this is just a mistake on the DNS for the two domains or it is someone doing this intentionally to try to harm our ranking. It has been going on for a while, so it is most likely not a mistake for two live sites as they would have noticed long ago they were pointing to the wrong IP. I can think of a variety of actions to take but I can find no information anywhere regarding what Google officially recommends doing in this situation, assuming you can't get a response. Here's my ideas. a) Approach it as a Digital Copyright Violation and go through the lengthy process of having their site taken down. Pro: Eliminates the issue. Con: Sort of a pain and we could be leaving possibly some link juice on the table? b) Modify .htaccess to do a 301 redirect from any URL not using our domain, to our domain. This means Google is going to see several domains all pointing to the same IP and all except our domain, 301 redirecting to our domain. Not sure if THAT will harm (or help) us? Would we not receive link juice then from any site out there that was linking to these other domains? Con: Google will see the context of the backlinks and their link text will not be related at all to our site. In addition, if any of these other domains pointing to our IP have backlinks from 'bad neighborhoods' I assume it could hurt us? c) Modify .htaccess to do a 404 File Not Found or 403 forbidden error? I posted in other forums and have gotten suggestions that are all over the map. In many cases the posters don't even understand what I'm talking about - thinking they are just normal backlinks. Argh! So I'm taking this to "The Experts" on SEOMoz.
Technical SEO | | jcrist1