Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Should I disable the indexing of tags in Wordpress?
-
Hi,
I have a client that is publishing 7 or 8 news articles and posts each month. I am optimising selected posts and I have found that they have been adding a lot of tags (almost like using hashtags) .
There are currently 29 posts but already 55 tags, each of which has its own archive page, and all of which are added to the site map to be indexed (https://sykeshome.europe.sykes.com/sitemap_index.xml).
I came across an article (https://crunchify.com/better-dont-use-wordpress-tags/) that suggested that tags add no value to SEO ranking, and as a consequence Wordpress tags should not be indexed or included in the sitemap.
I haven't been able to find much more reliable information on this topic, so my question is - should I get rid of the tags from this website and make the focus pages, posts and categories (redirecting existing tag pages back to the site home page)?
It is a relatively new websites and I am conscious of the fact that category and tag archive pages already substantially outnumber actual content pages (posts and news) - I guess this isn't optimal.
I'd appreciate any advice.
Thanks
-
Yes it would be best if you were the tags option off, It excellent performance working for example Shillong Teer Club chart
-
Disabling the indexing of tags in WordPress can be beneficial for SEO purposes, as it prevents search engines from indexing individual tag pages, which may otherwise lead to duplicate content issues. However, whether to disable tag indexing depends on your specific website goals and content structure. If you use tags sparingly and they add value to your site's organization, leaving them indexed may be beneficial. Evaluate your SEO strategy and content structure to determine the best approach for your WordPress site.
-
I'm having the same problem right now, my site is fairly new but it was already receiving some organic traffic then, all of a sudden traffic sank like a brick and i can't find the answer why.
Only thing that changed was adding tags to blog posts which i think might be creating duplicate content so I'm proceeding to disable those, will leave categories alive for the moment because they were bringing traffic but if nothing changes after it will deindex them as well, site in question is sluthpass hope i can recover traffic after disabling those annoying tags.
-
Hello experts, I have disabled the tags but it is still showing in Google. What should I do now? Here you can check redeemcodecenter.com, Thanks in Advance.
-
If you have a large number of tags that don't add clear value to your site's content, disabling tag indexing in WordPress can be beneficial for search engine optimization. However, if your tags are well-curated and provide meaningful navigation for your users, enabling indexing can improve discoverability and site organization. Evaluate the relevance and usefulness of your tags, considering both SEO considerations and user experience before making a decision.
-
I experienced the same problem. I once read an article that Google prefers websites that have a neat structure. and in my opinion, it is difficult to make tags more structured on my website. my site is malasngoding.com . what do you think?
-
Even i was also looking the answer for same for my website https://abcya.in/ i think tags should not be indexed.
-
I don't think it's a good idea. I'm testing tons of articles with and without tags for my website, Dizzibooster. It seems that adding tags will provide an edge for indexing purposes. However, you can test these things yourself."
-
I am facing the same issue on my website AmazingFactsHindi. As per our expert discussion and after reading this forum, I decided to de-index all the Tags and Category pages that are creating duplicate issues on our website.
-
We had similar questions on SEO. We experimented with disabling tags for the last 4 weeks. The only impact so far, I was able to find is that the [thecodebuzz(https://www.thecodebuzz.com/) website did not get hits for a few impressions which were based on tags keys. We are still evaluating the impact.
-
Heyo,
If your tags and categories are providing value to your users or helping with your site's SEO, you might not want to remove them from search engine indexes. I disabled it on my site OceanXD, And it was a good decision for me. -
I have a same question but I have found blocking category and tag is good for SEO.
I have blogsite Tech News Blog I have crated around 400 tag but I have seen this was crating duplicate issue.My personal opinion tag and category de index will be better for SEO.
-
@JCN-SBWD You can index your tags in as much as it doesn't affect the indexing of your posts. Tags do get traffic as well. The only reason why I stopped indexing my tags is because it affects the indexing of my post. Tags got indexed in a matter of minutes while it takes hours, sometimes days before my posts get indexed.
-
I would recommend to disable tags indexing as there are cases where you are multiple tags for same topic. You can index categories as mentioned above that they are more structure and define your website in some way. If you write custom excerpt for each post, it helps categories to have unique content for each post except.
-
It’s a good idea to block tags, since they are duplicate content and may dilute the performance of your real pages. But if you find certain tag or author pages bring valid traffic, you can make an exception for them. It's up to you
-
Can you please explain what exactly you do.
-
-
Many thanks for the prompt response and also for confirming my suspicions, it is much appreciated.
The robots suggestion is handy too.
-
Personally I usually do this as well as robots.txt blocking them to save on crawl allowance, but you should no-index first as if Google is blocked from crawling (robots.txt) then how will they find the no-index tags? So it needs to be staggered
I find that the tag URLs result in quite messy SERPs so I prefer to de-index those and then really focus on adding value to 'actual' category URLs. Because categories have a defined structure they're better for SEO (IMO)
Categories are usually good for SEO if you tune and tweak them up (and if their architecture is linear) but tags are very messy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mass Removal Request from Google Index
Hi, I am trying to cleanse a news website. When this website was first made, the people that set it up copied all kinds of articles they had as a newspaper, including tests, internal communication, and drafts. This site has lots of junk, but this kind of junk was on the initial backup, aka before 1st-June-2012. So, removing all mixed content prior to that date, we can have pure articles starting June 1st, 2012! Therefore My dynamic sitemap now contains only articles with release date between 1st-June-2012 and now Any article that has release date prior to 1st-June-2012 returns a custom 404 page with "noindex" metatag, instead of the actual content of the article. The question is how I can remove from the google index all this junk as fast as possible that is not on the site anymore, but still appears in google results? I know that for individual URLs I need to request removal from this link
Intermediate & Advanced SEO | | ioannisa
https://www.google.com/webmasters/tools/removals The problem is doing this in bulk, as there are tens of thousands of URLs I want to remove. Should I put the articles back to the sitemap so the search engines crawl the sitemap and see all the 404? I believe this is very wrong. As far as I know this will cause problems because search engines will try to access non existent content that is declared as existent by the sitemap, and return errors on the webmasters tools. Should I submit a DELETED ITEMS SITEMAP using the <expires>tag? I think this is for custom search engines only, and not for the generic google search engine.
https://developers.google.com/custom-search/docs/indexing#on-demand-indexing</expires> The site unfortunatelly doesn't use any kind of "folder" hierarchy in its URLs, but instead the ugly GET params, and a kind of folder based pattern is impossible since all articles (removed junk and actual articles) are of the form:
http://www.example.com/docid=123456 So, how can I bulk remove from the google index all the junk... relatively fast?0 -
301s being indexed
A client website was moved about six months ago to a new domain. At the time of the move, 301 redirects were setup from the pages on the old domain to point to the same page on the new domain. New pages were setup on the old domain for a different purpose. Now almost six months later when I do a query in google on the old domain like site:example.com 80% of the pages returned are 301 redirects to the new domain. I would have expected this to go away by now. I tried removing these URLs in webmaster tools but the removal requests expire and the URLs come back. Is this something we should be concerned with?
Intermediate & Advanced SEO | | IrvCo_Interactive0 -
Links from non-indexed pages
Whilst looking for link opportunities, I have noticed that the website has a few profiles from suppliers or accredited organisations. However, a search form is required to access these pages and when I type cache:"webpage.com" the page is showing up as non-indexed. These are good websites, not spammy directory sites, but is it worth trying to get Google to index the pages? If so, what is the best method to use?
Intermediate & Advanced SEO | | maxweb0 -
Lowercase VS. Uppercase Canonical tags?
Hi MOZ, I was hoping that someone could help shed some light on an issue I'm having with URL structure and the canonical tag. The company I work for is a distributor of electrical products and our E-commerce site is structured so that our URL's (specifically, our product detail page URL's) include a portion (the part #) that is all uppercase (e.g: buy/OEL-Worldwide-Industries/AFW-PG-10-10). The issue is that we have just recently included a canonical tag in all of our product detail pages and the programmer that worked on this project has every canonical tag in lowercase instead of uppercase. Now, in GWT, I'm seeing over 20,000-25,000 "duplicate title tags" or "duplicate descriptions". Is this an issue? Could this issue be resolved by simply changing the canonical tag to reflect the uppercase URL's? I'm not too well versed in canonical tags and would love a little insight. Thanks!
Intermediate & Advanced SEO | | GalcoIndustrial0 -
Do you add 404 page into robot file or just add no index tag?
Hi, got different opinion on this so i wanted to double check with your comment is. We've got /404.html page and I was wondering if you would add this page to robot text so it wouldn't be indexed or would you just add no index tag? What would be the best approach? Thanks!
Intermediate & Advanced SEO | | Rubix0 -
Meta tags - are they case sensitive?
I just ran the wordtracker tool and noticed something interesting. The tool didn't pick up our meta description. It's strange as our meta descriptions appear in organic search results and Moz never reported missing meta descriptions.After reviewing other pages, I noticed our meta description tag is written as the following: name="Description" content=" I never thought about this, but are meta tags case sensitive? Should it be written as: name="description" content=" Thoughts?
Intermediate & Advanced SEO | | Bio-RadAbs0 -
De-indexed Link Directory
Howdy Guys, I'm currently working through our 4th reconsideration request and just have a couple of questions. Using Link Detox (www.linkresearchtools.com) new tool they have flagged up a 64 links that are Toxic and should be removed. After analysing them further alot / most of them are link directories that have now been de-indexed by Google. Do you think we should still ask for them to be removed or is this a pointless exercise as the links has already been removed because its been de-indexed. Would like your views on this guys.
Intermediate & Advanced SEO | | ScottBaxterWW0 -
Rel=canonical tag on original page?
Afternoon All,
Intermediate & Advanced SEO | | Jellyfish-Agency
We are using Concrete5 as our CMS system, we are due to change but for the moment we have to play with what we have got. Part of the C5 system allows us to attribute our main page into other categories, via a page alaiser add-on. But what it also does is create several url paths and duplicate pages depending on how many times we take the original page and reference it in other categories. We have tried C5 canonical/SEO add-on's but they all seem to fall short. We have tried to address this issue in the most efficient way possible by using the rel=canonical tag. The only issue is the limitations of our cms system. We add the canonical tag to the original page header and this will automatically place this tag on all the duplicate pages and in turn fix the problem of duplicate content. The only problem is the canonical tag is on the original page as well, but it is referencing itself, effectively creating a tagging circle. Does anyone foresee a problem with the canonical tag being on the original page but in turn referencing itself? What we have done is try to simplify our duplicate content issues. We have over 2500 duplicate page issues because of this aliasing add-on and want to automate the canonical tag addition, rather than go to each individual page and manually add this tag, so the original reference page can remain the original. We have implemented this tag on one page at the moment with 9 duplicate pages/url's and are monitoring, but was curious if people had experienced this before or had any thoughts?0