Pointless Wordpress Tagging: Keep or unindex?
-
Simple as that.
Pointless random tags that are serving no purpose other than adding apparent bulk to a website. They are just showing duplicate content and literally are random keywords that serve almost no purpose. And the tags, for the most part are only used on one page.
If I remove them however, they will probably drop our site from around 650 pages to 450 (assuming I keep any tags that were used more than once).
I have read through some of the other posts on here and I know that Google will do some work as far as duplicate content is concerned. Now as far as UX is concerned, all these tags are worthless. Thoughts?
-
I think that it would be nice to see more.... "This was my problem, and here is how I fixed it" posts on YouMoz.
Keep up the great work!
-
YES! I was thinking about turning this whole thing in to a blog post about what I did to fix it. I'm trying to come up with a catchy title.
"The webs we weave, when we practice to deceive"
That may be a little too on the nose. But I think you catch the drift!
Thanks again to both of you!
-
Nice work!
-
Sounds like you've been busy! Will be interesting to see how all of this plays out over the next few months. Keep notes of what you've done, could make for a good blog post.
-
Update
I have officially removed ALL TAGS. I also found out that our previous web guy had placed some syntax in the .htaccess which was altering the way the URL was displaying domains, so in the url, there was no "tag" pages. Which was news to me! Previous I had been using a quick redirect plugin, but with the amount of 301's I figured it was time to upgrade to .htaccess and plug them in there.
It has been a few weeks since the changes.
- Pages Crawled 194
- High Priority Issues: 9
- Medium Priority Issues: 324 (although I know what the cause of this was, and won't be in the next audit)
And for the first time in a little over a month that I have been working here we finally saw green in the amount of traffic that we are getting! We are up 4%
Thanks again for all your help!
-
UPDATE
So as of 7/9 I have removed 423 Tags and redirected them to 2 of the our main tags.
I isolated the 2 most effective tags and spread them across the 160 posts that we have. I limited it to one tag per post as to not create duplicate content, and because I am leaning towards removing tags all together.
Part of the other reason that I am doing this is because our categories seem to be gaining more traction than our tags.
My thought moving forward would be to take all of these, and fold them into the categories.
My Audit reads as follows:
- Pages Crawled dropped from 624-240
- High Priority Issues Dropped from 42-4
- Medium Priority Issues Dropped from 141-102
It is much to early to tell how things are going with traffic. Our visits are down 2% (and nobody checks into rehab until at least a week or 2 after the holiday weekend) but our keywords sending visits are up 9.
I will keep you all informed!
-
As much as I wish that these pages were not a problem, given the fact that I could probably write a blogella (a short story blog) about just how messy this website I inherited actually is, I am inclined to think that they are doing more harm then they are helping. Our numbers are staggering,
We have 660 indexed pages, As you can see 440 of them are from wordpress tags.
From our links we have 145 different root domains that account for 11,000 inbound links
We have something like 18K internal links.
Things are not good over here.
Almost none of the pages have meta tags, alt tags, proper h1, h2, h3 etc,
-
For example we have 27 articles, and 440 tags.... that should give you some insight as to the website I'm trying to clean up....
With this information. I can say that I would delete all 440 tags if this was my website. I would not need to think about it. These pages are going to be duplicate content and dangerous to the health of the site. They will also be a power sink.
-
That's skew any way you look at it. But still. Put them all in a secondary sitemap.html so they are not orphans, remove from sitemap.xml, place noindex in the HTML HEAD and still try to consolidate where possible.
In general we do not want to get rid of pages that are not a problem as they can receive organic traffic for not targeted keywords that we have no real other way of discovering. The web is an organic momentum flux and is not a solid state structure. It needs some degree of unintended and not calculated behavior also in website structure. Otherwise the sum of the parts of all pages in google would equal the value of google which is not the case. google connects dot's, we interpret and find new meaning and relations translating into traffic we did not expect.
The momentum flux is a joke of course. It's a quantum state of course
-
I don't want to speak for EGOL here, but I don't think he is suggesting CUT everything. What I got from his post was pull what's worthless and redirect (or as you say consolidate) to whats worthwhile.
The webmaster before me was writing articles with a Spinner. At least I believe he was, so we end up with a Title and then 30 WP Tags. Of the 30 keywords maybe 5 will be tagged to 5 other posts, 5 will be tagged in at least one other post, 2 will be branded and the rest are 1 off keywords that are very random and are almost partial sentences.
For example we have 27 articles, and 440 tags.... that should give you some insight as to the website I'm trying to clean up....
-
No do not use a 301 and certainly do not remove any pages from the index as mentioned here. That's foolish and uncalled for and potentially harmful against zero to no risk if you would let them be and only make them less prominent to users of the site. And if you really feel you need to cut drastically in the number of tag pages then use rel=canonical instead of a 301!
Consolidate not decimate! When we 301 we assimilate pages to 1 page. We say the old page is gone for good and the new 1 is the new page for the old link. This diffuses the keyword that the page was found for as it melts all different pages that 301 to a page into 1. However when we use a canonical url we consolidate the pages into 1 new one that bundle the old. When we search for a page that has and canonical to a other page it still ranks next to the new page for a while. Only the title in search is the same as the page referred to with the canonical. With the 301 it will disappears completely from the index and google cache along with it's internal keyword binding it hat before. So use canonical not 301! And my advice: consolidate to 1 useful tag page with a real body of work and optimize this for a primary keyword like 'seo news' or something and leave the pages with the 301 be but don't link to them anymore from then on.
Hope this is helpful.
Gr Daniel
-
Powerful quote regarding Google / Search Engine dependency!
-
"I don't trust having Google do stuff for me that I could do myself, because plenty of times Google says how they are gonna do things and then change their mind without tellin' anybody." -EGOL
That may be one of the best industry quotes I have ever read....
-
I would do a 301.
If you use the URL removal tool that only works for google.
If you do a 301 that is on your server and every attempt to access that page goes where you want it.
I don't trust having Google do stuff for me that I could do myself, because plenty of times Google says how they are gonna do things and then change their mind without tellin' anybody.
-
EGOL strikes again! That was my thought.
It would be better to remove the tags and do a 301 as opposed to remove them from the index with the URL Removal tool? Or are you saying add a 301 to them?
-
Go into your analytics and see if they are pulling any traffic from search. See if they are pulling in any traffic from referrals or social media.
I am willing to bet that those pages are dead weight.
This types of pages do not exist on any of my sites.
So, if you find in the analytics that these pages are dead weight then delete them and use 301 redirects, and turn them off in your content manager.
If you have content that you want to promote or that lots of people are lookin' at, then give those pages links in obvious locations on every page of your website. People will look at that... they will rarely click a tag.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How does Google handle product detail page links hiden in a <noscript>tag?</noscript>
Hello, During my research of our website I uncovered that our visible links to our product detail pages (PDP) from grid/list view category-nav/search pages are <nofollowed>and being sent through a click tracking redirect with the (PDP) appended as a URL query string. But included with each PDP link is a <noscript>tag containing the actual PDP link. When I confronted our 3rd party e-commerce category-nav/search provider about this approach here is the response I recieved:</p> <p style="padding-left: 30px;">The purpose of these links is to firstly allow us to reliably log the click and then secondly redirect the visitor to the target PDP.<br /> In addition to the visible links there is also an "invisible link" inside the no script tag. The noscript tag prevents showing of the a tag by normal browsers but is found and executed by bots during crawling of the page.<br /> Here a link to a blog post where an SEO proved this year that the noscript tag is not ignored by bots: <a href="http://www.theseotailor.com.au/blog/hiding-keywords-noscript-seo-experiment/" target="_blank">http://www.theseotailor.com.au/blog/hiding-keywords-noscript-seo-experiment/<br /> </a> <br /> So the visible links are not obfuscating the PDP URL they have it encoded as it otherwise cannot be passed along as a URL query string. The plain PDP URL is part of the noscript tag ensuring discover-ability of PDPs by bots.</p> <p>Does anyone have anything in addition to this one blog post, to substantiate the claim that hiding our links in a <noscript> tag are in fact within the SEO Best Practice standards set by Google, Bing, etc...? </p> <p>Do you think that this method skirts the fine line of grey hat tactics? Will google/bing eventually penalize us for this?</p> <p>Does anyone have a better suggestion on how our 3rd party provider could track those clicks without using a URL redirect & hiding the actual PDP link?</p> <p>All insights are welcome...Thanks!</p> <p>Jordan K.</p></noscript></nofollowed>
White Hat / Black Hat SEO | | eImprovement-SEO0 -
Repetition in Title Tag and Description
Let's say this is a hypothetical title: "Chevrolet Parts in Buffalo, NY | Novotny Chevrolet" Would having two instances of Chevrolet between the name of the store and the keyword set off a spam warning or at least be a bad SEO practice? Also, would it be smarter to phrase it, "Novotny Chevrolet Parts in Buffalo, NY" or something of the sort? Would this principal also apply to meta descriptions? Thanks!
White Hat / Black Hat SEO | | oomdomarketing0 -
Keyword Phrase vs. separate keywords - Title Tag best practices
Hello, What is your opinion about when to use a keyword phrase vs. 2 keywords, separated by a comma, in the title tag? For example, on this page, the title could be either: NLP Hypnosis, Language Patterns | Nlpca.com or NLP and Hypnosis Including Language Patterns | Nlpca.com Which do you guys think is best with respect to rankings, updates, and future updates?
White Hat / Black Hat SEO | | BobGW0 -
WordPress Plugin Backlinks?
I'm considering having a WordPress plugin developed that would provide content from my site for others to display in their sidebar. It would definitely provide value for users and I know people would use it on their sites, but my question is . . . If I were to add my link below the widget (e.g. "Content provided by Company ABC"), would this be good or bad for SEO? The anchor text wouldn't be anything special, just an exact match of my brand name (my domain name). I seem to remember Matt Cutts answering something similar a few years ago and I thought he said it was fine as long as the anchor text was the brand name. But maybe things have changed since then. Keep in mind that this plugin could potentially be used by tens of thousands of sites, so the backlink profile could be huge. Thoughts? Would this cause my site to get penalized?
White Hat / Black Hat SEO | | JABacchetta0 -
Geotag city different from postal address. Can I mention both cities together in title tags?
This boundary thing seems to be haunting me at the mo. Oh what I'd give for somewhere within a defined boundary! Anyway, just noticed a client has one city in its official postal address, and another city under its geotag. So I'm looking at the title tags and I'm thinking of mentioning both cities on the main entry pages (6 of them) then dividing mention in sub pages. Is this acceptable to Google? Might they see mention of both cities in homepage title tag (and other entry pages) as spammy. I don't want to upset Google!!! PS. Both cities are core markets. I would say they're of equal importance in terms of current business bookings and business potential.
White Hat / Black Hat SEO | | McTaggart0 -
Improve CTR with Special Characters in Meta-Description / Title Tags
I've seen this question asked a few times, but I haven't found a definitive answer. I'm quite surprised no one from Google has addressed the question specifically. I ran across this post the other day and it piqued my interest: http://www.datadial.net/blog/index.php/2011/04/13/special-characters-in-meta-descriptions-the-beboisation-of-google/ If you're able to make your result stand out by using stars, smiley faces, TM symbols, etc it would be a big advantage. This is in use currently if you search for a popular mattress keyword in Google. It really is amazing how the special characters draw your attention to the title. You can also see the TM and Copyright symbols if you search for "Logitech Revue" Radioshack is using these characters in their adwords also. Has anyone found any definitive answers to this? Has anyone tracked CTR and long-term results with special characters in title or description tags? Any chance of getting penalized for using this? As a follow-up, it looks like you could also put check symbols into your meta-description tags. That has all kinds of interesting possibilities. http://www.seosmarty.com/special-symbols-wingdings-for-social-media-branding-twitter-linkedin-google-plus/
White Hat / Black Hat SEO | | inhouseninja0 -
Blogspot or Wordpress.com Redirect?
I have multiple domains with the same registrar. Is there an SEO benefit to create complimentary blogs on blogspot, wordpress.com or other "free" blog sites and forward these domains with the purpose of backlinking to the main site?
White Hat / Black Hat SEO | | reeljerc0