Pointless Wordpress Tagging: Keep or unindex?
-
Simple as that.
Pointless random tags that are serving no purpose other than adding apparent bulk to a website. They are just showing duplicate content and literally are random keywords that serve almost no purpose. And the tags, for the most part are only used on one page.
If I remove them however, they will probably drop our site from around 650 pages to 450 (assuming I keep any tags that were used more than once).
I have read through some of the other posts on here and I know that Google will do some work as far as duplicate content is concerned. Now as far as UX is concerned, all these tags are worthless. Thoughts?
-
I think that it would be nice to see more.... "This was my problem, and here is how I fixed it" posts on YouMoz.
Keep up the great work!
-
YES! I was thinking about turning this whole thing in to a blog post about what I did to fix it. I'm trying to come up with a catchy title.
"The webs we weave, when we practice to deceive"
That may be a little too on the nose. But I think you catch the drift!
Thanks again to both of you!
-
Nice work!
-
Sounds like you've been busy! Will be interesting to see how all of this plays out over the next few months. Keep notes of what you've done, could make for a good blog post.
-
Update
I have officially removed ALL TAGS. I also found out that our previous web guy had placed some syntax in the .htaccess which was altering the way the URL was displaying domains, so in the url, there was no "tag" pages. Which was news to me! Previous I had been using a quick redirect plugin, but with the amount of 301's I figured it was time to upgrade to .htaccess and plug them in there.
It has been a few weeks since the changes.
- Pages Crawled 194
- High Priority Issues: 9
- Medium Priority Issues: 324 (although I know what the cause of this was, and won't be in the next audit)
And for the first time in a little over a month that I have been working here we finally saw green in the amount of traffic that we are getting! We are up 4%
Thanks again for all your help!
-
UPDATE
So as of 7/9 I have removed 423 Tags and redirected them to 2 of the our main tags.
I isolated the 2 most effective tags and spread them across the 160 posts that we have. I limited it to one tag per post as to not create duplicate content, and because I am leaning towards removing tags all together.
Part of the other reason that I am doing this is because our categories seem to be gaining more traction than our tags.
My thought moving forward would be to take all of these, and fold them into the categories.
My Audit reads as follows:
- Pages Crawled dropped from 624-240
- High Priority Issues Dropped from 42-4
- Medium Priority Issues Dropped from 141-102
It is much to early to tell how things are going with traffic. Our visits are down 2% (and nobody checks into rehab until at least a week or 2 after the holiday weekend) but our keywords sending visits are up 9.
I will keep you all informed!
-
As much as I wish that these pages were not a problem, given the fact that I could probably write a blogella (a short story blog) about just how messy this website I inherited actually is, I am inclined to think that they are doing more harm then they are helping. Our numbers are staggering,
We have 660 indexed pages, As you can see 440 of them are from wordpress tags.
From our links we have 145 different root domains that account for 11,000 inbound links
We have something like 18K internal links.
Things are not good over here.
Almost none of the pages have meta tags, alt tags, proper h1, h2, h3 etc,
-
For example we have 27 articles, and 440 tags.... that should give you some insight as to the website I'm trying to clean up....
With this information. I can say that I would delete all 440 tags if this was my website. I would not need to think about it. These pages are going to be duplicate content and dangerous to the health of the site. They will also be a power sink.
-
That's skew any way you look at it. But still. Put them all in a secondary sitemap.html so they are not orphans, remove from sitemap.xml, place noindex in the HTML HEAD and still try to consolidate where possible.
In general we do not want to get rid of pages that are not a problem as they can receive organic traffic for not targeted keywords that we have no real other way of discovering. The web is an organic momentum flux and is not a solid state structure. It needs some degree of unintended and not calculated behavior also in website structure. Otherwise the sum of the parts of all pages in google would equal the value of google which is not the case. google connects dot's, we interpret and find new meaning and relations translating into traffic we did not expect.
The momentum flux is a joke of course. It's a quantum state of course
-
I don't want to speak for EGOL here, but I don't think he is suggesting CUT everything. What I got from his post was pull what's worthless and redirect (or as you say consolidate) to whats worthwhile.
The webmaster before me was writing articles with a Spinner. At least I believe he was, so we end up with a Title and then 30 WP Tags. Of the 30 keywords maybe 5 will be tagged to 5 other posts, 5 will be tagged in at least one other post, 2 will be branded and the rest are 1 off keywords that are very random and are almost partial sentences.
For example we have 27 articles, and 440 tags.... that should give you some insight as to the website I'm trying to clean up....
-
No do not use a 301 and certainly do not remove any pages from the index as mentioned here. That's foolish and uncalled for and potentially harmful against zero to no risk if you would let them be and only make them less prominent to users of the site. And if you really feel you need to cut drastically in the number of tag pages then use rel=canonical instead of a 301!
Consolidate not decimate! When we 301 we assimilate pages to 1 page. We say the old page is gone for good and the new 1 is the new page for the old link. This diffuses the keyword that the page was found for as it melts all different pages that 301 to a page into 1. However when we use a canonical url we consolidate the pages into 1 new one that bundle the old. When we search for a page that has and canonical to a other page it still ranks next to the new page for a while. Only the title in search is the same as the page referred to with the canonical. With the 301 it will disappears completely from the index and google cache along with it's internal keyword binding it hat before. So use canonical not 301! And my advice: consolidate to 1 useful tag page with a real body of work and optimize this for a primary keyword like 'seo news' or something and leave the pages with the 301 be but don't link to them anymore from then on.
Hope this is helpful.
Gr Daniel
-
Powerful quote regarding Google / Search Engine dependency!
-
"I don't trust having Google do stuff for me that I could do myself, because plenty of times Google says how they are gonna do things and then change their mind without tellin' anybody." -EGOL
That may be one of the best industry quotes I have ever read....
-
I would do a 301.
If you use the URL removal tool that only works for google.
If you do a 301 that is on your server and every attempt to access that page goes where you want it.
I don't trust having Google do stuff for me that I could do myself, because plenty of times Google says how they are gonna do things and then change their mind without tellin' anybody.
-
EGOL strikes again! That was my thought.
It would be better to remove the tags and do a 301 as opposed to remove them from the index with the URL Removal tool? Or are you saying add a 301 to them?
-
Go into your analytics and see if they are pulling any traffic from search. See if they are pulling in any traffic from referrals or social media.
I am willing to bet that those pages are dead weight.
This types of pages do not exist on any of my sites.
So, if you find in the analytics that these pages are dead weight then delete them and use 301 redirects, and turn them off in your content manager.
If you have content that you want to promote or that lots of people are lookin' at, then give those pages links in obvious locations on every page of your website. People will look at that... they will rarely click a tag.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Google and Other Search Engine crawl meta tags if we call it using react .js ?
We have a site which is having only one url and all other pages are its components. not different pages. Whichever pages we click it will open show that with react .js . Meta title and meta description also will change accordingly. Will it be good or bad for SEO for using this "react .js" ? Website: http://www.mantistechnologies.com/
White Hat / Black Hat SEO | | RobinJA0 -
Google suddenly stops ranking a page for a "keyword" with same "keyword" in title tag. Low competition.
Hi all, We have released our next version of product called like "software 11", which have thousands of searches every month. So we have just added this same keyword "software 11" as page title suffix to one of the top ranking pages. Obviously this is the page has been added suddenly with "software 11" at page title, multiple header tags and 1 mention in paragraph. Google ranked it for 2 days and suddenly stopped showing this page in entire results for the same keyword we optimised the page for. Why does it happened? Does Google think that we are overdoing with this page and ignoring it? Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Submitting a page to Google Search Console or Bing Webmaster Tools with nofollow tags
Hello, I was hoping someone could help me understand if there is any point to submit a domain or subdomain to Google Search Console (Webmaster Tools) and Bing Webmaster Tools if the pages (on the subdomain for example) all have nofollow/noindex tags ... or the pages are being blocked by the robots.txt file). There are some pages on a data feed onto a subdomain which I manage that have these above characteristics ... which I cannot change ... but I am wondering if it is better to simply exclude from submitting those from GWT and BWT (above) thereby eliminating generating errors or warnings ... or is it better to tell Google and Bing about them anyway then perhaps there is a chance those nofollow pages may be indexed/contextualised in some way, making it worth the effort? Many thanks!
White Hat / Black Hat SEO | | uworlds
Mark0 -
Advice needed! How to clear a website of a Wordpress Spam Link Injection Google penalty?
Hi Guys, I am currently working on website that has been penalised by Google for a spam link injection. The website was hacked and 17,000 hidden links were injected. All the links have been removed and the site has subsequently been redesigned and re-built. That was the easy part 🙂 The problems comes when I look on Webmaster. Google is showing 1000's of internal spam links to the homepage and other pages within the site. These pages do not actually exist as they were cleared along with all the other spam links. I do believe though this is causing problems with the websites rankings. Certain pages are not ranking on Google and the homepage keyword rankings are fluctuating massively. I have reviewed the website's external links and these are all fine. Does anyone have any experience of this and can provide any recommendations / advice for clearing the site from Google penalty? Thanks, Duncan
White Hat / Black Hat SEO | | CayenneRed890 -
Site Scraping and Canonical Tags
Hi, So I recently found a site (actually just one page) that has scraped my homepage. All the links to my site have been removed except the canonical tag, should this be disavowed through WMT or reported through WMT's Spam Report? Thanks in advance for any feedback.
White Hat / Black Hat SEO | | APFM0 -
Why isn't Moz recognizing meta description tags using SLIM?
Hey All, I keep getting reports from Moz that many of my pages are missing meta description tags. We use SLIM for our website, and I'm wondering if anyone else has had the same issue getting Moz to recognize that the meta descriptions exist. We have a default layout that we incorporate into every page on our site. In the head of that layout, we've included our meta description parameters: meta description ='#{current_page.data.description}' Then each page has its own description, which is recognized by the source code http://fast.customer.io/s/viewsourcelocalhost4567_20140519_154013_20140519_154149.png Any ideas why Moz still isn't recognizing that we have meta descriptions? -Nora, Customer.io
White Hat / Black Hat SEO | | sudonim0 -
Rel Noindex Nofollow tag vs meta noindex nofollow robots
Hi Mozzers I have a bit of thing I was pondering about this morning and would love to hear your opinion on it. So we had a bit of an issue on our client's website in the beginning of the year. I tried to find a way around it by using wild cards in my robots.txt but because different search engines treat wild cards differently it dint work out so well and only some search engines understood what I was trying to do. so here goes, I had a parameter on a big amount of URLs on the website with ?filter being pushed from the database we make use of filters on the site to filter out content for users to find what they are looking for much easier, concluding to database driven ?filter URLs (those ugly &^% URLs we all hate so much*. So what we looking to do is implementing nofollow noindex on all the internal links pointing to it the ?filter parameter URLs, however my SEO sense is telling me that the noindex nofollow should rather be on the individual ?filter parameter URL's metadata robots instead of all the internal links pointing the parameter URLs. Am I right in thinking this way? (reason why we want to put it on the internal links atm is because the of the development company states that they don't have control over the metadata of these database driven parameter URLs) If I am not mistaken noindex nofollow on the internal links could be seen as page rank sculpting where as onpage meta robots noindex nofolow is more of a comand like your robots.txt Anyone tested this before or have some more knowledge on the small detail of noindex nofollow? PS: canonical tags is also not doable at this point because we still in the process of cleaning out all the parameter URLs so +- 70% of the URLs doesn't have an SEO friendly URL yet to be canonicalized to. PSS: another reason why this needs looking at is because search engines won't be able to make an interpretation of these pages (until they have been cleaned up and fleshed out with unique content) which could result in bad ranking of the pages which could conclude to my users not being satisfied, so over and above the SEO factor, usability of the site is being looked at here as well, I don't want my users to land on these pages atm. If they navigate to it via the filters then awesome because they are defining what they are looking for with the filters. Would love to hear your thoughts on this. Thanks, Chris Captivate.
White Hat / Black Hat SEO | | DROIDSTERS0 -
Improve CTR with Special Characters in Meta-Description / Title Tags
I've seen this question asked a few times, but I haven't found a definitive answer. I'm quite surprised no one from Google has addressed the question specifically. I ran across this post the other day and it piqued my interest: http://www.datadial.net/blog/index.php/2011/04/13/special-characters-in-meta-descriptions-the-beboisation-of-google/ If you're able to make your result stand out by using stars, smiley faces, TM symbols, etc it would be a big advantage. This is in use currently if you search for a popular mattress keyword in Google. It really is amazing how the special characters draw your attention to the title. You can also see the TM and Copyright symbols if you search for "Logitech Revue" Radioshack is using these characters in their adwords also. Has anyone found any definitive answers to this? Has anyone tracked CTR and long-term results with special characters in title or description tags? Any chance of getting penalized for using this? As a follow-up, it looks like you could also put check symbols into your meta-description tags. That has all kinds of interesting possibilities. http://www.seosmarty.com/special-symbols-wingdings-for-social-media-branding-twitter-linkedin-google-plus/
White Hat / Black Hat SEO | | inhouseninja0