How would you handle 12,000 "tag" pages on Wordpress site?
-
We have a Wordpress site where /tag/ pages were not set to "noindex" and they are driving 25% of site's traffic (roughly 100,000 visits year to date). We can't simply "noindex" them all now, or we'll lose a massive amount of traffic. We can't possibly write unique descriptions for all of them. We can't just do nothing or a Panda update will come by and ding us for duplicate content one day (surprised it hasn't already). What would you do?
-
Yep, already implemented. Good point though.
-
Definitely. I start with the 30 day view, then go to YTD, then push the start date back to 1/1/2011. That's my 3 step process every time I'm investigating a situation.
I've seen at least 20 of our sites decline in traffic in the past few months due to the April & June Panda updates. The dates of decline in Webmaster Tools (Traffic > Search Queries) line up perfectly with the various recent Panda updates.
Fixing /tag/ issues is one thing...but we have a monumental task of rewriting massive amounts of product descriptions next. We also have a fair amount of "no-indexing" or canonicalizing to do with our syndicated content. We'll be better for it in the end. I only wish I knew about these situations much sooner.
As I tell everyone, protect your unique content with all you've got...and keep duplicate content nowhere near your site. It's just too risky.
-
Additionally, make sure your posts have rel=canonical.
-
Are you looking at your analytics as far back as early 2011?
I'm come across people who were hit on known Panda update day that weren't aware they were ....as strange as it may sound.
-
Thank you both...and, we're thinking alike. I recently went through our 60+ Wordpress sites addressing the issue of non-indexed /tag/ pages and also ensuring they weren't in the sitemap via our Sitemap plugin.
For the sites that had hundreds or thousands of /tag/ pages, but very little traffic in Google Analytics (Search > Organic w/ Landing Page as "primary dimension")...I just went ahead and set them to "noindex").
For sites where the /tag/ pages were driving a fair amount of traffic (10% of site total or more), I had our editors write unique descriptions for the top 50-100 (like we do with category pages) and then we set the rest to "noindex,follow" via the meta robots tag.
For this one site...I just haven't found an easy solution that didn't leave an uneasy feeling in my stomach. It's tough to give up 25% of your traffic in hopes that Google will get it right and rank your real content higher in place of these /tag/ pages.
Uh oh...I just checked Analytics and or organic traffic started creeping down @ July 13th. When I look at just the /tag/ pages in the organic landing pages section, I see that they dropped in traffic @ 50-60%. Something bad is happening. I am setting them to "noindex" immediately.
Definitely can't wait to read your post. I'll be writing my own on www.kernmedia.com in the near future as well.
-
Looking forward to that post, Dan.
-
Hi
I'm actually going to be addressing this exact question on a post for Moz in the coming weeks - so keep an eye out for that.
But in short, here's what I do;
Analytics
- run a report for landing tag pages (with a filter) - over the last three months
- apply an advanced segment to see google only traffic
- dump the report into a CSV
Webmaster Tools
- view a impressions / clicks report by top pages (not keyword) - also zoom out as far as you can
- filter for web only (not images)
- dump the report into a csv
VLookup in Excel
using a VLookup in excel - combine the two reports matching data to the URLs (you'll end up discarding some non-tag pages from wmt) - the end result will be a master spreadsheet, with the following columns;
- URL
- Impressions
- clicks
- avg position
- visits
- pages/visit
- avg visit duration
- % new visits
- bounce rate
(These are all the default report metrics. I actually prefer a custom landing page report in analytics, but this works fine.)
Analyze
Then, you do your sorting, filtering etc - to decide how valuable the tag traffic has been. In general, you're trying to look for an overwhelming reason for the value add of having those pages in there. they might get visits, but what's onsite behavior? maybe they get visits, but perhaps only from a small handle of tag pages?
In the post I do, I'll cover more about how to analyze this report etc.
As Klarke put so well, the actual posts should rank in their place. Those tend to have better results when people land on those.
Remove
If you decide to remove, do so carefully. Do it on a weekend or just before a downtime. If you use Yoast simply select to noindex tag archives.
Also, rememeber to exclude tags from your XML sitemap.
Then watch webmaster tools etc and watch for their removal.
--- I did this process on a site with 9,000 tag pages in the index and results were very good.
-Dan
-
I would "noindex,follow" them. Don't block them with robots.txt.
With that many pages, you're certainly running the risk of being hit by Panda.Those tag pages shouldn't be ranking, instead the individual posts should be in those positions. If I were you, I would take the chance and do the noindex, with the expectation that Google will appropriately rank the posts in their place.
I'd say those are better odds as against losing 50 - 80% of traffic in a panda update.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site moved. Unable to index page : Noindex detected in robots meta tag?!
Hope someone can shed some light on this: We moved our smaller site (into the main site ( different domains) . The smaller site that was moved ( https://www.bluegreenrentals.com)
Intermediate & Advanced SEO | | bgvsiteadmin
Directory where the site was moved (https://www.bluegreenvacations.com/rentals) Each page from the old site was 301 redirected to the appropriate page under .com/rentals. But we are seeing a significant drop in rankings and traffic., as I am unable to request a change of address in Google search console (a separate issue that I can elaborate on). Lots of (301 redirect) new destination pages are not indexed. When Inspected, I got a message : Indexing allowed? No: 'index' detected in 'robots' meta tagAll pages are set as Index/follow and there are no restrictions in robots.txtHere is an example URL :https://www.bluegreenvacations.com/rentals/resorts/colorado/innsbruck-aspen/Can someone take a look and share an opinion on this issue?Thank you!0 -
SEO implications of off-site secure Donation page
Hi Mozzers, I have a non-profit client that defends wildlife and public lands in the western US. The huge website is currently not responsive so we are working on that. In the meantime, we will be making the Action pages (such as Donations, Sign Petition, Get Newsletter) pages responsive. This will be housed under a new domain. My question is, what are SEO best practices for doing this? Does it negatively impact SEO to have a visitor "booted" from a site to a second secure site? Does Google know that the Donation site is in fact fulfillment and is linked to the original site? Also, what about domain implications? Is it best to have the name of the non-profit in the domain or sub, like this: https://saveanimals.secure.com/donate? Thanks everyone!
Intermediate & Advanced SEO | | CalamityJane770 -
Google de-indexed a page on my site
I have a site which is around 9 months old. For most search terms we rank fine (including top 3 rankings for competitive terms). Recently one of our pages has been fluctuating wildly in the rankings and has now disappeared altogether from the rankings for over 1 week. As a test I added a similar page to one of my other sites and it ranks fine. I've checked webmaster tools and there is nothing of note there. I'm not really sure what to do at this stage. Any advice would me much appreciated!
Intermediate & Advanced SEO | | deelo5550 -
Risk Using "Nofollow" tag
I have a lot of categories (like e-commerce sites) and many have page 1 - 50 for each category (view all not possible). Lots of the content on these pages are present across the web on other websites (duplicate stuff). I have added quality unique content to page 1 and added "noindex, follow" to page 2-50 and rel=next prev tags to the pages. Questions: By including the "follow" part, Google will read content and links on pages 2-50 and they may think "we have seen this stuff across the web….low quality content and though we see a noindex tag, we will consider even page 1 thin content, because we are able to read pages 2-50 and see the thin content." So even though I have "noindex, follow" the 'follow' part causes the issue (in that Google feels it is a lot of low quality content) - is this possible and if I had added "nofollow" instead that may solve the issue and page 1 would increase chance of looking more unique? Why don't I add "noindex, nofollow" to page 2 - 50? In this way I ensure Google does not read the content on page 2 - 50 and my site may come across as more unique than if it had the "follow" tag. I do understand that in such case (with nofollow tag on page 2-50) there is no link juice flowing from pages 2 - 50 to the main pages (assuming there are breadcrumbs or other links to the indexed pages), but I consider this minimal value from an SEO perspective. I have heard using "follow" is generally lower risk than "nofollow" - does this mean a website with a lot of "noindex, nofollow" tags may hurt the indexed pages because it comes across as a site Google can't trust since 95% of pages have such "noindex, nofollow" tag? I would like to understand what "risk" factors there may be. thank you very much
Intermediate & Advanced SEO | | khi50 -
De-indexing product "quick view" pages
Hi there, The e-commerce website I am working on seems to index all of the "quick view" pages (which normally occur as iframes on the category page) as their own unique pages, creating thousands of duplicate pages / overly-dynamic URLs. Each indexed "quick view" page has the following URL structure: www.mydomain.com/catalog/includes/inc_productquickview.jsp?prodId=89514&catgId=cat140142&KeepThis=true&TB_iframe=true&height=475&width=700 where the only thing that changes is the product ID and category number. Would using "disallow" in Robots.txt be the best way to de-indexing all of these URLs? If so, could someone help me identify how to best structure this disallow statement? Would it be: Disallow: /catalog/includes/inc_productquickview.jsp?prodID=* Thanks for your help.
Intermediate & Advanced SEO | | FPD_NYC0 -
Does Google only look at LSI per page or context of the Site?
From what I have read i should optimise each page for a keyword/phrase, however, I read recently that google may also look at the context of the site to see if there are other similar words. For example i have different pages optimised for Funeral Planning, funeral plans, funeral plan costs, compare funeral plans, why buy a funeral plan, paying for a funeral, prepaid funeral plans. Is this the best strategy when the words/phrases are so close or should i go for longer pages with the variations on one page or at least less pages? Thanks Ash
Intermediate & Advanced SEO | | AshShep10 -
How to increase the page rank for keyword for entire site
sorry for my bad english is there any way to increase the ranking for a keyword for the entire site .i know that seo is done per page basis .my site contains 1000ds of posts and i cant get back links for each and every post .so i picked 4 keywords which are mostly used while searching my products , is there any method i can increase my ranking for those keywords like increasing domain authority EXAMPLE :like if i want to increase my ranking for "buy laptop" .if any user searches In google with buy laptop i want my site or any of related pages that match the user search query must show up in front
Intermediate & Advanced SEO | | prakash.moturu0 -
Can I reduce number of on page links by just adding "no follow" tags to duplicate links
Our site works on templates and we essentially have a link pointing to the same place 3 times on most pages. The links are images not text. We are over 100 links on our on page attributes, and ranking fairly well for key SERPS our core pages are optimized for. I am thinking I should engage in some on-page link juice sculpting and add some "no follow" tags to 2 of the 3 repeated links. Although that being said the Moz's on page optimizer is not saying I have link cannibalization. Any thoughts guys? Hope this scenario makes sense.
Intermediate & Advanced SEO | | robertrRSwalters0