How would you handle 12,000 "tag" pages on Wordpress site?
-
We have a Wordpress site where /tag/ pages were not set to "noindex" and they are driving 25% of site's traffic (roughly 100,000 visits year to date). We can't simply "noindex" them all now, or we'll lose a massive amount of traffic. We can't possibly write unique descriptions for all of them. We can't just do nothing or a Panda update will come by and ding us for duplicate content one day (surprised it hasn't already). What would you do?
-
Yep, already implemented. Good point though.
-
Definitely. I start with the 30 day view, then go to YTD, then push the start date back to 1/1/2011. That's my 3 step process every time I'm investigating a situation.
I've seen at least 20 of our sites decline in traffic in the past few months due to the April & June Panda updates. The dates of decline in Webmaster Tools (Traffic > Search Queries) line up perfectly with the various recent Panda updates.
Fixing /tag/ issues is one thing...but we have a monumental task of rewriting massive amounts of product descriptions next. We also have a fair amount of "no-indexing" or canonicalizing to do with our syndicated content. We'll be better for it in the end. I only wish I knew about these situations much sooner.
As I tell everyone, protect your unique content with all you've got...and keep duplicate content nowhere near your site. It's just too risky.
-
Additionally, make sure your posts have rel=canonical.
-
Are you looking at your analytics as far back as early 2011?
I'm come across people who were hit on known Panda update day that weren't aware they were ....as strange as it may sound.
-
Thank you both...and, we're thinking alike. I recently went through our 60+ Wordpress sites addressing the issue of non-indexed /tag/ pages and also ensuring they weren't in the sitemap via our Sitemap plugin.
For the sites that had hundreds or thousands of /tag/ pages, but very little traffic in Google Analytics (Search > Organic w/ Landing Page as "primary dimension")...I just went ahead and set them to "noindex").
For sites where the /tag/ pages were driving a fair amount of traffic (10% of site total or more), I had our editors write unique descriptions for the top 50-100 (like we do with category pages) and then we set the rest to "noindex,follow" via the meta robots tag.
For this one site...I just haven't found an easy solution that didn't leave an uneasy feeling in my stomach. It's tough to give up 25% of your traffic in hopes that Google will get it right and rank your real content higher in place of these /tag/ pages.
Uh oh...I just checked Analytics and or organic traffic started creeping down @ July 13th. When I look at just the /tag/ pages in the organic landing pages section, I see that they dropped in traffic @ 50-60%. Something bad is happening. I am setting them to "noindex" immediately.
Definitely can't wait to read your post. I'll be writing my own on www.kernmedia.com in the near future as well.
-
Looking forward to that post, Dan.
-
Hi
I'm actually going to be addressing this exact question on a post for Moz in the coming weeks - so keep an eye out for that.
But in short, here's what I do;
Analytics
- run a report for landing tag pages (with a filter) - over the last three months
- apply an advanced segment to see google only traffic
- dump the report into a CSV
Webmaster Tools
- view a impressions / clicks report by top pages (not keyword) - also zoom out as far as you can
- filter for web only (not images)
- dump the report into a csv
VLookup in Excel
using a VLookup in excel - combine the two reports matching data to the URLs (you'll end up discarding some non-tag pages from wmt) - the end result will be a master spreadsheet, with the following columns;
- URL
- Impressions
- clicks
- avg position
- visits
- pages/visit
- avg visit duration
- % new visits
- bounce rate
(These are all the default report metrics. I actually prefer a custom landing page report in analytics, but this works fine.)
Analyze
Then, you do your sorting, filtering etc - to decide how valuable the tag traffic has been. In general, you're trying to look for an overwhelming reason for the value add of having those pages in there. they might get visits, but what's onsite behavior? maybe they get visits, but perhaps only from a small handle of tag pages?
In the post I do, I'll cover more about how to analyze this report etc.
As Klarke put so well, the actual posts should rank in their place. Those tend to have better results when people land on those.
Remove
If you decide to remove, do so carefully. Do it on a weekend or just before a downtime. If you use Yoast simply select to noindex tag archives.
Also, rememeber to exclude tags from your XML sitemap.
Then watch webmaster tools etc and watch for their removal.
--- I did this process on a site with 9,000 tag pages in the index and results were very good.
-Dan
-
I would "noindex,follow" them. Don't block them with robots.txt.
With that many pages, you're certainly running the risk of being hit by Panda.Those tag pages shouldn't be ranking, instead the individual posts should be in those positions. If I were you, I would take the chance and do the noindex, with the expectation that Google will appropriately rank the posts in their place.
I'd say those are better odds as against losing 50 - 80% of traffic in a panda update.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the impact of an off-topic page to other pages on the site?
We are working with a client who has one irrelevant, off-topic post ranking incredibly well and driving a lot of traffic. However, none of the other pages on the site, that are relevant to this client's business, are ranking. Links are good and in-line with competitors for the various terms. Oddly, very few external links reference this off-topic post, most are to the home page. Local profile is also in-line with competitors, including reviews, categorization, geo-targeting, pictures, etc. No spam issues exist and no warnings in Google Search Console. The only thing that seems weird is this off-topic post but that could affect rankings on other pages of the site? Would removing that off-topic post potentially help increase traffic and rankings for the other more relevant pages of the site? Appreciate any and all help or ideas of where to go from here. Thanks!
Intermediate & Advanced SEO | | Matthew_Edgar0 -
Difference in Number of URLS in "Crawl, Sitemaps" & "Index Status" in Webmaster Tools, NORMAL?
Greetings MOZ Community: Webmaster Tools under "Index Status" shows 850 URLs indexed for our website (www.nyc-officespace-leader.com). The number of URLs indexed jumped by around 175 around June 10th, shortly after we launched a new version of our website. No new URLs were added to the site upgrade. Under Webmaster Tools under "Crawl, Site maps", it shows 637 pages submitted and 599 indexed. Prior to June 6th there was not a significant difference in the number of pages shown between the "Index Status" and "Crawl. Site Maps". Now there is a differential of 175. The 850 URLs in "Index Status" is equal to the number of URLs in the MOZ domain crawl report I ran yesterday. Since this differential developed, ranking has declined sharply. Perhaps I am hit by the new version of Panda, but Google indexing junk pages (if that is in fact happening) could have something to do with it. Is this differential between the number of URLs shown in "Index Status" and "Crawl, Sitemaps" normal? I am attaching Images of the two screens from Webmaster Tools as well as the MOZ crawl to illustrate what has occurred. My developer seems stumped by this. He has submitted a removal request for the 175 URLs to Google, but they remain in the index. Any suggestions? Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan0 -
Two Week eCommerce Site Migration - how to handle visibility
Good day All, We have a eComm site with over 100K pages and migrating to a new site design, new CMS and with a new URL structure for top level URLs only. Product URLs not changing (thank goodness!). We have outlined our strategy for redirects, indexation, 404s, etc. The missing piece of the puzzle is that we're only allowing 10% visitors to see new site at launch and increase visibility over two week; therefore, my question is do we not allow indexing of new site until 100% visibility to all users? How do we manage the redirects for limited visibility? My gut says don't bother for such a short period of time and block new site from SEs until 100% visibility. Since the site would be blocked how are redirects managed? Should we be using a 302 initially then switch to 301 or use a 503 code to indicate "hey, maintenance happening - come back later" with a time frame? Hope that's clear and any tips greatly appreciated. Cheers, WMCA
Intermediate & Advanced SEO | | WMCA0 -
Can too many "noindex" pages compared to "index" pages be a problem?
Hello, I have a question for you: our website virtualsheetmusic.com includes thousands of product pages, and due to Panda penalties in the past, we have no-indexed most of the product pages hoping in a sort of recovery (not yet seen though!). So, currently we have about 4,000 "index" page compared to about 80,000 "noindex" pages. Now, we plan to add additional 100,000 new product pages from a new publisher to offer our customers more music choice, and these new pages will still be marked as "noindex, follow". At the end of the integration process, we will end up having something like 180,000 "noindex, follow" pages compared to about 4,000 "index, follow" pages. Here is my question: can this huge discrepancy between 180,000 "noindex" pages and 4,000 "index" pages be a problem? Can this kind of scenario have or cause any negative effect on our current natural SEs profile? or is this something that doesn't actually matter? Any thoughts on this issue are very welcome. Thank you! Fabrizio
Intermediate & Advanced SEO | | fablau0 -
Facebook "lockout"
I'm not sure what the correct term is, but I've visited websites that require me to like page 1 of an article, to view page 2. Little annoying but fair enough, they wrote the content, I clearly find it of value as I want page 2. I run a download website, with user generated content. We used to only allow downloads to members, this resulted in 5,000+ new signups per day and a massive userbase. We now allow guests to download content, the majority are freeloaders, not even a thank you to the artist. I am about to employ a system for guests, that forces them to like, tweet or G+ the download, for it to begin. If they don't, no download. Are there any SEO considerations here? The page this will be implemented on, isn't a crawlable page. Cheers.
Intermediate & Advanced SEO | | seo-wanna-bs0 -
How to handle a server outage if I have two sites
I operate a web application. It consists of two sites, www.mysite.com and app.mysite.com. As you might imagine, www is used for marketing purposes, and it's our main organic search entry point. The app.mysite.com domain is where our application portal is for customers, and it is also where our login and registration pages are located. Currently, www.mysite.com is experiencing a catastrophic outage and is returning 504 errors, but app.mysite.com is on a totally separate system with a lot redundancy, and is doing just fine. If we get traffic from referrals or search, we want that traffic to be able to login and register, so we've replaced the 504 error with a 302 redirect to app.mysite.com until the situation is resolved. This provides the best possible experience for users (nothing's worse than a 504). How will this affect SEO? Is there something other than a 302 that I should be doing with the broken www.mysite.com domain?
Intermediate & Advanced SEO | | Ehren0 -
Multilingual sites: Canonical and Alternate tag implementation question
Hello, I would like some clarification about the correct implementation of the rel="alternate" tag and the canonical tag. The example given at http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077 recommends implementing the canonical tag on all region specific sub-domains, and have it point to the www version of the website Here's the example given by Google. My question is the following. Would this technique also apply if I have region specific sites site local TLD. In other words, if I have www.example.com, www.example.co.uk, www.example.ca – all with the same content in English, but prices and delivery options tailored for US, UK and Canada residents, should I go ahead and implement the canonical tag and alternate tag as follows: I am a bit concerned about canonicalizing an entire local TLD to the .com site.
Intermediate & Advanced SEO | | Amiee0 -
Getting 260,000 pages re-indexed?
Hey there guys, I was recently hired to do SEO for a big forum to move the site to a new domain and to get them back up to their ranks after this move. This all went quite well, except for the fact that we lost about 1/3rd of our traffic. Although I expected some traffic to drop, this is quite a lot and I'm wondering what it is. The big keywords are still pulling the same traffic but I feel that a lot of the small threads on the forums have been de-indexed. Now, with a site with 260,000 threads, do I just take my loss and focus on new keywords? Or is there something I can do to get all these threads re-indexed? Thanks!
Intermediate & Advanced SEO | | StefanJDorresteijn0