Should I Allow Blog Tag Pages to be Indexed?
-
I have a wordpress blog with settings currently set so that Google does not index tag pages. Is this a best practice that avoids duplicate content or am I hurting the site by taking eligible pages out of the index?
-
Thanks everyone for the insights and for taking the time to help. I'm using the WP All in one SEO plugin and will continue to leave the tag pages set to noindex.
-
I would block all tags until you create new content for each tag.
SInce it's wordpress you can create a tag php file and add content into it. Then you can add that tag to your sitemap and remove from robots.txt. That way you avoid duplicate content and boost the site with more seo friendly content.
You should create php files for each tag and just work on them one at a time.
For example you might start with tags-seo.php then tag-design.php and create specific content for those. -
Have you added Google Webmaster tools code and verified your site? This will give you fantastic insight into what is being crawled and if there are any dupe content problems present now.
Do you have an SEO plugin installed? Are canonicals set up? This would solve any problems for you if they exist.
***** OUR SCENARIO *****
We noindex pages, categories, and tags because we want our posts to rank on their own merit. We still assign categories and tags to posts for navigation internally but feel our posts stand on their own and are more of a benefit. Your experience and navigation may vary. We have never had a duplicate content problem yet and have operated our WP.org blog since before canonicals.
-
Different people have different opinions on it. As a general rule, I recommend clients not allow indexing of tag pages, just to be sure there's no duplicate content conflicts. This is even more important when several tags are assigned to each article. Quite often you'll end up with a "tag" page that's only got one or a couple articles, making for both duplicate content possibility and thin content pages.
The other potential problem for duplicate content issues is when articles show up in multiple tag pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
E-Commerce Site Collection Pages Not Being Indexed
Hello Everyone, So this is not really my strong suit but I’m going to do my best to explain the full scope of the issue and really hope someone has any insight. We have an e-commerce client (can't really share the domain) that uses Shopify; they have a large number of products categorized by Collections. The issue is when we do a site:search of our Collection Pages (site:Domain.com/Collections/) they don’t seem to be indexed. Also, not sure if it’s relevant but we also recently did an over-hall of our design. Because we haven’t been able to identify the issue here’s everything we know/have done so far: Moz Crawl Check and the Collection Pages came up. Checked Organic Landing Page Analytics (source/medium: Google) and the pages are getting traffic. Submitted the pages to Google Search Console. The URLs are listed on the sitemap.xml but when we tried to submit the Collections sitemap.xml to Google Search Console 99 were submitted but nothing came back as being indexed (like our other pages and products). We tested the URL in GSC’s robots.txt tester and it came up as being “allowed” but just in case below is the language used in our robots:
Intermediate & Advanced SEO | | Ben-R
User-agent: *
Disallow: /admin
Disallow: /cart
Disallow: /orders
Disallow: /checkout
Disallow: /9545580/checkouts
Disallow: /carts
Disallow: /account
Disallow: /collections/+
Disallow: /collections/%2B
Disallow: /collections/%2b
Disallow: /blogs/+
Disallow: /blogs/%2B
Disallow: /blogs/%2b
Disallow: /design_theme_id
Disallow: /preview_theme_id
Disallow: /preview_script_id
Disallow: /apple-app-site-association
Sitemap: https://domain.com/sitemap.xml A Google Cache:Search currently shows a collections/all page we have up that lists all of our products. Please let us know if there’s any other details we could provide that might help. Any insight or suggestions would be very much appreciated. Looking forward to hearing all of your thoughts! Thank you in advance. Best,0 -
Does >70 character title tag affect a pages ranking in search?
We are a publication that puts out hundreds of articles a month. We have +5000 medium priority errors showing that our title element tags are too long. The title tag is structured like this: [Headine] | [Publication Name that is 23 characters] . However, since we are a publication, it's not practical for us to try to limit the length of our title tags to 70 characters or less because doing so would make the titles of our content seem very unnatural. We also don't want to remove the branding because we want it to go with the article when it's shared (and to appear when some titles are short enough to allow room in SERPs). I understand the reasons for limiting characters to 70 or less with regard to SERP friendliness. We try to keep key phrases in the front. People are more likely to click on a page if they know what it's about etc etc. My question is, do the longer titles affect the ability for the page to rank in search? To put it a different way, if we altered all the +5000 of the title tags to fit within 70 characters, would the page authorities and our site's domain authority increase? I'd like to avoid needed to clean up 5000 pages if the medium priority errors aren't really hurting us. Any input is appreciated. Thanks!
Intermediate & Advanced SEO | | CatBrain1 -
Do I use H1 tag for logo or page content?
Should the h1 tag be used for the main page content or the logo? I understand the original method was too H1 the logo with the main search term, does this still hold true or should it be content focused?
Intermediate & Advanced SEO | | seoman100 -
Do I need to re-index the page after editing URL?
Hi, I had to edit some of the URLs. But, google is still showing my old URL in search results for certain keywords, which ofc get 404. By crawling with ScremingFrog it gets me 301 'page not found' and still giving old URLs. Why is that? And do I need to re-index pages with new URLs? Is 'fetch as Google' enough to do that or any other advice? Thanks a lot, hope the topic will help to someone else too. Dusan
Intermediate & Advanced SEO | | Chemometec0 -
Can Googlebots read canonical tags on pages with javascript redirects?
Hi Moz! We have old locations pages that we can't redirect to the new ones because they have AJAX. To preserve pagerank, we are putting canonical tags on the old location pages. Will Googlebots still read these canonical tags if the pages have a javascript redirect? Thanks for reading!
Intermediate & Advanced SEO | | DA20130 -
Link Removal Request Sent to Google, Bad Pages Gone from Index But Still Appear in Webmaster Tools
| On June 14th the number of indexed pages for our website on Google Webmaster tools increased from 676 to 851 pages. Our ranking and traffic have taken a big hit since then. The increase in indexed pages is linked to a design upgrade of our website. The upgrade was made June 6th. No new URLS were added. A few forms were changed, the sidebar and header were redesigned. Also, Google Tag Manager was added to the site. My SEO provider, a reputable firm endorsed by MOZ, believes the extra 175 pages indexed by Google, pages that do not offer much content, may be causing the ranking decline. My developer submitted a page removal request to Google via Webmaster tools around June 20th. Now when a Google search is done for site:www.nyc-officespace-leader.com 851 results display. Would these extra pages cause a drop in ranking? My developer issued a link removal request for these pages around June 20th and the number in the Google search results appeared to drop to 451 for a few days, now it is back up to 851. In Google Webmaster Tools it is still listed as 851 pages. My ranking drop more and more everyday. At the end of displayed Google Search Results for site:www.nyc-officespace-leader.comvery strange URSL are displaying like:www.nyc-officespace-leader.com/wp-content/plugins/... If we can get rid of these issues should ranking return to what it was before?I suspect this is an issue with sitemaps and Robot text. Are there any firms or coders who specialize in this? My developer has really dropped the ball. Thanks everyone!! Alan |
Intermediate & Advanced SEO | | Kingalan10 -
To index or not to index search pages - (Panda related)
Hi Mozzers I have a WordPress site with Relevanssi the search engine plugin, free version. Questions: Should I let Google index my site's SERPS? I am scared the page quality is to thin, and then Panda bear will get angry. This plugin (or my previous search engine plugin) created many of these "no-results" uris: /?s=no-results%3Ano-results%3Ano-results%3Ano-results%3Ano-results%3Ano-results%3Ano-results%3Akids+wall&cat=no-results&pg=6 I have added a robots.txt rule to disallow these pages and did a GWT URL removal request. But links to these pages are still being displayed in Google's SERPS under "repeat the search with the omitted results included" results. So will this affect me negatively or are these results harmless? What exactly is an omitted result? As I understand it is that Google found a link to a page they but can't display it because I block GoogleBot. Thanx in advance guys.
Intermediate & Advanced SEO | | ClassifiedsKing0 -
Why are so many pages indexed?
We recently launched a new website and it doesn't consist of that many pages. When you do a "site:" search on Google, it shows 1,950 results. Obviously we don't want this to be happening. I have a feeling it's effecting our rankings. Is this just a straight up robots.txt problem? We addressed that a while ago and the number of results aren't going down. It's very possible that we still have it implemented incorrectly. What are we doing wrong and how do we start getting pages "un-indexed"?
Intermediate & Advanced SEO | | MichaelWeisbaum0