Wordpress noindex
-
Hi there,
Does anyone no of a way to noindex all the "previous entries" pages in a wordpress blog. They usally show on domain.com/page/2/ etc. They are the small snippets that provide a summary of the all your posts.
I've not been able to find a plugin to do this.
Thanks so much!
-
Thanks, this helped me
-
Brilliant. Glad I could help.
-
I fully understand that. People at my work fully advocate the use of plugins, I personally believe that somethings should be hard-coded into the theme (This being one of those things).
Chances are once you find an SEO plugin you tend to stick with it until you notice issues or find one that does things better.
I'm surprised that WordPress doesn't do more to help prevent duplicate content.
-
Thanks so much Ben, This has done exactly what I wanted
-
I agree Ben, that coding into the theme itself is the best way to do this - however since there are a few ways to do this task - we wanted to offer various solutions.
-
Also, if you use the plugin to set the posts to no-follow and you want to change SEO plugin changes are those settings won't be transferred, if you code the solution into the theme its plugin independent and (in my eyes) the best way to go about avoiding duplication in WordPress.
-
This seems overly complex for a coding solution. All you need is 3 lines of code (max) in header.php of your theme (See my reply above)
-
That will only put noindex on the date archives.
I noticed the same problem and wrote an article on how to avoid duplication in WordPress. For a fresh WordPress installation with 10 published posts, Google ends up indexing close to 30 URLs, all pointing back to just 10 unique articles, because of the date, author and category archive pages.
In header.php of your theme add the following code:
1){ echo ''; } ?> '; } ?> '; } ?>
If you want, this could be condensed to:
$paged > 1 || is_author() || is_trackback()){ echo ''; } ?>
You can read more at my blog: http://www.laceytechsolutions.co.uk/blog/wordpress-development/avoiding-duplicate-content-in-wordpress
-
There is a plugin way – and a coding way…
I will explain both
Plugin version is good for individual posts / coding way is a mass archive way…
PLUGIN VERSION
Using a plugin called WordPress SEO by Yoast.. you can go into the previous posts and do the following.
Open the POST dashboard in your WordPress blog.
Find the post/s you are interested in working with.
Click ‘Edit’ post
Then scroll down to the WordPress SEO by Yoast section on the bottom anc click ADVANCED
From there.. you can select noindex / nofollow for that post – as well as a few advanced meta tags.
You can do this for any or all of your posts…
CODING VERSION
Step 1
In your theme folder locate the following files:/wp-content/themes/yourtheme/
Archive.php
Header.php
Open them both up.
Step 2
First thing to do is SAVE the header file as:header-archive.php
with that new version open add the following code into the header section:
Then save it.
Upload that file to your wp theme folder online.
Step 3 now go to the open archive.php file and find and replace the following:
FIND:
REPLACE WITH:
Save and upload that file to your theme folder as well.
Now you have just MASS added a NOINDEX / NOFOLLOW to all of your archive pages – but NOT the individual post pages themselves.
Hope this helps!
-
It should.If that does not work for some reason, try Yoast. One of these 2 plugins should be able to deindex these pages for you. I presume you are using the latest version of Wordpress as well as the Plugin.
-
hi najul.. Thanks, I already use allinone seo, and have that option selected, but it hasn't put noindex on these pages.
-
hi syed.
Thanks for the post.. I did that a couple of months ago, and it has removed the description, but the pages are still indexed. I'm trying to get them deindexed.
Thanks
-
If you use All in One SEO Plugin, you can enable a setting where you can set a Noindex on all the Archive level pages. I hope this helps.
-
I don't know of any plugin that does this but why not simply do it via robots.txt:
<code>User-agent: Googlebot Disallow: /page/specificfolder/</code>
where 'specificfolder' could be '/2/' as in your example (domain.com/page/2/) - it would simply noindex all content under that specific folder.
Also, if you want to noindex specific pages/posts, you could do it by using this plugin: http://yoast.com/wordpress/meta-robots-wordpress-plugin/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can you add meta discription for your homepage in yoast wordpress SEO?
Hello everyone Can some please tell me how to add meta description for your homepage in yoast wordpress SEO. I've wrote the description in titles and Metas section but it didn't work. Only my title page worked fine. I really need help on this one. thank you
Technical SEO | | nashnazer0 -
Author schema and Wordpress Author Page
Hi everyone, Has anyone tried using the author schema on their Wordpress author page or on their G+ profile or on their Moz profile? Would it be a good idea to always use it where you publish? I publish on several blogs Thanks Carla example: Use it here - http://www.posicionamientowebenbuscadores.com/blog/author/carla/ http://moz.com/community/users/392216 It seems like I would be over doing it.
Technical SEO | | Carla_Dawson0 -
Best way to handle pages with iframes that I don't want indexed? Noindex in the header?
I am doing a bit of SEO work for a friend, and the situation is the following: The site is a place to discuss articles on the web. When clicking on a link that has been posted, it sends the user to a URL on the main site that is URL.com/article/view. This page has a large iframe that contains the article itself, and a small bar at the top containing the article with various links to get back to the original site. I'd like to make sure that the comment pages (URL.com/article) are indexed instead of all of the URL.com/article/view pages, which won't really do much for SEO. However, all of these pages are indexed. What would be the best approach to make sure the iframe pages aren't indexed? My intuition is to just have a "noindex" in the header of those pages, and just make sure that the conversation pages themselves are properly linked throughout the site, so that they get indexed properly. Does this seem right? Thanks for the help...
Technical SEO | | jim_shook0 -
Wordpress Archive pages
In the SEOMOZ site report a number of errors were found. One of which was no or duplicate meta desctions on certain blog pages. When I drilled down to find these i noticed thosepages are the wordpress autocreated archive pages. When I searched for these through the wordpress control panel through both pages and blogs they were nowhere to be found. Does anyone know how to find these pages or are they not something I need to worry about?
Technical SEO | | laserclinics0 -
Is Noindex Enough To Solve My Duplicate Content Issue?
Hello SEO Gurus! I have a client who runs 7 web properties. 6 of them are satellite websites, and 7th is his company's main website. For a long while, my company has, among other things, blogged on a hosted blog at www.hismainwebsite.com/blog, and when we were optimizing for one of the other satellite websites, we would simply link to it in the article. Now, however, the client has gone ahead and set up separate blogs on every one of the satellite websites as well, and he has a nifty plug-in set up on the main website's blog that pipes in articles that we write to their corresponding satellite blog as well. My concern is duplicate content. In a sense, this is like autoblogging -- the only thing that doesn't make it heinous is that the client is autoblogging himself. He thinks that it will be a great feature for giving users to his satellite websites some great fresh content to read -- which I agree, as I think the combination of publishing and e-commerce is a thing of the future -- but I really want to avoid the duplicate content issue and a possible SEO/SERP hit. I am thinking that a noindexing of each of the satellite websites' blog pages might suffice. But I'd like to hear from all of you if you think that even this may not be a foolproof solution. Thanks in advance! Kind Regards, Mike
Technical SEO | | RCNOnlineMarketing0 -
The impact of mulstisite wordpress on seo
hi there, i would talk about a specific topic: The impact of mulstisite wordpress on seo Do you think that penalize seo activity ? i make you an example : a wordpress network of sites, domain based let the possibility to manage two domain on a single wp install, but even if the domains are separete, how does google see them, as separate or as a sigle domain?
Technical SEO | | guidoboem0 -
How to noindex lots of content properly : bluntly or progressively ?
Hello Mozers ! I'm quite in doubt, so I thought why not ask for help ? Here's my problem : I need to better up a website's SEO that consists in a lot (1 million+ pages) of poor content. Basically it's like a catalog, where you select a brand, a product series and then the product, to then fill out a form for a request (sorry for the cryptic description, I can't be more precise). Beside the classic SEO work, a part of what (I think) I need to do is noindex some useless pages and rewrite important ones with great content, but for the noindexing part I'm quite hesitant on the how. There's like 200 000 pages with no visits since a year, so I guess they're pretty much useless junk that would be better off in noindex. But the webmaster is afraid that noindexing that much pages will hurt its long tail (in case of future visits), so he wants to check the SERP position of every one of them, to only eliminate those that are in the top 3 (for these there's no hope of amelioration he thinks). I think it would be wasting a lot of time and resources for nothing, and I'd advise to noindex them regardless of their position. The problem is I lack the experience to be sure of it and how to do it : Is it wise to noindex 200 000 pages bluntly in one time (isn't it a bad signal for google ?) or should we do this progressively in like a few months ? Thanks a lot for your help ! Johann.
Technical SEO | | JohannCR0 -
Why does SEOMos Pro include noindex pages?
I'm new to SEOMoz. Been digesting the crawl data and have a tonne of action items that we'll be executing on fairly soon. Love it! One thing I noticed is in some of crawl warnings include pages that expressly have the ROBOTS meta tag with the "noindex" value. Example: many of my noindex pages don't include meta descriptions. Therefore, is it safe to ignore warnings of this nature for these pages?
Technical SEO | | ChatterBlock0