Yup! A page is a much better way to target a keyword than a category archive. That better fots the concept of a landing page.
Hope the video is helpful too
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Yup! A page is a much better way to target a keyword than a category archive. That better fots the concept of a landing page.
Hope the video is helpful too
Agreed with Gianluca, I reference that article a lot.
This is another article I've found helpful too. He tells you step by step what to do.
And this is an article on the same blog with advanced tips.
Hi There
To noindex pages there are a few methods;
use a meta noindex without robots.txt - I think that is why some may not be removed. The robots.txt block crawling so they can not see the noindex.
use a 301 redirect - this will eventually kill off the old pages, but it can definitely take a while.
canonical it to another page. and as Chris says, don't block the page or add extra directives. If you canonical the page (correctly), I find it usually drops out of the index fairly quickly after being crawled.
use the URL removal tool in webmaster tools + robots.txt or 404. So if you 404 a page or block it with robots.txt you can then go into webmaster tools and do a URL removal. This is NOT recommended though in most normal cases, as Google prefers this be for "emergencies".
The only method that removes pages within a day or two guaranteed is the URL removal tool.
I would also examine your site since it is new, for something that is causing additional pages to be generated and indexed. I see this a lot with ecommerce sites where they have lots of pagination, facets, sorting, etc and those can generate lots of other pages which get indexed.
Again, as Chris says, you want to be careful to not mix signals. Hope this all helps!
-Dan
Hi Ria
99.9% certain Google 'sees' all of those as the same in terms of character/word separation. I don't think OptionA/OptionB etc will be seen all as one keyword.
However Patrick has the right idea - to question if you really need one page or if things can be broken into separate pages.
I'd also optimize for readability and clicks too
-Dan
As far as I thought, the important thing is that your feed shows up in feed readers. Can you subscribe to and view your RSS feed in a variety of different feed readers?
Yes, so long as the ? is utilized only in ways in which would result in duplicate content, or content that would not be desirable to crawl, it will have that effect.
-Dan
I do not know of such a tool - maybe try SEMRush? They have a lot by way of competitive analysis.
Hi Shaun
Yes in terms of keeping strictly to Google's guidelines, I agree that Yoast should in theory use either prev/next or canonical on subpages, but not both.
I am honestly not certain the settings it could be otherwise, as "subpages of archives" is the only one I know of that handles pagination.
Could there be another plugin or your theme (or custom coding in header.php) causing a conflict? One thing you can do is shut off other plugins one by one to diagnose. You can switch themes or switch to the default header.php file included with WordPress, but I (for obvious reasons) do NOT recommend doing that on a live website. I'm not sure if you have a testing environment.
Are you using a framework like Thesis or Genesis? Sometimes those can cause unexpected things to happen as well.
-Dan
Hi
From what I have seen, if the NEW structure is done well (good architecture, good on-page optimization, better navigation, better keyword optimization in the new urls - its all an improvement from the old navigation) and you 301 everything corectly - you should see an improvement in rankings. Although this can take a week or two.
And of course submit a new xml sitemap. I also block the old pages/directories in robots.txt (this may be a little overkill but I do it anyway).
One extra tip is to just do a site:www.mydomain.com search in google to uncover any pages in their index you may not know about, or have overlooked and be sure to 301 those to the most relevant pages as well.
-Dan
Hi Phil
I don't know that I've seen this exact issue before, but I've seen similar situations recently as well. I almost wonder if this could have to do with what data center you happen to be hitting, or if perhaps just in their database for some reason they push back to a prior index. Or you could be seeing part of test in one of those instances.
You could try asking others to check the cache from different places geographically. Have you also tried different browsers? I know some of that could be a stretch, but testing all variables can't hurt.
End of the day, I don't think this is a huge issue if Googlebot crawls the site at a healthy rate. So I'd run through webmaster tools etc and make sure crawl depth is where it should be.
-Dan
Hi There
This can be common with new sites. Google will test out it's rankings in the beginning to see how users respond etc - and then it may "settle" to a more normal ranking.
These small adjustments to on-page things like H1s (which is not a strong signal) etc are probably going to make very small differences. I would recommend looking at other factors such as back links, domain authority, local citations etc.
You can use the Moz Keyword Difficulty tool to gather lots of metrics around why certain sites rank where they do, and where you stand compared to those that are ranking.
Take a look though these link building strategies as well as GetListed for some help on the links and citations.
Hope that helps!
-Dan
I'd look at this in a holistic manner. If your site looks great, functions great, has good information, promotes trust among visitors, has good on page optimization, etc this is likely not of concern. The search engines know that an e-commerce site is going to have many add to cart buttons on a page and on a site. I'm sure they take this into consideration when judging an e-commerce site. As long as you're not doing anything sneaky with the buttons like hiding text behind them or anything, I'm sure your fine.
Of course, an example of your site, screenshot or link, is always most helpful when determining these things
-Dan
Hey There
I wrote a post which walks through how to decide what to do with tags. Essentially, here's what I would do;
(This assumes you have Yoast SEO for your SEO plugin).
Hope that helps!
I also did this post on setting up WordPress for the Moz blog which you may find helpful.
-Dan
Hi
How are you checking rankings?
I see a few on-site issues;
The first thing to do is check Google Webmaster Tools. This is the best source of problems with the site. Correct every possible thing in there first.
Before thinking its something malicious I would
Hope it helps!
-Dan
Hi Shaun
Dan here, one of the Moz Associates - we're very sorry for the delay!
I've attached a screenshot of my own personal company site which uses the Yoast Plugin - just want to verify the code as seen here is what I would consider "correct" and best practice for WordPress pagination.
That code has not require any custom coding or anything. So either we need to get the Yoast settings correct, or something else may be interfering with Yoast.
Please first try going to: Yoast SEO->Titles/Meta and select "Noindex subpages of archives". This to my knowledge is the only setting that needs to be made to handle pagination correctly.
Let us know if that works - and again, apologies for the delay. Sometimes we have quite a backlog and don't pick up right away if the community has not appropriately answered a question.
Thanks!
-Dan
Hi Andrea!
Funny, you just tweeted to me today and then I was assigned to help you out with this question
Anyway, just want to make sure I understand exactly how things are setup to see if your question has been answered.
Is this correct?
You used www.wordpress.org (not .com) and installed a WP blog on a domain hosted with Go Daddy? In other words, you installed the WP files etc into your Go Daddy hosting account?
And you want that blog to map to like blog.yourdomain.com ?
Let me know, thanks!
-Dan
It is true they will not "penalize" for tags or archives directly, but you can make your site much better by doing many of the things Mike recommends above.
I wrote a post talking through how to assess your tags, and deciding which ones to delete and/or noindex: here.
Here's the elephant in the room too, you may rank for those tags, but do they bring traffic? What is the on-site metrics for your tag traffic? Bounce rate? Time on page?
It is true you may rank for some tags, but in general they never provide traffic, or the right traffic, compared to the content its self.
-Dan
URLs will always be important in lots of ways;
I don't think URLs are something Google looks at in a simplistic way like "oh you're URL says X keyword, we're gonna weigh that keyword more" - I think all of these factors feed into the importance of URLs and I don't see that value diminishing anytime soon.
-Dan
Hey Fabio
Regarding #2 I'd give it a little bit more time. 301's take a little longer to drop out, so maybe check back in a week or two Technically the URL removal will mainly work if the content now 404's, is noindexed or blocked in robots.txt but with a redirdect you can do none of those, so you just have to wait for them to pick up on the redirects.
-Dan
Hi!
Little late to the party here - thanks Geoff for helping out!!
While certainly creating excerpts on for the tag pages would be great - I'd suggest doing a crawl of your own site with something like Screaming Frog SEO Spider
I just did a crawl, and see a bunch of issues needing attention:
You've got some additional issues to work out besides just the tags thing.
Check webmaster tools to confirm this as well, Google webmaster tools will show you everything you need to fix!
-Dan
First - I would noindex them with a robots meta noindex - and not use the robots.txt disallow. The whole point is to not have them in the index. The robots.txt will prevent crawling but not remove from the index. So noindex the archives and remove the robots.txt disallow.
Then - just wait. WMT data can take months to catch up. I would not worry about the data in WMT so much though if you know you've got the right settings.
-Dan
Hi There
You should ensure the content either;
And then use the URL removal tool again and see if that works.
Hi Anthony
There's two separate things here, so I'll attack each one individually - and give you some insight to the thought process.
Should You Move To** WordPress**
It's unquestionable that for design, practicality, and long term benefits - you should move them to WordPress. The issue it really all about the ranking. So the questions then become - WHY is the site ranking now, and could you preserve that ranking with a migration.
I would look at things in the following ways.
My inclination would be the say that you can safely migrate - so long as it's done correctly .. see next.
How To Migrate To WordPress
I should preface and say that I have no experience with Google Sites. But let's assume the following things are possible.
Obviously without a deeper look I can't guarantee this as full proof. But I believe you should be fine - the main thing is just those 301s.
Also - couldn't hurt to just try this on one site first!
-Dan
I agree John this is the most user friendly way to do the redirects. The only thing I would add is - shut OFF the features which automatically adds redirects when you change URLs. It sounds like a nice feature, but it can get confusing because I found it to do them automatically a little too aggressively. Best to use the plugin but keep it on manual.
Hmm I thought the lazy load might have been an issue but all the content is showing the text-only cache: http://webcache.googleusercontent.com/search?q=cache:https://www.claydip.com/airbnb.html&strip=1
Claydip - try doing a fetch and render in Google Webmaster Tools: http://googlewebmastercentral.blogspot.com/2014/05/rendering-pages-with-fetch-as-google.html - and this will show you exactly what Googlebot sees for content.
One thing I can tell you too - I think your page could use some CRO. For example the "call to action" at the bottom "check out our demo" is barely noticeable.
Comparing your page to the top ranked one: http://www.cogzidel.com/airbnb-clone/ - they have
Does your page convert well? Try to make it a really great landing page with a good conversion rate (do you drive PPC traffic to it?).
Hi There
Should have explained better
if you type cache: in front of any web URL for example cache:apple.com you get;
And see the "cache" date? This is not the same as the crawl date, but it can give you a rough indication of how often Google might be looking at your pages.
So try that on some of your tag archives and if the cache date is say 4+ weeks ago maybe Google isn't looking at the site very often.
But it's odd they haven't been removed yet, especially with the URL removal tool - that tool usually only takes a day. Noindex tags usually only take a week or two.
Have you examined the source code to make sure it does in fact say "noindex" by the robots tag - or that there is not a conflicting duplicate robots noindex tag? Sometimes wordpress themes and plugins both try adding SEO tags and you can end up with duplicates.
-Dan
Hey There
Sounds like you are all set - just want to add that the type of page you're referring to: page/2 etc is "subpages" and you'll also want to look into noindexing those as well, in addition to "tags" and "categories". That should also fix the errors you're seeing in the Moz report.
-Dan
Thanks for that! Just want to add for Courtney - if she uses Yoast SEO plugin, you can also edit .htaccess right in wordpress without having to FTP.
Robert
Hmmm ok - this could possibly be an architectural and/or internal linking issue confusing Google as to which page it should rank. Or maybe it's your keyword targeting on both pages. Would you say the subcategory page is more optimized/focused on the keyword than the homepage?
Hey There
You want to look for this;
You can just do a cntrl-f (to search text in the source) and type in "noindex" and it should be present on the Tag archives.
-Dan
I would suggest going to Analytics, segmenting by organic search traffic, and seeing if anyone has landed on those pages from search results in the last 2-3 months. If Google is not returning them in search results, and they are not bringing traffic, Google usually favors cleaning pages out of the index that don't need to be there.
If you don't want to noindex them, you can add "Page 2" etc to the title tags to eliminate the duplicate title errors in the crawl report.
-Dan
Hi Christopher
I think you all have this sorted out, but it's a common question (and I agree that WP is a little lacking here) - so I addressed it in my Mozinar a few months ago (pages 60-66 of the downlaodable slide deck).
But to sum up that section - your image linked options are;
I would also recommend using "noindex" for media in Yoast SEO.
-Dan
It was probably indexed so late because Google couldn't find it
I just crawled the whole site with Screaming Frog and that URL wasn't picked up in the crawl --> http://screencast.com/t/xzunkNR3K
But it's in your sitemap --> https://www.policygenius.com/blog/post-sitemap.xml - so this makes total sense why it took Google so long to find it
Hi There
1. For the flash file NoReflectLight.swf - I would do a removal request in WMT and maintain the blocking in robots.txt of /plugins/
2. When you do a URL removal in WMT the files need to either be blocked in robots.txt or have a noindex on them or 404. Doesn't that sort of link redirect to your affiliate product? In other words, if I were to try to visit /go/affiliate-product/ it would redirect to www.affiliateproductwebsite.com ?Or does /go/affiliate-product/ load it's on page on your site?
3. I would maintain the robots.txt bloking on /plugins/ - if no other files from there are indexed, they will not be in the future.
-Dan
Hey Mark
Good idea? Yep!
Basic process;
I definitely recommend getting a dev to assist you if you don't feel comfortable!
Hope that helps.
-Dan
Hey There
The post suggests keeping specific tags that are receiving traffic indexed. You can do this on a tag by tag basis with Yoast. The post also does not recommend deleting tags, just noindexing them.
I would suggest keeping tags that are bringing the most traffic indexed, while just noindexing the rest. Do not delete them. Out of your 1,000+ tags - how many are responsible for the 700+ visits?
There's also the option to 301 redirect tags to the most relevant post or category. But again, I would only do this with tags that aren't bringing substantial traffic.
-Dan
No problem. Screaming Frog (or any crawler) won't pick it up, because it's not being linked to within the website (it's an "orphaned" page).
Google could still index them because they are in the sitemap, but it took so long because they are no actually linked to from the website.
So... if it's not supposed to be indexed at all in the first place, you can add a meta "noindex" tag to the page and remove it from the sitemap. Then you'll be all set
Hi There
For all these cases above, this may be a situation where you've BOTH blocked these in robots.txt and added noindex tags. You can not block the directories in robots.txt and get them deindexed, because Google can not then crawl the URLs to see the noindex tag.
If this is the case, I would remove any disallows to /tag/ etc in robots.txt, allow Google to crawl the URLs to see the nodinex tags - wait a few weeks and see what happens.
As far as the URL removal not working, make sure you have the correct subdomain registered - www or non-www etc for the URLs you want removed.
If neither one of those is the issue, please write back so I can try to help you more with that. Google should noindex the pages in a week or two under normal situations. The other thing is, check the cache date of the pages. If the cache dates are prior to the date you added the noindex, Google might not have seen the noindex directives yet.
-Dan
Hey Chris
Personally, I would use a different host that allows you to edit .htaccess! And I would 301 redirect to the new urls.
Go with long term. Even if you lose .1% of link value the non-html will provide a much better UX in my opinion, and you'll have a standard setup.
-Dan
Hey Thomas - thanks for jumping in! Just to clarify;
Escaped fragments are not recommended by Google anymore. Google now recommends Progressive Enhancement such as the History API pushState. See this article for details:
https://googlewebmastercentral.blogspot.com/2015/10/deprecating-our-ajax-crawling-scheme.html
Here's another post that you may want to check out:
https://builtvisible.com/javascript-framework-seo/
-Dan
Hey Aron
Just wanted to chime in on the wordpress bit. EGOL nailed the core answer though. But for the noindex, yes you can just noindex any pages you want to and this isn't going to cause any issues. Noindexed pages do not count towards Panda or low user metrics in the algo, so it's a great way to let the content exist but not have it cause trouble in the SERPs.
-Dan
Hi Chris
As I said in the other comment (and I'm just seeing your comment here) - this is crazy! Use a different host that "likes" whatever plugin you want and allows you full access to everything.
-Dan
Hi Angela
First off - where are you seeing the jump in 404's - Moz Tools? Google Webmaster Tools? Screaming Frog SEO Spider? Somewhere else?
When I visit that link which points here it does not 404, it returns a 200 - which leads me to think the tool you are getting this from is wrong.
You can also use a "Header Checker" like this: http://urivalet.com/?http://www.turnerpr.com/blog/wp-login.php?redirect_to=http%3A%2F%2Fwww.turnerpr.com%2Fblog%2F2013%2F10%2Fdays-markus-freitag%2F#Report - and you can see it also returns a 200.
Now, let's assume though the page is returning a 404 and you want to fix it. The link is due to your blog setup, and the site is requesting all users log in before they comment. For example if you go all the way to the bottom on this post: http://www.turnerpr.com/blog/2013/10/days-markus-freitag/ - under Post a Reply, you will see the link.
You can shut off this feature (which is normal to do) by going to Settings->Discussion-> and uncheck "users must be registered and logged in to comment".
Hope that helps!
-Dan
Hi Jenny
Yes you can redirect to URLs with anchor tags, but to Gaston's point - now that you have everything on one page, they may not rank as well as before. It does depend a little on how much overlap there was across the different products to begin with. The new page might rank well for a little while, but as Google starts to take the new consolidated page into account, you may lose ranking. The root fix would be to maintain separate pages like before, it that's possible.
Hey There
Bradley's answer is great. I just want to add, you really should only worry about the suggestions here - http://developers.google.com/speed/pagespeed/insights/ - in regards to Google and speed / UX factors that would affect ranking.
You should also check for mobile errors in Webmaster Tools.
Mike
I think this is what you are looking for: https://moz.com/community/q/html-extension - also a recent question from the Q&A. I think Dana's link explains how to maintain the .html in WordPress and that debate was more about if 301's pass PR etc.
Whereas if you've decided you already want to redirect, that tells you how.
Hi There
Jesse is right, a 302 doesn't pass PageRank, but it make pass other signals (such as understanding of content, associated penalties - these are just my guesses by the way). Is this something where you are concerned of passing bad link signals? Or other undesired signals?
Also, technically a 302 is for "temporary" redirects, but people do misuse this temporary bit all the time and leave them more or less permanently
-Dan
Hey Becky
Marcus pretty well covered things, but wanted to point you to a video Matt Cutts did a few years back about discontinued products: https://www.youtube.com/watch?v=9tz7Eexwp_A - a good watch to get an idea of how Google may look at things, and he breaks out some options depending on the type of site you have.