Hey There - this has been throwing a lot of us off. There was a great thread a week or two ago here: http://moz.com/community/q/accuracy-of-search-volume-for-keyword-planner-v-old-keyword-tool
Michael and Kevin pretty much summed it up too
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Hey There - this has been throwing a lot of us off. There was a great thread a week or two ago here: http://moz.com/community/q/accuracy-of-search-volume-for-keyword-planner-v-old-keyword-tool
Michael and Kevin pretty much summed it up too
Thanks for that! Just want to add for Courtney - if she uses Yoast SEO plugin, you can also edit .htaccess right in wordpress without having to FTP.
I agree John this is the most user friendly way to do the redirects. The only thing I would add is - shut OFF the features which automatically adds redirects when you change URLs. It sounds like a nice feature, but it can get confusing because I found it to do them automatically a little too aggressively. Best to use the plugin but keep it on manual.
Hi - I think changing it back couldn't hurt to minimize the possible over-optimization signals.
-Dan
Hi Tim
Yup, take a look at this screenshot --> http://screencast.com/t/6j9zPt8ck1YL - that menu link did not exist on your old site (to my knowledge). This creates 100's of internal links on your new site with the anchor text "gastric band hypnotherapy" targeting a page /gastric-band-hypnotherapy - which also has exact match backlinks pointing to it.
Just my theory, but I believe possibly the addition of this sitewide internal link may have been enough to trigger some sort of overoptimization signal to Google.
-Dan
Hi There
When you upgraded the site did the URL change or did it stay the same? I noticed you also have a lot of your staging site indexed (probably unintentionally) but I would suggest getting that noindexed.
I do see they all redirect to the homepage of your site, but this might not be the best thing to do, since there is not a great topic relevancy going from deep pages redirecting to the homepage. I would just add meta robots noindex tags to them and not redirect them.
I think I answered my question, as I see the archive.org version of the page from a few months ago - the URL is the same but just with capital letters. This is sending through a 301 redirect, but I doubt enough is lost with that.
Unfortunately your back links seem to have an unnatural amount of commercial anchor text, especially for the page in question. So I am willing to bet perhaps when you upgraded the site, it may have "stirred the pot" a little and Google looked a little closer at the site.
I think the "final straw" may have to do with the fact the new site has a sitewide anchor text link "gastric band hypnotherapy" in the main menu, whereas the old site did not. Google takes the overoptimized back links, adds it up with the new overoptimized internal links and that could have triggered something.
Another question - did your rankings drop sitewide? Just for this page?
-Dan
Wow Dana - super helpful thanks so much for chiming in!
I am honestly not 100% sure at this point. I see the option to select broad, phrase or exact via a tiny little icon in the top right, however I am not convinced as to how well this is working. But the option is there, so you must be able to choose still. The default may be exact match, but you can still switch (whereas before default was broad match).
Hi Christopher
I think you all have this sorted out, but it's a common question (and I agree that WP is a little lacking here) - so I addressed it in my Mozinar a few months ago (pages 60-66 of the downlaodable slide deck).
But to sum up that section - your image linked options are;
I would also recommend using "noindex" for media in Yoast SEO.
-Dan
Hi Laurence
Google tries to take the iFrame as associate it with the page it's on (in effect trying to view it as a single page) - so in their ideal world, you should look at the page and the iFrame all as one page and treat it accordingly.
In reality though, they don't always accomplish what they want, so you might be OK.
What I would do is check your cache and text only cache and see if they're caching everything all as one page.
Here's their documentation on iFrames
-Dan
Hi Patrick
It is a little tricky to say for sure without the context of the site. However, the one link you've cited, those are typically not an issue. I'm pretty confident Google sees that as a "junk" site, scraping links, repurposed content etc that's not "real".
I would worry about links that were done to clearly attempt to over-optimize - either through being paid, anchor text, sitewide links on low-ish quality sites, links within content that is not at all relevant.
Conversely, you can be proactive by trying to build links that look as natural as possible - domain/brand/propword anchor text.
In fact I think Cyrus' video pretty astutely describes what to look out for.
You really have to discern if links were done maliciously (if you did not build them) and those would be the ones to use disavow.
Hope that helps!
-Dan
Thanks so much for that link Jeff!! Had no idea it was there. Will definitely be sharing.
I saw the AdWords help tab had a email address and phone number. I bet if you or someone payone for AdWords contacted (and asked like you're a paying AdWords customer) you could get an answer.
You could also submit a question in the Google Webmaster Help forum, or their Google Plus Group - or join a hangout they hold every Friday.
-Dan
Wow that's crazy! The only thing I can think to suggest is submitting a help request or contacting them.
First thing I want to check;
Next, you may be wondering how i got back to the old tool;
-Dan
1. If you check the source code of your blog posts, there must be some sort of link to the feeds - possibly even in the header. I'm not 100% on how the Moz crawler operates (if it only spiders <a>anchor links or if it spiders referenced links in the header - pretty sure the latter) - but either way that's how they're finding it, through some sort of link on the page.</a>
<a>You could try running a crawl with Screaming Frog SEO Spider and see if it also picks up the feed URLs and Screaming Frog will show you where it found the links as well.
2. Good question. Your theme may be displaying links to these things somewhere - the best way to find out is to crawl with Screaming Frog and it will show you which pages link to your feed and trackback URLs. Then if you don't need them, you can go into the editor and remove them from the code.
3. I agree with Thomas here, I would not block them with robots.txt - rather I would see if you can fix them at the source and remove the links if they are not needed.
-Dan</a>
Funny enough - look what I just found: http://www.bobwp.com/wordpress-floating-social-bar-plugin/
Hi Matt
I'm afraid this might have to be something custom. I hunted around and couldn't find anything that does this by default. You might want to check WPMU's plugins or Code Canyon. Sorry we can't find anything pre-made - but good luck!
-Dan
Hey Bill
I like to start with this standard setup (image/chart from my wordpress post on moz);
Pages, Posts, Categories - Index
Tags, Dated Archives, Subpages, Author Archives - noindex
You can check out the full post - I will be updating the Yoast Screenshots very soon!
-Dan
Here's what to do;
First - I would noindex them with a robots meta noindex - and not use the robots.txt disallow. The whole point is to not have them in the index. The robots.txt will prevent crawling but not remove from the index. So noindex the archives and remove the robots.txt disallow.
Then - just wait. WMT data can take months to catch up. I would not worry about the data in WMT so much though if you know you've got the right settings.
-Dan
In addition to what EGOL suggested, which is right on - you could also check to see which pages are indexed but not receiving traffic (for say the last 3 months). I would do this by crawling the site and comparing an export of your product pages to an export of your organic landing pages from analytics. Any products that Google has indexed, but not ranking or returning in search are good ones to noindex until you make them better.
-Dan
Hey There
This is an odd one for sure, and a really crowded space with a ton of EMDs and PMDs so a lot of noise. But I have a few hunches just after looking around a bit.
1. Their site is visually blazingly fast compared to chillisauce.
2. Their site is one of the few I could find, where the entire domain is hyper-focused on "stag in edinburgh" - and has the benefit of the homepage being the page to rank or match topically for those types of keywords. Whereas chillisause and some others are broader focused in what the site is about - and so architecturally, or linking, perhaps not as easy for Google to rank/credit those types of pages within the site, in this case, and as Doug pointed out perhaps their on-site is not so strong.
3. User metrics could be playing a role - CTR in serps etc. They actually stand out as being the least spammy of most other results, and have a shorter domain name (less spammy looking).
But yeah, there's plenty of anomalies - all exact match anchors for linking, less links, a borderline spammy/keyword-stuffed website. Has this site done well for a really long time?
-Dan
Hi there - I'm not 100% sure without looking, but I do know that depending on where you get your PR it might be "toolbar pagerank" which is not updated often, and not current to what your actual PR is. Did the other pages have any PR before?
But as Mike said, I doubt this has caused an actual drop in PR.
-Dan
I think what you're really looking to do is use multi touch funnels and attribution. And the metric you are trying to measure is called "first touch" - meaning the source of the first visit gets the credit.
If your analytics is set up for goals and/or ecommerce tracking which is all pretty easy to do, you can see all of the default multi touch reports in GA.
Here's their intro guide and guide on setting it up.
-Dan
It is true they will not "penalize" for tags or archives directly, but you can make your site much better by doing many of the things Mike recommends above.
I wrote a post talking through how to assess your tags, and deciding which ones to delete and/or noindex: here.
Here's the elephant in the room too, you may rank for those tags, but do they bring traffic? What is the on-site metrics for your tag traffic? Bounce rate? Time on page?
It is true you may rank for some tags, but in general they never provide traffic, or the right traffic, compared to the content its self.
-Dan
Hey Adam
The best place to start is really diagnose where and when the exact organic traffic loss is happening.
Is there an exact date in analytics you dropped off for organic traffic? Or is it gradual? If it's an exact date, you can match it up to the algo history and determine if it's panda or penguin related.
Then secondly I would look at your average position report in webmaster tools under queries to see if your rankings have dropped. Sometimes traffic loss can occur without ranking drop, so you'd want to see if that's in fact what it is. Or maybe you're tracking rankings.
If it is lost ranking then you may want to start segmenting organic traffic in analytics. Is the loss across all keywords? Can you pinpoint a few high volume keywords that were hit? Or were certain pages hit?
Let's walk through the site a little on video;
(the first few seconds is a little jumpy, sorry!)
First time trying this in the Moz Q&A. Let me know what you think - helpful?
Lastly - your question about the crazy numbers for indexation in WMT. I see this a lot too. This tells me Google is not sure what pages on your site are important or not. There's probably a lot of extra pages which are not important being crawled. I would mitigate this with a great XML sitemap as well as noindexing of things like subpages and pages that present the same content but just filtered and/or sorted differently.
Hope that helps!
-Dan
Hey There
Categories can certainly bring traffic, so I typically do index them. (I do recommended noindexing subpages of archives though - page 2 and onward etc).
But if your goal is to target more keywords, these are typically best done with creating new posts or pages around them, if they are unique enough to warrant new content. It's best to keep your categories to a minimum (7-10 maybe top level categories - you can nest more under those if you have to) - but design your categories for best information architecture. Let your posts and pages carry more keywords.
I would not recommend trying anything as an SEO "trick" like using the canonical on pages that are different keywords. The canonical is only for pages that have nearly identical content.
Hope that helps!
-Dan
Hey There
Determining What Keywords
1. Create List - Create a long list of as many keywords as you can come up with for each category or product. Throw in everything you can think of - and use www.ubersuggest.org and Google Keywords tool. Use two columns in a spreadsheet. One for the keyword, and the second to label the category or product.
2. Gather Metrics - Then we're going to add the most "bare bones" metrics. Grab exact match search volume for each keyword from AdWords keyword tool. Run rankings on the site for each keyword (although your site is not launched, so you probably don't have rankings ). You can also pull maybe CPC (to get an idea of how commercial a keyword is) or Moz Difficulty for an idea of competitiveness. Add your metrics to new columns as you go.
3. Analyze & Choose keywords - Using sorting and/or filters. Look at keywords for one product or category at a time. Basically, you want to choose keywords with a good amount of search volume that you think you could rank for, and they're relevant. You could create a column to tage keywords you're going to use.
How Many To Target
You should really target one, and it's variations. For example - if it's turmeric supplements you'd target "supplements made with turmeric" "turmeric health supplements" etc.
Follow one of my favorite SEO flowcharts of all time (made by Rand) to decide if you should split words into separate pages.
Product / Brand Name
I assume you are selling 3rd party products? Abso-heck-lutely include the brand and product name. People are gonna be searching on those brand and product names. These are very transaction / intent-to-purchase searches - and if you can serve up a result that's compelling enough to get the click, there's a high chance you'll get the purchase.
Hope that helps
-Dan
Hey There. Yes, I generally utilize mainly custom reports and segments for clients, as everyone goals and markets are different. Essentially, just creating the reports won't bring any "results" but our analysis and utilization of the data - takeaways and next step actions are what bring the results
But segmentation is totally essential to analytics: the words of Avinash say it much better than me ---> http://www.kaushik.net/avinash/web-analytics-segments-three-category-recommendations/
-Dan
Hi Anthony
100 links per page is not a rule anymore at all. Check out Matt's video from 2011.
I think you may have been hurt by some link building though. Check out this screenshot of your top anchor text according to open site explorer. You'll see a lot more commercial keywords for anchor text - and this is the sort of thing Google is not looking at so favorably anymore. You'll want to see if you can change the anchor text of those links to branded keywords if possible (not all, but as many as you can). And in addition you'll want to think about building new links that are more natural. I like PointBlank SEO's list of strategies: http://pointblankseo.com/link-building-strategies
And you should also focus a little on local SEO and local citations and review acquisition. I love Whitespark's Local Citation Finder and Moz's GetListed.org for help there.
It is great to clean up on-site stuff too of course! My post on WordPress SEO should help out there.
Good luck!
-Dan
Hi There
There's two possible ways I can think of you may want to try. Segmenting by landing page or segmenting by keyword/source.
**Landing Page ** - depending on the site, there may be specific pages (like products for ecommerce) that are obviously more purchase based. You can create a segment to show you all traffic that came by way of landing on one of these pages.
Keyword/Source - there are also purchase intent keywords like "buy, for sale, for purchase" etc - you can create a keyword segment which includes all of your purchase intend words - exclude branded terms, "how to" information intent words etc. There may be certain sources that you know are informational - like if you're linked to in a resource etc that's an informational visit - but if your product is in a "top 10 products" list, this referral traffic may be more purchase intent.
So really there's no blanket way to do this. You have to determine what determines the intent of traffic for your site and create segments around those parameters.
Hope that helps!
-Dan
Thanks Randy! It was my understanding that they were identical as well.
I too have seen this behavior, and don't know of a one-fits-all solution. Personally I'd start examining things with the specific URL. Do other results in those SERPs have stars?
-Dan
I second Lynn's answer. You need to find where the link is coming from to begin with. Could also use Screaming Frog SEO Spider or Webmaster Tools - they will all get you the same thing. Find out where the bad URLs are linked from, and then you can narrow down the source of bad code or whatever it may be.
-Dan
These: http://screencast.com/t/p120RbUhCT
They appear on every page I looked at, and take up the entire area "above the fold" and the content is "below the fold"
-Dan
Ahh. I see. You just need to "noindex" the pages you don't want in the index. As far as how to do that with blogengine, I am not sure, as I have never used it before.
But I think a bigger issue is like the giant box areas at the top of every page. They are pushing your content way down. That's definitely hurting UX and making the site a little confusing. I'd suggest improving that as well
-Dan
Hi There... that address does not seem to work for me. Should it be .net? http://www.dotnetblogengine.net/
-Dan
Andrew
I doubt that one thing made your rankings drop so much. Also, what type of CMS are you on? Duplicate content like that should be controlled through indexation for the most part, but I am not recognizing that type of URL structure as any particular CMS?
Are just the title tags duplicate or the entire page content? Essentially, I would either change the content of the pages so they are not duplicate, or if that doesn't make sense I would just "noindex" them.
-Dan
I don't know of one, but internally I keep what I call a "word map" in excel - where I have columns for state, adjectives like "best, top", etc etc. And I tend to create them from scratch each time for each client, but will just cut and pastes from client to client when certain columns relate.
-Dan
Mike
I think this is what you are looking for: https://moz.com/community/q/html-extension - also a recent question from the Q&A. I think Dana's link explains how to maintain the .html in WordPress and that debate was more about if 301's pass PR etc.
Whereas if you've decided you already want to redirect, that tells you how.
Hi There
Where are they appearing in WMT? In crawl errors?
You can also control crawling of parameters within webmaster tools - but I am still not quite sure if you are trying to remove these from the index or just prevent crawling (and if preventing crawling, for what reason?) or both?
-Dan
Hi, sorry... I just meant if you did have a few other spammy links from other sites, those could hurt you. I doubt the woorank ones are hurting, but perhaps the other 3 spammy sites you mentioned could have been.
Thanks for the details. It may not be those links, but if there were even a few others, it could potentially hurt the site. I had a client who had done some spammy link building in 2011ish with another agency and just those few links hurt the site. I would try removing or disavowing anything you can find that's spammy, and not helping, and see if they helps anything.
The types of things to de-optimize on your own site would be more like keyword stuffing in the titles, URLs, headers and internal anchors - not so much blocking pages from being crawled. Blocking crawling it typically only if you want to improve crawl efficiency on really large sites or you just don't want Google to page attention to that content, but it won't fix a Penguin issue if it's on-site.
-Dan
It depends. If the sitewide links were;
Those issues could hurt you coming from one site. If the links were exact match commercial anchors I'd see about changing them to brand or domain name anchors. If the site linking to you is a random topic, or you're among a big giant list of links, or part of a link exchange get the links removed or disavow.
-Dan
I agree with David here. Someone made a custom category plugin for WordPress which will do what David is suggesting. I would not block anything, but rather noindex what you don't want in the index, which is typically for me tags, subpages, author and date archives.
-Dan
Penguin 2.0 is definitely from webspam and backlinks. I suggest using Google Webmaster Tools as your backlink dataset. They provide links that are more likely to be causing problems. It's extremely unlikely the internal links are hurting you if it is Penguin 2.0.
I'm wondering why you want to block crawling of these URLs - I think what you're going for is to not index them, yes? If you block them from being crawled, they'll remain in the index. I would suggest considering robots meta noindex tags - unless you can describe in a little more detail what the issue is?
-Dan