Yeah, one thing I think is critically important is to try to divorce yourself from your own creation and think in terms of what Google finds valuable. We all think our sites are the greatest and every page we create is a masterpiece, even when we'd ignore or trash the same kind of page on someone else's site. When you're talking about a 100X increase, brutal honesty with yourself is very important.
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

Best posts made by Dr-Pete
-
RE: Can increasing website pages decrease domain authority?
-
RE: Rel=canonical tag on original page?
My experience matches Nakul's. Technically, Bing claims you shouldn't have canonicals on the canonical page itself, but I've never seen evidence that they actually do anything about that or don't honor the tags. I think they just don't like processing the extra tags. Google originally suggested the same thing, but then softened their stance. I've NEVER seen a practical issue with it on Google.
So, by the book, Bing would rather you only put it on non-canonical URLs. Practically, though, it seems to work absolutely fine. I wouldn't lose sleep over it. So many big sites are doing this now that I don't think the engines can devalue the tactic. It's not spammy - it's just mildly inconvenient for them (some extra processing).
-
RE: How long for authority to transfer form an old page to a new page via a 301 redirect? (& Moz PA score update?)
It can vary quite a bit. The page has to be recrawled/recached, which can take anywhere from hours to weeks, depending on how much authority the page has. That's usually the big delay. After that, Google may on occasion delay passing authority, but we don't have proof of that (there are just cases where it seems like they do).
If it's just a handful of pages, re-fetch them through Google Webmaster Tools. It never hurts to kick the crawlers.
-
RE: How long for authority to transfer form an old page to a new page via a 301 redirect? (& Moz PA score update?)
I wouldn't get too hung up on the Moz timeline, since that's only correlated with what Google does, to build a broader model. If Google has crawled/cached the 404 and the page actually is no longer in the index, then that page should stop inheriting and passing link equity. It can get complicated, because sometimes 404s have inbound links and other issues tied to them that can confuse the crawlers. So, I'd say it's situational.
Moz (specifically, OSE) can help you determine what links still exist to those URLs, which really should guide whether you let them stay 404s or 301-redirect them to something relevant. The other aspect of the decision is just whether something relevant exists. If you clearly have built a page to replace the old one then 301-redirect it. If the old page is something that ceased to exist for a reason, then a 404 is probably fine unless that old page had a ton of inbound links. In that case, the 404 has essentially cut off those links.
The problem is that those inbound links are still out there, so it's not that the authority has ceased to exist. It's that you've basically cut the pipe through which the authority flows.
-
RE: Is there another option for User Testing besides Usertesting.com
Ah, got it - the "A/B testing" in your questions threw me off. I was wondering if you were looking for a different type of service than they used to sell.
If you're looking for structured interviews, it definitely can get a lot more expensive very quickly. It used to be that this stuff ran $10K+, before companies like UserTesting.com came along a couple of years back.
I've occasionally heard good things about UserBrain (https://userbrain.net/), but I haven't used their service personally. UserLytics (http://www.userlytics.com/sitepublic/) also used to be a lower-priced alternative, but again, it's been a while since I've dug into their service offerings and prices.
If it's something you plan to do a fair amount of, it may be worth training someone in-house and getting a basic set up. Steve Krug's more recent book (http://www.amazon.com/Rocket-Surgery-Made-Easy-Do-It-Yourself/dp/0321657292) is a great starting point. Some time and equipment investment up front may be enough to get you a lot of insights. The nice thing about in-person testing is that it's really exploratory - you don't have to be an expert to get at least some results.
-
RE: Domain / Page Authority - logarithmic
Unfortunately, I'm not sure it's as simple as an equation - we use a machine-learning algorithm and I think the result just happens to be logarithmic. I know who knows the answer, though, so let me ask and see if it's something we're able to discuss.
-
RE: Does the root domain hold more power then an inner page?
This is a very complex issue, but I think Jonathan's summed it up pretty well. Generally, home pages collect a lot of the "mass" of inbound links, and so they can overpower other pages. On the other hand, deep pages are easier to target to specific keywords and sometimes have targeted anchor text. I've seen cases where someone wanted the home-page to rank, but a deep page was ranking, and I've seen the opposite.
Rand wrote about that general problem here:
http://www.seomoz.org/blog/wrong-page-ranking-in-the-results-6-common-causes-5-solutions
While it's not exactly what you're asking, it covers the general logic of why one page might win over another.
-
RE: Potential spam websites with high DA linking back to us
Unfortunately, it's very difficult to measure the impact of proactive disavow, for a number of reasons:
(1) The timeline of if and when Google processes disavows isn't very transparent
(2) The impact of bad links is usually only seen in large drops, and long after the fact of those linksFor most of us, I think proactively disavowing links carries the risk that those links might actually be passing value. So, if you start carving away at your link profile before you're in any danger, you stand to lose as much as you might gain down the road.
Risk profiling is tricky, and there are certainly sites on the verge of penalties or at high risk who should be proactive. I can't tell you where on the continuum you fall from just a couple of linking domains, but my gut reaction is that disavowing links isn't a good use of your time and energy right now.
-
RE: Do links in the nav bar help SEO?
There's nothing wrong with doing this, as long as the "title" attribute is accurate (DON'T spam it with non-relevant keywords), but I haven't seem compelling evidence that it acts as a ranking signal.
-
RE: Comparing New vs. Old Keyword Difficulty Scores
I empathize with your frustration, and we certainly take it seriously. Let me first say that I've been involved in the Keyword Explorer project for a while, and I assure you that this was not about releasing a new product just to have something to do. Our goal was to really reinvent and help automate the keyword research process. We did re-work Keyword Difficulty as part of that, but there are many more features that we sincerely believe help simplify a difficult and time-consuming process. I'd encourage you to check out lists and Keyword Potential, as it helps balance Difficulty w/ Volume and other considerations.
The changes to Keyword Difficulty were carefully considered and tested. That's not to say they're perfect, and we are evaluating them based on large-scale customer data as we collect it. There were issues with V1, though, that we felt needed addressing. The original Keyword Difficulty score tended to bunch up on the middle values, didn't take into account the disproportionate impact of the top of the SERP, and handled missing data poorly. We may have overcompensated on the bunching up problem, based on what we're seeing over a lot of data, and are looking to address that ASAP.
I'm not clear on what tool you were comparing, but it's important to note that Keyword Difficulty isn't like volume, which has a real-world answer (Google won't tell us what it is, but there is one, in theory). So, every tool measures difficulty a bit differently. It doesn't really make sense to compare different tools - that difference won't be meaningful. Keyword Difficulty, in our design, isn't meant to be used in a vacuum - it's meant to be used to compare target keywords to each other. In other words, it's not so much that Keyword X scores a 30, but how it compares to Keyword Y. Our goal is to help you pick the best target from your list of potential targets, but any given score out of context isn't very useful. No single keyword tells the whole story.