Best of luck! Definitely interested in hearing back about how it goes.
Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.

Posts made by Dr-Pete
-
RE: Can a .ly domain rank in the United States?
-
RE: Should I use rel=canonical on similar product pages.
There's no perfect solution, but Google's advice is to use rel=prev/next. This looks like pretty classic pagination. Rel-canonical is a stronger signal, but it's generally going to keep pages 2+ from ranking.
-
RE: Can a .ly domain rank in the United States?
Can a .ly domain rank on Google.com? Sure. Here are many that do:
https://www.google.com/search?q=site%3Aly
Do they have a ranking disadvantage? That's a lot tougher question. It's definitely in a gray area. Thanks to Bit.ly and some others, there is some precedent for these domains to be treated as generic, if other signals suggest they are not Libyan domains (geo-location, language, etc.), but there's always more risk for a new site and a chance you'll need to spend more to get the same results.
The political situation is much more worrisome, though, because none of us can predict that. If there's a history of .ly domains being confiscated and if that confiscation was considered legal (even in the short-term), then there's no guarantee that won't happen again. That's a risk we can't calculate in SEO terms. The fact that even Bitly eventually went to bitly.com doesn't bode well.
My gut feeling is that (1) there are no magic domain names, and you're going to have to advertise/market/promote any new domain, and (2) given the geo-political situation there is more than average risk (vs., say, a .co domain or one of the new TLDs). There's a lot I don't know about your situation, but I think your hesitation is warranted.
-
RE: Dates appear before home page description in the SERPs- HUGE drop in rankings
Unfortunately, they're being pretty tight-lipped on this one. Seems like a glitch, but they don't seem to think it's related to the rankings drop. Possible two events co-occurred, and there was an algo update at the same time as the glitch. Honestly, though, it's not clear at all.
-
RE: Dates appear before home page description in the SERPs- HUGE drop in rankings
We're definitely seeing similar reports about the bad dates, and it has been brought to Google's attention at reasonably high levels (i.e. I'm confident they know about it, but it's hard to say what they're doing about it).
Unfortunately, it's unclear whether this was connected to a ranking drop in some cases or was a coincidence. We did see substantial movement in the algorithm right around November 10th (the date you posted this question), but, unfortunately, we have no confirmation.
Sorry, wish I had better info right now, but I'll try to find out more.
-
RE: Duplicate content on subdomains.
It would probably be better (and more likely to get you responses) if you started a new question - this one is three years old. Generally, I think it depends on your scope. If you need some kind of separation (corporate, legal, technical), then separate domains or sub-domains may make sense. They're also easier to target, in some ways. However, you're right that authority may be diluted and you'll need more marketing effort against each one.
If resources are limited and you don't need each country to be a fully separate entity, then you'll probably have less headaches with sub-folders. I'm speaking in broad generalities, though - this is a big decision that depends a lot on the details.
-
RE: Multiple Domains on 1 IP Address
Sorry, I'm confused about the setup. Hosts routinely run multiple sites off of shared IPs, but each domain name resolves as itself. Users and search bots should never see that redirection at all and shouldn't be crawling the IPs. This isn't an SEO issue so much as a setup issue. Likewise, any rel=canonical tags on each site would be tied to that site's specific domain name.
-
RE: Comparing New vs. Old Keyword Difficulty Scores
I empathize with your frustration, and we certainly take it seriously. Let me first say that I've been involved in the Keyword Explorer project for a while, and I assure you that this was not about releasing a new product just to have something to do. Our goal was to really reinvent and help automate the keyword research process. We did re-work Keyword Difficulty as part of that, but there are many more features that we sincerely believe help simplify a difficult and time-consuming process. I'd encourage you to check out lists and Keyword Potential, as it helps balance Difficulty w/ Volume and other considerations.
The changes to Keyword Difficulty were carefully considered and tested. That's not to say they're perfect, and we are evaluating them based on large-scale customer data as we collect it. There were issues with V1, though, that we felt needed addressing. The original Keyword Difficulty score tended to bunch up on the middle values, didn't take into account the disproportionate impact of the top of the SERP, and handled missing data poorly. We may have overcompensated on the bunching up problem, based on what we're seeing over a lot of data, and are looking to address that ASAP.
I'm not clear on what tool you were comparing, but it's important to note that Keyword Difficulty isn't like volume, which has a real-world answer (Google won't tell us what it is, but there is one, in theory). So, every tool measures difficulty a bit differently. It doesn't really make sense to compare different tools - that difference won't be meaningful. Keyword Difficulty, in our design, isn't meant to be used in a vacuum - it's meant to be used to compare target keywords to each other. In other words, it's not so much that Keyword X scores a 30, but how it compares to Keyword Y. Our goal is to help you pick the best target from your list of potential targets, but any given score out of context isn't very useful. No single keyword tells the whole story.
-
RE: Comparing New vs. Old Keyword Difficulty Scores
Sorry about the frustration with the before/after. In the case of Keyword Difficulty, we may have adjusted it too aggressively and are testing a few changes that could soften that a little. I still believe the new score is an improvement in many ways, but we're looking to make an adjustment that will bring the new scores a little closer to the old ones.
For volume, it's a bit trickier, because we really feel that the new scores are better and based on richer data sources. We are working to adapt volume to other markets, as well. More on volume is in Russ's post:
Sweating the Details - Rethinking Google Keyword Tool Volume
In both cases, though, our keyword metrics aren't intended to be like ranking or authority. Ranking is something you measure over time, relative to itself - you care whether it went up or down. Our keyword metrics are intended to help you compare two keywords to each other. It's not so much about whether a keyword is more difficult today than last week (we expect that to be fairly stable over time, at least for most keywords), but whether Keyword X is a better bet for you to target than Keyword Y today.
-
RE: Comparing New vs. Old Keyword Difficulty Scores
The old tool is still active temporarily, and I'm not sure if we've finalized the shut-off dates. We hear your concerns regarding the new limits. The new Keyword Explorer collects much more data and is quite a bit more resource intensive, but we're trying to balance out the needs of users of the old tools as best we can.
-
Comparing New vs. Old Keyword Difficulty Scores
We've had a few questions regarding the new Keyword Difficulty score used in Keyword Explorer, and how it compares to the old score in our stand-alone Keyword Difficulty tool. Specifically, people want to know why some scores are much lower using the new tool.
There a general discussion of the math behind the tool in this post:
Keyword Research in 2016: Going Beyond Guesswork
One of the problems we had with the original Keyword Difficulty score is that, because it's based on our Page Authority (PA) score and PA tends toward the middle of the 0-100 range, Difficulty got a bit bunched up. A Difficulty score in the low-to-mid 20s (via the old tool) is actually very low. So, we set out to re-scale the new tool to broaden that score and use more of the 0-100 range. We hoped this would allow more granularity and better comparisons.
While the logic is sound, we're concerned that we may have been too aggressive in this re-scaling, given recent feedback. So, we're going to be analyzing a large set of keywords (anonymously, of course) that people have run through the tool to see if too many Difficulty scores seem too low. If they do, we'll make some adjustments to the math.
In the meantime, please be aware that low scores may appear lower in the new tool and very high scores may appear higher. We wanted to address some of the limitations in V1 and feedback over the years, and so the old and new scores really can't be compared directly in a meaningful way. We're sorry for any confusion that has caused, and we will re-evaluate if necessary.
-
RE: Keyword Explorer is Now Live; Ask Me Anything About It!
The new Difficulty score gets pretty aggressively scaled, because we found that the distribution of PA/DA was bunching up a bit, and fell almost entirely in the 25-85 range (looking across an entire SERP). So, a 25 gets scaled down to nearly zero, to give the new metric more granularity.
It looks like the CTR-adjusted PA for "Naturmode" on Google.de (for example) is right around 26, so it may be that the PAs have gone down a bit over time as well. Even by the non-weighted metric, I'd expect that to be a solid 10 points lower than the old difficulty score.
We're going to dig in and try to find out if the rescaling is too aggressive, once we have more data from real-world KWE usage. In general, though, you're going to see a bigger range of difficulty scores with the new metric. It's pretty tough to meaningfully compare the old and new scores.
-
RE: Is there another option for User Testing besides Usertesting.com
Ah, got it - the "A/B testing" in your questions threw me off. I was wondering if you were looking for a different type of service than they used to sell.
If you're looking for structured interviews, it definitely can get a lot more expensive very quickly. It used to be that this stuff ran $10K+, before companies like UserTesting.com came along a couple of years back.
I've occasionally heard good things about UserBrain (https://userbrain.net/), but I haven't used their service personally. UserLytics (http://www.userlytics.com/sitepublic/) also used to be a lower-priced alternative, but again, it's been a while since I've dug into their service offerings and prices.
If it's something you plan to do a fair amount of, it may be worth training someone in-house and getting a basic set up. Steve Krug's more recent book (http://www.amazon.com/Rocket-Surgery-Made-Easy-Do-It-Yourself/dp/0321657292) is a great starting point. Some time and equipment investment up front may be enough to get you a lot of insights. The nice thing about in-person testing is that it's really exploratory - you don't have to be an expert to get at least some results.
-
RE: Is there another option for User Testing besides Usertesting.com
I'm still showing a basic pricing package of $49/video on UserTesting.com:
https://www.usertesting.com/plans
You mentioned A/B testing, though. Are they doing custom test set up now? I've only ever used them for qualititative testing (user videos, basically). Conversion consulting does tend to get more expensive fast. If you can give me a sense of the type/scope of your project, I can check out some of the companies I used to use and see if they're relevant.
-
RE: GWMT / Search Analytics VS OpenSiteExplorer
I know it's not always the answer people want to here, but Matt's right - this is basically where we're at. OSE tends to focus on higher-authority links and quality over quantity. Unfortunately, while this works well for tracking the strengths in your link profile, it doesn't always do as well at tracking the weaknesses. We're very much interested in expanding the quantity as well, but it's a balancing act and, in the interest of full transparency, there are many engineering challenges.
People have compared our index to Majestic and Ahrefs on the blogosphere. Since I can't claim to be unbiased, I'd welcome you to read those posts and make your own judgments. In fairness to Majestic and Ahrefs, all three of us are somewhat transparent about sources and at least our general methodologies. Unfortunately, Google is not very transparent about how they sample links or choose which data to show. So, direct comparison with any of the major SEO tools to Google Search Console proves to be a lot trickier. We're also not clear on Google's update cycle for that data.
-
RE: Duplicate content and http and https
If Google detects both http: and https: versions, they've started to automatically pick the https: version, but that's not consistent yet. In general, I think it's still important to set strong canonicalization signals. Google still separates your http: and https: sites in Google Search Console, too, so even they haven't quite made up their minds.
In general, Google is pushing sites toward https:, but that's a somewhat complex decision that depends on more than just SEO. If you're using https: and the https: URLs are indexed, then you should treat those as canonical and suppress the http: URLs, in most cases.