Agreed - Google is consolidating subdomain links in Google Webmaster Tools, but as far as I know, that does not reflect a change in how the algorithm works. Subdomains can still fragment and split link-juice. The change is more of an accounting trick, for lack of a better word.
Posts made by Dr-Pete
-
RE: A Blog Structure Dilemma We're Facing...
-
RE: .CA or .COM?
No, you're not crazy It's basically two local SEO problems in one, and I'm honestly not sure where the sweet spot is. If you overoptimize for Phoenix, you may hurt your standing on Google.ca.
I suspect you have to keyword optimize for Phoenix/Arizona, but geo-target for Canada. Since Canada pulls a lot of .com results, though, you can afford to have some Phoenix-based geo-cues. I'm not sure anyone can tell you the perfect balance. I'm going to see if the broader Moz team has experience with this issue.
-
RE: Trying to rank against keyword in domain
I'll just add the comment that you shouldn't push the exact-match anchor text too hard. You will definitely need some well-targeted links and copy, though.
If they really have PR0 and 2 back-links, I'd focus hard on quality, too. A couple of strong, relevant links could make a huge difference, much more so than dozens of spammy links. If Google is letting a site with 2 back-links rank at #1, it's not just the exact-match domain, most likely. It's likely that the rest of the Top 10 has some quality issues.
-
RE: 7 years old domain sandboxed for 8 months, wait or make a domain change?
Unfortunately, there's no easy answer. I agree with Zsolt that this isn't a "sandbox" issue - it sounds like a classic (and severe) link-based penalty. The 301 can work, but it's not risk-free. Usually, you'll retain some of the link-juice and not carry the penalty, but the penalty does transfer in some situations. There's no good way to tell when and if it will.
I'm afraid you're right on reconsideration - you'd have to cut the vast majority of the bad links, and that's going to be very tricky. Your only other option, if the bad links are generally low-quality links (spammy article marketing, for example, as opposed to paid links), is to build strong, relevant links going forward and let the bad links fade out over time. That depends a lot on the severity and type of bad links, though.
If you've been waiting for things to change for 8 months and building decent links in that time, the 301 may be your best recourse. It's a bit of a last resort and it's not guaranteed to work, but it sounds like you may need to try it.
-
RE: Onsite Content - Word Count & KW Density
I think there are some good points here, but I want to warn that it really depends. There are sites with 250-word pages that do well, if that content is unique and isn't buried in ads, etc. If you have 1,000 words but it's all syndicated from other sites and jammed with ads, you could have Panda breathing down your neck.
I would generally not worry about keyword density. Write natural copy, with solid topic focus, and your keywords will organically end up represented in various forms. Google is a lot more sophisticated than just counting keywords or density these days, and trying to engineer the perfect number is more likely to harm you than help (as others mentioned). Plus, you can drive yourself crazy for something that will ultimately have a very small impact.
What I think is a lot more important is your overall keyword strategy. Instead of worrying about how many times a keyword is on a page, focus on the structure of your site. Which pages target which keywords? Are there important variants that need their own content (and can you create unique content for them)? Are you spread too thin. I see many more problems caused by bad keyword strategy ACROSS sites than within any one page.
-
RE: How do I set up Google Analytics to track paid visitors from Bing
The other option for keyword, although it's a lot tougher, is to use a separate URL for each keyword (and specify "keyword="). That will track what you bid on, and not what people queried (which is what DKI does). It's time-consuming, but if you use the desktop editor, it's doable. Of course, if you use phrase-match and exact-match, it won't be that different from "{keyword}". I'd strongly suggest narrowing targeting - it's starting to matter a lot in Bing (it used to be we just all put everything on broad-match for Bing/Yahoo).
-
RE: .CA or .COM?
So, just to clarify, she's basically looking for Canadians who want to buy property in Arizona? Wow, that's quite a niche, and not an easy SEO problem. You're crossing 2 geo-targeting streams, in a sense.
-
RE: How to extract URLs from a site (without bringing the server down!)
Just a follow-up to my endorsement. It looks like Screaming Frog will let you control the number of pages crawled per second, but to do a full crawl you'll need to get the paid version (the free version only crawls 500 URLs):
http://www.screamingfrog.co.uk/seo-spider/
It's a good tool, and nice to have around, IMO.
-
RE: What is important for page rank?
Keep in mind that "quality" is a difficult term at best, and even if you could pin down a definition of quality, Google still has to translate that (imperfectly) into code. There are a lot of factors that go into determining the value of a link. Rand had a good post on the subject here:
http://www.seomoz.org/blog/10-illustrations-on-search-engines-valuation-of-links
Many of these aren't even quality issues, per se. For example, if you have a blog comment on a high PR page, but you're 1 of 300 comments, that links isn't going to count much. You could argue that's a quality issue, but it's also just simple math - the PR of that page just got split 300+ ways. Even if every single comment was relevant and topically appropriate, it wouldn't count much.
There's also TrustRank (we simulate it with mozTrust), the recent Panda factors, and other quality variables - they all tackle a different piece of the puzzle. Then, you've got user factors, like bounce rate, that are probably starting to come into play. So, I don't think you can just look at it as Quality vs. PR - there's a lot more in play.
Edit: Sorry, I was reading this as "link quality", not the quality of the site itself. There's no either/or - both content factors (including on-page factors like good Title tags) and link factors matter, along with social factors these days. The best approach is going to tackle all fronts.
-
RE: Custom Error and page not found responses
Just to add to Ryan's comments - if you had massive 500 issues, then you might theoretically argue that 301'ing would keep Google from crawling so many errors. At best, though, it's a band-aid, and maybe even a poor-fitting one. The better question is - why are those 500s occurring. Ultimately, they should be fixed, not patched.
Usually, Google isn't going to penalize a one-time 500 error or a short-term server problem. The only time I could see 301'ing is if you knew you had a major problem and couldn't fix it for a few days. The 301 (or possibly 302, in this case) could buffer you from crawl problems while you made the fixes. Obviously, that wouldn't be an ideal situation.
-
RE: Ranking #1 for decent traffic keywords, but not receiving any traffic?
I'll just add that I wrote a post about this topic recently: