Oh man this is me to a T. Its hard to explain to others that the rest of the time spent farting around online is really a primer for this. If I just tried to come in for an hour and leave It would never be an in the zone hour.
Best posts made by KrisRoadruck
-
RE: What's your best hidden SEO secret?
-
RE: Advice on buying a domain name for a valuable link
I do this pretty regularly so hopefully this will be of help. If you decide to go this route there is a ton of diligence that needs to be done. First keep in mind that post-penguin there are a ton of people dumping essentially burned domains. When you are digging through your drop lists first start off by making sure the domain is still indexed by google at all. Next take your list and dump it in a google doc and use the moz api (tutorial here: http://www.seomoz.org/ugc/updated-tool-seomoz-api-data-for-google-docs) to pull up the following stats on your list:
Page Authority
Domain Authority
Domain Trust
Root Linking Domains to Root Domainpage authority is really not overly important at this stage, you are just using it to figure out if they were WWW or non-WWW for when you do your site rebuild (more on that later)
For the other ones dont touch anything with a DA below 40, a DT below 4 and RLDTDs below say 150.
once you have your remaining list (it'll be a lot smaller than the one you started with) you now have to basically check SEOmoz's work. A lot of times a domain that has been spammed all to hell will still have semi-decent numbers with moz. They just aren't that good at determining spammy links just yet. Look for really high money term anchor counts (bad) really high link to root linking domain accounts (means lots of sitewides, also bad) and things of that nature.
Once you've chucked out all the domains that are really ugly take your now very very small list and go to archive.org. You are looking to see how long the domain existed in its most recently current state. Does it look like it changed hands a bunch? Or has it been pretty much the same for years. Does archive.org have most of its site archived? You'll need that for site reconstruction. As others have mentioned you basically want to restore it to its previous state as closely as you can. At least to start out with. If all that checks out then you'll want to look at the whois info. You are going to want to register it with the same registrar it was last time. Match their whois data as best as you can. Make it look like they just forgot to renew for a while but then fixed it. Once the whole site (or as much of it as you can) is restored, just let that sucker camp for a few weeks. Let google get used to it being back. Make sure they don't pull a pagerank reset on it for the drop (if it had some and it suddenly drops to zero, you might as well toss the thing out. It means google knows it changed hands). If all looks good after a few weeks? Slip your link in wherever it makes sense. A new page linked to from the home page would be ideal. Adding a new link to an old content page is in most cases a pretty glaring sign of link manipulation.
Hope that helps
P.S. in the version of the google doc included in that tutorial it doesnt have DT or RLDTRDs build in. You'll need to mod the code slightly if you want to pull those stats. Here is the bit you need to change
add this:
"ptrp" : "domain trust" , // 524288
"pid" : "root domain links" // 8192under this row:
"upa" : "page authority" , // 34359738368
and then change this row from this:
var SEOMOZ_ALL_METRICS = 103616137253; // All the free metrics
to this
var SEOMOZ_ALL_METRICS = 103616669733; // All the free metrics
now in the main sheet just add 2 columns. One with the heading domain trust, the other with root domain links and be sure to change the yellow box formula to include the extra 2 columns during its fetch. Cheers
-
RE: Follow or Nofollow outbound links to resourceful, related site?
Most of the time.. if its worth linking out to, its worth leaving followed. No-follow really doesnt help you save any juice these days it just tells google you don't trust the site you are linking out to. I would say if you dont trust a site you are linking out to... maybe just don't link to it at all (unless its for a blog post were you are poking fun at the site in question and the link is for reference purposes)
-
RE: Link Building: Feel like ive hit a wall
The very first thing that strikes me here in this is you say you have about 12,000 links from only 48 domains.
Site age may be helping them but I'd be willing to bet that they probably have a more ip-diverse link profile than you do as well. Sounds like you have the niche relevant links down so now its time to get outside that box and start going after other metrics. Here are the big 6 I look at:
Trust/Authority
Niche Relevance (you have this covered)
Segmentation (where the links appear on the page)
Anchor
IP diversity (links from many domains > many links from few domains) <- start here
Link-weight (link juice, pagerank, whatever you want to call it)I'd be willing to bet your 80 bucks would be better spent on something like a pauls or angelas backlink packet than it would be on renting one or 2 links a month. And a lower risk too.
-
RE: Why does google not show my ecommerce category page when I have the same keywords for many products in the product title?
A great way to get around this would be to apply a different title within the category page linking to the product page than the one displayed after hitting the product page. People at the category level likely know they are in the mens shirt section so doing brand + color + style (long, short, sweater et cetera) and sizing options at that level for the title and then when the user gets to the actual product page having your H1 and title tags reflecting the full string including "mens shirt" may be an ideal way to do this. Obviously this may take some adjustments in your CMS but if what you are seeing seems definitive enough this is a great middle ground and likely worth the coding effort to add an extra entry field for "category page product title" You may even be able to automate it by simply having your system drop the category text from product titles within a category page.
-
RE: Outsourcing development to external agencies
Hmm. Well there is a LOT of ground to cover here but I would say the short version is it sounds like its time to rebuild your website. Depending on the size and scope of the project my firm may be willing to take it on pro-bono. We actively give to charity every year and though that usually means monetary donations I would absolutely be interested in donating something that we do best!
Now to the core of the question, yes having everything on a single domain is usually better. There are situations were subdomains are appropriate but without more information I'd say this is likely not the case for you. I also think that having your subdomains hosted in different parts of the world is likely sending some really odd signals to google as to the connection between your subdomains and the main site. I would definately try to get the whole site hosted in one place.
I can't fathom a good reason to use iFrames in the senario you are discribing instead of just self-hosting the content in the correct location.
-
RE: Did Google's Farmer Update Positively/Negatively Affect Your Search Traffic?
Oddly the handful of sites that I have which should have most probably been affected negatively actually saw boosts in traffic, CTR on ads and eCPM on ads. Not huge jumps.. but yeah.. I benefited which was odd. These domains are testbeds I set up a long time ago to find the upper limit of what you can "get away with" in google so I know where to draw the line.
Other interesting facts. I rand some tests over the weekend (may not be large enough to be statistically relevant yet) but it seems the farmer update has almost no impact on indexation of poor or duplicate content given enough raw link juice (no anchor, ip diversity or any other cool factors, just a flat link from a big ol' bucket of link juice) which I find disappointing. =/ I expected a bit of a challenge after all this hoopla. Even though I rock the greyhat Im still pretty anti dupe/crap content.
-
RE: Will a Media Wiki guides section affect my sites SEO?
you may see some initial drop as google figures out where your content went and how its being repurposed (even with smart 301ing you are still chunking up the content)
HOWEVER, wiki-style internal linking has proven to be very effective! I mean... all those rich anchors, links in the content instead of the nav.. these are all nice signals.
Im fairly certain wiki comes outta the box with dofollow for internal links and nofollow for externals however you can change all of this stuff with settings and plugins.
All in all I'd say if you are willing to commit the time its going to be a net gain both for your rankings (over time) and more importantly for your users.
-
RE: The perfect work environment?
Why stop at 3 monitors when you can have 6! Pics attached are my actual computer at the office.
6x 27in LED monitors on articulating arms.
-
RE: Seo is dead?
I think the "SEO is Dead" statement comes from a fundamental lack of understanding of what SEO actually is. While we search marketers tend to clump a lot of extra marketing tasks into the SEO bucket, SEO in its simplest form means exactly what it says.. search engine optimization. Search engines will always exist in one form or another. Even using the "FIND" function in twitter is tapping into a search engine. As long as there are people searching for things and thus information retrevial, optimizing that will always be a necessity. People tend to confuse short term tactics with long term definitions. Just because article marketing or link building or digg may no longer be the way to optimize something for search, or even if google and bing cease being the preferred search portals it in no way means SEO will go away, just the strategies and tactics and applications will change.
-
RE: How do I check if my IP is blocked?
IP based bans are really rare within google. If you can link to your site that'd be great. Im wondering if there might not be a robots.txt problem or something else that may have arrived from the move as apposed to an IP based block.
-
RE: Why has SEOmoz added G+ code to multiple pages?
can't speak to the moz staff but meta tags aren't treated as links. neither are rel tags within an <a>tag. So no risk to "leaking link juice from ever page". :-)</a>
-
RE: Number of occurances of a keyword
Keyword density is irrelevant unless you are just spamming the crap out of the page with it. Assuming you are writing naturally I'd have to guess its a really long page to get your partial keyword in there 60 times. If you aren't writing naturally, fix it of course. If however its not a big block of text we are talking about here but rather an ecommerce page with a ton of product listings that just happen to contain the word (for example its a page listing different types of boots and thusly the word boot appears a bunch of times) I think google is smart enough to figure out whats going on there and not ding you for it.
Hope that helps.
-
RE: Good SEO Companies
Define smaller budget. The term is really subjective. Some companies consider $5k/month to be small. Some people think $200 is small.
My company offers linkbuilding and I like to think we are some of the best out there but our monthly minimum is $2500 and requires a multi-month contract.
I know of a few great freelancers that do a lot of local SEO starting at just 500 a month.
Another option would be to snap up the individual things you want done sorta ala-cart from places like freelancer.com, odesk, wickedfire and digital point.
-
RE: How do I check if my IP is blocked?
well the first thing Im seeing right off the bat is google seems to prefer the address you just gave me however that redirects to a subdomain reflecting your desired keyword. When did you do this? I would
A) make sure that you are redirecting properly
B) if its only been a short while give google time to figure out whats going on here
C) make sure that any links that previously pointed to the root domain be switched to your preferred address.
Edit: Take a look at this...
-
RE: Why has SEOmoz added G+ code to multiple pages?
I doubt a single string is going to add too much overhead. Im not sure how they have designed their CMS but it may just be a case of having it everywhere is easier to implement than page specific. The other thing may just be making sure google has an easy time associating sub-page content with the author/publisher without having to look up the main page. Im simply guessing here though, Im sure a mozzer will come along and give you a more definitive answer once they see your question.
-
RE: Brand SERP Domination
You should be able to control positions 1-3 with just your primary domain. Work on pushing up some of your sub-pages for your brand term.
Once thats done consider tossing a blog or something else (presskit, video content, employee directory) on a subdomain. That should net you another 2 spots on page one with your domain with a little effort.
Once thats done, Facebook, Twitter, Google+, Linkedin and Youtube can take the other 5 spots.
-
RE: Will cleaning up old pr articles help serps?
Article directories have been looked down on by google for quite some time now (basically since panda, long before penguin). Couple that with them coming down on overly aggressive money anchor linking and the issue you have with semi-duped content there (Im guessing you spun them so you could take the same article and shop it to all 10 directories each time you did this) and yeah thats a bad recipe. I'd definitely delete the articles if you still have access to all the accounts. 50 unique * 10 directories Im guessing 500 crap links in total? Ditch em and then work on getting some really awesome replacements (you wont need nearly as many). High quality guest blogging is the new article submission. See if you can't get some guest author accounts at a few tightly related sites from your industry or aim high and see about getting guest author accounts on general purpose but super authoritative sites (few examples that allow this, national geographic, cracked.com, washington times, guardian.co.uk, investopedia, you get the idea here...) dumping those old links and adding in just even 20-30 great ones is going to be a huge lift for you.
-
RE: Link building? I really dont get it is there an easy way
Easy
Cheap
Safe
Pick any 2.
-
RE: How do you track new inbound links?
Not sure on OSE but in both AHREFS and Majestic there is a "first discovered" date on links. Perhaps this should go in the feature request bucket for the OSE team?
-
RE: Is there any way to see how one my keywords ranked historically before I started tracking it?
Russ is correct. SEMrush also has a bunch of historical ranking data.
-
RE: How to compete with duplicate content in post panda world?
Not a complete answer but instead of rel-canonicaling your dynamic pages you may just want to robot.txt block them somthing like:
Disallow: /*?
this will prevent google from crawling any version of the page that includes the ? in the URL. Cannonical is a suggetion whereas robots is more of a command.
as you can see from this query:
Google has indexed 132 versions of that single page rather than follow your rel=canonical suggestion.
To further enforce this you may be able to use a fancy bit of php code to detect if the url is dynamic and do a
robots noindex, noarchive on only the dynamic renderings of the page.
This could be done like this:
I also believe there are some filtering tools for this right within webmaster tools. Worth a peek if your site is registered.
Additionally where you are redirecting non-www subpages to the home page you may instead want to redirect them to their www versions.
this can be done in htaccess like this:
Redirect non-www to www: RewriteEngine On RewriteBase / RewriteCond %{HTTP_HOST} ^yourdomain.com [NC] RewriteRule ^(.*)$ http://www.yourdomain.com/$1 [L,R=301]
This will likely provide both a better user experience as well as a better solution in googles eyes.
I'm sure some other folks will come in with some other great suggestions for you as well
-
RE: A tool to submit websites in directories
small caveat to the endorsment above... IP diversity does seem to be a rather important ranking metric. It's been my experience that Quantity and Quality need not come from the same source. If you've got some great quality links, and a bunch of quantity only type links.. it seems to still result in a net positive.
Six things I normally look at linkbuilding wise (in no particular order):
Authority/Trust
Niche Relevance
Segmentation (links in content > links in sidebar, footer, comments)
Anchor
IP diversity
Linkweight (Juice, PageRank, whatever)As long as you have elements of each of these SOMEWHERE in your link profile you generally do well.
Other than that I agree with these fine folks. Also automation is great for scaling but it almost always leads to a great reduction in quality and possibly could end up being a wasted effort if the rejection rate goes up enough as a result.