Questions created by icecarats
-
Unnatural Link Notification - Third Go Round, specific questions
Hi all, I'm posting what is sure to be a common question, but I can't seem to find much information by searching Q&A over the last month so thought I'd throw this out there. There's a lot of 'what do I do??' questions about 'unnatural link notification', but most of them are from first timers. We're pretty far along in the process and it feels like we're going nowhere, so I was hoping to pick the brains of anyone else who's 'been there'. We have a client that we inherited with an unnatural link profile; they were warned shortly after we took them on (around March was the first warning). We compiled an apologetic letter, specifically identified a previous agency who >was< doing bad things, mentioned things would be different from now on, and provided a list of links we were working on to remove based on WMT and OSE and some other sources. This was submitted in early June. Traffic on the main keyword plummeted; ranking went from top 5 to about mid-page 4. We got hit with that same rash of Unnatural Link warnings on July 23 that everyone else did and after looking around I decided not to respond to those. We got a response to the reinclusion request submitted in June above, saying the site was still violating guidelines. This time I went all out, and provided a Google docs spreadsheet of the over 1,500 links we had removed, listed the other links that had no contact info (not even in WHOIS), listed the links we had emailed/contact formed but got no response, everything. So they responded to that recently, simply saying 'site still violates guidelines' with no other details, and I'm not sure what else I can do. The campaign above was quite an investment of resources and time, but I'm not sure how to most efficiently continue. I promised specific questions, so here they are: Are the link removal services (rmoov, removeem, linkdelete, et al) worth investigating? To remove the 1,500 links I mentioned above I had a full time (low paid) person working for a week. Does Google even reconsider after long engagements like this? Most of what I've read has said that inclusion gets cleared up on the first/second request, and we're at bat for the third now. Due to the lack of feedback I don't know if their opinion is "nope, you just missed some" or "you are so blackhat you shouldn't even bother asking anymore". One of the main link holders is this shady guy who runs literally thousands of directories the client appears in thanks to previous SEO agency, and wants $5 per link he removes. Should I mention this to Google, do they even care? Or is it solely our responsibility? Thanks in advance for any advice;
White Hat / Black Hat SEO | | icecarats0 -
SEOMoz Q&A having some issues?
Apologies if this has been asked. I was browsing Q&A and got a Ruby error page, reloaded and everything was fine. However a bunch of the topics are kind of messed up, and it looks like a chunk of them are missing now. For example my 'questions I've answered' section of My Q&A only has the newest question and the oldest question, and when I sort the SEOMoz Tools category by newest I get one recent post and then the next one is from November of last year. You guys having database problems?
Moz Pro | | icecarats0 -
Roger keeps telling me my canonical pages are duplicates
I've got a site that's brand spanking new that I'm trying to get the error count down to zero on, and I'm basically there except for this odd problem. Roger got into the site like a naughty puppy a bit too early, before I'd put the canonical tags in, so there were a couple thousand 'duplicate content' errors. I put canonicals in (programmatically, so they appear on every page) and waited a week and sure enough 99% of them went away. However, there's about 50 that are still lingering, and I'm not sure why they're being detected as such. It's an ecommerce site, and the duplicates are being detected on the product page, but why these 50? (there's hundreds of other products that aren't being detected). The URLs that are 'duplicates' look like this according to the crawl report: http://www.site.com/Product-1.aspx http://www.site.com/product-1.aspx And so on. Canonicals are in place, and have been for weeks, and as I said there's hundreds of other pages just like this not having this problem, so I'm finding it odd that these ones won't go away. All I can think of is that Roger is somehow caching stuff from previous crawls? According to the crawl report these duplicates were discovered '1 day ago' but that simply doesn't make sense. It's not a matter of messing up one or two pages on my part either; we made this site to be dynamically generated, and all of the SEO stuff (canonical, etc.) is applied to every single page regardless of what's on it. If anyone can give some insight I'd appreciate it!
Moz Pro | | icecarats0 -
Very Slow Advanced Reports
Hello All - I've been running some Advanced Reports again lately, and they seem much slower than I remember from last time I ran some. Currently I've got one (Inbound Links) report at 2,500 out of 10,000 links retrieved through LSAPI, and it's been at that point for about 6 hours. Did something get cloggered on the reports, or is it this just the expected performance?
Moz Pro | | icecarats0 -
So what's up with UpDowner.com?
I've noticed these guys in link profiles for several sites I manage. They'll usually show up around 1,000-10,000 times in the backlink profile. From what I can tell they index websites, build up keyword relationships, and then when you search for something on their site (e.g. poker) they'll present a list of related sites with stats about them. The stats seem to be yanked straight from Alexa. Where the backlink comes from is that every time 'your' site shows up for a search result they'll put a little iframe that contains your site. This means if your site's name/keywords are pretty broad, you could be showing up thousands and tens of thousands of times as being linked from these guys on their pages that Google indexes. And Google indexes, boy do they ever. At the height, they had over 53 million pages indexed. That has apparently shrunk now to around 25 million. I believe their strategy is to generate a crap-load of automated content in the hopes they can cash in on obscure long tails. So my questions for you guys are: Are you seeing them in your backlinks too? Should I block their spider/referrers? What is their deal man?
White Hat / Black Hat SEO | | icecarats0 -
Too Many On-Page Links: Crawl Diag vs On-Page
I've got a site I'm optimizing that has thousands of 'too many links on-page' warnings from the SeoMoz crawl diagnostic. I've been in there and realized that there are indeed, the rent is too damned high, and it's due to a header/left/footer category menu that's repeating itself. So I changed these links to NoFollow, cutting my total links by about 50 per page. I was too impatient to wait for a new crawl, so I used the On Page Reports to see if anything would come up on the Internal Link Count/External Link Count factors, and nothing did. However, the crawl (eventually) came back with the same warning. I looked at the link Count in the crawl details, and realized that it's basically counting every single '<a href'="" on="" the="" page.="" because="" of="" this,="" i="" guess="" my="" questions="" are="" twofold:<="" p=""></a> <a href'="" on="" the="" page.="" because="" of="" this,="" i="" guess="" my="" questions="" are="" twofold:<="" p="">1. Is no-follow a valid strategy to reduce link count for a page? (Obviously not for SeoMoz crawler, but for Google)</a> <a href'="" on="" the="" page.="" because="" of="" this,="" i="" guess="" my="" questions="" are="" twofold:<="" p="">2. What metric does the On-Page Report use to determine if there are too many Internal/External links? Apologies if this has been asked, the search didn't seem to come up with anything specific to this.</a>
Moz Pro | | icecarats0