In light of the all round astonishment I went and checked the actual numbers, rather than off the top of my head - it was 44% removed.
So still way better than I'd expected.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
In light of the all round astonishment I went and checked the actual numbers, rather than off the top of my head - it was 44% removed.
So still way better than I'd expected.
Have to say I haven't done it your way round - we've completed three rounds of email outreach then submitted a disavow file, so I don't want to guess to far on advisability of doing it the other way around. For starters I've no idea how well / quickly Google handles domains removed from a disavow list.
Your first assumption needs comment though - every domain we're removing links from was from a spammy website (zero quality SEO directory), as a result of a submission package they bought years back before I joined.
We managed to get about 50% of the list removed from that email outreach process. Granted one was a directory network with over 100 domains, but even counting their domains as one that we got about 40% success.
Around 10% responded with a demand for a fee - ranging from 99c to $50. Clearly all those were instant place in disavow list
So to summarise I was really surprised how successful the email outreach part was - we'd been expecting next to nothing by way of response rate rather than around 50% success.
Even if you do it disavow first, hoping for a Penguin update in the next three weeks so the file is actioned prior to your season starting is a little optimistic!
I'm hoping for a couple of second opinions of my conclusions here before I shoot 60% of our link profile in the head...
a) I've not needed to deal with an angry Penguin before,
b) it's not as black and white as most of the blog posts seem to find.
Domain is www.world-text.com
The Penguin Google Says...
● We have no unnatural links warning in WMT
● We have dropped out of the top 50 for all keywords except three. Those three are a couple of pages down.
● We still own page 1 for a brand search (World Text), though we lost the bottom result.
● Overall traffic down 25%, organic down 50%, SERPs impressions down 50-70%
I'd love to know where the remaining organic Google traffic is coming from as we currently rank for nothing (useful anyway); the analytics doesn't show the usual off a cliff to nothing, more a few quiet weeks.
Several other players in the sector appear to have been obliterated too - searches on several keywords are a total mess with total unknowns and spam showing up on page 1 SERPs.
Adwords traffic is up markedly which I put down to the mess that is SERPs
Our adwords are nicely optimised and our top keywords are hitting quality scores of >=8, mostly 9 or 10.
Why did the Penguin Peck?...
Simple. Opensiteexplorer should make it blindingly obvious what the issue is.
We have a horrible link profile that I have been muttering about for 18 months or more.
● Long before I joined seems they got a 5,000 search friendly directory submissions package or equivalent
● Sadly not a single one of them was from the long obsolete SEOMoz directories list, or worth keeping.
● Every single last one of them has identical anchor text and description. Still, makes for easy analysis.
● The anchor text used isn't likely to be searched often
"World-Text.com Text Messaging Services"
● 3 of 4 originally submitted sites/directories have since expired, removed links etc
Findings
740 judged bad domains:
All the identical anchor text SEO directories
a couple of others picked out as unwanted
the domains serving malware
Only 235 class Cs in these
144 have 0-2 indexed pages - not useful for a directory!
500 judged maybe:
No links found even though in WMT (a lot of these)
down, suspended etc (a lot of these too)
parked
links that are less than ideal but not identically anchored spam.
(for instance someone who had linked to us has had their site scraped and dropped on dozen or more forums including our link)
another pass of these in a week or so, just in case some come back to life...
400 judged good (well harmless):
Natural links
real directories
all the assorted alexa like scrapers, domain valuation junk etc.
Executing Penguins and other vermin...
Apparently pretending to know nothing doesn't help, so...
● Disavow the three discovered sites serving malware
● Remove where possible every last one of the SEO directory entries avoiding any extorted payment. Track contacts and responses.
● Disavow the rest
{Reconsideration request}
I KNOW this is for manual penalties, and we don't have one, BUT I've seen countless sources suggesting reconsideration for Penguin. Really?
Anyway that was far longer than intended, so thanks for making it this far! I look forward to comments!
Excellent response from Dana to which I can add only one thought:
What's their backlink profile like now? Do many of their product pages have links to them? Do any have a decent number of links?
If they 404 all those links it's going to hurt. You know that already.
Some stats of how their backlink profile is going to be affected might also help your case.
I've been trying Bing for about the last two months.
Still early days, but we imported our Adwords campaign and it's thus far working really quite well - we're getting some good traffic at a lower level than adwords, and clicks work out cheaper. Strangely our best performing key words on Google aren't turning out to be those doing best on Bing. Still early days, but it is proving worthwhile.
Beyond that it would certainly be worth hearing someone out, but I'd be asking many questions on who's in their network etc. Obviously you'll want to be reassured that any web properties they're showing ads on are relevant to your niche, and the right sort of visitors.
Unfortunately the only way to be sure with most forms of advertising is to throw a hundred pounds or two at it and see what happens!
It's all about link profile.
GoDaddy may appear to get away with site footer links on domain holding pages, but they have a massive link profile, not least of which they gained during the whole SOPA/PIPA outrage.
Chances are it's not harming their ranking overly much, but it certainly isn't helping it much either.
Whereas a small web designer with a site wide footer link on every site they create could well have those as the vast majority of the total links - leading to a heavily skewed and very unnatural link profile.
Now then I'm sure that some of the larger web design and service outfits got where they are with the help of site wide footer links, site widgets etc, but Google moved the goalposts so you can't take that route any more.
Depending if it's a one-off or recurring event, you could also include a summary of what took place, and some teaser copy for next year: "Look at what you missed, watch this space for the 2014 conference" or some such.
Otherwise 301 redirects are probably the way to go.
I think it's equally probable both are right!
There are SO many variables that making a clear case one way or the other is unlikely. For example techies are unlikely to ever click ads - and for many even see them as they'd have AdBlock installed. At the other extreme there's many non web literate folks (yes, even these days) who, frankly, haven't distinguished between ads and organic: Top link = top answer. Even if that link is an ad, or an injected ad due to the malware littering their PC.
In the middle there's a massive group who are aware, see ads and likely choose when / if to click ads. Different intent will lead to different likelihood of clicking ads in different circumstances. If searching for a product to buy I think most people will factor in sponsored results, at least at some point - perhaps organic didn't give the desired results, they want the widest range of products or companies to choose from so open every link, etc.
The Nielsen article seems to be talking more about branded search, certainly not money out searches, whilst the Wordstream infographic is talking of high commercial intent searches - ie money out searches. Unfortunately their sources are a bit small to make out and aren't clickable so I can't look back further.
But given the difference in search intent between the two pieces I'd be comfortable with them both being right!
I suspect the answer is something along the lines of "soon" (tm) with a pinch of "when it's ready".
Having been in software far too long, the absolute worst thing they could do is unleash it as beta to us when it's still a little flakey or inconsistent. Analytics that aren't consistent and correct are of marginal use (yes, ok I know about Google Analytics issues), and would probably generate a whole load of support requests - time that will be better served polishing and completing the service. Would probably lead to a lot more questions along the lines of "are we nearly there yet?" too!
I'm sure there's a few carefully chosen alpha testers who are willing to put up with pain, compare with Google and other analytics and not rely on it for any sort of decision yet.
Me, I'm happy to wait until they think it's resilient, consistent and accurate enough to survive us, the great unwashed.
At which point I hope to be very happy to have the option to kick GA into touch.
Moz crawl your site weekly and any updates to duplicate content issues will get updated at that time.
The actual day it gets crawled varies depending on account I believe. You can check when it's due in the Crawl Diagnostics section of the campaign overview. At the bottom right you should see last crawl date and when next is due.
So any fixes you're making won't be reflected until the next Moz crawl.
Going on the on-page "Factors Overview" box near the top, the max appears to be 36.
But the number you refer to is simply a count of instances of the particular key phrase on the page, which as far as I know has no maximum - it's simply an indicator of what the crawl found for that URL - of course too high a keyword density leads to keyword stuffing and pages that are thoroughly unnatural to read!
We've long moved past the point where keyword density was the be-all and end-all of ranking, so beware trying to get it too high.
Check in the view-source of the page as some (Page Title, Meta Description etc) won't appear in an on-page search in browser if you're looking to confirm the value.
Your first point: creating fresh, relevant content on a regular basis, is the key one. This then hooks into catching long tail searches relevant to your business. The more you talk around the subject and business, providing use-cases, how-to guides, 37 things you never realised about x, the more long tail searches you're likely to catch. It's also far easier to place on the site as a blog post than trying to constantly produce evergreen content for the site and find some suitable place to put it.
It's also the logical place to inject a little opinion or humour - which perhaps will lead to some discussion or following. For the most part that's going to look out of place on the main site in amongst the sales pages.
For many small businesses, in many sectors, building a significant following is unlikely, unless the posts are very far removed from the business: perhaps drain clearing, dentistry or invoice factoring. In such a case you're never, ever going to build the same following as if you were in some entertainment or tech sector discussing the the latest iThing. I'm sure there's one or two exceptional folks out there who've managed to gain following for some dry, dull industry, but that's infinitely more about them, than the topic.
Many of the businesses that blog, highly successfully, are blogging about topics incidental to the business - for many web businesses the topics that catch notice and generate responses are the discussions about programming or some problem they encountered with scaling or the tech they're using, rather than the payment service, or the service they're actually offering. The posts on the actual service they provide getting much less traction and sharing by comparison.
But it's also about getting awareness, so that post on some uncooperative aspect of Wordpress or Apache might introduce new people to your actual service. Of course this perhaps easier in tech, but in any sector there are connected and related sectors where the same can apply, so the wider the topics the better even if not directly related to the business. That's not the same as saying constantly blog about iThings on your dentistry site of course!
Establishing self as authority. Again for many sectors, this is a bit of a stretch. With the best will in the world, the most surprising, enlightening posts on drain clearing are unlikely to get much traction on Google+ and Facebook, yet the engines are taking social success as a clear sign of authority. I've seen several valiant attempts to make "unsexy" businesses succeed on social, and to an extent, they have. As an example a ladder and scaffold rental business spent much effort sharing photos of ladders in silly places, and achieved some success. Up against shares of an iThing, cats and zombies, not so much.
So, yes, it's very necessary, particularly for the very noticeable effect of long tail traffic. For the other factors, treat them as nice-to-haves and blog accordingly.
The other key thing about blogging is nearly everyone gives up far, far too early, so keep at it.
If they're in your industry it's not going to do you any harm. But from a low authority site it's not going to a huge amount of good either!
So if it's going to take several hours of time to get them to agree to link, it's probably not worth the effort. If it's a 10 minute of email back and forth, go for it.
Never discount a link if it's a company site though - it's not quite the same as a dodgy seo directory of everything - a directory is never going to get better authority for the page linking. A company site may grow in stature as they work their own site and seo efforts, so in a year or two the link might have significantly more weight.
Likewise, never discount the human, rather than purely seo, value of a link - if it's a company in your industry, it might attract actual, interested, visitors.
Hi Mozzers,
We've just imported (around two weeks ago) our Adwords into Bing and are just evaluating it.
Pretty much across the board, but especially our best performing Ad Groups are showing up with abysmal quality scores.
Case in point: our best ad group has mostly 10s, two 9s and one 7 in Adwords, yet nothing over 3 in Bing.
Specifically landing page relevance is rated poor, keyword relevance and landing page experience as "no problem".
So, what specifically is Bing looking for on landing page relevance that's dramatically different to Adwords?
The Bing help references a blog post of 2 years ago suggesting increasing keyword count - yet the pages do well in organic search and adding more keywords to the copy will start to look artificial and stuffed, so I'm very reluctant to start there!
Any pointers?
I think it's less about submitting them to, as you put it, the vast amount of info graphic directories, and more about having something that will resonate with your audience.
If it's something you believe has good information, and will bring in business, then produce it and publicise it, but I'd hesitate to mass submit to every infographic directory you can find. Pick the x most well known, or most relevant to your industry.
If the information is good, and well presented you should gain real links from real people, which is significantly more useful than any number of directory submissions!
Personally I think infographics and videos are both in very overcrowded spaces these days.
As it's likely far more time-consuming to create one than a blog post or page of content their creation is a fairly inefficient way of getting a link. But if it goes viral of course...
Mind you, it's not just about the link. If you have something compelling or unique they can, and do, still work - slide decks and videos tend to attract different searches and searchers so they can still be a good way of drawing human traffic to the site.
Videos and images can appear in Google organic results, so increasing your presence.
So they're worth considering as part of your strategy so you have a presence on some different content sites in the hope of gaining exposure to different groups.
OK thanks - the simple fact you guys are affected too, gives me hope it'll be resolved soon enough!
Suspect it's just a wrinkle as a result of the change from seomoz to moz, but on seomoz the remember me checkbox had a time to live of a few weeks, possibly a month, which was excellent.
Since the great rebranding and migration (which is excellent by the way), time to live from ticking remember me is in the order of a couple of hours, which is pretty annoying.
Any chance the web folks could bump it up a little, better yet put it back to 30 days?
What strikes me is at the same time as your pages crawled fell off a cliff your page load time doubles. What changed round about then that would have caused an increase in load times?
If that was around the time you switched CDN, I'd be double and triple checking the new one was set up perfectly and is an improvement on cloudflare. Did all your pages suddenly get markedly bigger, perhaps due to another plugin being added? Pingdom tools is also great for picking apart load times.
That would be where I'd start...
Best way to get useful followers on G+ lately is have your people and the page active in communities and discussions. You are much more likely to get interactive people adding than just going out and following a bunch of people - anyone with a decent following isn't going to notice or react to your adding them, as they probably get upwards of 50-100 new people a day adding them. Most of whom turn out to be spam or fake accounts. I long ago stopped caring who added me on G+, or even looking!
Since communities, stream following and circle sharing matters less. So get your page to join topical and relevant communities (and if there aren't any for your niche, consider starting one if you can cope with the overhead of managing/maintaining it - nothing more offputting than a dead community), and get active in them. If your people are constantly active in the widgets communities, posting good quality info and tech help, you'll gain a lot of good followers.
However, people who've made good/interesting/insightful/funny comments on threads I'm active in, communities are likely to be pro-actively added as I've decided they are worth a try in the stream.
G+ can be a heck of a time-sink, but if you're willing to put the time in the right places it's more than worthwhile.
I've seen conflicting views on this coming from Google, and I refer you to the link in my answer regarding Pete's comments about reconsideration.
So it may be wrong, but there does seem to be evidence that disavow does not action without either an update, or reconsideration.
Hence passing along the suggestion in good faith.
You might want to check out the responses to this question : http://moz.com/community/q/google-penguin-2-0-how-to-recover particularly the comments from Pete Myers regards reconsideration requests.
What I take away from all I've read on Penguin 2 is to proceed as follows:
Hope this helps!
I'd be tempted to hang fire a little while before "fixing" the problem, and give it a while longer to shake out.
We had something very similar around the time of Penguin 2 on one of our sites, and like you for only a selection of our keywords. Half of the dropped keywords have returned to roughly where they were. Of the other half most have increased, but are still down compared to pre-Penguin.
One keyword went from 3rd to not in top 50 and back to 4th this week.
Aside from burning a fair bit of time trying to figure out what was going on and if we'd been partially Penguined, I'm not sure I'm any the wiser! Nothing notable going on with key competitors, nothing news worthy or topical in the sector - just temporary obliteration of around a quarter of keywords.
Depends on how you see the content.
If it's evergreen, and in no way time sensitive, I'd be tempted to place it as a page. For example a how to relating to the specific product or service they supply is better not lost somewhere in a blog. Blog entries are great for a Google search, but not so great for an existing customer going to the website and trying to find specific information.
For items that have a sell by date (service changes, new features and the like) and more peripheral content, blog is the way to go.
It's also worth bearing in mind that if you submit your site to DMOZ some categories no longer have volunteers dealing with applications, so your submission may sit in the queue for the next several years going nowhere. So even if it is important, it's not necessarily achievable
So submit and then instantly forget about it and stop caring.
I'd be inclined to make the equivalent of a URL rewrite rule in the ASP equivalent of htaccess to only serve www.mysite.com. Sorry I don't know ASP so can't help with syntax there.
Then I would use canonical url to tell bots which page is actually the index.
Lastly I'd scan your site and ensure that all links on your site, which link to the home page have just a slash, as it's very likely there's a few of each:
This way all to the home page that crawlers see are just the canonical URL.
No, doesn't mean single word, simply means just one set of
The crawl is presumably finding pages with more than one h1 level title on the page.
Whilst you can't get at the actual keywords behind "not provided", you can do a lot to mitigate the black hole using rank tracking.
You can add an Analytics event to track rank, which also shows average rank on not provided, per landing page. This doesn't give actual keywords, but you're now in a place where you can infer a lot. Assuming you have enough site traffic for it to be meaningful, you can now infer likely keywords based on what keywords bring traffic to those pages when you do have the data.
For example clicking into top events, you can now see a list of landing pages behind (not provided). Each page is shown, along with the number of events triggered (meaning visitors), and average value (meaning search rank on whatever hidden keywords were used).
Of course you have no way of knowing when someone used a unique long-tail search, or if for whatever reason the spread of keywords in "not provided" is actually different to visible keywords.
But it is enough to be useful, and vastly better than having no info at all on 20%+ of your traffic. Or in our case 40%+
To set up event based rank tracking, see this post from AJ Kohn : http://www.blindfiveyearold.com/new-ways-to-track-keyword-rank
The info you need is about half way down that post.
goto Pro Dashboard
goto campaign settings (text link under the view campaign button) for whichever campaign you need to change competitors for and the competitors are at the end of the page
This week's crawl results were a little crazy for us too.
We dropped to 1 error (-171), and to 48 warnings (-538), yet most of the issues not reported this week are still most definitely there. I guess roger was having a lazy week?
Is that a signed out, no cookies search giving you #1?
As I've just searched Google UK and can't find you in the first 10 pages, so it seems the ranking report is about right.
If you were ranking and have suddenly dropped, how's your link profile? Amy messages in Webmaster tools?
"... if still want my tag pages to be indexed in Google, but i don't want to to influence my traffic negatively?"
Don't forget to consider how much value landing on a tag page gives to a site visitor (clue: usually none). It's all very well indexing and ranking for some extremely long tail term via tags, but what's the human who lands on in going to make of it? (Most likely contribute to that page's 100% bounce rate).
By all means keep tags, and use them consistently to categorise your posts, but as a means of keyword stuffing (as I'd suspect it is with nigh on 600 tags), the time is long since past
Good luck!
It's a bit hacky, but you can 301 them simply.
Create a directory in place of the pdf called document.pdf or whatever, then inside that folder add a default index php (or whatever you're using) to 301 them from http://www.yourdomain.com/document.pdf/index.php to wherever the pdf now lives.
Blog and forum comments containing links are almost universally rel="nofollow" meaning you'll get no value from an SEO point of view.
Course that does precisely nothing to discourage the blog spammers.
Hi.
We set up our SEOMoz account yesterday, and the initial crawl showed up a number of errors and warnings which we were in the process of looking at and resolving.
I log into SEOMoz today and it's showing 0 errors,
Pages Crawled: 0 | Limit: 10,000
Last Crawl Completed: Nov. 27th, 2012 Next Crawl Starts: Dec. 4th, 2012errors, warnings and notices show as 0, and the issues found yesterday show only in the change indicators.There's no way of getting to the results seen yesterday other than waiting a week?We were hoping to continue working through the found issues!