Julian,
There are a few more listed here on screamingfrog's blog and daniweb said they recovered. Seerinteractive also wrote about a client they helped to recover. But you're right, there are very few examples of big sites who have recovered.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Julian,
There are a few more listed here on screamingfrog's blog and daniweb said they recovered. Seerinteractive also wrote about a client they helped to recover. But you're right, there are very few examples of big sites who have recovered.
Teginder,
Yes, you're going to have to completely revise all of the content on your homepage. Since that content is shared on many other sites, your best bet is going to be to do a rewrite. Be sure to make it very different from what you have now--not just changing a few words. Moving forward, you should make sure the content on the rest of your pages is also unique on the web.
Hi Jason,
It looks like you're off to quite a start but you still have a long way to go. Keep in mind that pagerank should not be the metric you focus on--it should be traffic and pagerank doesn't increase your traffic, content does--shared and linked-to traffic, that is. It's great to use your social channels and public relations but that's not enough. If your content is not currently being shared and/or linked to in numbers similar to that of your competitors, you've got to figure out why and make changes there.
With a quick look at your back links and your content quality, I'd say you're going to need to up your investment in your content. Figure out a way to make it engageable and share-able, watch this video If this is the industry you chose for your business, it's going to pay to be an authority in it to publish content that shows you're an authority. Focus on increasing and tracking those social stats for the next year or two and you'll start to see changes.
Hi Double Dutch, I'm Obtuse. : )
Here's one for you: Guide to Creating Viral Linkbait & infographics
But that's like putting a great gadget out on the showroom floor and hoping that by itself, it will bring people into the store. Sure some existing customers may come in and tell their friends when they leave but in a day or so, that gadget will be forgotten by many of the rest will forget where they saw that cool thing.
Marketers will be out there trying to get the exact kind of new customers into the store that are already looking for that great gadget (content), who also so happen to be the ones who will most effectively pass on the information of how great it was and where others can find it.
I think "viral" on the other hand (a term I think gets used too often) is the rare kind of gadget that sparks growing interest without having to be marketed. Those gadgets are few and far between--and short lived, too, for the most part.
Emerald, I'd say to sit tight, it'll show up.
Tim,
As long as you can get to the site remotely via a website address you should be good to go. However, if the site is blocking crawlers via robots.txt file or meta robots tag rogerbot won't access it. On the other hand, screamingfrog has a setting to tell it to ignore the robots.txt file if one exists.
So Nightwing, you're asking for examples of linkable assets--such as a piece of content that ranks well enough to be found by people who benefit from linking to it for personal or professional reasons? Aren't there ton(nes) of examples of them?
They lower the visitor's perception of the quality of the site and the value they may get out of it.
I'm not aware of much/any research showing that changing anchor text on external links is harmful. Keep in mind, too that you'd be eliminating most of them anyway. Again, I say, emphasize your brand, more so than the keywords. Diluting is good. Thematic is good, too.
I'd lean towards being aggressive in your changes time frame, as there's still a long way down to the bottom of the slope you're on and you don't want to wake up one morning and find yourself there.
I'd stick with your home page as the result for that search. Remember, "Web design" is a service, "web design Brisbane" is keyword, not a service.
Looking at (what I figure is) your website and your back links, I see you doing what has gotten a lot of designers in trouble--putting tag lines with exact match anchor text in the footers of all the pages on their clients site . If I were you, I'd make a list of 20 different possible anchor texts that also include your actual brand name and go to every client site you're able to and either remove or nofollow all but one or two footer links on each site.The live links that remain should employ your list of varied anchor text. Moving forward from there, keep your new links thematic but diverse and emphasize your brand.
Maybe you're better off noindexing the partial articles and linking from them to the main article with a "Read the full article" link, or something like that.
How many of these articles you have relative to the rest of your content could make a difference--a very small percentage probably wouldn't be an issue in the overall health of your site if you just left them as is.
Why are you consolidating?
I don't think you'll be able to find what you're looking for as far as historical ranking data for your site but you can try over at semrush.com. Having taken a very quick look at your site, I'd suggest you get acquainted with redirects--and take some weeks (or more) to read up on the SEO in general before you start making changes to your site. There are lots of things you can get wrong without even knowing if you just jump in without some fundamentals under your belt.
If everything's already set up and you're ready to start wrapping your arms around the optimization process, Moz's guide to SEO is a good place to start.
Steve,
Has it been like this since the beginning or is this a relatively new issue? It seems kind of penalty-related, but have you looked closely at the differences in the back links of the two pages--internal and external? How about duplicate content coming from those countries--have you checked for that?
Hi John,
I'm wondering if you've set up your first campaign yet. If you haven't done that. use this guide to get started you can enter you domain and keywords in the setup and once per week you'll get your report on the ranking results.
Expresso, why not just 301 the individual pages to the consolidated article?
I hear you Luke. There's always the option to rely on traffic from other search engines or take the "things were so much better before there was Google" perspective.
I'm pretty sure that you're not able to force a refresh of your campaign stats in between your normal weekly crawl. This tool will crawl the site but it doesn't refresh your campaign. Specifically, what errors were found that you're trying to get rid of?
So, sounds like you're looking for a list of indexed pages? Will this tool help?
http://www.intavant.com/tools/google-indexed-pages-extractor/
Mag,
It seems that functionality is still in the works, or on the drawing table, or being looked into, or at least has been asked about many times, but is not yet available.
I believe at this time, your best bet is to 1. Submit a feature request 2. Recreate the campaign in your new account.
I vote to 301 the crusty old links and be done with it. I don't think you'll be able to squeeze much more juice out of them in any other fashion so just set the 301s and move on.
Lawrence,
The short answer is they can help but don't build a strategy around them.
Here are few things worth reading on it.
Hey JonnyG,
Be sure not to confuse links with URLs. Essentially, a link is clickable thing on a web page that, when clicked, takes the user to another URL. A URL is an address (non-clickable) . A web page is the resource that exists at a URL.
Anyway, the Internal Links tab shows how many links exist on your site that can take you to other pages on your site. However, if you click on the Health | Index Status tab, you'll get choices to see Basic and Advanced info on your indexed URLs. In the advanced tab, you'll see the total number of pages Google's index on your site. Google's Webmaster Tools Help has a page on Index Status for more info.
In the HTML, your title is spread over 4 lines and it seams that Moz is counting additional charaters because of that. It seems as though Google is presenting it correctly but you really should get that fixed.
Llanero,
Here's a link to [">another thread](<img%20class=) on the topic here in Q&A. Looks like people like it.
I won't be so sure that those expired domains are what's helping them rank. As their competitor, I'd more likely snicker at them behind their back for spending the money on buying and hosting those domains with the thought that it's helping their main site in the search results. Here's an old Danny Sullivan article on the topic
You don't need to use the word "sandbox" but there may be a delay in indexing the redirected content based on how often your original site gets crawled. (The redirects that went into effect when seomoz changed to moz, for example seemed to happen right away, for example. others might take a month or more.
In your case, it sounds like you're moving from one domain to another? If that's the case, then the redirect should be reflected in Google's search results as soon as the original domain is recrawled.
A plugin for that would be nice but I'm not aware of one (doesn't mean there's not one out there though).
Here's a nice tutorial on installing custom search on wordpress and then you can access Site Search reports from the Content section of Google Analytics. Can't help you with Joomla, though.
Looking at the back links for .org, I'd think seriously about just dropping that 301 from the home page of the .com site and any other pages that have bad links going to them.
I'm not sure why OSE shows links that are pointing to the .com site as back links to the .org site. I'd go ahead and delete those accounts, since it seems all those links point to .com anyway.
I'd be working to distance myself from the .com site as much as possible.
Marino,
www.xml-sitemaps.com and Screamingfrog might be your best bets but you might also try
180 days is the best practice for leaving a 301 in place. You could remove that redirect and that will leave all those links pointing to the .com unaffiliated with the .org site.
How did you do your 301s? page by page or did you 301 the whole domain to the .org site? There are still a few URLs left in the index for that domain
Maybe your blog is better optimized for the term than your home page is. If your blog is about door hangers and your home page is about door hangers, then something's gotta give. Or, maybe you changed something on your homepage? Certainly, you should take a look at all of your internal anchor text and see what your links are telling google about what page should be ranking for that term.
Robert,
I've always looked at the 70 charters you get in the title as the most important 70 characters on the whole page and every character you use there should be well thought-out and have a strong reason for being there. Of course branding is important but rarely do you really need to take up space in your title in order to rank for your brand. It's often better to use that space for a concise handful of words that vividly describes what the page is about.
If the video is useful for visitors to those pages and elsewise, the copy on those pages is unique, there's no problem putting the video on all the pages.
1.) Before sure to deal with Local Search to it's fullest extent.
2.) What other options do they have but to target place names combined with types of insurance? I mean, you can't do anything about where they're located, right? They made that decision. All you can do is drive as much traffic as is looking for them to their site. In a smaller market, it is what it is and you'll find out what that is when you get there.
The crux of the matter often boils down to how much are they willing to pay you to get the traffic that' available to them. Less available traffic usually means less competition. Less competition usually means its easier to reach the top of the search results. The easier it is to reach the top of the search results, the less time you have to spend with them as a client. So, do you charge them for your time, do you charge them for the value you bring them, or do you charge them based on the value they believe your going to bring them?
Being a word guy, I have a little bit different take on it:
'"Natural traffic" isn't a term that's widely used in SEO. In fact, I'm not sure I've ever actually heard it used--ever. "Organic traffic" derives from "organic search results", which were originally the algorithmic, non-paid search results before there was Local Search. Organic traffic is that traffic that comes to your site from non-paid, search results (and technically, also not from Local Search--although that might be debatable.)
If this is happening throughout your site and your site is more than a few months old and has some links to it, I'd be skeptical as to whether the content (or even the urls) are being indexed--especially considering that you have to turn off your JS to view the source. Remember, the bot is like a stripped stripped down browser and can't all of the code your browser can. How old is your site and is any of it being indexed?
Also, do you have some sort of plugin installed to deter content from being duplicated from your site? I've seen that show the same symptoms as what you're describing.
On the one hand, I think every site has some number of questionable links pointing to them, so disavowing all questionable shouldn't be a primary goal. On the other hands, if it's a free directory and not really human edited, I'd ditch it. If it's a free directory and it's niche and it's well edited, I'd keep it. If its a paid directory and it's "reviewed" I'd think about keeping it. If it's a paid and well edited, I'd keep it.
In answer to your specific question, according to SerpIQ, The average content length for a web page that ranks in the top 10 results for any keyword on Google has at least 2000 words. And the higher up you go on the search listings page, the more content each web page contains--until you reach about 2500 words.
Stephan,
It's hard to go strictly by the numbers, as some worthwhile niche sites may never develop great amounts of domain authority. Your best bet is to go to the sites and check them out. It doesn't take much to get a feel for whether its a real site that's up to date with real information that is useful to your visitors and theirs. Look at the home page, look at the page with the link, look at their back links, look at the pages that show up the top of a site:domain search, and make sure the homepage show up in the search results for a selected quote from it. It's a very touchy-feely process.
Hi James,
Per Moz's Documentation on On-Page-Reports:
The On-page summary automatically generates reports for any of your campaign keywords that rank in the top 50 of your primary search engine. The URL that it grades is the same URL that appears in the search results.
For instance, if you have 75 keywords ranking in the Top 50, you should have 75 On-Page Reports. This generation happens automatically within 24 hours of when your rankings are updated.
Interesting question. It seems to me that hundreds or even thousands of spammy links from one domain pointing to 404 URLs on another domain isn't the the kind of situation Google would be penalizing the receiving domain for. I think you're in a far better situation than those who have such a back link profile pointing to their homepage because they don't have the option of deleting the page being linked to.
I would say that you don't need to disavow those links--just delete the profiles.
Rexjoec,
It is considered duplicate content and the same holds true for www and non-www. You can 301 forward the non-www to www and use rel=canonical to avoid the duplicate content issue with between http and https.
I see it's commented out in the header of the homepage but I'd give removing that a shot just to remove any doubt. I had a client with both author and publisher markup on the homepage (it wasn't commented out, though) and had the same issue you're having.
Gagan,
Try removing the rel=publisher markup. I think it will work then.