I would ask a single question. What are the best practices that they follow after Penguin/Panda standards. Depending on the answer to this, you would know who deserves a chance to work with you.
Thanks,
Rajesh Dhawan
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
I would ask a single question. What are the best practices that they follow after Penguin/Panda standards. Depending on the answer to this, you would know who deserves a chance to work with you.
Thanks,
Rajesh Dhawan
Have you checked the source of traffic in your analytics report? That should be able to explain a lot.
Are you tracking SERPs (rank positions) for your site?
Add to that the fact that the domain has some good age. It was registered on 27-Sep-2000, so it is a 12 year old domain to be precise.
You can't force the rate at which your site's old content is de-indexed and accounted for using the new pages. Leave that to Google and have some fresh juice to lighten up.
Cheers,
Rajesh Dhawan
Hiya,
Please look at the following link to find out the chronology of updates made by Google:
http://www.seomoz.org/google-algorithm-change
I see an update that happened on 3rd April; this could possibly explain the dip in your site traffic in the 1st week of April.
Coming to your 2nd question, how long was your site offline? Do you see any indexed items of your website in Google?
Finally, for the spiked traffic in July, do you see any organic traffic details such as the keywords etc.? It could very well be malicious code or script that could be generating false views/hits on your site.
Thanks,
Rajesh Dhawan
Hiya,
Mat offers a correct insight into redirecting the old site to the new site.
This looks complicated though. For example, how many pages were there on the old site and have you set up a corresponding one-to-one mapping between the old site content and the new site content?
Is the old site still in the Google index? If not, then you may have to resurrect the old site and try to gain those rankings over a period of time.
Setting up a domain redirect will help if it was your domain that was ranking well for keywords. If you had tonnes of internal pages that were ranking, then I will suggest that you bring up the old site and work out a migration plan with a specialist. Otherwise, most of your internal pages' rankings would not carry forward to the new domain and your conversions will, eventually, suffer.
Best of Luck,
Rajesh Dhawan
Hi Sorin,
If your site tanked on 25th-26th April, then it is safe to presume that your site was indeed hit by the Penguin update. There was a smaller Panda update that happened on 18th April as well. If you have access to Google Analytics or rank trackers, you should be able to tell by looking at the dates whether it was Panda or Penguin.
Coming to the issue of your small website, I would agree with Sha. It would be most appropriate if you can disclose the affected URL on the forum for real helpful suggestions.
Penguin penalized everything that was on the 'over' side. It punished sites with over-optimization at each level. Be it links (internal and external both) or the covert keyword stuffing across web pages. Penguin's impact could be felt more extensively because it affected almost everything that an SEO would have done to boost site's rankings.
I guess, if you are really looking for non-generalized approach towards helping your cause, then website's url is much needed.
Best Regards,
Rajesh Dhawan
What is it that is stopping your site from ranking higher? Is it bad on-page optimization, bad links, poor link profile, Penguin-inflicted penalty, Panda-inflicted penalty or just plain bad luck?
I am sure you would have a probable reason behind your site's inability to climb higher. Before looking for a provider, you need to find out the weakness. With SeoMoz's tool-chest, this should not be difficult.
Thanks,
Rajesh Dhawan
Unless your old site is penalized for some reason, I don't see any reason you should go for a new domain within the same traffic segment. With 5 years of domain age, chances are that your efforts will take less time on the old domain to bring results. With a new domain, things won't be as simple as they look. I would have taken age of the site as a big advantage and stuck to old domain. Overall costs would be lower for the results to be achieved in case of old domain.
Now, if for some reason the old domain or site was penalized, then that is a different matter altogether.
Thanks,
Rajesh Dhawan
Here is what you will need to do:
a) setup a g+ profile for your individual account
b) Connect that across to your company g+ profile
c) on the individual g+ profile, update the contributor to section to redirect to your blog's root domain
d) From the individual posts, rel=me to the g+ profile URL
Cheers,
Rajesh Dhawan
Author page is not a necessity. You can use rel=me from individual blog posts as well. Rel=me will connect with your g+ profile.
You are already doing a rel=me to the root domain by using the 'Contributor To' section of your g+. It works behind the scenes.
Now, you need to claim the content by using rel=me at the individual blog post level or at the level of an author page that will in turn get connected with the rel=author tag.
It sounds confusing, but it isn't actually.
Thanks,
Rajesh Dhawan
Send an email to help (at) seomoz.org for someone to have a look.
Here is what you need to do:
a) Create a g+ profile
b) in the 'contributor to' section, link out to the blog for which you write.
c) on the blog post, add a rel=me tag and point back to the g+ profile or, setup an author page on the blog. From the author page, use a rel=me tag and link back to your g+ profile. If you were to setup an author page, each post on your blog will need to link to the author page using rel=author tag.
Cheers,
Rajesh Dhawan
And here is how you reset the crawl:
1. On your webserver, edit the robots.txt file.
2. Block the seomoz bot from crawling the site by blocking its access to the root.
You can do so by adding the following lines:
User-agent: rogerbot
Disallow: /
This would end the crawl session.
But, before you do this, it may a good idea to check if your site indeed has a lot of content and outgoing links?
Rory,
What is the sub-domain that you are crawling? It may just be that there is a lot of content to crawl.
There are 2 aspects to adding a listing in Yahoo directory.
1. It helps in SEO efforts for Yahoo aka Bing.
2. It helps improve the trust score of the site.
If your client has a budget for it, then go for it. It is one of those links that are still held in high-trust list. It could balance out a lot of spammy links in your site's overall backlinks profile.
Rory,
I would guess that this crawl session has hung-up; it would be a good idea to start a new session. The session could have been left in the middle due to a server side issue on your website or a temporary drop in connection between the API server and your website's server.
Agreed, he needs to evolve a strategy that is based on 2 things:
a) Claiming authorship of his original content
b) Devising ways to prevent content theft
At the same time, if his site was inflicted with a Panda-penalty, he will make a lot of gains using authorship.
Astrid,
David is right; you will put almost the entire backlinks' profile of .net at risk. The quantum of the risk and the subsequent results are factors that nobody would be able to predict with perfection.
Tracy,
If you have no way to remove the links from those blog networks, then just go ahead and start a new site. I am presuming that the blog networks were linking to the root domain? If yes, then you would save a lot of effort and budget by starting over a new site.
It wont work without a proper headshot.
The author profile is just a way to authenticate yourself to Google. If you have the ability to set profiles for other people of your company, you should be fine.
There is not a simple solution for your particular case. Your site ranks below an exact match domain and another site that is haaretz.com. It would be very difficult to rank above chriscornell.com as it is an exact match domain.
Haaretz.com has a domain authority of 91, while your site's domain authority is 40. Until you tell the search engines that the content posted on haaretz.com is originally yours, it would seem difficult for your site to outrank haaretz.com
As a first step, I will setup an author profile using g+. That should give you more visits since your author profile information should include headshot as well.
Besides this, I don't thin there are many options that are available off the table. You will need to work on improving your site's overall authority.
A sitemap is just a file. So, as long as you can modify the automated sitemap file and insert the tags for the second CMS, you should be fine. You would have had a problem if there were 2 sitemap files in the root as it would mean that your site is giving different sitemap files.
Out there in the webmaster tool, you can list the path of your sitemap file and Google bot would automatically pick up the file.
So, your approach looks good to me.
Do you have 2 systems and a single sitemap or 2 systems with 2 sitemaps? What exactly do you mean when you say that you have 2 systems? Are they 2 different websites?
Do you mind sharing your site's URL? One of the easiest ways for you to rank higher than those sites would be to setup author profiles and claim yourself as the author of the original content. It would be a way to tell Google that it's your content.
It may take a few weeks time for things to settle down.