Website has been penalized?
-
Hey guys,
We have been link building and optimizing our website since the beginning of June 2010.
Around August-September 2010, our site appeared on second page for the keywords we were targeting for around a week. They then dropped off the radar - although we could still see our website as #1 when searching for our company name, domain name, etc. So we figured we had been put into the 'google sandbox' sort of thing. That was fine, we dealt with that.
Then in December 2010, we appeared on the first page for our keywords and maintained first page rankings, even moving up the top 10 for just over a month. On January 13th 2011, we disappeared from Google for all of the keywords we were targeting, we don't even come up in the top pages for company name search. Although we do come up when searching for our domain name in Google and we are being cached regularly.
Before we dropped off the rankings in January, we did make some semi-major changes to our site, changing meta description, changing content around, adding a disclaimer to our pages with click tracking parameters (this is when SEOmoz prompted us that our disclaimer pages were duplicate content) so we added the disclaimer URL to our robots.txt so Google couldn't access it, we made the disclaimer an onclick link instead of href, we added nofollow to the link and also told Google to ignore these parameters in Google Webmaster Central.
We have fixed the duplicate content side of things now, we have continued to link build and we have been adding content regularly.
Do you think the duplicate content (for over 13,000 pages) could have triggered a loss in rankings? Or do you think it's something else? We index pages meta description and some subpages page titles and descriptions. We also fixed up HTML errors signaled in Google Webmaster Central and SEOmoz.
The only other reason I think we could have been penalized, is due to having a link exchange script on our site, where people could add our link to their site and add theirs to ours, but we applied the nofollow attribute to those outbound links.
Any information that will help me get our rankings back would be greatly appreciated!
-
The links on our link exchange script accounted to about 2% of websites total links, most of our link building has been through natural articles and websites posting about us. So even if Google discounted our links via link exchange - this wouldn't of made us drop this much?
I agree with you. That's a very small number and unlikely to be the problem.
The duplicate content issue is fixed.
Excellent!
I have removed the link exchange script.
Good....
Even if I link as www.freemoviedb.com as anchor text, will it still help me rank for my keywords?
Yes, you'll want to keep a number of links out there with your keyword anchor text, as that still has a high effect on ranking for a particular term. But you'll want to present a link profile that has a more natural "mix" of keywords and your domain to avoid getting flagged as spammy.
-
The links on our link exchange script accounted to about 2% of websites total links, most of our link building has been through natural articles and websites posting about us. So even if Google discounted our links via link exchange - this wouldn't of made us drop this much?
The duplicate content issue is fixed.
I have removed the link exchange script.
Even if I link as www.freemoviedb.com as anchor text, will it still help me rank for my keywords?
-
Sure looks like either a penalty or a massive discounting of links to me. You're not banned, but you're way back in the results given your link profile. I took a quick look at your robots.txt and it looks fine.
If Google is still seeing 13,000 pages as duplicate content, this could be the issue as well, as Google's internal "quality score" on your site is not going to be pretty. But giving the number of inbound links you have, I'm much more inclined to think that Google has identified your link exchange script and discounted all links related to that.
There's some discussion out there also on whether you can get dinged for over-optimizing your anchor text. See this article. While that case study is a pretty small number of sites, if Tim's findings are accurate, you're definitely at risk. Both your internal and external links to your home page are virtually 99% "watch movies online" and "watch free movies online".
So here's what I would do:
#1 solve the duplicate content issue
#2 see what you can do to change the link exchange process to make it less recognizable to Google
#3 go vary your anchor text, adding new links and changing a few old ones like this profile link to use www.freemoviedb.com as the anchor text.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website crawl error
Hi all, When I try to crawl a website, I got next error message: "java.lang.IllegalArgumentException: Illegal cookie name" For the moment, I found next explanation: The errors indicate that one of the web servers within the same cookie domain as the server is setting a cookie for your domain with the name "path", as well as another cookie with the name "domain" Does anyone has experience with this problem, knows what it means and knows how to solve it? Thanks in advance! Jens
Technical SEO | | WeAreDigital_BE0 -
2 domains for one website
Hi, I was wondering: i have 2 domains. One is the name of the company the other one is an exact match domain. There is only one website. If I use both domains for this website which one will rank? How google react to website configured like that? Regards
Technical SEO | | LeszekNowakowski0 -
How do I setup sitemaps for an international website?
I am adding translated versions of my sites to a subdomain for example es.example.com. Will I add each subdomain into Google Webmaster Tools? Will each need its own sitemap?
Technical SEO | | EcommerceSite0 -
Using the same domain for two websites (for different geographical locations)
Hi all, My client has a new E-commerce site coming out in few months.
Technical SEO | | skifr
His requirement is to use the same domain (lets call it www.domain.com for now) for two seperate websites:
The first site, for users with ip addresses from USA - which will include prices in US dollars.
The second site - for users outside of the US - will not include any prices, and will have different pages and design. Now, lets say that googlebot crawls the websites from different ip ranges. How can i make sure a user from France, for example, won't see crawled pages from the US? Sure, once he will click the result, I can redirect him to a "Sorry, but this content is unavailable in your country" page. The problem is, I don't want a user from France to see the in the search results the meta description snippets of pages related only to users in the US (in some cases, the snippets may include the prices in $).
Is Geotargeting through Webmaster Tools can help in this case? I know I can target a part of the website for a specific country (e.g. - www.domain.com/us/), but how can I make sure global users won't see the pages targeted only to the US in the search results? Thanks in Advance0 -
Website redesign launch
Hello everyone, I am in the process of having my consulting website redesigned and have a question about how this may impact SEO. I will be using the same URL as I did before, just simply replacing an old website with a new website. Obviously the URL structure will change slightly since I am changing navigation names. Page titles will also change. Do I need to do anything special to ensure that all of the pages from the old website are redirected to the new website? For example, should I do a page level redirect for each page that remains the same? So that the old "services" page is pointed to the new "services" page? Or can I simply do a redirect at the index page level? Thank you in advance for any advice! Best, Linda
Technical SEO | | LindaSchumacher0 -
Generating a Sitemap for websites linked to a wordpress blog
Greetings, I'm looking for a way to generate a sitemap that will include static pages on my home directory, as well as my wordpress blog. The site that I'm trying to build this for is in a temporary folder, and can be accessed at http://www.lifewaves.com/Website 3.0 I plan on moving the contents of this folder to the root directory for lifewaves.com whenever we are go for launch. What I'm wondering is, is there a way to build a sitemap or sitemap index that will point to the static pages of my site, as well as the wordpress blog while taking advantage of the built in wordpress hierarchy? If so, what's an easy way to do this. I have generated a sitemap using Yoast, but I can't seem to find any xml files within the wordpress folder. Within the plugin is a button that I can click to access the sitemap index, but it just brings me to the homepage of my blog. Can I build a sitemap index that points to a sitemap for the static pages as well as the sitemap generated by yoast? Thank you in advance for your help!! P.S. I'm kind of a noob.
Technical SEO | | WaveMaker0 -
Websites on same c class IP address
If two websites are on the same c class IP address, what does it mean ? Does two websites belong to the same company ?
Technical SEO | | seoug_20050 -
Google crawl index issue with our website...
Hey there. We've run into a mystifying issue with Google's crawl index of one of our sites. When we do a "site:www.burlingtonmortgage.biz" search in Google, we're seeing lots of 404 Errors on pages that don't exist on our site or seemingly on the remote server. In the search results, Google is showing nonsensical folders off the root domain and then the actual page is within that non-existent folder. An example: Google shows this in its index of the site (as a 404 Error page): www.burlingtonmortgage.biz/MQnjO/idaho-mortgage-rates.asp The actual page on the site is: www.burlingtonmortgage.biz/idaho-mortgage-rates.asp Google is showing the folder MQnjO that doesn't exist anywhere on the remote. Other pages they are showing have different folder names that are just as wacky. We called our hosting company who said the problem isn't coming from them... Has anyone had something like this happen to them? Thanks so much for your insight!
Technical SEO | | ILM_Marketing
Megan0