On site links triggering anchor text algorithmic penatly?
-
I'm trying to figure out why a drop in ranking occurred and think it may be related to an increase in on site links. I've attached images of the SEO moz report showing a jump in links from a few hundred to around 15,000 within the space of a week. I think this may be due to some on site work I did when I created categories (I use wordpress) for a large number of cities and towns in the UK. I soon realised I'd run into duplicate content issues and removed all these categories within a few days. As I added categories I also ran into 'too many on page links' warnings as each category I added created a new link and I ended up with hundreds on each page.
If you look at the analytics reports I suffered a huge drop in rankings on the 10th March and think this could be due to an on site anchor text problem that was caused by adding the categories and in turn creating many on site links. SEO moz found these links on the 11th and 25th Feb but my guess is that Google found them around at the same time but if these links are the problem then why didn't my rankings drop until the 10th March? Surely they would have dropped sooner? Would this cause a drop in rankings?
I've recieved an email from google saying that no manual penalty was applied to the site after I submitted a reconsideration request. Therefore it must be some kind of algorithmic penalty. Could this be the problem and if not what else should I look at. My baclink profile appears to be okay and I've been careful to vary my anchor text with inbound link building.
I'm at a loss as to what to do next. Any help will be much appreciated!
-
Ok thanks.
Sam.
-
I'll need to wait until tomorrow to check on this in OSE when they revert to the newer index once again. All of my link exports are currently showing link count prior to the increase. Should be able to update you tomorrow after I get a chance to look.
Ok, to update my response here, OSE is showing 14,000+ links as a result of your on-site changes. You can see that as a list of 745 top pages: http://www.opensiteexplorer.org/pages.html?page=16&site=www.top-10-dating-reviews.com&sort=page_authority. Looks like those pages have at least 70 links each, which easily exceeds 14,000 possible links being found.
Open Site Explorer is updated roughly 1-2 times per month, and shows data that is roughly 20-50 days old depending on when you look at it and when the index was crawled. That's the explanation for why you're still seeing this in the search results. If it doesn't go away within the next 1-2 OSE updates then I'd look into it further.
--
Regarding the original question about whether internal links can hurt the domain, a Matt Cutts video was released yesterday partially addressing this:
Will multiple internal links with the same anchor text hurt a site's ranking?: http://www.youtube.com/watch?v=6ybpXU0ckKQ
That doesn't mean all of those pages of duplicate content may not have hurt rankings, but the links themselves were not the issue.
--
I'm still confused by the Analytics drop but that could be due to a number of things. I'd say the answer lies in digging through Analytics and finding out what exactly dropped that day.
-
Thanks for your reply. creating an extreme number of categories is what I did. I've deleted them now but but still on my seomoz link analysis it says over 14,000 links? I have no idea why? The site is http://www.top-10-dating-reviews.com ( there is some adult content there) . Any ideas appreciated!
-
OK, so assuming that the large jump in links is coming from internal links, here are a few ways that Wordpress might create that many pages:
- Creating an extreme number of categories (more than 20-30) while using permalinks that contain /%category%/ and applying posts to multiple categories.
- Using a theme that contains parameterized URLs such as ?reply-to-comment at the end of every comment reply button.
- Using a strange permalink setting that causes issues.
If all of those pages are really new internal URLs then I suppose it could have confused Google and affected your rankings but since I have not dealt with such an extreme amount of duplicate content added so quickly I couldn't say for sure.
There are also plenty of ways that you could have triggered that many external links. Any sidebar or footer link on a large site could easily add thousands of links. I highly doubt this type of link would have caused a ranking drop on its own - it's no different than someone adding you to their blogroll.
This is a difficult question to answer properly without looking at the site or the exact links, because all I can do is list of lots of hypothetical causes. If you'd like to include the domain or PM it to me I'm happy to look at the website itself.
-
Thanks for your reply. The urls I removed are 404'ing so should I remove these urls in webmaster tools or let them drop out of the index naturally? They keep popping up in webmaser tools as crawl errors.
-
It's a tricky situation, it seems like you were making many changes to your site, it's always risky to put links with keyword rich anchors, and when they're too many and built in a short time period that's definitely dangerous.
First of all get rid of everything you made in a "dangerous way" like your many internal links, normally google has itsrict parameters to check out a page and when you're above a certain threshold you get hit. However I think that to recover the threshold is even lower, it seems like, google is more strict with you since you've tried to game their algo.
Now these are just my ideas and nothing confirmed but I think that you should try to clean up all the new links first, then have a look at your pages, that way to create a lot of pages in such short time, seems that they're programmed pages without any valuable content so they may be toxic for your recovery. Try to make a step back, and restart creating them on a slower pace and maybe hope google to reconsider your position. However if you don't have any manual penalty you'll have to wait until you get recovered. Reconsideration requests won't help you at all.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I use links intag instead of "ahref" tag can Google read links inside div tag?
Hi All, Need a suggestion on it. For buttons, I am using links in tag instead of "ahref". Do you know that can Google read links inside "div" tag? Does it pass rank juice? It will be great if you can provide any reference if possible.
Intermediate & Advanced SEO | | pujan.bikroy0 -
OK to change the anchor text of a link?
I have built a link on behalf of a ciient in a long, well-written article on a reputable website that accepts contributor accounts. I therefore control the link. I have since realised that the anchor text of the link could be optimized much better than it currently is (while still only being a partial match). Would I be punished by the algorithm for going in and changing the link? I know it's not 100% "natural," but then we're SEOs, and i don't think it's too implausible that a website owner may go in and do the same... Maybe if I add some text as well, it would make things look more natural?
Intermediate & Advanced SEO | | zakkyg1 -
Transferring Domain and redirecting old site to new site and Having Issues - Please help
I have just completed a site redesign under a different domain and new wordpress woo commerce platform. The typical protocol is to just submit all the redirects via the .htaccess file on the current site and thereby tell google the new home of all your current pages on the new site so you maintain your link juice. This problem is my current site is hosted with network solutions and they do not allow access to the .htaccess file and there is no way to redirect the pages they say other than a script they can employ to push all pages of the old site to the new home page of the new site. This is of course bad for seo so not a solution. They did mention they could also write a script for the home page to redirect just it to the new home page then place a script of every individual page redirecting each of those. Does this sound like something plausible? Noone at network solutions has really been able to give me a straight answer. That being said i have discussed with a few developers and they mentioned a workaround process to avoid the above: “The only thing I can think of is.. point both domains (www.islesurfboards.com & www.islesurfandsup.com) to the new store, and 301 there? If you kept WooCommerce, Wordpress has plugins to 301 pages. So maybe use A record or CName for the old URL to the new URL/IP, then use htaccess to redirect the old domain to the new domain, then when that comes through to the new store, setup 301's there for pages? Example ... http://www.islesurfboards.com points to http://www.islesurfandsup.com ... then when the site sees http://www.islesurfboards.com, htaccess 301's to http://www.islesurfandsup.com.. then wordpress uses 301 plugin for the pages? Not 100% sure if this is the best way... but might work." Can anyone confirm this process will work or suggest anything else to redirect my current site on network solutions to my new site withe new domain and maintain the redirects and seo power. My domain www.islesurfboards.com has been around for 10 years so dont just want to flush the link juice down the toilet and want to redirect everything correctly.
Intermediate & Advanced SEO | | isle_surf0 -
Getting Rid Of Spammy 301 Links From An Old Site
A relatively new site I'm working on has been hit really hard by Panda, due to over optimization of 301 external links which include exact keyword phrases, from an old site. Prior to the Panda update, all of these 301 redirects worked like a charm, but now all of these 301's from the old url are killing the new site, because all the hyper-text links include exact keyword matches. A couple weeks ago, I took the old site completely down, and removed the htaccess file, removing the 301's and in effect breaking all of these bad links. Consequently, if one were to type this old url, you'd be directed to the domain registrar, and not redirected to the new site. My hope is to eliminate most of the bad links, that are mostly on spammy sites, that aren't worth linking to. My thought is these links would eventually disappear from G. My concern is that this might not work, because G won't re-index these links, because once they're indexed by G, they'll be there forever. My fear is causing me to conclude I should hedge my bets, and just disavow these sites using the disavow tool in WMT. IMO, the disavow tool is an action of last resort, because I don't want to call attention to myself, since this site doesn't have a manual penalty inflected on it. Any opinions or advise would be greatly appreciated.
Intermediate & Advanced SEO | | alrockn0 -
Whats the best search parameters on Open Site Explorer for identifying un-natural back links?
Using open site explorer, what parameters will best narrow down low quality back links(or back links that could be viewed as un-natural by Google)? ie. blog networks, link schemes, etc.
Intermediate & Advanced SEO | | Stromme0 -
Transfer link juice from old to new site
Hi seomozzers, The design team is building a new website for one of our clients. My role is to make sure all the link juice is kept. My first question is, should I just make 301s or is there another technique to preserve all the link juice from the old to new site that I should be focusing on? Second Question is that ok to transfer link juice using dev urls like www.dev2.example.com (new site) or 182.3456.2333? or should I wait the creation of real urls to do link juice transfer? Thank you 🙂
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
How do you prevent the mobile site becoming a duplicate of the full browser site?
We have a larger site with 100k+ pages, we need to create a mobile site which gets indexed in the mobile engines but I am afraid that google bot will consider these pages duplicates of the normal site pages. I know I can block it on the robots.txt but I still need it to be indexed for mobile search engines and I think google has a mobile crawler as well. Feel free to give me any other tips that I should follow while trying to optimize the mobile version. Any help would be appreciated 🙂
Intermediate & Advanced SEO | | pulseseo0 -
Link anchor text: only useful for pages linked to directly or distributed across site?
As a SEO I understand that link anchor text for the focus keyword on the page linked to is very important, but I have a question which I can not find the answer to in any books or blogs, namely: does inbound anchor text 'carry over' to other pages in your site, like linkjuice? For instance, if I have a homepage focusing on keyword X and a subpage (with internal links to it) focusing on keyword Y. Does is then help to link to the homepage with keyword Y anchor texts? Will this keyword thematically 'flow through' the internal link structure and help the subpage's ranking? In a broader sense: will a diverse link anchor text profile to your homepage help all other pages in your domain rank thematically? Or is link anchor text just useful for the direct page that is linked to? All views and experiences are welcome! Kind regards, Joost van Vught
Intermediate & Advanced SEO | | JoostvanVught0