Optimizing internal links or over-optimizing?
-
For a while I hated the look of the internal links page of Google Web Master Tools account for a certain site.
With a total of 120+K pages, the top internal link was the one pointing to "FAQ". With around 1M links. That was due to the fact, on every single page, both the header and the footer where presenting 5 links to the most popular questions.
The traffic of those FAQ pages is non-existent, the anchor text is not SEO interesting, and theoretically 1M useless internal links is detrimental for page juice flow.
So I removed them. Replacing the anchor with javascript to keep the functionality. I actually left only 1 “pure” link to the FAQ page in the footer (site wide).
And overnight, the internal links page of that GWT account disappeared. Blank, no links.
Now... Mhhh... I feel like... Ops!
Yes I am getting paranoid at the idea the sudden disappearance of 1M internal links was not appreciated by google bot.
Anyone had similar experience?
Could this be seen by google bot as over-optimizing and be penalized?
Did I possibly triggered a manual review of the website removing 1M internal links? I remember Matt Cutts saying adding or removing 1M pages (pages) would trigger a flag at google spam team and lead to a manual review, but 1M internal links?
Any idea?
-
Everything went back to normality in GWT after 48 hours.
-
Hi Cyrus, thanks for your answer.
The links to the FAQ page where 1M, the second most popular had 300k. And of those 1M link half of them were placed prominently on the top navigation bar. That's why I thought they were detrimental for link equity. Do you think they are still not making much difference?
-
Let's put it this way....1. Google expects to see pages like FAQs, About Us, Contact Pages, etc have a high number of internal links.
2. The link equity "leaked" to these pages is usually negligible. Sure, there's a small amount of PageRank, but it's not really considered anything that would influence your on-page optimization.
On the other hand, if the links were in obvious places, and there were a lot of them, I might be tempted to control my link equity as well. It really all depends on where the links are placed and how prominent they are on the page. "How likely is the user to click this link?" is a good question to ask a.k.a. "reasonable surfer"
3. At the same time, Google probably doesn't care too much that you removed those links, either. Very rare that removing links is seen as over-optimization.
4. Google is getting very good at following javascript links. They many not show up in Google Webmaster Tools, but it's likely they are parsing your javascript anyway. (unsure how much link equity, if any, flows through this)
If you're concerned about it, you can simply watch your rankings, traffic, and crawl stats to monitor for problems. If there is, simply revert your actions back to the original, and hopefully everything will be good.
Best of luck with your SEO!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I am losing 1 point of DA at month? What could it be? I have noticed I have lost 50K (out of 300K) of internal links after a website update, could it be related to that?
I am losing 1 point of DA at month? What could it be? I have noticed I have lost 50K (out of 300K) of internal links after a website update, could it be related to that?
Technical SEO | | albertoalchieriefficio0 -
301 Redirect for multiple links
I just relaunched my website and changed a permalink structure for several pages where only a subdirectory name changed. What 301 Redirect code do I use to redirect the following? I have dozens of these where I need to change just the directory name from "urban-living" to "urban", and want it to catch the following all in one redirect command. Here is an example of the structure that needs to change. Old
Technical SEO | | shawnbeaird
domain.com/urban-living (single page w/ content)
domain.com/urban-living/tempe (single page w/ content)
domain.com/urban-living/tempe/the-vale (single page w/ content) New
domain.com/urban
domain.com/urban/tempe
domain.com/urban/tempe/the-vale0 -
Will Google Recrawl an Indexed URL Which is No Longer Internally Linked?
We accidentally introduced Google to our incomplete site. The end result: thousands of pages indexed which return nothing but a "Sorry, no results" page. I know there are many ways to go about this, but the sheer number of pages makes it frustrating. Ideally, in the interim, I'd love to 404 the offending pages and allow Google to recrawl them, realize they're dead, and begin removing them from the index. Unfortunately, we've removed the initial internal links that lead to this premature indexation from our site. So my question is, will Google revisit these pages based on their own records (as in, this page is indexed, let's go check it out again!), or will they only revisit them by following along a current site structure? We are signed up with WMT if that helps.
Technical SEO | | kirmeliux0 -
Link Indexing Thoughts
We have have several promotional Articles put out for a few client sites, (posted on sites - not article directories) That was in Sept, it looks like they have not yet been indexed - any ideas on best to get them indexed? Not just these, but a lot of external links indexed quickly -Google seem to be slowing getting to them (big web after all....)
Technical SEO | | OnlineAssetPartners0 -
Do we know if inbalanced anchor text distribution also applies to internal links?
I have the pages of my site linked together very well with editorial links in my copy and blog posts. But now I'm starting to wonder post-penguin if it's a problem if all my internal links to a certain page have the same anchor text? Or is my internal link juice not powerful enough to set off a red flag? I don't think I've seen this addressed anywhere or if we even know the answer to this or can only speculate.
Technical SEO | | UnderRugSwept0 -
Too Many On-Page Links
Hello. My Seomoz report this week tells me that I have about 500 pages with Too Many On-Page Links One of the examples is this one: https://www.theprinterdepo.com/hp-9000mfp-refurbished-printer (104 links) If you check, all our products have a RELATED products section and in some of them the related products can be UP to 40 Products. I wonder how can I solve this. I thought that putting nofollow on the links of the related products might fix all of these warnings? Putting NOFOLLOW does not affect SEO?
Technical SEO | | levalencia10 -
.Nofollow and link count
If i use nofollow on links ( internal or external ), will it reduce the link count as regard to Google. If there are 50 external links, and i nofollow 20 of them, will Google count this as 30 external links.
Technical SEO | | seoug_20050 -
Which version of pages should I build links to?
I'm working on the site www.qualityauditor.co.uk which is built in Moonfruit. Moonfruit renders pages in Flash. Not ideal, I know, but it also automatically produces an HTML version of every page for those without Flash, Javascript and search engines. This HTML version is fairly well optimised for search engines, but sits on different URLs. For example, the page you're likely to see if browsing the site is at http://www.qualityauditor.co.uk/#/iso-9001-lead-auditor-course/4528742734 However, if you turn Javascript off you can see the HTML version of the page here <cite>http://www.qualityauditor.co.uk/page/4528742734</cite> Mostly, it's the last version of the URL which appears in the Google search results for a relevant query. But not always. Plus, in Google Webmaster Tools fetching as Googlebot only shows page content for the first version of the URL. For the second version it returns HTTP status code and a 302 redirect to the first version. I have two questions, really: Will these two versions of the page cause my duplicate content issues? I suspect not as the first version renders only in Flash. But will Google think the 302 redirect for people is cloaking? Which version of the URL should I be pointing new links to (bearing in mind the 302 redirect which doesn't pass link juice). The URL's which I see in my browser and which Google likes the look at when I 'fetch as Googlebot'. Or those Google shows in the search results? Thanks folks, much appreciated! Eamon
Technical SEO | | driftnetmedia0