Over-optimizing Internal Linking: Is this real and, if so, what's the happy medium?
-
I have heard a lot about having a solid internal linking structure so that Google can easily discover pages and understand your page hierarchies and correlations and equity can be passed. Often, it's mentioned that it's good to have optimized anchor text, but not too optimized.
You hear a lot of warnings about how over-optimization can be perceived as spammy: https://neilpatel.com/blog/avoid-over-optimizing/
But you also see posts and news like this saying that the internal link over-optimization warnings are unfounded or outdated:
https://www.seroundtable.com/google-no-internal-linking-overoptimization-penalty-27092.html So what's the tea? Is internal linking overoptimization a myth? If it's true, what's the tipping point? Does it have to be super invasive and keyword stuffy to negatively impact rankings? Or does simple light optimization of internal links on every page trigger this? -
Just so you know, EMA (exact-match anchor text), which is also referred to as 'over' link optimisation, is more a concern for your off-site links. In terms of your internal site structure, that's much more lenient. Obviously if it impacted UX (e.g: site nav buttons with ridiculous amounts of text that become over-chunky, annoying users) then that's bad. If you can satisfy UX and also do some light keyword optimisation of your internal site links, I honestly don't see that as a massive problem. If anything it just gives Google more context and direction
I don't think internal link over-optimisation is a myth, because there's always someone stupid enough to pick up a spoon and run with it (taking it to ridiculous extremes that would also impact UX and the readability of the site). But as long as you don't go completely mental and the links make sense for users (they end up where they would expect to end up, with concise link / button text that doesn't bloat the UI) then you're fine. Don't worry about this overly much, but don't take it to an unreasonable extreme
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Canonical URL's For Two Domains
We have two websites, one we use for Google PPC (website 1) and one (website 2) we use for everything else. The reason is we are in an industry that Google Adwords doesn't like, so we built a whole other website that removes the product descriptions as Google Adwords doesn't approve of many of them (nutrition). Right now we have that Google Adwords approved website (website 1) no-index/no-follow because we didn't want to run into potential duplicate content issues in free search, but the issue is we can't submit it to Google Shopping...as they require it to be indexable. Do you think removing the no-index/no-follow from that website 1 and adding canonical URL's pointing to website 2 would resolve this issue (being able to submit it to Google Shopping) and not cause any problems with duplicate content? I was thinking of adding the canonical tag to all pages of website 1 and point it to website 2. Does that make sense? Do you think that would work?
Intermediate & Advanced SEO | | vetofunk0 -
How to switch from URL based navigation to Ajax, 1000's of URLs gone
Hi everyone, We have thousands of urls generated by numerous products filters on our ecommerce site, eg./category1/category11/brand/color-red/size-xl+xxl/price-cheap/in-stock/. We are thinking of moving these filters to ajax in order to offer a better user experience and get rid of these useless urls. In your opinion, what is the best way to deal with this huge move ? leave the existing URLs respond as before : as they will disappear from our sitemap (they won't be linked anymore), I imagine robots will someday consider them as obsolete ? redirect permanent (301) to the closest existing url mark them as gone (4xx) I'd vote for option 2. Bots will suddenly see thousands of 301, but this is reflecting what is really happening, right ? Do you think this could result in some penalty ? Thank you very much for your help. Jeremy
Intermediate & Advanced SEO | | JeremyICC0 -
Partial Match or RegEx in Search Console's URL Parameters Tool?
So I currently have approximately 1000 of these URLs indexed, when I only want roughly 100 of them. Let's say the URL is www.example.com/page.php?par1=ABC123=&par2=DEF456=&par3=GHI789= All the indexed URLs follow that same kinda format, but I only want to index the URLs that have a par1 of ABC (but that could be ABC123 or ABC456 or whatever). Using URL Parameters tool in Search Console, I can ask Googlebot to only crawl URLs with a specific value. But is there any way to get a partial match, using regex maybe? Am I wasting my time with Search Console, and should I just disallow any page.php without par1=ABC in robots.txt?
Intermediate & Advanced SEO | | Ria_0 -
Site was moved, but still exists on the old server and is being outranked for it's own name
Recently, a client went through a split with a business partner, they both had websites on the same domain, but within their own sub directories. There is a main landing page, which links to both sites, the landing page sits on the root. Ie. example.com is a landing page with links to example.com/partner1, and example.com/partner2 Parter 2 will be my client for this example. After the split, partner 2 downloaded his website, and put it up on his own server, but no longer has any kind of access to the old servers ftp, and partner 1 is refusing to cooperate in any way to have the site removed from the old server. They did add a 301 redirect for the home page on the old server for partner 2, so, example.com/partner2/index.html is 301'ing to the new site on the new server, HOWEVER, every other page is still live on that old server, and is outranking the new site in every instance. The home page is also being outranked, even with the 301 redirect in place. What are some steps I can take to rectify this? The clients main concern is that this old website, containing the old partners name, is outranking him for his own name, and the name of his practice. So far, here's what i've been thinking: Since the site has poor on-page optimization, i'll start be cleaning all of that up. I'll then optimize the home page to better depict the clients name and practice through proper usage of heading tags, titles, alt, etc, as well as the meta title and description. The only other thing I can think of would be to start building some backlinks? Any help/suggestions would be greatly appreciated! Thanks.
Intermediate & Advanced SEO | | RCDesign740 -
My homepage doesn't rank anymore. It's been replaced by irrelevant subpages which rank around 100-200 instead of top 5.
Hey guys, I think I got some kind of penalty for my homepage. I was in top5 for my keywords. Then a few days ago, my homepage stopped ranking for anything except searching for my domain name in Google. sitename.com/widget-reviews/ previously ranked #3 for "widget reviews"
Intermediate & Advanced SEO | | wearetribe
but now....
sitename.com/widget-training-for-pet-cats/ is ranking #84 for widget reviews instead. Similarly across all my other keywords, irrelevant, wrong pages are ranking. Did I get some kind of penalty?0 -
New gTLD's, buy or wait and see?
Is the new gTLD scheme from ICANN worth the money? I manage a brand relatively well-known in our own market segment. Would I benefit from moving from .com and national TLDs for my international sites to my own brand TLD? Are there any obvious SEO pros and cons?
Intermediate & Advanced SEO | | KnutDSvendsen0 -
Need to duplicate the index for Google in a way that's correct
Usually duplicated content is a brief to fix. I find myself in a little predicament: I have a network of career oriented websites in several countries. the problem is that for each country we use a "master" site that aggregates all ads working as a portal. The smaller nisched sites have some of the same info as the "master" sites since it is relevant for that site. The "master" sites have naturally gained the index for the majority of these ads. So the main issue is how to maintain the ads on the master sites and still make the nische sites content become indexed in a way that doesn't break Google guide lines. I can of course fix this in various ways ranging from iframes(no index though) and bullet listing and small adjustments to the headers and titles on the content on the nisched sites, but it feels like I'm cheating if I'm going down that path. So the question is: Have someone else stumbled upon a similar problem? If so...? How did you fix it.
Intermediate & Advanced SEO | | Gustav-Northclick0 -
Export list of urls in google's index?
Is there a way to export an exact list of urls found in Google's index?
Intermediate & Advanced SEO | | nicole.healthline0