What Are Latest Internal Linking Strategies?
-
I have been doing a little research, but all the articles are really old. Even the Moz site page is pretty old.
So I am wondering, has the strategy changed? Is it OK to still use internal links with your keywords in them? Do you have multiple links on a page? What about a blog post? Do you no follow?
What are the thoughts out there on this?
-
Hi,
Just saw this article and recalled your question. Using exact match for internal linking is still ok; however, it's true only if you use it occasionally. If it's over used, it's still bad for SEO as you can get over optimized penalty. I guess the best is to mix and match and do some co citation/co occurrence with long tail keywords.
https://blog.kissmetrics.com/avoid-over-optimizing/
Thank you!
-
Yes, that makes sense. It doesn't appear that the google algorithms are penalizing for keyword anchor text on internal links unless you way overdue it and turn your site into a spamfest.
-
Hi Tommy,
Looking at your links, that pages deals with external links. If you read the comments, it seems Neil thinks internal links with keyword anchor text is still ok.
-
Hi,
Some of the old Internal Linking practices still apply. You should still continue to link internally to pass on page ranks and continue to link to deeper pages within your site. However, there are some updates on the anchor text you use to internal link or even linking to external sites.
Optimized Anchor Texts were targeted by Penguin 3.0 or whatever numbering system some people use. Using exact match keywords or any keyword rich anchor text are pretty much dangerous to use or any sort of anchor text optimization. One of the update is to link naturally or relevant keywords as anchor text to avoid being hit.
Here is an article from Quick Sprout on Penguin 3.0 update that talked about Anchor Text http://www.quicksprout.com/2014/05/14/how-to-avoid-getting-slaughtered-by-penguin-3-0/
Hope this helps.
-
Use your links in a fashion that allows users to spend more time on the site. The more a site is seen as a positive resource of information, the better it will succeed for users and ranking. These can be both keyword and broad match.
Example one: You can use this tool to find out more about keyword research.
Example two: XYZ company has created a keyword research tool to help find relevant keywords.
(Bold text above would be links) We have found it successful to link as if you were speaking to a live person. As you touch on certain subjects or points of interest, use those opportunities to offer additional information as links.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirect inbound links to youtube?
I have a website that's been going for 10 years or so, doesn't get huge traffic but it's fairly consistent. About 5 years ago I put the same content on youtube- instructional how to videos. The website offers slightly better content because there are images to accompany the step by step text below the videos. The text is more or less the same on youtube and my website. Recently, youtube has started to vastly out-perform my website. For every page/video on my website, there is a youtube page. They're basically competing against each other. Over the years I have accrued a fair number of links to my website. My question is, should I redirect my inbound links to the relevant youtube pages and sacrifice my website? Thanks! Will
Intermediate & Advanced SEO | | madegood0 -
Is Link equity / Link Juice lost to a blocked URL in the same way that it is lost to nofollow link
Hi If there is a link on a page that goes to a URL that is blocked in robots txt - is the link juice lost in the same way as when you add nofollow to a link on a page. Any help would be most appreciated.
Intermediate & Advanced SEO | | Andrew-SEO0 -
Is it OK that the root didn't have any internal links?
Hi guys; In a website with more than 20,000 indexed pages, Is it normally that homepage (root) didn't have any internal links, while other important pages have enough internal links? Consider that in a top menu in header of all pages, I added homepage link, so the home page link repeated on all indexed pages, but google didn't count it and the website technology is angular js thank you for helping me
Intermediate & Advanced SEO | | cafegardesh0 -
Counting over-optimised links - do internal links count too?
To whit: In working out whether I've too many over-optimised links pointing to my homepage, do I look at just external links -- or also the links from my internal pages to my homepage? In other words, can a natural link profile from internal pages help dilute overoptimisation from external links?
Intermediate & Advanced SEO | | Jeepster0 -
Link masking in WordPress
in Wordpress, I want to block Google from crawling my site using the primary navigation. I want to use anchor text links in the body and custom menus in the sidebar to make maximum benefit of the "first link counts" rule. In short, I want to obfuscate all of the links in my primary navigation without using the dreaded nofollow. I do not want to block other links to the pages - body text, custom menus, etc. . This would be site wide. I'd rather not use Ajax or any type of programming unless it's part of a plugin. Can anyone make a simple, Google-friendly suggestion?
Intermediate & Advanced SEO | | CsmBill0 -
Advice on forum links
Hi guys, Looking for some good advice on forum links and there potential negative impact. I am analysing the links of a URL and around 60% of the links are coming from a forum (on a different domain). The forum is very relevant - about the same product he is selling and also has a decent user base. This 60% of links account for roughly 6,500 links. All with different varying keyword anchor text's, and some with excessive usage of a particular keyword anchor text. They are also all do-follow. They are in a mixture of signature links and in post links. The site they link to has been hit by penguin which also has an EMD. MY question is even though these links are relevant and on a good site with good traffic, do you think they have likely been picked up in the penguin algorithm? My initial thought was yes only because they are all do follow and mostly keyword based. But id love to hear thoughts on this as well as possible recovery options, i.e should he remove the forum links, reduce drastically or make them all no follow so traffic can still pass through? Thanks!
Intermediate & Advanced SEO | | ROIcomau0 -
Internal Javascript Links
Hi, We have a client who has internal links pointing to some relatively new pages that we asked them to implement. The problem is that instead of using standard HTML links, their developers have used javascript - e.g. javascript:GoTo... The new pages have links from the homepage (among others) and have been live for about 3-4 weeks now - yet are still to be indexed by Google, Bing & Yahoo. Is it possibe that Javascript links are making them difficult to be found? Thanks in advance for any tips.
Intermediate & Advanced SEO | | jasarrow0 -
Link Architecture - Xenu Link Sleuth Vs Manual Observation Confusion
Hi, I have been asked to complete some SEO contracting work for an e-commerce store. The Navigation looked a bit unclean so I decided to investigate it first. a) Manual Observation Within the catalogue view, I loaded up the page source and hit Ctrl-F and searched "href", turns out there's 750 odd links on this page, and most of the other sub catalogue and product pages also have about 750 links. Ouch! My SEO knowledge is telling me this is non-optimal. b) Link Sleuth I crawled the site with Xenu Link Sleuth and found 10,000+ pages. I exported into Open Calc and ran a pivot table to 'count' the number of pages per 'site level'. The results looked like this - Level Pages 0 1 1 42 2 860 3 3268 Now this looks more like a pyramid. I think is is because Link Sleuth can only read 1 'layer' of the Nav bar at a time - it doesnt 'hover' and read the rest of the nav bar (like what can be found by searching for "href" on the page source). Question: How are search spiders going to read the site? Like in (1) or in (2). Thankyou!
Intermediate & Advanced SEO | | DigitalLeaf0