Varying Internal Link Anchor Text with Each New Page Load
-
I'm asking for people's opinions on varying internal anchor text. Before you jump in and say, "Oh yes, varying your anchor text is always a good idea", let me explain.
I'm not talking about varying anchor text on different links scattered throughout a site. We all know that is a wise thing to do for a variety of reasons that have been covered in many places. What I'm talking about is including semi-useful links below the fold and then varying the anchor text with each page load. Each time Googlebot crawls a page, it sees different anchor text for each link. That way, Googlebot is seeing, for example, 'san diego bars', 'taverns in san diego', 'san diego clubs', and 'pubs in san diego' all pointing to a San Diego bar/tavern/club/pub page.
I'm wondering if there is value in this approach. Will it help a site rank well for multiple search queries? Could it potentially be better than static anchor text as it may help Google better understand the targeted page? Is it a good way to protect a large site with a huge number of internal links from Penguin?
To summarize, we're talking about the impact of varying the anchor text on a single page with each page load as opposed to varying the anchor text on different pages.
Thoughts?
-
Thanks for everyone's input!
Without pointing any fingers, let's just say this is happening in the wild right now. It came as a bit of a surprise to me as I wouldn't expect Google to be fooled into ranking a site better for multiple keywords based on dynamic internal anchor text. To be clear, I have no evidence this technique is helping or that the motivation is to game Google for better rankings, but I haven't come up with any other reason.
If it is working, I must admit, it's pretty clever...
-
I would say test it out and see what happens. I would love to know the result. ( youmoz post perhaps ? )
what I assume would happen :
The new link only counts when G-bot crawls the page ( and obviously not on each page load ), and each time Gbot crawls the page it will see that an old link is dropped and a new one is added. So what ever value you gain from the new link , you will lose from the old one which is no longer there. So I really don't see the value to be had from an SEO point of view . But repeat visitors to you page may click through to those pages. ( Again testing it will give you solid proof )
-
What comes to me is this: I don't think you'll get the value out of links with dynamic anchor text that you would get with anchor text that is static. A page's overall value and the value it passes on to other page via links is iterative--it's not assigned after just a single pass of the bot. The dynamism would devalue the links, if not render them worthless all together.
And even if you had one thousand variations of anchor texts for each link and they did pass some sort of value, what do you think that footprint would look like after a year or two of google crawls? Upon a manual review, someone there would say, "Huh, look at this, their links change all the time and each one is focused around a specific money term--I think it's obvious that they're trying to manipulate their rankings. Smack--here's a penalty for you."
-
Oh yes, varying your...oh wait sorry you didn't want that haha.
Erm this is an interesting idea - on first read my first thought was you're trying to game the system and that's never a good idea.
Then I thought a little more and I suppose it is very similar to dynamic content such as offers on your linking page, although it always points at one location.
I suppose it is only similar to changing your anchor text manually to see what works best, but I think that such frequent changes could end up getting noticed - a link anchor changing every time Google visits - surely Google is clever enough to notice this pattern and doesn't it smack of over-optimisation?
I bet others have already tried this - have you done any digging to see if you can find out what the impact was?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New link explorer
I was checking this new tool which is really cool by the way and was wondering if I can outrank big guys with just content. I have a Domain authority of 28 with a spam score of 28 % Can I outrank with amazing content a site that hase a domain authority of 50 and a spam score of 1 % ? Should I ask for all my bad links to be removed so that my spam score goes down or doesn't it matter anymore those days and what matters is good content, link just don't count anymore ? Thank you,
Intermediate & Advanced SEO | | seoanalytics1 -
How important is my code to text ratio on web pages?
I've been getting mixed reviews on this and I'm trying to figure out whether this is something I should be concerned with. We have a higher than recommended amount of code, relative to "content" but not by a crazy amount. Thanks!
Intermediate & Advanced SEO | | absoauto0 -
What is the best strategy for linking to sub category pages?
My site is set up like this (i have x6 categories and all are similar) Home Page - Category - sub category - X4 detail pages My category page provides a summary/introduction of the subject, my sub category page is the "money page" with ability to quote & buy - my detail pages provide supporting material. What is the best internal linking strategy between these pages? (in addition, in one category i have x6 sub categories but only one of them is a "money page", should i be linking all of these pages back to the money page?) Thanks Ash
Intermediate & Advanced SEO | | AshShep10 -
Will a Google manual action affect all new links, too?
I have had a Google manual action (Unnatural links to your site; affects: all) that was spurred on by a PRWeb press release where publishers took it upon themselves to remove the embedded "nofollow" tags on links. I have been spending the past few weeks cleaning things up and have submitted a second pass at a reconsideration request. In the meantime, I have been creating new content, boosting social activity, guest blogging and working with other publishers to generate more natural inbound links. My question is this: knowing that this manual action affects "all," are the new links that I am building being negatively tainted as well? When the penalty is lifted, will they regain their strength? Is there any hope of my rankings improving while the penalty is in effect?
Intermediate & Advanced SEO | | barberm1 -
How Can Google tell, if a anchor text is exact match
So, I was thinking to myself today. Couldn't Google say everything is an exact match anchor text in reality? Such as, Hyundai in Boston, Or cars in boston? I'm just concerned, that's all. Thanks for your help.
Intermediate & Advanced SEO | | PeterRota0 -
On site links triggering anchor text algorithmic penatly?
I'm trying to figure out why a drop in ranking occurred and think it may be related to an increase in on site links. I've attached images of the SEO moz report showing a jump in links from a few hundred to around 15,000 within the space of a week. I think this may be due to some on site work I did when I created categories (I use wordpress) for a large number of cities and towns in the UK. I soon realised I'd run into duplicate content issues and removed all these categories within a few days. As I added categories I also ran into 'too many on page links' warnings as each category I added created a new link and I ended up with hundreds on each page. If you look at the analytics reports I suffered a huge drop in rankings on the 10th March and think this could be due to an on site anchor text problem that was caused by adding the categories and in turn creating many on site links. SEO moz found these links on the 11th and 25th Feb but my guess is that Google found them around at the same time but if these links are the problem then why didn't my rankings drop until the 10th March? Surely they would have dropped sooner? Would this cause a drop in rankings? I've recieved an email from google saying that no manual penalty was applied to the site after I submitted a reconsideration request. Therefore it must be some kind of algorithmic penalty. Could this be the problem and if not what else should I look at. My baclink profile appears to be okay and I've been careful to vary my anchor text with inbound link building. I'm at a loss as to what to do next. Any help will be much appreciated! UXsMLYS.png Ov9AOs8.png
Intermediate & Advanced SEO | | SamCUK0 -
Convert keyword rich PDFs to web pages (text & images)
SteriPEN is a portable water purifier that kills viruses, protozoa, e-coli, etc. Because of the technical and safety requirements nature of the product, our website has much documentation of testing, organisms affected, and more. These are in pdf form and can often be found through google search (and through links on specific pages). Because of the keyword-richness of these documents pertaining to microbes SteriPEN kills, etc. does it make sense to convert these pdf's into html text and images? Then I was thinking perhaps writing a blog post AND generating key links on important landing pages to these documents (as html). Removing pdfs may be harmful? Not a clue as to the cost/benefit.
Intermediate & Advanced SEO | | Timmmmy0 -
Pagination and links per page issue.
Hi all, I have a listings based website that just doesn't seem to want to pass rank to the inner pages. See here for an example: http://www.business4sale.co.uk/Buy/Hotels-For-Sale-in-the-UK I know that there are far too many links on this page and I am working on reducing the number by altering my grid classes to output fewer links. The page also displays a number of links to other page numbers for these results. My script adds the string " - Page2" to the end of the title, description and URL when the user clicks on page two of these results. My question is: Would an excessive amount(200+) of links on a page result in less PR being passed to this page(looking spammy)? And would using rel canonical on page numbers greater than 1 result in better trust/ranking? Thanks in advance.
Intermediate & Advanced SEO | | Mulith0