Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Link flow for multiple links to same URL
-
Hi there,
my question is as follows:How does Google handle link flow if two links in a given page point to the same URL? (do they flow link individually or not?)
This seems to be a newbie question, but actually it seems that there is little evidence and even also little consensus in the SEO community about this detail.
- Answers should include source
- Information about the current state of art at Google is preferable
- The question is not about anchor text, general best practises for linking, "PageRank is dead" etc.
We do know that the "historical" PageRank was implemented (a long time ago) without special handling for multiple links, as e.g. last stated by Matt Cutts in this video: http://searchengineland.com/googles-matt-cutts-one-page-two-links-page-counted-first-link-192718
On the other hand, many people from the SEO community say that only the first link counts. But so far I could not find any data to back this up, which is quite surprising.
-
I totally agree on the focus thing in general - it's not helpful to act with PageRank in mind when it comes to layout decisions etc.
But: For large websites (e.g. 100,000 pages and up) crawl rate, indexing and rankings of deeper parts of the site depend heavily on the internal link graph. Taking a deeper look at the internal link graph gives us a lot of useful information in these cases, does it?
Now: Think of links sitting in a template that gets used on 50,000 pages. A little change here is likely to cause quite a difference in the internal link graph.
For example I've run PageRank simulations with both models on a smaller website with only 1,500 pages / 100,000 links. For many pages, the little difference ends up with 20-30% more or less internal PageRank - for these individual pages, this could be crucial for crawling, indexation and rankings. Still not useful?
Since moz runs it's own iterative PR like algorithms: How do you guys handle this with mozRank / mozTrust? Which model leads to better correlations with rankings?
-
- The links both get PageRank flow...
- The link value gets divided, though, so it wouldn't exactly double the value.
- The link extraction process might choose to only select one link from the page based on certain factors (perhaps ignoring some links not because they are duplicative but based on location, or other qualifiers)
Here is Matt Cutts talking about this very issue. And here again. It is the closest thing we have to an answer.
I think the reason for the "first link counts" is really an extension of an understanding of PageRank. Let's say a page has 1 outbound link. It gets 100% of the value passable by that page. Now, let's say the page adds another link, but it is the exact same link. Now, each link gets 50%. The sum total is 100%. It is as if the 2nd link were never added. But, this calculation changes depending on the other links on the page. Let's say a page has 2 links on it. One to you, one to someone else. 50/50. If you get another, you jump to 67/33. Slightly better. As the page increases in number of links, your additional link approaches a doubling of the first link's value. So on one end of the spectrum it is valueless. On the other end of the spectrum it doubles.
The other question is whether anchor text is counted for all links. Some experimentation indicates that only the 1st anchor text matters. This might also indicate the selection / extraction process mentioned in #2.
That all being said, I think I agree with Matt Cutts on this one. This is such a small issue that you really should focus on bigger picture stuff. It is interesting, yes, but not particularly useful.
I hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do to index all my links of my website?
Ok, i have a new website, with only 14.000 page indexed by google, but the potential is big, 1-2 million pages. What i have to do, to force somehow google to index my website faster? This is my website: https://vmag.ro/
On-Page Optimization | | TeodorMarin0 -
To NoFollow or to NoIndex internal links
I all, I have recently taken over a fairly large e-commerce site that I am trying to "fix" and have come across something that I need a second opinion on. A Semrush audit has revealed that there are a heck of a lot of internal nofollow links (over 90 000) that point to predominantly 4 pages from the Header of each page in the site, these are change currency pages to show clients different currencies and a members login page. The pages are: /?action=changecurrency¤cy=EUR /?action=changecurrency¤cy=USD /?action=changecurrency¤cy=GBP /members/ My opinion is that these pages should just be no index pages and they should be followed. instead of being indexed and no followed? Any thoughts on this out there?
On-Page Optimization | | cradut0 -
Multiple domains for the same business
My client purchased over 500 URLs for targeting various customers and ranking for different keywords. It is for the same business though. What is the best strategy to deal with this kind of approach in your opinion. They use different meta data for each of the URLs starting with brand name in meta title. Are there any other points to keep in mind when developing strategy for all those URLs. Is this a good approach?
On-Page Optimization | | alicaomisem1 -
How to Structure URL's for Multiple Locations
We are currently undergoing a site redesign and are trying to figure out the best way to structure the URL's and breadcrumbs for our many locations. We currently have 60 locations nationwide and our URL structure is as follows: www.mydomain.com/locations/{location} Where {location} is the specific street the location is on or the neighborhood the location is in. (i.e. www.mydomain.com/locations/waterford-lakes) The issue is, {location} is usually too specific and is not a broad enough keyword. The location "Waterford-Lakes" is in Orlando and "Orlando" is the important keyword, not " Waterford Lakes". To address this, we want to introduce state and city pages. Each state and city page would link to each location within that state or city (i.e. an Orlando page with links to "Waterford Lakes", "Lake Nona", "South Orlando", etc.). The question is how to structure this. Option 1 Use the our existing URL and breadcrumb structure (www.mydomain.com/locations/{location}) and add state and city pages outside the URL path: www.mydomain.com/{area} www.mydomain.com/{state} Option 2 Build the city and state pages into the URL and breadcrumb path: www.mydomain.com/locations/{state}/{area}/{location} (i.e www.mydomain.com/locations/fl/orlando/waterford-lakes) Any insight is much appreciated. Thanks!
On-Page Optimization | | uBreakiFix0 -
Multiple Cities in Title Tag
My question is how to avoid having a spammy title. Currently I'm working on a project where a business serves four cities, but two of them are out of its home state. I'm trying to create a title tag that is appealing to the eyes, and meets what I need it to do at the same time. I was wondering what everyone though of this sample Brand X Dealer Serving Newark, DE; New Castle, DE; Glens Mills, PA; and Springfield, PA I know that too much repetition can be a bad thing, but this might not be a big deal since they are separate instances. Let me know what you all think. Thanks!
On-Page Optimization | | OOMDODigital0 -
URL best practices, use folders or not ?
Hi I have a question about URLs. Client have all URL written after domain and have only one / slash in all URLs. Is this best practice or i need to use categories,folders? Thanks
On-Page Optimization | | 77Agency0 -
How long is too long for domain URL length?
I noticed one of the negatively correlated ranking factors was length of URL. I'm building a page from scratch, we are trying to rank for 'Minneapolis Fitness' and 'Minneapolis Massage'. Is www.minnnepolismassageandfitness.com just ridiculously long? Or does the exact match outweigh the penalty for URL length?
On-Page Optimization | | JesseCWalker2 -
Prevent link juice to flow on low-value pages
Hello there! Most of the websites have links to low-value pages in their main navigation (header or footer)... thus, available through every other pages. I especially think about "Conditions of Use" or "Privacy Notice" pages, which have no value for SEO. What I would like, is to prevent link juice to flow into those pages... but still keep the links for visitors. What is the best way to achieve this? Put a rel="nofollow" attribute on those links? Put a "robots" meta tag containing "noindex,nofollow" on those pages? Put a "Disallow" for those pages in a "robots.txt" file? Use efficient Javascript links? (that crawlers won't be able to follow)
On-Page Optimization | | jonigunneweg0