Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Link Age as SEO factor?
-
Hi Guys
I have a client who ranks well within a competitive sector of the travel industry. They are planning CMS move which will involve changing from .cfm to .aspxWe will be doing the standard redirects etc
However Matt's statement here on 301 redirects got me thinking
http://www.youtube.com/watch?v=zW5UL3lzBOA&t=0m24sHe says that basically you loose a bit of page rank when you do a 301 redirect.
Now, we will be potentially redirecting 1000s of links and my thinking is 'a lot of a little, adds up to a lot' In other words, 1000s of redirects may have a big enough impact to loose some rankings in a very competitive and aggressive space.
So recommended that we contact the sites who has the link highest value and ask them to manually change the links from cfm to aspx. This will then mean that there are no loss value as with a 301 redirect.
-But now I have another dilemma which I'm unsure about.
So the main question:
Is link age factor in rankings ?If I update any links, this will make said link new to Google, so if link age is a factor, would this also lessen the value passed initially?
-
Do you have the option of not displaying the extension on your URL? That way no matter what underlying language you use, you have the same URL and don't have to worry about updating links in the future.
-
The dev team is aware of the duplicate posting issue. I delete duplicate posts when I see them, but occasionally get errors myself.
-
Link age is not a factor.
Strength of domain/page the link is coming from is a factor. (would you want a day old link from the front page of SEOmoz, or a two year old link from your buddies blog?)
The only reason link age is thought to be a factor is that the older the link, the older the page the link is coming from, and the older the page the more time it has had to build more authority, thus pass more juice.
Great idea on getting those links manually changed!!!
-
Ho ho ho! Very whimsical indeed
For your sanity you should know there has been issues with this for all of us recently - and Delete Reply doesn't work
( Hmmm, I wonder if SEOMoz will get penalised for all of this duplicate content??? )
-
From a developer's point of view: If you do not already have the new system in place, I would suggest a MVC move rather than aspx on the dotnet platform and put a cfm handler in place to map the pages at the controller level. Goggle will not know know there has been a change and you site will perform much faster. Microsoft is tending to move away from the aspx to the more structured mvc version anyway.
-
apparently I was quick enough to answer it twice before you could answer once
I tried to delete the 2nd post but I get a site error. C'est la vie!
-
Update your links to get back 100% of your link juice back.
See here for more info: http://www.seomoz.org/qa/view/48932/link-age-vs-domain-age
EDIT: DANG! How quick are you Ryan ;o) Were you at Vivid Lime's house when he started writing the question?!?
-
If I update any links, this will make said link new to Google, so if link age is a factor, would this also lessen the value passed initially?
My answer to you is NO. But you should be aware there is at least some dispute on the topic.
The SEOmoz point of view (which I agree with): http://www.seomoz.org/blog/age-of-site-and-old-links-whiteboard-friday
Another point of view: http://www.the-system.org/2011/01/google-algorithm-change-attacks-spam-or-does-it/
-
If I update any links, this will make said link new to Google, so if link age is a factor, would this also lessen the value passed initially?
My answer to you is NO. But you should be aware there is at least some dispute on the topic.
The SEOmoz point of view (which I agree with): http://www.seomoz.org/blog/age-of-site-and-old-links-whiteboard-friday
Another point of view: http://www.the-system.org/2011/01/google-algorithm-change-attacks-spam-or-does-it/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Many Links to Disavow at Once When Link Profile is Very Spammy?
We are using link detox (Link Research Tools) to evaluate our domain for bad links. We ran a Domain-wide Link Detox Risk report. The reports showed a "High Domain DETOX RISK" with the following results: -42% (292) of backlinks with a high or above average detox risk
Intermediate & Advanced SEO | | Kingalan1
-8% (52) of backlinks with an average of below above average detox risk
-12% (81) of backlinks with a low or very low detox risk
-38% (264) of backlinks were reported as disavowed. This look like a pretty bad link profile. Additionally, more than 500 of the 689 backlinks are "404 Not Found", "403 Forbidden", "410 Gone", "503 Service Unavailable". Is it safe to disavow these? Could Google be penalizing us for them> I would like to disavow the bad links, however my concern is that there are so few good links that removing bad links will kill link juice and really damage our ranking and traffic. The site still ranks for terms that are not very competitive. We receive about 230 organic visits a week. Assuming we need to disavow about 292 links, would it be safer to disavow 25 per month while we are building new links so we do not radically shift the link profile all at once? Also, many of the bad links are 404 errors or page not found errors. Would it be OK to run a disavow of these all at once? Any risk to that? Would we be better just to build links and leave the bad links ups? Alternatively, would disavowing the bad links potentially help our traffic? It just seems risky because the overwhelming majority of links are bad.0 -
SEO on dynamic website
Hi. I am hoping you can advise. I have a client in one of my training groups and their site is a golf booking engine where all pages are dynamically created based on parameters used in their website search. They want to know what is the best thing to do for SEO. They have some landing pages that Google can see but there is only a small bit of text at the top and the rest of the page is dynamically created. I have advised that they should create landing pages for each of their locations and clubs and use canonicals to handle what Google indexes.Is this the right advice or should they noindex? Thanks S
Intermediate & Advanced SEO | | bedynamic0 -
Deep linking with redirects & building SEO
Hi there. I'm using deep linking with unique URL's that redirect to our website homepage or app (depending on whether the user accesses the link from an iphone or computer) as a way to track attribution and purchases. I'm wondering whether using links that redirect negatively affects our SEO? Is the homepage still building SEO rank despite the redirects? I appreciate your time & thanks for your help.
Intermediate & Advanced SEO | | L_M_SEO0 -
SEO Impact of High Volume Vertical and Horizontal Internal Linking
Hello Everyone - I maintain a site with over a million distinct pages of content. Each piece of content can be thought of like a node in graph database or an entity. While there is a bit of natural hierarchy, every single entity can be related to one or more other entities. The conceptual structure of the entities like so: Agency - A top level business unit ( ~100 pages/urls) Office - A lower level business unit, part of an Agency ( ~5,000 pages/urls) Person - Someone who works in one or more Offices ( ~80,000 pages/urls) Project - A thing one or more People is managing ( ~750,000 pages/urls) Vendor - A company that is working on one or more Projects ( ~250,000 pages/urls) Category - A descriptive entity, defining one or more Projects ( ~1,000 pages/urls) Each of these six entities has a unique (url) and content. For each page/url, there are internal links to each of the related entity pages. For example, if a user is looking at a Project page/url, there will be an internal link to one or more Agencies, Offices, People, Vendors, and Categories. Also, a Project will have links to similar Projects. This same theory holds true for all other entities as well. People pages link to their related Agencies, Offices, Projects, Vendors, etc, etc. If you start to do the math, there are tons of internal links leading to pages with tons of internal links leading to pages with tons of internal links. While our users enjoy the ability to navigate this world according to these relationships, I am curious if we should force a more strict hierarchy for SEO purposes. Essentially, does it make sense to "nofollow" all of the horizontal internal links for a given entity page/url? For search engine indexing purposes, we have legit sitemaps that give a simple vertical hierarchy...but I am curious if all of this internal linking should be hidden via nofollow...? Thanks in advance!
Intermediate & Advanced SEO | | jhariani2 -
URL Value: Menu Links vs Body Content Links
Hi All, I'm a little confused. I have read a number of articles from authority sites that give mixed signals over the importance of menu links vs body content links. It is suggested that whilst all menu links spread link juice equally, Google does not see them as favourably. Inserting a link within the body will add more link juice value to the desired page. Any thoughts would be appreciated. Thanks Mark
Intermediate & Advanced SEO | | Mark_Ch0 -
Where to link to HTML Sitemap?
After searching this morning and finding unclear answers I decided to ask my SEOmoz friends a few questions. Should you have an HTML sitemap? If so, where should you link to the HTML sitemap from? Should you use a noindex, follow tag? Thank you
Intermediate & Advanced SEO | | cprodigy290 -
Increasing Internal Links But Avoiding a Link Farm
I'm looking to create a page about Widgets and all of the more specific names for Widgets we sell: ABC Brand Widgets, XYZ Brand Widgets, Big Widgets, Small Widgets, Green Widgets, Blue Widgets, etc. I'd like my Widget page to give a brief explanation about each kind of Widget with a link deeper into my site that gives more detail and allows you to purchase. The problem is I have a lot of Widgets and this could get messy: ABC Green Widgets, Small XYZ Widgets, many combinations. I can see my Widget page teetering on being a link farm if I start throwing in all of these combos. So where should I stop? How much do I do? I've read more than 100 links on a page being considered a link farm, is that a hardline number or a general guideline?
Intermediate & Advanced SEO | | rball10