What is the longest you would go back to ressurrect links that should have been 301's?
-
I have never thought of anything beyond a site that was possibly developed a month or two ago, but an interesting possible client has come along and begs a question.
They had their site "redesigned" in April 2014 and it appears whomever did the work did not realize what a 301 was for. Using ahrefs or MajesticSEO, they have gone from roughly 15,000 referring pages to 500 and the time line perfectly intersects the redesign. Sooooo, just wondering if any of you geniuses has ever gone back that far to try and pull off a 301.... I am actually just thinking of a link building / content marketing plan but thought it was an interesting question.
Thanks for the help,
Robert
-
This link is about using a compass
-
The link in this reply is to a website about "cheap goalkeeper gloves"
-
...a man can dream,....
My favorite is taking the time to explain followed by silence...not awkward at all!
-
Think about it... They did a poor UI/UX site, none or few redirects from a fairly well ranked and high DA/PA site... do you really think they would even consider a custom 404 page? I think we spend more time trying to explain to clients that not everyone gets good design, SEO, etc. No matter what the name of their company says.
Best -
LInda,
I think the point about it being anecdotal really is what I was looking for. There is no clear direction from the search engines on this so that is one of the things that makes Moz so strong. Good SEO's sharing anecdotal and other evidence/ideas.
Thanks so much,Robert
-
Very good point Ash, very good. I have seen it continue to crawl for a year or more as well. Checking for the 404s as a comparison and redirecting to fix the 404's is a good explanation. Well done.
-
Tom,
Great points. I am not as concerned with content relevance as we are fairly careful with that. The issue was it was a new site but used old content and they did not do redirects. I am going to give it a try with what I find to be the most relevant pages from the old, but not with all as I do not want to "overdo" it.
Thanks
-
Great story.
That reminds me of one... I know of a small adsense site that went offline and the owner didn't realize. A few months went by before they realized that the hosting was not responding. The site was brought back online, popped back into the SERPs and resumed making money.
-
When I first started my current job, I found out that in the past there had been a separate, small website for one of the products, which had been abandoned a couple of years before that. (The site, not the product.)
The site still seemed to have some good links pointing to it, so for the heck of it I 301'd it to the main page on the current site for that product. That page quickly grew to be one of the strongest pages on the current site.
This is just one anecdotal data point but based on my experience if it's not a huge amount of work, I'd try redirecting, at least for the pages with the best links.
-
Really hope that they had a custom 404 page at least!
-
What? You callin me a fruit picker???
There are worse things
-
I've been in a similar situation. My recommendation is to look in Google Webmaster Tools in 'Crawl > Crawl Errors' and if it is reporting them as 404 pages, what's the harm in redirecting them?
Google can crawl old URLs which 404 or provide another error for years (as in my case - it was an old website redesign from A LONG time ago).
-
Hi Robert,
I would do an archive.org take a look at the site structure the best you can that way. Then I would figure out if there are any links that are valuable to the site and relevant to the pages that exist.
It is still very risky I have a friend who changed his domain and redirected 35,000 URLs to a site of half 1 million URLs however the links from the old domain that were very high page rank still did not benefit the site very much at all.
I would export from Ahrefs upload to Deep Crawl see the similarities between the old URL's content in the new URLs
considering that they have already been checked and confirmed that the links are not bad.
ttp://moz.com/blog/how-to-fix-crawl-errors-in-google-webmaster-tools
http://www.seoconsultants.com/tools/check-server-headers-tool/
I hope this is of some help,
Tom
-
I don't know the answer to your original question.... but I would be jumping to redirect anything from April 2014.
Nobody really "knows" the answer to this... but I think there is a good chance that google will continue to crawl these connections. Even if some of them are still good.
-
What? You callin me a fruit picker???
Actually I am pondering it quite a bit. Really a shame these people did this to them. And...bad job on the site as we must rebuild.
Thanks Andy.
-
Hi Robert,
I have never gone back that far myself (30-40 days max), but I can see no reason why this isn't worth a shot. There could still be a lot of potential hanging around out there for the grabbing. Grab any low hanging fruit with both hands
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What website changes (technical) SEOs can ignore confidently? Google's perspective!
Hi community members, I am looking after SEO at our company and there are lots of changes happening about our website; especially technical changes. It's hard for me to look after every deployment of the website like change of server location, etc. We generally agree that every change related to website must be notified by SEO to understand the ranking fluctuation and how search engines welcome them. I just wonder what technical deployments of a website I could confidently ignore to save time and give a go ahead to technical team without interrupting or waiting for my approval. Thanks
Web Design | | vtmoz1 -
Do we still have this Page Rank / Link juice / Link equity? So this dilution concept?
Hi all, As per the traditional or standard SEO rules, we have this link juice and dilution concept. Many websites have changed their linking structure with this with the beleif "the more number of pages, the PR will get diluted". Then many websites avoided more number of pages from homepage to avoid link juice dilution. Even we followed same. But I just wonder it's still the same way Google handles websites and rankings as per the links. And many websites even avoid more number of 2nd tier/hierarchy pages to avoid link dilution. I have gone through our competitors where they been employing lot of top level pages like 2nd tier/hierarchy pages but still doing good at rankings. Please share your views and suggestions on this. Thanks
Web Design | | vtmoz0 -
Is there anything wrong to have large number of internal links pointing to homepage? Including links from sub domains or sub directories?
Hi all, Generally more number of internal links will be pointed to homepage. But I see some modern suggestions that too many internal links to homepage are not good. I'm just wondering if most number of internal links pointing to homepage may hurt? Also we have sub domains, can we point a link from every page of sub domain or sub directory to homepage? Usually the answer here is about users. Of course, the content is about same product across all pages. Thanks
Web Design | | vtmoz0 -
I want to make changes, in my site's visual appearence
As we are getting more user to our site. We decided to improve its visual appearance. As of now our site ranked higher around 1 - 5 in google. Does the visual changes affect SEO rank and what about adding subdomains?
Web Design | | FhyzicsBCPL0 -
Is it necessary to Remove 301 redirects from Wordpress after removing the 404 url from Google Webmaster?
There were many 404 urls in my site found by Google Webmaster. I've redirected these urls to the relevant urls with 301 redirect in wordpress. After that I removed these 404 urls from Google Index through Webmaster. "Should I cleanup these 301 redirects from Wordpress or not? ". Help Needed.
Web Design | | SangeetaC0 -
Competitive Analysis: Links & Keywords
I'm noticing that for some key local search terms our company is not ranking in SERPs as I would expect considering it's size relative to the local sites that are ranking. I subscribed to SEOmoz to get a better understanding of what's going on, and haven't figured it out yet. Our site is higher in almost every metric than the sites we're competing with, but our competition consistently ranks higher in organic results for industry standard keywords. The few metrics we're being outranked in are, "Linking C Blocks" and "Page MozTrust" (we're very close to the leader in MozTrust). Are these two metrics enough to account for our companies poor SERP performance or do I need to be paying attention to something else?
Web Design | | thinkWebstoreSEO0 -
Redirecting 301 Redirects -- Will Search Engines Notice?
Hello Mozzers, We're currently evaluating a client site where the previous web developer redesigned the site and got lazy, 301 redirecting hundreds of pages to the home page instead of to their respective new URLs. Ugh. In any case, we will probably fix this for the sake of implementing best practices. But I am curious how search engines treat 301'd URLs, as they are supposed to be permanent redirects. Will search crawlers ever visit the old URLs again to find that we've re-redirected them? Or have they written them off as moved to the home page for good, meaning that there's no way to direct the authority of the previous URLs to their rightful targets? Thanks!
Web Design | | SEOTeamSF0 -
How is link juice split between navigation?
Hey All, I am trying to understand link juice as it relates to duplicate navigation Take for example a site that has a main navigation contained in dropdowns containing 50 links (fully crawl-able and indexable), then in the footer of said page that navigation is repeated so you have a total of 100 links with the same anchor text and url. For simplicity sake will the link juice be divided among those 100 and passed to the corresponding page or does the "1st link rule" still apply and thus only half of the link juice will be passed? What I am getting at is if there was only one navigation menu and the page was passing 50 link juice units then each of the subpages would get passed 1link juice unit right? but if the menu is duplicated than the possible link juice is divided by 100 so only .5 units are being passed through each link. However because there are two links pointing to the same page is there a net of 1 unit? We have several sites that do this for UX reasons but I am trying to figure out how badly this could be hurting us in page sculpting and passing juice to our subpages. Thanks for your help! Cheers.
Web Design | | prima-2535090