Will "internal 301s" have any effect on page rank or the way in which an SE see's our site interlinking?
-
We've been forced (for scalability) to completely restructure our website in terms of setting out a hierarchy.
For example - the old structure :
country / city / city area
Where we had about 3500 nicely interlinked pages for relevant things like taxis, hotels, apartments etc in that city :
We needed to change the structure to be :
country / region / area / city / cityarea
So as patr of the change we put in place lots of 301s for the permanent movement of pages to the new structure and then we tried to actually change the physical on-page links too.
Unfortunately we have left a good 600 or 700 links that point to the old pages, but are picked up by the 301 redirect on page, so we're slowly going through them to ensure the links go to the new location directly (not via the 301).
So my question is (sorry for long waffle) :
Whilst it must surely be "best practice" for all on-page links to go directly to the 'right' page, are we harming our own interlinking and even 'page rank' by being tardy in working through them manually?
Thanks for any help anyone can give.
-
Thanks Everett - sorry about delay in coming back to your response.
This 301 issue was one if the things we were worried about (along with a ton of others) so we can at least be a little self-assured that we're prgressing on all fronts and not leaving a gaping problem that will continue to dog us.
Cheers
W
-
I'm just going to answer your question directly. This was your question:
"Whilst it must surely be "best practice" for all on-page links to go directly to the 'right' page, are we harming our own interlinking and even 'page rank' by being tardy in working through them manually?"
Short Answer: As long as you are working to update those internal links, and you have 301 redirects in place during the meantime, you should be fine.
Technically speaking, it is best practice to link directly to the page internally, rather than relying on 301 redirects. Yes, it is true that a very small (very, VERY small so as to be virtually undetectable) amount of pagerank is lost when redirecting, it only becomes an issue when you begin adding redirect on top of redirect. Keeping your house clean, so-to-speak, by not relying on redirects to fix your broken internal links will keep this from happening, and is exactly what the tiny amount of pagerank loss is said to be created for (to discourage webmasters from relying on redirects to fix broken internal links) - if you believe Matt Cutts.
With that said, you may indeed have many other issues to deal with, as do most sites that have a geotargeted, deep URL structure like the one you have outlined. Panda slammed a lot of sites like that pretty hard. But all of that is beyond the scope of this question.
I hope you find whatever is wrong and get your traffic back. Good luck!
-
Hi Chris
Thanks - I 'love' the loose MC videos - "it is - but it isn't an issue".
That was my gut that there may be a temporary loss of link juice, but it would re-adjust after a period. Which means we have other issues.
Cheers
W
-
Thanks for your advice - amended the question so it is simpler to read. sorry about that.
Well that's what I thought - but anecdotal evidence ( as well as past experience ) is making me wonder whether we're losing a significant passing of link juice. We put the 301s in place about 6 or 7 months ago so any loss of link juice between pages should have come back by now.
Maybe we have some other issues?
W
-
Agree with Chris, thumbs up. I would just add that "ideally" you would have manually gone through all the links ahead of time and had the 301s in place prior to launch. That way there is no downtime/confusion to Google on what they are supposed to do with these pages. If you think about it you have 600 pages that are in limbo and so after a while Google will just say, well, I guess those pages are dead and start to crawl them less often and eventually drop them.
I would make it a priority to go through those pages and setup the new 301s ASAP. Google will keep trying a old page for a while (few months) if it 404s or even if you have a 301. It knows that mistakes happen. So in the case of the 301, it will still crawl the old URL for a while even after it sees the 301 the first time, just to make sure that the 301 is really permanent. You have a bit of a grace period so take advantage of it to get things cleaned up quickly.
-
Hiya,
First off let me post this video from Matt Cutts regards to 301 redirects http://www.youtube.com/watch?v=Filv4pP-1nw
As long as the 301 is pointed towards either the same page or a page of equal value (content wise) you should be good. Whilst going through them manually may loose you a bit of rank over time at least you can know you are directing to the correct pages.
short answer
manual - Short term rank loss long term benifit
Auto - visa vesa
Hope this helps
-
Hello,
I don't quite understand your question, if you are adding more category pages, you should have more pages instead of less, just make sure to 301 redirect every single old page and you shouldn't have a problem.
I had to do something similar to one of my sites like 3 months ago and I did loose pagerank on some pages but ranking got better so I wouldn't worry much about pagerank.
Cheers
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
'Results pages'
Hey there, I have a website that my client is update it every day with some 'results' for example - see attached image.
On-Page Optimization | | JohnPalmer
What is the best way to avoid 7 duplicate content pages every day? MgYlqFW.png0 -
Does a JS script who scroll automaticaly into pages could make some content "hidden" ?
Hello everybody, Sorry for my english (I'm French), I will try to do my best... We've got an e-commerce website : kumulusvape.fr
On-Page Optimization | | KumulusVape
On each categories, to improve our conversion rate, we put a javascript to automaticaly scroll into the page to the product list. You can see an example here : http://www.kumulusvape.fr/44-e-liquide-savourea-smookies This script scroll and make some content "hidden".
It's not really a scroll, just changing page position. Do you think that our h1 and our category content could be consider "hidden" by Google ? Thank you very much for your help0 -
Strange SERP's descriptions
Hey, when I googled one of our products i came up with this strange result, see attachment. I searched for: kurs praktische psychologie on google germany. These words also come up in the meta description of this page:** Praktische Psychologie** Fernkurs mit professioneller Betreuung. Testen Sie den praxisorientierten Kurs über die Grundlagen der Psychologie 4 Wochen kostenlos. and in the body: _Sie glauben der Mensch lässt sich trotz all seiner Facetten durchschauen, wenn man sich nur Mühe gibt ihn zu verstehen? Da liegen Sie vollkommen richtig! Der Kurs "Praktische Psychologie" vermittelt Ihnen hierfür alle Kenntnisse und Fähigkeiten, sodass Sie schon bald das Mysterium Mensch ergründen. _ Why is Google still showing this description which i obviously don't want to be shown, and why does it state _spring naar (jump to) Kursgeburh _and how can i avoid this? yd1DStW
On-Page Optimization | | NHA_DistanceLearning0 -
I've just manually edited all the page titles and meta descriptions on a site, when will this show in Google results?
I've just manually edited all of the page titles, meta descriptions and optimised the copy on a client's site. I submitted this for a new crawl on Google via Webmaster Tools but when I do a Google search the old versions are still showing. Will it still take a few weeks for the new versions to show even though Google has crawled it via Webmaster?
On-Page Optimization | | aoifep0 -
Duplicate Content for Men's and Women's Version of Site
So, we're a service where you can book different hairdressing services from a number of different salons (site being worked on). We're doing both a male and female version of the site on the same domain which users are can select between on the homepage. The differences are largely cosmetic (allowing the designers to be more creative and have a bit of fun and to also have dedicated male grooming landing pages), but I was wondering about duplicate pages. While most of the pages on each version of the site will be unique (i.e. [male service] in [location] vs [female service] in [location] with the female taking precedent when there are duplicates), what should we do about the likes of the "About" page? Pages like this would both be unique in wording but essentially offer the same information and does it make sense to to index two different "About" pages, even if the titles vary? My question is whether, for these duplicate pages, you would set the more popular one as the preferred version canonically, leave them both to be indexed or noindex the lesser version entirely? Hope this makes sense, thanks!
On-Page Optimization | | LeahHutcheon0 -
Replacing "_" with "-" in url, results in new url?
We ran SEOmoz's "On-Page Optimization" tool on a url which contains the character "_". According to the tool: "Characters which are less commonly used in URLs may cause problems with accessibility, interpretation and ranking in search engines. It is considered a best practice to stick to standard URL structures to avoid potential problems." "Rewrite the URL to contain only standard characters." Therefore we will rewrite the url, replacing "_" with "-". Will search engines consider the "-" url a different one? Do we need to 301 the old url to the new one? Thanks for your help!
On-Page Optimization | | gerardoH0 -
Internal Links & Title Tags, Which Page Benifits?
The best way I can explain why question is with an example. Lets say I have a parent parent page that is focusing on a broad keyword.
On-Page Optimization | | donford
I also have a sub-page which is focused more on long-tail keyword variations. When I make an internal link and give it a title tag, should I give it the long-tail keyword for the juice, or should I use the broad keyword for the parent page's relevancy? Thanks for any help, advise or pointers.0 -
Site URL's
We are redeveloping our website, and have the option to amend URLs (with 301 redirects from old URL to new), so my question is: Would 'golfsite.com/golf-clubs' achieve superior rankings than 'golfsite.com/clubs' for the search term 'golf clubs' if all other factors were the same? Should the URL reflect the intended search term wherever possible?
On-Page Optimization | | swgolf1230