Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Will "internal 301s" have any effect on page rank or the way in which an SE see's our site interlinking?
-
We've been forced (for scalability) to completely restructure our website in terms of setting out a hierarchy.
For example - the old structure :
country / city / city area
Where we had about 3500 nicely interlinked pages for relevant things like taxis, hotels, apartments etc in that city :
We needed to change the structure to be :
country / region / area / city / cityarea
So as patr of the change we put in place lots of 301s for the permanent movement of pages to the new structure and then we tried to actually change the physical on-page links too.
Unfortunately we have left a good 600 or 700 links that point to the old pages, but are picked up by the 301 redirect on page, so we're slowly going through them to ensure the links go to the new location directly (not via the 301).
So my question is (sorry for long waffle) :
Whilst it must surely be "best practice" for all on-page links to go directly to the 'right' page, are we harming our own interlinking and even 'page rank' by being tardy in working through them manually?
Thanks for any help anyone can give.
-
Thanks Everett - sorry about delay in coming back to your response.
This 301 issue was one if the things we were worried about (along with a ton of others) so we can at least be a little self-assured that we're prgressing on all fronts and not leaving a gaping problem that will continue to dog us.
Cheers
W
-
I'm just going to answer your question directly. This was your question:
"Whilst it must surely be "best practice" for all on-page links to go directly to the 'right' page, are we harming our own interlinking and even 'page rank' by being tardy in working through them manually?"
Short Answer: As long as you are working to update those internal links, and you have 301 redirects in place during the meantime, you should be fine.
Technically speaking, it is best practice to link directly to the page internally, rather than relying on 301 redirects. Yes, it is true that a very small (very, VERY small so as to be virtually undetectable) amount of pagerank is lost when redirecting, it only becomes an issue when you begin adding redirect on top of redirect. Keeping your house clean, so-to-speak, by not relying on redirects to fix your broken internal links will keep this from happening, and is exactly what the tiny amount of pagerank loss is said to be created for (to discourage webmasters from relying on redirects to fix broken internal links) - if you believe Matt Cutts.
With that said, you may indeed have many other issues to deal with, as do most sites that have a geotargeted, deep URL structure like the one you have outlined. Panda slammed a lot of sites like that pretty hard. But all of that is beyond the scope of this question.
I hope you find whatever is wrong and get your traffic back. Good luck!
-
Hi Chris
Thanks - I 'love' the loose MC videos - "it is - but it isn't an issue".
That was my gut that there may be a temporary loss of link juice, but it would re-adjust after a period. Which means we have other issues.
Cheers
W
-
Thanks for your advice - amended the question so it is simpler to read. sorry about that.
Well that's what I thought - but anecdotal evidence ( as well as past experience ) is making me wonder whether we're losing a significant passing of link juice. We put the 301s in place about 6 or 7 months ago so any loss of link juice between pages should have come back by now.
Maybe we have some other issues?
W
-
Agree with Chris, thumbs up. I would just add that "ideally" you would have manually gone through all the links ahead of time and had the 301s in place prior to launch. That way there is no downtime/confusion to Google on what they are supposed to do with these pages. If you think about it you have 600 pages that are in limbo and so after a while Google will just say, well, I guess those pages are dead and start to crawl them less often and eventually drop them.
I would make it a priority to go through those pages and setup the new 301s ASAP. Google will keep trying a old page for a while (few months) if it 404s or even if you have a 301. It knows that mistakes happen. So in the case of the 301, it will still crawl the old URL for a while even after it sees the 301 the first time, just to make sure that the 301 is really permanent. You have a bit of a grace period so take advantage of it to get things cleaned up quickly.
-
Hiya,
First off let me post this video from Matt Cutts regards to 301 redirects http://www.youtube.com/watch?v=Filv4pP-1nw
As long as the 301 is pointed towards either the same page or a page of equal value (content wise) you should be good. Whilst going through them manually may loose you a bit of rank over time at least you can know you are directing to the correct pages.
short answer
manual - Short term rank loss long term benifit
Auto - visa vesa
Hope this helps
-
Hello,
I don't quite understand your question, if you are adding more category pages, you should have more pages instead of less, just make sure to 301 redirect every single old page and you shouldn't have a problem.
I had to do something similar to one of my sites like 3 months ago and I did loose pagerank on some pages but ranking got better so I wouldn't worry much about pagerank.
Cheers
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google ranking content for phrases that don't exist on-page
I am experiencing an issue with negative keywords, but the “negative” keyword in question isn’t truly negative and is required within the content – the problem is that Google is ranking pages for inaccurate phrases that don’t exist on the page. To explain, this product page (as one of many examples) - https://www.scamblermusic.com/albums/royalty-free-rock-music/ - is optimised for “Royalty free rock music” and it gets a Moz grade of 100. “Royalty free” is the most accurate description of the music (I optimised for “royalty free” instead of “royalty-free” (including a hyphen) because of improved search volume), and there is just one reference to the term “copyrighted” towards the foot of the page – this term is relevant because I need to make the point that the music is licensed, not sold, and the licensee pays for the right to use the music but does not own it (as it remains copyrighted). It turns out however that I appear to need to treat “copyrighted” almost as a negative term because Google isn’t accurately ranking the content. Despite excellent optimisation for “Royalty free rock music” and only one single reference of “copyrighted” within the copy, I am seeing this page (and other album genres) wrongly rank for the following search terms: “free rock music”
On-Page Optimization | | JCN-SBWD
“Copyright free rock music"
“Uncopyrighted rock music”
“Non copyrighted rock music” I understand that pages might rank for “free rock music” because it is part of the “Royalty free rock music” optimisation, what I can’t get my head around is why the page (and similar product pages) are ranking for “Copyright free”, “Uncopyrighted music” and “Non copyrighted music”. “Uncopyrighted” and “Non copyrighted” don’t exist anywhere within the copy or source code – why would Google consider it helpful to rank a page for a search term that doesn’t exist as a complete phrase within the content? By the same logic the page should also wrongly rank for “Skylark rock music” or “Pretzel rock music” as the words “Skylark” and “Pretzel” also feature just once within the content and therefore should generate completely inaccurate results too. To me this demonstrates just how poor Google is when it comes to understanding relevant content and optimization - it's taking part of an optimized term and combining it with just one other single-use word and then inappropriately ranking the page for that completely made up phrase. It’s one thing to misinterpret one reference of the term “copyrighted” and something else entirely to rank a page for completely made up terms such as “Uncopyrighted” and “Non copyrighted”. It almost makes me think that I’ve got a better chance of accurately ranking content if I buy a goat, shove a cigar up its backside, and sacrifice it in the name of the great god Google! Any advice (about wrongly attributed negative keywords, not goat sacrifice ) would be most welcome.0 -
Reducing number of site pages?
Hi, I am looking through my site structure and I have a lot of pages left over from the days of article keywords. Probably 7 or 8 years ago, someone sold my husband on article key word pages. I have slowly gotten rid of a lot of them as they have fallen out out of the ranks. I would like to get rid of the rest, probably 5 or 6 pages. Will it hurt my rankings to delete pages and redirect them? My customers really like the simplicity of our site and I want to keep it that way, plus clean up flags that Moz is telling me is a problem. I think its easier to keep less pages top notch than have to worry with a lot of them. Especially since my customers aren't viewing them. Thanks in advance!
On-Page Optimization | | CalicoKitty20000 -
How to change images of a page without loosing ranking?
Hi, I have two reasons to change some images of a page on a wordpress site: 1.Google speed service advise me to optmize the images size to better spead load times. 2.I want to change images titles (to improve seo optimization for the page keyword), so i need to replace them, since im using wordpress. Now the question is: Can i just change the images without worring about any related seo issues? Or should i follow some best practice to change images in order to not affect the ranking of the page? tx for your support!
On-Page Optimization | | Dreamrealemedia0 -
Why do I have 2 different URL's for the same page - is this good practice?
Hi GuysMy father is currently using a programmer to build his new site. Knowing a little about SEO etc, I was a little suspicious of the work carried out. **Anyone with good programming and SEO knowledge, please offer your advice!**This page http://www.thewoodgalleries.co.uk/gallery-range-wood-flooring/ which is soon to be http://www.thewoodgalleries.co.uk/engineered-wood/ you'll see has a number of different products. The products on this particular page have been built into colour categories like thishttp://www.thewoodgalleries.co.uk/engineered-wood/lights-greys http://www.thewoodgalleries.co.uk/engineered-wood/beiges http://www.thewoodgalleries.co.uk/engineered-wood/browns http://www.thewoodgalleries.co.uk/engineered-wood/darks-blacks This is fine. Eventually when we add to our selection of woods, we'll easily segment each product into "colour categories" for users to easily navigate to. My question is - Why do I have 2 different URL's for the same page - is this good practice? Please see below... Visible URL - http://www.thewoodgalleries.co.uk/engineered-wood/browns/cipressa/Below is the permalink seen in Word Press for this page also.Permalink: http://www.thewoodgalleries.co.uk/engineered-wood/browns-engineered-wood/cipressa/and in the Word Press snippet shows the same permalink urlCipressa | Engineered Brown Wood | The Wood Gallerieswww.thewoodgalleries.co.uk/engineered-wood/browns-engineered-wood/cipressa/ Buy Cipressa Engineered Brown Wood, available at The Wood Galleries, London. Provides an Exceptional Foundation for Elegant Décor, Extravagant .. If this is completely ok and has no negative search impact - then I'm happy. If not what should I advise to my programmer to do? Your help would be very much appreciated. Regards Faye
On-Page Optimization | | Faye2340 -
Any SEO effect(s) / impact of Meta No Cache?
Hi SEOMoz Guys, Hope you guys are doing well. I've been searching online and bumped into this archived page (http://www.seomoz.org/qa/view/34982/meta-nocache-affect-ranking). I would like to get an updated take on this issue whether or not the meta no cache code on a page bears negative/positive or no SEO impact / effect. <meta http-equiv="Pragma" content="no-cache" /> <meta http-equiv="Cache-Control" content="no-cache"/> Thanks! Steve
On-Page Optimization | | sjcbayona-412182 -
Google Page Rank of my site has dropped from 4/10 to 3/10
Google Page rank of my website has been dropped after Panda Update. Can anyone help me out to tell me the possible reasons about the same. We have tried to make our website more lively and user friendly. We have indulged some graphics to make it more attractive. But it seems it backfired us. my site is http://www.myrealdata.com as well as Google page ranking of my Quickbooks hosting page has been dropped as well. It would be great if someone can help me out with expert suggestions.
On-Page Optimization | | SangeetaC1 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5 -
Post Title - Use the blog's name or not?
In the tile of my post, shoudl I used my blog's name in it at the end or emit the blog name. EX: title of post with keywords | name of blog OR EX: title of post with keywords The site's name is 3 words long, so I'm worrying that those extra words are diluting the keywords in the post's name that I'm trying to target.
On-Page Optimization | | gregalam0