Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Does page "depth" matter
-
Would it have a negative effect on SEO to have a link from the home page to this page...
http://www.website/com/page1deep/page2deep
rather than to this page
http://www.website/com/page1deep
I'm hoping that made some sense. If not I'll try to clarify.
Thanks,
Mark
-
I had a quick scan thought the article and it looks likethey are talking from a usability aspect. I am talking from a link juice aspect. Page rank passes only 85% thought a link, so if the home page passes PR of 1 thought each of its likes, the a page one click away only gets 0.85, 2 clicks 0.72, 3 clicks 0.61, 4 clicks 0.52.
It gets a bit more complicated when the pages link back to the home page, theough dopnt pass back as much either.
i have a simple explaination here.
http://thatsit.com.au/seo/tutorials/a-simple-explanation-of-pagerank
And her is a caculator that you can ry to see how it workes out.
-
"if page domain.com/rootpage.htm takes 4 clicks to get to it from the home page then you have a problem."
I've always believed it's best to keep the clicks down, but I recently read some research that shows many clicks are not necessarily a problem: http://www.uie.com/articles/three_click_rule/ - though the research is from 2003 and I'd say people have become more impatient and expectant of more instant results since then, it's still interesting and proves that 3 clicks doesn't have to be a rule.
There's also a useful article about "the scent of information" here: http://searchengineland.com/seo-and-the-scent-of-information-26206
-
Thank you Geoff, Casey and Alan.
Great answers and exactly what I needed to know.
-
Its not how many folders deep your pages is, but how many clicks from the home page it is.
if page domain.com/deep/deep/deep/deeppage.htm is linked from the home page then thats ok.
if page domain.com/rootpage.htm takes 4 clicks to get to it from the home page then you have a problem.
-
I might have misunderstood the question but what really matters is how you get to the content not necessarily the URL structure (relevancy required - no spam please).
Given this "freshness" aspect i.e. recent links / social shares will enhance the opportunities for this particular page to appear in the results (given other SEO boxes are "ticked"), the "extra" folders will not matter.
-
Hi Mark,
This will not hurt your SEO at all, here is a video from Matt Cutts explaining the issue: http://www.youtube.com/watch?v=l_A1iRY6XTM
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Title Length
Hi Gurus, I understand that it is a good practice is to use 50-60 characters for the a page title length. Google appends my brand name to the end of each title (15 characters including spaces) it index. Do I need to count what google adds as part of the maximum recommended length? i.e.
On-Page Optimization | | SunnyMay
is the maximum 50-60 characters + the 15 characters brand name Google adds to the end of the title or 50-60 including the addition? Many thanks!
Lev0 -
Will it upset Google if I aggregate product page reviews up into a product category page?
We have reviews on our product pages and we are considering averaging those reviews out and putting them on specific category pages in order for the average product ratings to be displayed in search results. Each averaged category review would be only for the products within it's category, and all reviews are from users of the site, no 3rd party reviews. For example, averaging the reviews from all of our boxes products pages, and listing that average review on the boxes category page. My question is, will this be doing anything wrong in the eyes of Google, and if so how so? -Derick
On-Page Optimization | | Deluxe0 -
"translation" of code in htaccess file
Hi everyone! I am a newbie to the whole SEO and html thing and I am trying to get a better understanding of the "behind the scenes" part of my website. I hope I can find someone here who can translate a piece of code for me that I have in my htaccess file: Options -Multiviews
On-Page Optimization | | momof4
Options +FollowSymLinks
rewritecond $1 !^(index.php|public|tmp|robots.txt|template.html|favicon.ico|images|css|uploads)
rewritecond %{REQUEST_FILENAME} !-f
rewritecond %{REQUEST_FILENAME} !-d
rewriterule ^(.*)$ index.php?link=$1 [NC,L,QSA] I know that something is getting redirected to the index file, but what (or when) exactly? Does the word "robots"mean that search engine crawlers are getting redirected here? And is this good or bad (in terms of SEO)? Or is this redirecting people who try to get to my robots/ template or image files?? Thanks in advance for any answers!0 -
Will "internal 301s" have any effect on page rank or the way in which an SE see's our site interlinking?
We've been forced (for scalability) to completely restructure our website in terms of setting out a hierarchy. For example - the old structure : country / city / city area Where we had about 3500 nicely interlinked pages for relevant things like taxis, hotels, apartments etc in that city : We needed to change the structure to be : country / region / area / city / cityarea So as patr of the change we put in place lots of 301s for the permanent movement of pages to the new structure and then we tried to actually change the physical on-page links too. Unfortunately we have left a good 600 or 700 links that point to the old pages, but are picked up by the 301 redirect on page, so we're slowly going through them to ensure the links go to the new location directly (not via the 301). So my question is (sorry for long waffle) : Whilst it must surely be "best practice" for all on-page links to go directly to the 'right' page, are we harming our own interlinking and even 'page rank' by being tardy in working through them manually? Thanks for any help anyone can give.
On-Page Optimization | | TinkyWinky0 -
Duplicate Content when Using "visibility classes" in responsive design layouts? - a SEO-Problem?
I have text in the right column of my responsive layout which will show up below the the principal content on small devices. To do this I use visibility classes for DIVs. So I have a DIV with with a unique style text that is visible only on large screen sizes. I copied the same text into another div which shows only up only on small devices while the other div will be hidden in this moment. Technically I have the same text twice on my page. So this might be duplicate content detected as SPAM? I'm concerned because hidden text on page via expand-collapsable textblocks will be read by bots and in my case they will detect it twice?Does anybody have experiences on this issue?bestHolger
On-Page Optimization | | inlinear0 -
Is there a SEO penalty for multi links on same page going to same destination page?
Hi, Just a quick note. I hope you are able to assist. To cut a long story short, on the page below http://www.bookbluemountains.com.au/ -> Features Specials & Packages (middle column) we have 3 links per special going to the same page.
On-Page Optimization | | daveupton
1. Header is linked
2. Click on image link - currently with a no follow
3. 'More info' under the description paragraph is linked too - currently with a no follow Two arguments are as follows:
1. The reason we do not follow all 3 links is to reduce too many links which may appear spammy to Google. 2. Counter argument:
The point above has some validity, However, using no follow is basically telling the search engines that the webmaster “does not trust or doesn’t take responsibility” for what is behind the link, something you don’t want to do within your own website. There is no penalty as such for having too many links, the search engines will generally not worry after a certain number.. nothing that would concern this business though. I would suggest changing the no follow links a.s.a.p. Could you please advise thoughts. Many thanks Dave Upton [long signature removed by staff]0 -
E-Commerce product pages that have multiple skus with unique pages.
Hey Guys, With the recent farm/panda update from google i'm at a cross roads as to how I should optimize product pages for a project i'm working on for a client. My client sells tires and one particular tire brand can have up to 15 models and each model can have up to 30 sizes. IE: 'Michelin Pilot Sport Cup' comes in 15 different sizes. Each size will have it's unique product page and description bringing me to my question. Should I use the same description on every size? I do plan on writting unique content for each tire model however i'm not sure if I should do it for every size. After all the tire model description is the same for every size, each size doesn't carry any unique characteristics that I can describe. Thanks in advance!
On-Page Optimization | | MikeDelaCruz770 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5