Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Does page "depth" matter
-
Would it have a negative effect on SEO to have a link from the home page to this page...
http://www.website/com/page1deep/page2deep
rather than to this page
http://www.website/com/page1deep
I'm hoping that made some sense. If not I'll try to clarify.
Thanks,
Mark
-
I had a quick scan thought the article and it looks likethey are talking from a usability aspect. I am talking from a link juice aspect. Page rank passes only 85% thought a link, so if the home page passes PR of 1 thought each of its likes, the a page one click away only gets 0.85, 2 clicks 0.72, 3 clicks 0.61, 4 clicks 0.52.
It gets a bit more complicated when the pages link back to the home page, theough dopnt pass back as much either.
i have a simple explaination here.
http://thatsit.com.au/seo/tutorials/a-simple-explanation-of-pagerank
And her is a caculator that you can ry to see how it workes out.
-
"if page domain.com/rootpage.htm takes 4 clicks to get to it from the home page then you have a problem."
I've always believed it's best to keep the clicks down, but I recently read some research that shows many clicks are not necessarily a problem: http://www.uie.com/articles/three_click_rule/ - though the research is from 2003 and I'd say people have become more impatient and expectant of more instant results since then, it's still interesting and proves that 3 clicks doesn't have to be a rule.
There's also a useful article about "the scent of information" here: http://searchengineland.com/seo-and-the-scent-of-information-26206
-
Thank you Geoff, Casey and Alan.
Great answers and exactly what I needed to know.
-
Its not how many folders deep your pages is, but how many clicks from the home page it is.
if page domain.com/deep/deep/deep/deeppage.htm is linked from the home page then thats ok.
if page domain.com/rootpage.htm takes 4 clicks to get to it from the home page then you have a problem.
-
I might have misunderstood the question but what really matters is how you get to the content not necessarily the URL structure (relevancy required - no spam please).
Given this "freshness" aspect i.e. recent links / social shares will enhance the opportunities for this particular page to appear in the results (given other SEO boxes are "ticked"), the "extra" folders will not matter.
-
Hi Mark,
This will not hurt your SEO at all, here is a video from Matt Cutts explaining the issue: http://www.youtube.com/watch?v=l_A1iRY6XTM
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Optimization Error
Hi, I am trying to track a page optimization feature for one of my project, https://shinaweb.com but i keep getting this below error: "PAGE OPTIMIZATION ERROR
On-Page Optimization | | shinawebnavid
There was a problem loading this page. Please make sure the page is loading properly and that our user-agent, rogerbot, is not blocked from accessing this page." I checked robots.txt file, it all looks fine. Not sure what is the problem? Is it a problem with Moz or the website?0 -
Hreflang Errors 404 vs "Page Not Found"
For a websites that differ between catalogs (PDPs) what hreflang error causes the least harm? Obviously the best solution is to only have hreflang for shared products, but this takes more work to implement. So when no identical product exists... 1. Hreflang points to 404 or 410 error. 2. Hreflang points to 200 status "Page Not Found" page. This obviously has the additional issue of needing to point back to 100+ urls. I want to avoid having Google decide to ignore all hreflang due to errors as many correct urls will exist. Any thoughts?
On-Page Optimization | | rigelcable0 -
Category pages, should I noindex them?
Hi there, I have a question about my blog that I hope you guys can answer. Should I no index the category and tag pages of my blog? I understand they are considered as duplicate content, but what if I try to work the keyword of that category? What would you do? I am looking forward to reading your answers 🙂
On-Page Optimization | | lucywrites0 -
Will it upset Google if I aggregate product page reviews up into a product category page?
We have reviews on our product pages and we are considering averaging those reviews out and putting them on specific category pages in order for the average product ratings to be displayed in search results. Each averaged category review would be only for the products within it's category, and all reviews are from users of the site, no 3rd party reviews. For example, averaging the reviews from all of our boxes products pages, and listing that average review on the boxes category page. My question is, will this be doing anything wrong in the eyes of Google, and if so how so? -Derick
On-Page Optimization | | Deluxe0 -
Home page and category page target same keyword
Hi there, Several of our websites have a common problem - our main target keyword for the homepage is also the name of a product category we have within the website. There are seemingly two solutions to this problem, both of which not ideal: Do not target the keyword with the homepage. However, the homepage has the most authority and is our best shot at getting ranked for the main keyword. Reword and "de-optimise" the category page, so it doesn't target the keyword. This doesn't work well from UX point of view as the category needs to describe what it is and enable visitors to navigate to it. Anybody else gone through a similar conundrum? How did you end up going about it? Thanks Julian
On-Page Optimization | | tprg0 -
Noindex child pages (whose content is included on parent pages)?
I'm sorry if there have been questions close to this before... I've using WordPress less like a blogging platform and more like a CMS for years now... For content management purposes we organize a lot of content around Parent/Child page (and custom-post-type) relationships; the Child pages are included as tabbed content on the Parent page. Should I be noindexing these child pages, since their content is already on the site, in full, on their Parent pages (ie. duplicate content)? Or does it not matter, since the crawlers may not go to all of the tabbed content? None of the pages have shown up in Moz's "High Priority Issues" as duplicate content but it still seems like I'm making the Parent pages suffer needlessly... Anything obvious I'm not taking into consideration? By the by, this is my first post here @ Moz, which I'm loving; this site and the forums are such a great resource! Anyways, thanks in advance!
On-Page Optimization | | rsigg0 -
The word "in" between 2 keywords influence on SEO
Does anybody know when you have the word "in" between two keywords has this a negative influence in Google? For example: "Holiday Home Germany" is the search term in Google
On-Page Optimization | | Bram76
"Holiday Home in Germany" as h1 on our website or do we have to use "Holiday Home Germany" on our website?0 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5