Does PR/linkjuice get transfered from directory->subdirectory?
-
If website.com links to website.com/utilities
Does website.com/utilities/business.html pagerank get benefited from this?
The reason I ask is, I have the option to have:
1) website.com/utilities/business.html
or
2) website.com/utilities-business
My boss says would like to keep the first example and link directory to subdirectory --> File.
I prefer the second example because the potential traffic in it is a source of revenue and I want it to get linkpower from homepage directly.
So, unless linkjuice is transfered horizontally across subdirectories, 2) would be better SEO-wise, is this correct?
Thanks a lot.
-
Hi Jorge,
Good question! If I understand correctly, you have a choice between two file structures - is that right? Using option 1, is there anything that would actually live on website.com/utilities/, or is it simply a URL directory with no actual webpage?
If this is the case, both option 1 and 2 would pass the same amount of link juice, assuming you linked directly to each target. That said, it's usually desirable to have:
- A flat architecture - meaning you keep your structure as flat as possible using the fewest amount of directories as possible
- Shorter URLs are generally correlated to better rankings and click-through rates.
For this reason, if you are linking directly to each target, I would chose the second option if possible, although the difference it makes probably isn't that great.
Hope this helps! Best of luck with your SEOl
-
Hi Jorge,
Option 1 would be good if you have a lot of sub pages that can be classified under the "utilities" section. This would also help the search engines to index and classify your site pages better also benefitting you on the SEO side
Option 2 would be recommended if you don't have a lot of content that can be classified under "utilities"
Think about the user first and what would be benefit them. To directly answer your question, in my opinion , the closer the link to the home page the better it would be from an SEO perspective.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to get into Google's Tops Stories?
Hi All, I have been doing research for a few weeks and I cannot for the life of me figure out why I cannot get my website (Racenet) into the top stories in Google. We are in Google News, have "news article" schema, have AMP pages. Our news articles also perform quite well organically and we typically dominate the Google News section. We have two main competitors (Punters and Just Horse Racing) who are both in top stories and I cannot find anything that we are doing that they aren't. Apparently the AMP "news article" schema is incorrect and that could be the reason why we aren't showing up in Google Top Stories, but I can't find anything wrong with the schema and it looks the same as our competitors. For example: https://search.google.com/structured-data/testing-tool/u/0/#url=https%3A%2F%2Fwww.racenet.com.au%2Fnews%2Fblake-shinn-booked-to-ride-doncaster-handicap-favourite-alizee-20190331%3FisAmp%3D1 Does anyone have any ideas of why I cannot get my site into Google Top Stories? Any and all help would be greatly appreciated. Thanks! 🙂
Technical SEO | | Saba.Elahi.M.0 -
Content available only on log-in/ sign up - how to optimise?
Hi Mozzers. I'm working on a dev brief for a site with no search visibility at all. You have to log in (well, sign up) to the site (via Facebook) to get any content. Usability issues of this aside, I am wondering what are the possible solutions there are to getting content indexed. I feel that there are two options: 1. Pinterest-style: this gives the user some visibility of the content on the site before presenting you with a log in overlay. I assume this also allows search engines to cache the content and follow the links. 2. Duplicate HTTP and HTTPS sites. I'm not sure if this is possible in terms of falling foul of the "showing one thing to search engines and another thing to users" guidelines. In my mind, you would block robots from the HTTPS site (and show it to the users where log in etc is required) but URLs would canonicalise to the HTTP version of the page, which you wouldn't present to the users, but would show to the search engines. The actual content on the pages would be the same. I wonder if anyone knows any example of large(ish) websites which does this well, or any options I haven't considered here. Many thanks.
Technical SEO | | Pascale0 -
No-follow for article directory?
My clients pull from a central article directory on our server (medical directory), as the information is about standard medical issues. This said, the MOZ analytics is showing these articles for each client as indexed and duplicate in content, descriptions, titles, etc. Would it be better to use a no-follow for these articles to avoid looking like duplicate content, or should I consider overhauling the resource section into static pages and making each article unique to each client-considering the latest updates in Google? Any help/insight would be greatly appreciated!!!!! Thanks
Technical SEO | | lfrazer0 -
Cross domain shared/duplicate content
Hi, I am working on two websites which share some of the same content and we can't use 301s to solve the problem; would you recommend using canonical tags? Thanks!
Technical SEO | | J_Sinclair0 -
I am cleaning up a clients link profile and am coming across a lot of directories (no surprise) My question is if an obvious fre for all generic directory doesn't look to have been hit by any updates is it a wise move recommending tit for removal?
I am cleaning up a clients link profile and am coming across a lot of directories (no surprise) My question is, if an obvious free for all generic directory doesn't look to have been hit by any updates is it a wise move recommending it for removal on the basis that it is a free for all directory and could be hit in teh future?
Technical SEO | | fazza470 -
How do I get rid of duplicate content
I have a site that is new but I managed to get it to page one. Now when I scan it on SEO Moz I see that I have duplicate content. Ex: www.mysite.com, www.mysite.com/index and www.mysite.com/ How do I fix this without jeopardizing my SERPS ranking? Any tips?
Technical SEO | | bronxpad0 -
Will same language different region (US/UK) geotargeting via subdirectory (& GWT) cause dupe content or other issues ?
If a UK hosted site on a .com, needs to target US now too but for keywords that are spelt differently in US is creating duplicate version of uk hosted .com site and putting it on a subdirectory .com/us/ and geotargeting via webmaster tools (to usa) ok ? I take it in this scenario no dupe content issues (or other issues) so long as is geotargeted via GWT ? Or are there ? Comments from anyone with experience doing similar (same language, different region geo-targeting dupe content with kw spelling being only difference, via a subdirectory or other route) much appreciated ? Many Thanks 🙂
Technical SEO | | Dan-Lawrence0 -
Google / Bing Product Feeds - Optimization - Taxonomies
Can anyone provide insightful optimization tips for GOOGLE and BING product feeds? How important is it to use the respective SE' taxonomies? Any other effective tactics apart from data stufffing?
Technical SEO | | DavidS-2820610