Do more page links work against a Google SEO ranking when there is only 1 url that other sites will link to?
-
Say I have a coupon site in a major city and assume there are 20 main locations regions (suburb cities) in that city.
Assume that all external links to my site will be to only the home page. www.site.com Assume also that my website business has no physical location.
Which scenario is better?
1. One home page that serves up dynamic results based on the user cookie location, but mentions all 20 locations in the content. Google indexes 1 page only, and all external links are to it.
2. One home page that redirects to the user region (one of 20 pages), and therefore will have 20 pages--one for each region that is optimized for that region. Google indexes 20 pages and there will be internal links to the other 19 pages, BUT all external links are still only to the main home page.
Thanks.
-
Thanks Marc. Sorry for the slow response--I came down with a bug last night..
Here is the basis for my comments that I am thinking that link juice is about Page Rank and not some much with the resulting Search Rank, as well as that Page Rank may actually not be a big deal anymore with overall Search Rank--and so my concern about dilution is overblown. Regardless of the truth on the matter, I appreciate your advice about content and relevancy:
http://blog.hubspot.com/blog/tabid/6307/bid/5535/Why-Google-Page-Rank-is-Now-Irrelevant.aspx
Thanks again, Ted
-
take a look at the screenshot - it`s taken from this url:
http://moz.com/search-ranking-factors
So you only misunderstood that linkjuice has nothing to do with search rank... it is a ranking factor so you should think about how you can use it more effective. On the other hand, websites with only a few sites or even with less content will also have their very own problems with the rankings. BUT everybody (including Google I guess) would prefer a smaller site if it provides good content in comparison to big sites with nonsense
If you are able and can ensure a certain kind of quality and uniqueness for every single (sub)page of your site then go ahead and use this scenario... if you are just able to create (partial) DC: hands off!
-
HI Marc,
Yeah I may not be explaining my understanding correctly, or I may not understand correctly. What I have read is that the issue of link juice is only connected to page rank and not search rank. So, if I have no backlinks to my subpages, then I don't lose any home page juice. So why even have subpages if no backlinks? Because of the search rank. Queries can still lead people to my subpages. In fact I've read that page rank is hardly even a factor in search rank anymore, which implies that no one should even be concerned about link juice dilution at all! I'd like to believe it because I potentially will have plenty of pages with unique content and would like to build backlinks to at least some of them besides the home page..
Does it sound like I've misunderstood this issue?
-
Maybe I didn
t understand you correctly but to avoid mistakes, please have a look at the attached graphic (linkjuice)... it would be like I
ve explained... I mean its not really bad to add several subpages and to pass some of your whole linkjuice towards them but there is no real advantage in the first place... let
s say that you want to do a really, really good job, then you have to create absolutely unique subpages (20 times in your case) for more or less the same topic... terrific if you can do so... then use the subpage model...It
s not an indisputable fact that your site won
t rank if its just one site... chances might raise if you have additinal subpages but only if you are able to fill each page with unique cotent. I think that there is a potential risk, that you just create DC or partial DC and pass some of your linkjuice towards those unperfect subpages... so if you think that you are able to create 20 unique subpages that choose this scenario... if it
s more or less a copy of the main site than this wouldn´t make any sense -
Hi Marc,
Thank you..I've heard this but here is why I find this issue so perplexing: First, I have read that the link juice is ONLY associated with inbound links, so if in both scenarios above all inbound links are to the home page only, then there is no decrease in link juice if I have 20 internal pages, YET I get the benefit of having 20 more pages indexed that might show up in a user query. I guess I'm trying to confirm that my understanding is correct before I have the programmer (me) set up 20 internal pages...I don't want to any more lose link juice from the home page than I have to.
Yesterday the SEO guy I'm thinking of hiring wrote this:
"If you only have the home page indexed, you will never rank. If you only have incoming links to the home page, you will never rank." I don't really understand this..it is in the context of a coupon site that offers coupons for all regions in all cities and of course they will be categorized by some 30 categories and 200 subcategories...
Any further input..really do appreciate it..
-
linkjuice/linkpower is a term which comes up within the scenarios you describe.
You have to imagine that every single external link gives your site this linkpower/linkjuice. According to that keeping it within one site would be a better decision. If your main site has serveral additional local sites behind/under it, the linkjuice will be passed to those sites.
You don`t have to be a genius in mathematics to see that this would decrease the linkjuice by 20 (in the scenario you describe)...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Closed Location Pages - 301 to open locations?
I work with several thousand local businesses and have a listing page for each on my site. Recently a large chunk of these locations closed, and a number of these pages rank well for localized keywords. I'm trying to figure out the best course of action.
Local Website Optimization | | Andrew_Mac
What I've done so far is make a note on each of the closed location pages that says something to the effect of "This location is currently closed. Here are some nearby options" and provide links to the location pages of 3 open places nearby. The closed location pages are continuing to rank well, but conversion rates from visitors landing on these pages has dropped. What I'm considering doing is 301ing these pages to the nearest open location page. I'm hoping this will preserve the ranking of the page for keywords for which the nearby location is still relevant, while not hurting user experience by serving up a closed location. I'm also thinking of, as a second step, creating new pages (with slightly altered URLs) for the closed listings. They won't rank as well obviously, but if someone searches for the address or even the street of the closed location, my hope is that I could still capture some of that traffic and hope to convert it through someone clicking through to an open location from there. I spoke with someone about this second step and he thought it sounded spammy. My thinking is, combined with the 301, I'm telling Google that the page it is currently ranking well no longer has the importance it once did and that the page I'm 301ing to does, but that the content on the page I'm creating for the closed location still has enough value to justify the newly created page. I'd really appreciate thoughts from the community on this. Thanks!0 -
Is my competitor doing something blackhat? - Cannot only access pages via serps , not from website navigation /search
Hi Mozzers, One of my competitors uses a trick whereby they have a number of different sitemaps containing location specific urls for their most popular categories on their eCommerce store. It's quite obvious that they are trying to rank for keyword <location>and from what I am see, you cant to any of these pages from their website navigation or search , so it's like these pages are separated from the main site in terms of accessing them but you can access the main website pages/navigation the other way round (i.e if you select one of the pages from finding it in serps) </location> I know that google doesn't really like anything you can't access from the main website but would you class this as blackhat ? / cheating etc ... They do tend to rank quite well for these alot of the pages and it hasn't seem to have affected pages on their main website in terms of rankings. I am just wondering , if it's worth us doing similar as google hasn't penalised them by the looks of things.. thanks Pete
Local Website Optimization | | PeteC120 -
Map Files for Branches and SEO
Dear All, We have an xml and image site map but we currently don't have a separate GEO Site Map / map files for our branches. I am wondering if such a thing exists and if so , if this something that we should be doing to help our branches rank locally on google maps etc. We have google local listings for our branches and we already do schema.org for our branches. Any thoughts on this would be appreciated. thanks Peter
Local Website Optimization | | PeteC120 -
Search Result Discrepancy: Keyword "Dresses" shows international sites in the search results of Google.co.in.
Hi All, What would be the reason that Google shows international websites in the first page results while there are huge local players available. Eg: Dresses - Keyword that shows results with almost all the results from International websites whereas the local big players in the same category are not shown. This is not the case for other keywords like Women dresses, Clothing, Shoes etc., Is it a bug or any particular reasons? Thanks,
Local Website Optimization | | Myntra0 -
Can a Find Us Link suffice as the NAP in footer of site?
I understand the need for NAP in the website for citation sourcing / local ranking purposes, etc. Is it possible to use a linking anchor text such as "Find Us" that can link to the Contact Page of the site that does list the street address? Or should it link to the google places listing? The client basically wants to "hide" the NAP, but keep the power of the local listing. Can this be done? Any suggestions? Or an example of website that does this successfully?
Local Website Optimization | | cschwartzel1 -
Feedback on different SEO Tools
Can anyone give me their opinion on these different tools? 1. MOZ vs. AHREF (I'm a happy MOZ subscriber, but I would like feedback) (or any other tool you'd recommend) 2. WhiteSpark vs. BrightLocal (or any other tool you recommend) 3. Optimizely vs. Visual Website Optimizer (or any other tool you recommend) 4. Hootsuite vs. ??? (can't think of another one) (or any other tool you recommend) 5. Weebly vs. Wordpress (to build websites) Lastly, please feel free to recommend any other tools you find are helpful for either SEO, Local SEO, Social management. Thanks.
Local Website Optimization | | mrodriguez14400 -
Launching Hundreds of Local Pages At Once or Tiered? If Tiered, In What Intervals Would You Recommend?
Greeting Mozzers, This is a long question, so please bare with me 🙂 We are an IT and management training company that offers over 180 courses on a wide array of topics. We have multiple methods that our students can attend these courses, either in person or remotely via a technology called AnyWare. We've also opened AnyWare centers in which you can physically go a particular location near you, and log into a LIVE course that might be hosted in say, New York, even if you're in say, LA. You get all the in class benefits and interaction with all the students and the instructor as if you're in the classroom. Recently, we've opened 43 AnyWare centers giving way to excellent localization search opportunities to our website (e.g. think sharepoint training in new york or "whatever city we are located in). Each location has a physical address, phone number, and employee working there so we pass those standards for existence on Google Places (which I've set up). So, why all this background? Well, we'd like to start getting as much visibility for queries that follow the format of "course topic area that we offered" followed by "city we offer it in." We offer 22 course topic areas and, as I mentioned, 43 locations across the US. Our IS team has created custom pages for each city and course topic area using a UI. I won't get into detailed specifics, but doing some simple math (22 topic areas multiplied by 43 location) we get over 800 new pages that need to eventually be crawled and added to our site. As a test, we launched the pages 3 months ago for DC and New York and have experienced great increases in visibility. For example, here are the two pages for SharePoint training in DC and NY (total of 44 local pages live right now). http://www2.learningtree.com/htfu/usdc01/washington/sharepoint-training
Local Website Optimization | | CSawatzky
http://www2.learningtree.com/htfu/usny27/new-york/sharepoint-training So, now that we've seen the desired results, my next question is, how do we launch the rest of the hundreds of pages in a "white hat" manner? I'm a big fan of white hat techniques and not pissing off Google. Given the degree of the project, we also did our best to make the content unique as possible. Yes there are many similarities but courses do differ as well as addresses from location to location. After watching Matt Cutt's video here: http://searchengineland.com/google-adding-too-many-pages-too-quickly-may-flag-a-site-to-be-reviewed-manually-156058 about adding too man pages at once, I'd prefer to proceed cautiously, even if the example he uses in the video has to do with tens of thousands to hundreds of thousands of pages. We truly aim to deliver the right content to those searching in their area, so I aim no black hat about it 🙂 But, still don't want to be reviewed manually lol. So, in what interval should we launch the remaining pages in a quick manner to raise any red flags? For example, should we launch 2 cities a week? 4 cities a month? I'm assuming the slower the better of course, but I have some antsy managers I'm accountable to and even with this type of warning and research, I need to proceed somehow the right way. Thanks again and sorry for the detailed message!0 -
How slow can a website be, but still be ok for visitors and seo?
Hello to all, my site http://www.allspecialtybuildings.com is a barn construction site. Our visitors are usually local. I am worried about page speed. I have been using Google Page Insight, and Gtmetrix. Although I cannot figure out browser leveraging, I have a 79 / 93 google score and for gtmetrix 98/87 score. Load times vary between 2.13 secs to 2.54 secs What is acceptable? I want to make sure I get Google love for a decent page speed, but for me these times are great. Bad times are like 7 seconds and higher. I have thought about a CDN, yet I have read horror stories too. I have ZERO idea of how to use a CDN, or if I need it. I just want a fast site that is both user and Google speed friendly. So my question is, what is a slow speed for a website? Is under 3 seconds considered ok? or bad for seo? But any advice is greatly appreciated.
Local Website Optimization | | asbchris0