Do more page links work against a Google SEO ranking when there is only 1 url that other sites will link to?
-
Say I have a coupon site in a major city and assume there are 20 main locations regions (suburb cities) in that city.
Assume that all external links to my site will be to only the home page. www.site.com Assume also that my website business has no physical location.
Which scenario is better?
1. One home page that serves up dynamic results based on the user cookie location, but mentions all 20 locations in the content. Google indexes 1 page only, and all external links are to it.
2. One home page that redirects to the user region (one of 20 pages), and therefore will have 20 pages--one for each region that is optimized for that region. Google indexes 20 pages and there will be internal links to the other 19 pages, BUT all external links are still only to the main home page.
Thanks.
-
Thanks Marc. Sorry for the slow response--I came down with a bug last night..
Here is the basis for my comments that I am thinking that link juice is about Page Rank and not some much with the resulting Search Rank, as well as that Page Rank may actually not be a big deal anymore with overall Search Rank--and so my concern about dilution is overblown. Regardless of the truth on the matter, I appreciate your advice about content and relevancy:
http://blog.hubspot.com/blog/tabid/6307/bid/5535/Why-Google-Page-Rank-is-Now-Irrelevant.aspx
Thanks again, Ted
-
take a look at the screenshot - it`s taken from this url:
http://moz.com/search-ranking-factors
So you only misunderstood that linkjuice has nothing to do with search rank... it is a ranking factor so you should think about how you can use it more effective. On the other hand, websites with only a few sites or even with less content will also have their very own problems with the rankings. BUT everybody (including Google I guess) would prefer a smaller site if it provides good content in comparison to big sites with nonsense
If you are able and can ensure a certain kind of quality and uniqueness for every single (sub)page of your site then go ahead and use this scenario... if you are just able to create (partial) DC: hands off!
-
HI Marc,
Yeah I may not be explaining my understanding correctly, or I may not understand correctly. What I have read is that the issue of link juice is only connected to page rank and not search rank. So, if I have no backlinks to my subpages, then I don't lose any home page juice. So why even have subpages if no backlinks? Because of the search rank. Queries can still lead people to my subpages. In fact I've read that page rank is hardly even a factor in search rank anymore, which implies that no one should even be concerned about link juice dilution at all! I'd like to believe it because I potentially will have plenty of pages with unique content and would like to build backlinks to at least some of them besides the home page..
Does it sound like I've misunderstood this issue?
-
Maybe I didn
t understand you correctly but to avoid mistakes, please have a look at the attached graphic (linkjuice)... it would be like I
ve explained... I mean its not really bad to add several subpages and to pass some of your whole linkjuice towards them but there is no real advantage in the first place... let
s say that you want to do a really, really good job, then you have to create absolutely unique subpages (20 times in your case) for more or less the same topic... terrific if you can do so... then use the subpage model...It
s not an indisputable fact that your site won
t rank if its just one site... chances might raise if you have additinal subpages but only if you are able to fill each page with unique cotent. I think that there is a potential risk, that you just create DC or partial DC and pass some of your linkjuice towards those unperfect subpages... so if you think that you are able to create 20 unique subpages that choose this scenario... if it
s more or less a copy of the main site than this wouldn´t make any sense -
Hi Marc,
Thank you..I've heard this but here is why I find this issue so perplexing: First, I have read that the link juice is ONLY associated with inbound links, so if in both scenarios above all inbound links are to the home page only, then there is no decrease in link juice if I have 20 internal pages, YET I get the benefit of having 20 more pages indexed that might show up in a user query. I guess I'm trying to confirm that my understanding is correct before I have the programmer (me) set up 20 internal pages...I don't want to any more lose link juice from the home page than I have to.
Yesterday the SEO guy I'm thinking of hiring wrote this:
"If you only have the home page indexed, you will never rank. If you only have incoming links to the home page, you will never rank." I don't really understand this..it is in the context of a coupon site that offers coupons for all regions in all cities and of course they will be categorized by some 30 categories and 200 subcategories...
Any further input..really do appreciate it..
-
linkjuice/linkpower is a term which comes up within the scenarios you describe.
You have to imagine that every single external link gives your site this linkpower/linkjuice. According to that keeping it within one site would be a better decision. If your main site has serveral additional local sites behind/under it, the linkjuice will be passed to those sites.
You don`t have to be a genius in mathematics to see that this would decrease the linkjuice by 20 (in the scenario you describe)...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiple service area pages that rank well. However the primary keyword page tends to bounce around between the pages. How can I stabalise the ranking to the primary page
We have multiple service area pages attached to the primary keyword for the site which arent in the navigation and we have the primary page which is in the navigation. Currently Google is choosing different service area pages to rank for the primary keyword so the rankings bounce around a lot for the keyword when it doesn't have a service area target in it. Eg work shirts vs work shirts brisbane.
Local Website Optimization | | jonathan.k0 -
How to Get google to get to index New URL and not the OLD url
Hi Team, We are undertaking a Domain migration activity to migrate our content frrom one domain to another. 1. the Redirection of pages is handeled at Reverse proxy level. 2. We do have 301 redirects put in place. However we still see that google is indexing pages with our Old domain apart from the pages from new domain. Is there a way for us to stop google from indexing our pages from Old domain. The recommendations to have Noindex on Page mete title and disallow does not work since our redirection is setup at RP and google crawlers always discover the new pages after redirection.
Local Website Optimization | | bhaskaran0 -
Should Multi Location Businesses "Local Content Silo" Their Services Pages?
I manage a site for a medical practice that has two locations. We already have a location page for each office location and we have the NAP for both locations in the footer of every page. I'm considering making a change to the structure of the site to help it rank better for individual services at each of the two locations, which I think will help pages rank in their specific locales by having the city name in the URL. However, I'm concerned about diluting the domain authority that gets passed to the pages by moving them deeper in the site's structure. For instance, the services URLs are currently structured like this: www.domain.com/services/teeth-whitening (where the service is offered in each of the two locations) Would it make sense to move to a structure more like www.domain.com/city1name/teeth-whitening www.domain.com/city2name/teeth-whitening Does anyone have insight from dealing with multi-location brands on the best way to go about this?
Local Website Optimization | | formandfunctionagency1 -
Local SEO for Multiple Locations - Is this the best approach?
Hi everyone! I previously have worked with single-location companies, and am now working for a company that is continuously growing and adding new locations. We are a financial institution that currently has 12 locations, and we should have 15+ locations by year-end 2017. Seeing as we have all of these locations, I thought the following approach would be the best for increasing our presence in local search. Our primary keyword is "credit union in location". Our search traffic has increased heavily over last year, but is down from the beginning of the year. I've gone through and done the following: Freshened up the content on the main website Created pages for each of our locations around April-end Attributed these location page URLs to our Google My Business locations Verified each location Wrote unique content for each page Our primary keyword rankings seem to fluctuate weekly. My next steps are to get our web design company to add the following: Structured Data on all location pages The ability to change SEO title and meta descriptions on location pages Sitemap (there is none currently, and I've been fighting them to get one added because it isn't needed.) I also plan on utilizing Moz Local to manage our local listings. After this is done I plan on finding ways for us to build links for each location, like the chambers of commerce in each city and local partnerships. Is this the best approach for our overall goal, and should I continue? Is there anything I should change about our current approach? I appreciate the help!
Local Website Optimization | | PelicanStateCU0 -
What is the SEO effect of schema subtype deprecation? Do I really have to update the subtype if there isn't a suitable alternative?
Could someone please elaborate on the SEO effect of schema subtype deprecation? Does it even matter? The Local business properties section of developers.google.com says to: Define each local business location as a LocalBusiness type. Use the most specific LocalBusiness sub-type possible; for example, Restaurant, DaySpa, HealthClub, and so on. Unfortunately, the ProfessionalService page of schema.org states that ProfessionalService has been deprecated and many of my clients don't fit anywhere else (or if they do it's not a LocalBusiness subtype). I find it inconvenient to have to modify my different clients' JSON-LD from LocalBusiness to ProfessionalService back to LocalBusiness. I'm not saying this happens every day but how does one keep up with it all? I'm really trying to take advantage of the numerous types, attributes, etc., in structured data but I feel the more I implement, the harder it will be to update later (true of many things, of course). I do feel this is important and that a better workflow could be the answer. If you have something that works for you, please let us know. If you think it's not important tell us why not? (Why Google is wrong) I understand there is always a better use of our time, but I'd like to limit the discussion to solving this Google/Schema.org deprecation issue specifically.
Local Website Optimization | | bulletproofsearch0 -
Multiple location pages are they bad?
Hello all, I am research some competitors of a client of mine. My client specializes in H.P. printer repair and over the last 8 years has lost market shares to the competition. I want to reclaim market share. As I was searching some of the service companies many have page that list multiple towns that they service. here is an example. http://printerrepairservice.com/locations-we-service/ Should I be recommending this to my client? To me it seems like a spam keyword process. I know an employee of this particular company and he say their online business is booming. I want my clients to boom too! What are your thoughts on these location type pages?
Local Website Optimization | | donsilvernail0 -
Home page links -- Ajax When Too Many?
My home page has links to major cities. If someone chooses a specific city I want to give them the choice to choose a suburb within the city, With say 50 cities and 50 suburbs for each city that's 2500 links on the home page. In order to avoid that many links on the home page (or any page) I would like to have just the 50 cities and pull up the suburbs as an ajax call that search engines would not read/crawl. This would be better than clicking on a main city and then getting the city page which they then can choose a suburb. Better to do it all at once. Is it a bad idea to ajax the subregions on the home page and to code it so Google, Bing, other search engines don't crawl or even see anything on the home page related to the suburbs? The search engines will still find the suburb links because they will be listed on each main city page.
Local Website Optimization | | friendoffood0 -
Local site went from dominating first page - bad plugin caused duplicate content issues - now to 2nd page for all!
I had a bad plugin create duplicate content issues on my Wordpress CMS - www.pmaaustin.com I got it fixed, but now every keyword has been stuck on page 2 for search terms for 4 months now, where I was 49 out of 52 keywords on page one. It's a small local niche with mostly easier to rank keywords. Am I missing something? p.s. Also has a notice on the Dashboard that says: "404 Redirected: There are 889 captured 404 URLs that need to be processed." Could that be a problem? Thanks, Steve
Local Website Optimization | | OhYeahSteve0