Do more page links work against a Google SEO ranking when there is only 1 url that other sites will link to?
-
Say I have a coupon site in a major city and assume there are 20 main locations regions (suburb cities) in that city.
Assume that all external links to my site will be to only the home page. www.site.com Assume also that my website business has no physical location.
Which scenario is better?
1. One home page that serves up dynamic results based on the user cookie location, but mentions all 20 locations in the content. Google indexes 1 page only, and all external links are to it.
2. One home page that redirects to the user region (one of 20 pages), and therefore will have 20 pages--one for each region that is optimized for that region. Google indexes 20 pages and there will be internal links to the other 19 pages, BUT all external links are still only to the main home page.
Thanks.
-
Thanks Marc. Sorry for the slow response--I came down with a bug last night..
Here is the basis for my comments that I am thinking that link juice is about Page Rank and not some much with the resulting Search Rank, as well as that Page Rank may actually not be a big deal anymore with overall Search Rank--and so my concern about dilution is overblown. Regardless of the truth on the matter, I appreciate your advice about content and relevancy:
http://blog.hubspot.com/blog/tabid/6307/bid/5535/Why-Google-Page-Rank-is-Now-Irrelevant.aspx
Thanks again, Ted
-
take a look at the screenshot - it`s taken from this url:
http://moz.com/search-ranking-factors
So you only misunderstood that linkjuice has nothing to do with search rank... it is a ranking factor so you should think about how you can use it more effective. On the other hand, websites with only a few sites or even with less content will also have their very own problems with the rankings. BUT everybody (including Google I guess) would prefer a smaller site if it provides good content in comparison to big sites with nonsense
If you are able and can ensure a certain kind of quality and uniqueness for every single (sub)page of your site then go ahead and use this scenario... if you are just able to create (partial) DC: hands off!
-
HI Marc,
Yeah I may not be explaining my understanding correctly, or I may not understand correctly. What I have read is that the issue of link juice is only connected to page rank and not search rank. So, if I have no backlinks to my subpages, then I don't lose any home page juice. So why even have subpages if no backlinks? Because of the search rank. Queries can still lead people to my subpages. In fact I've read that page rank is hardly even a factor in search rank anymore, which implies that no one should even be concerned about link juice dilution at all! I'd like to believe it because I potentially will have plenty of pages with unique content and would like to build backlinks to at least some of them besides the home page..
Does it sound like I've misunderstood this issue?
-
Maybe I didn
t understand you correctly but to avoid mistakes, please have a look at the attached graphic (linkjuice)... it would be like I
ve explained... I mean its not really bad to add several subpages and to pass some of your whole linkjuice towards them but there is no real advantage in the first place... let
s say that you want to do a really, really good job, then you have to create absolutely unique subpages (20 times in your case) for more or less the same topic... terrific if you can do so... then use the subpage model...It
s not an indisputable fact that your site won
t rank if its just one site... chances might raise if you have additinal subpages but only if you are able to fill each page with unique cotent. I think that there is a potential risk, that you just create DC or partial DC and pass some of your linkjuice towards those unperfect subpages... so if you think that you are able to create 20 unique subpages that choose this scenario... if it
s more or less a copy of the main site than this wouldn´t make any sense -
Hi Marc,
Thank you..I've heard this but here is why I find this issue so perplexing: First, I have read that the link juice is ONLY associated with inbound links, so if in both scenarios above all inbound links are to the home page only, then there is no decrease in link juice if I have 20 internal pages, YET I get the benefit of having 20 more pages indexed that might show up in a user query. I guess I'm trying to confirm that my understanding is correct before I have the programmer (me) set up 20 internal pages...I don't want to any more lose link juice from the home page than I have to.
Yesterday the SEO guy I'm thinking of hiring wrote this:
"If you only have the home page indexed, you will never rank. If you only have incoming links to the home page, you will never rank." I don't really understand this..it is in the context of a coupon site that offers coupons for all regions in all cities and of course they will be categorized by some 30 categories and 200 subcategories...
Any further input..really do appreciate it..
-
linkjuice/linkpower is a term which comes up within the scenarios you describe.
You have to imagine that every single external link gives your site this linkpower/linkjuice. According to that keeping it within one site would be a better decision. If your main site has serveral additional local sites behind/under it, the linkjuice will be passed to those sites.
You don`t have to be a genius in mathematics to see that this would decrease the linkjuice by 20 (in the scenario you describe)...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Research on industries that are most competitive for SEO?
I am trying to see if there is a reputable / research-backed source that can show which industries are most competitive for search engine optimization. In particularly, I'd be interested in reports / research related to the residential real estate industry, which I believe based on anecdotal experience to be extremely competitive.
Local Website Optimization | | Kevin_P3 -
SEO Company wants to rebuild site
Hello Community, I am a designer and web developer and I mostly work with squarespace. Squarespace has SEO best practices built into the platform, as well as developer modes for inserting custom code when necessary. I recently built a beautiful website for a Hail Repair Company and referred them to several companies to help them with SEO and paid search. Several of these companies have told this client that in order to do any kind of SEO, they'll need to completely rebuild the site. I've seen some of the sites these companies have built, and they are tacky, over crowded and hard to use. My client is now thinking they need to have their site rebuilt. Is there any merit to this idea? Or are these companies just using the knowledge gap to swindle people into buying more services? The current site is : https://www.denverautohailspecialists.com/ Any advice would be appreciated.
Local Website Optimization | | arzawacki2 -
Understand how site redesign impacts SEO
Hi everyone, I have, what I think, is kind of a specific question, but hoping you guys can help me figure out what to do. I have a client that recently changed their entire website (I started working with them after it happened, so I can't comment on what the site was like as far as content was before). I know they were using a service that I see a lot of in the service industry that aim to capitalize on local business (i.e. "leads nearby" or "nearby now") by creating pages for each targeted city and I believe collecting reviews for each city directly on the website. When they redesigned their website, they dropped that service and now all those pages that were ranking in SERPs are coming back as 404s because they are not included in the new site (I apologize if this is getting confusing!) The site that they moved to is a template site that they purchased the rights to from an already successful company in their same industry, so I do think the link structure probably changed, especially with all of the local pages that are no longer available on the site. Note: I want to use discretion in using company names, but happy to share more info in a private message if you'd like to see the sites I am talking about as I have a feeling that this is getting confusing 🙂 Has anyone had experience with something like this? I am concerned because even though I am targeting the keywords being used previously to direct content to the local pages to new existing pages, traffic to the website has dropped by nearly 60% and I know my clients are going to want answers-- and right now, I only have guesses. I am really looking forward to and so greatly appreciate any advice you might be able to share, I'm at a bit of a loss right now.
Local Website Optimization | | KaitlinNS0 -
Our rankings are all over the place but mostly keywords are dropping
our rankings are all over the place but mostly keywords are dropping from 2-20 to 35 and over 51. it has been happening over the past 3 weeks but don't know what to look for. any advise is appreciated. stevesautorepairva.com . our other automotive website hometownetire.com seems to be doing better but do not know why. they are 2 separate businesses. Thank you very much in advance for any help. AqDQnRx
Local Website Optimization | | ifixcars0 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1 -
Listing bundle info on site and on local SEO page.
We just finished a new telecom site, and like all telecom sites (think AT&T, Verizon, Suddenlink, etc.), we allow people to put their location in and find internet and phone service packages (what we call bundles) unique to their area. This page also has contact information for the local sales team and some unique content. However, we're about to start putting up smaller, satellite pages for our local SEO initiative. Of course, these pages will have unique content as well, but it will have some of the same content as what's on the individual bundle page, such as package offerings, NAP, etc. Currently this is the URL structure for the bundles: domain.com/bundles/town-name/ This is what I'm planning for the local SEO pages: domain.com/location/town-name-state/ All local FB pages, Google listings, etc. will like to these location pages, rather than the bundle pages. Is this okay or should I consolidate them into one?
Local Website Optimization | | AMATechTel0 -
Does Schema Replace Conventional NAP in local SEO?
Hello Everyone, My question is in regards to Schema and whether the it replaces the need for the conventional structured data NAP configuration. Because you have the ability to specifically call out variables (such as Name, URL, Address, Phone number ect.) is it still necessary to keep the NAP form-factor that has historically been required for local SEO? Logically it makes sense that schema would allow someone to reverse this order and still achieve the same result, however I have yet to find any conclusive evidence of this being the case. Thanks, and I look forward to what the community has to say on this matter.
Local Website Optimization | | toddmumford0 -
Local SEO Tools for UK
Hi guys I'm looking for any recommendations for local SEO tools in the UK? I keep stumbling across a variety of different tools but they all seem to cater for the US market only. Any tools or tips would be greatly received!
Local Website Optimization | | DHS_SH0