Page rank 2 for home page, 3 for service pages
-
Hey guys,
I have noticed with one of our new sites, the home page is showing page rank two, whereas 2 of the internal service pages are showing as 3. I have checked with both open site explorer and yahoo back links and there are by far more links to the home page. All quality and relevant directory submissions and blog comments.
The site is only 4 months old, I wonder if anyone can shed any light on the fact 2 of the lesser linked pages are showing higher PR?
Thanks
-
Cool - have you figured out what the problem might be yet?
-
Hi Alex, I appreciate that answer loads. I will check all of the points you mentioned, however I am travelling from the UK to Thailand over the next day or so, so I will take a look once I land and sleep! Appreciate both yourself and Nsauers help loads on this
-
Does your homepage show (in Google's index) under different URLs? E.g.
That would split the authority over the separate URLs. You should redirect any differences to one standard URL and all of your links should point to the exact same homepage URL if possible.
Is your homepage linked to on every page on your website? How about those service pages? Your internal link structure goes a long way towards showing which pages are the most important.
PageRank updates infrequently. There's a possibility the value of your internal pages has been updated, but the homepage hasn't. Have you checked the Page Authority in SEOmoz PRO?
You say you have links from "relevant directory submissions and blog comments" - as it's a new site you might not have a varied link profile, so the link building could look unnatural. A spike of links from directories, for example, doesn't look natural - so Google might not give the links as much weight as they would have done had the links been built over a longer time period.
Do you have any/more outbound links on your homepage? Are you selling links?
It could just be down to the fact that the site is new. The authority of the homepage should naturally build over time if you continue to concentrate on building a relevant link profile from varied sources.
-
Yeah sure, that's a good point and actually makes total sense.
When I look at the site, one of the service pages has only links from the body text of the blog and the home page yet it is PR 3. Do you think it makes sense to also produce more internal body text links from the blog and service pages to the home page? To out weigh the blog comments?
-
Yea, I have found and tested the fact that Google has essentially de-valued Comments, Footers, and Sidebars. To what extent I don't know. I wouldn't cease any natural blog commenting as it is just that, natural.
My comment was merely regarding the value (weighting) of different types of links. A blog comment link is worth much less than a link in the body or content section of the page. And because of this it could lead to your inside pages being considered more authoritative.
It's not that the homepage is being penalized or looked upon negatively, it just doesn't have the same inbound link juice.
-
you know that is weird. Although it is a method of spammers, if the site is relevant to your own, it makes no sense for Google to de-value.
The comments on the blogs are mostly no follow, but they are comments made by an employee who is networking in the industry and is not concerned if the actions affect the SEO of the site. He is simply commenting on relevant blogs that are of interest.
Do you think I should ask them to stop? I really didn't think this kind of no-follow comment link paid any negative towards PR. As long as the site is relevant, it is natural behaviour.
-
Hi! It's highly possible that the homepage is not getting full link juice value since you mention that it receives blog comment links. First, those are likely nofollow and second they have been devalued by Google since they are a known method of spammers.
If the other pages receive better quality links they will be considered more authoritative than the homepage. Just my thoughts!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Recently rankings for "Tree Service" dropped
Hi, we've had a page which was ranking top 1 on the Google rankings but has complete dropped out to 5. We don't have any duplicate errors from that page on here. Do you have any suggestions?
Technical SEO | | FIT0 -
Why are only a few of our pages being indexed
Recently rebuilt a site for an auctioneers, however it has a problem in that none of the lots and auctions are being indexed by Google on the new site, only the pages like About, FAQ, home, contact. Checking WMT shows that Google has crawled all the pages, and I've done a "Fetch as Google" on them and it loads up fine, so there's no crawling issues that is standing out. I've set the "URL Parameters" to no effect too. Also built a sitemap with all the lots in, pushed to Google which then crawled them all (massive spike in Crawl rate for a couple days), and still just indexing a handful of pages. Any clues to look into would be greatly appreciated. https://www.wilkinsons-auctioneers.co.uk/auctions/
Technical SEO | | Blue-shark0 -
Should I deindex my pages?
I recently changed the URLs on a website to make them tidier and easier to follow. I put 301s in place to direct all the previous page names to the new ones. However, I didn't read moz's guide which says I should leave the old sitemap online for a few weeks afterwards. As I result, webmaster tools is showing duplicate page titles (which means duplicate pages) for the old versions of the pages I have renamed. Since the old versions are no longer on the sitemap, google can no longer access them to find the 301s I have put in place. Is this a problem that will fix itself over time or is there a way to quicken up the process? I could use webmaster tools to remove these old urls, but I'm not sure if this is recommended. Alternatively, I could try and recreate the old sitemap, but this would take a lot of time.
Technical SEO | | maxweb0 -
Penguin 2.1 update, ranking dropped.
Hi, My website was hit by Google's new update like never before, first my ranking dropped back in May when Google rolled out with their second Penguin update, back then i was outsourcing my SEO since most of the time i was working on optimizing my local maps. As soon as noticed my traffic dropped i start doing SEO my self, I removed over optimized keywords on website ( title, meta description) then i analyzed my link profile and found that i had a lot of commercial anchor text linking from spammy websites, mostly blog comments. Its been over 4 months since I used Google's disavow tools in hoping that will help me get my raking back, but i still see these spammy links in my profile, 90% of them are nofollow anyways , so i'm not sure if disavow tool helped me at all. With this Penguin 2.1 update my website ended up on page 4-10 for 90% of my keywords http://screencast.com/t/3bcw8LUxpj Im not sure what would be the best way out this, should i buy a new domain and start fresh?Please let me know if you want to take a look on my link profile and give your opinion. Looking forward to your help!
Technical SEO | | mezozcorp0 -
50,000 pages or a page with parameters
I have a site with about 12k pages on a topic... each of these pages could use another several pages to go into deeper detail about the topic. So, I am wondering, for SEO purposes would it be better to have something like 50,000 new pages for each sub topic or have one page that I would pass parameters to and the page would be built on the fly in code behind. The drawback to the one page with parameters is that the URL would be static but the effort to implement would be minimal. I am also not sure how google would index a single page with parameters. The drawback to the 50k pages model is the dev effort and possibly committed some faux pas by unleashing so many links to my internal pages. I might also have to mix aspx with html because my project can't be that large. Anyone here ever have this sort of choice to make? Is there a third way I am not considering?
Technical SEO | | Banknotes0 -
Which carries more weight Google page rank or Alexa Rank?
And how come do I see websites with Google PR of Zero and Alexa Page Rank in the top Thousands rank?
Technical SEO | | sherohass0 -
Pages plummeting in ranking
Hi all, I have a question, which i hope you can answer for me. I have a site www.betxpert.com (a danish betting site) and we have tried to do some SEO to improve conversions. One of the steps we have taken was to link to all of our bookmaker reviews in our menu (a mega menu). All of our bookmakers have an img and text link in the menu. The menu is shown on every page of the site. Since we have made this change we have been plumeting down the SERPs. For the search "betsafe" this page http://www.betxpert.com/bookmakere/betsafe is no longer in the top 50. We also added the "stars" so that the google result will show our over all review for the bookmaker, in order to stand out in the SERPs. Can anyone explain to me what the problem might be? Over extensive internal linking or?
Technical SEO | | rasmusbang0