Why would the PageRank for all of our websites show the same?
-
The last time I checked (early this year), the PageRank on the sites I manage varied, with the highest showing as 6. It made sense as the PR6 site has loads of links and has been around for a long time, whereas the other sites hadn't.
Now all of our websites are showing the same PageRank - 6, even one that has recently launched and another that has barely any links/traffic or anything to it. I didn't check the PR of that one last time (I'd be surprised if it was 2), but the sites now showing as 6 ranged from PR3 to PR6 back then.
We changed server in February...so could this issue be something to do with all of the sites being stored on the same server? It doesn't seem right but it's the only thing I can think of.
At the moment, the Domain Authority for these six websites ranges from 27 to 62.
-
Yes. Every tool I know of that provides PR pulls from the tool bar PR.
-
I don't use Google Toolbar, I've used the PR checker on here, and tried a couple of others elsewhere, including prchecker.com I think. Do they all use the same data?
-
The server shouldn't be an issue at all. PR in the tool bar is very unreliable. Soon Google is going to stop supporting it. I wouldn't worry at all about it. Domain authority and page authority from the SEOmoz tool are much more reliable.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO QA automation of large websites
Can you share your experiences in managing SEO QA automation of large websites with millions of pages?
Intermediate & Advanced SEO | | terentyev
what are the things you are regularly testing for, besides the most obvious - hreflangs/canonicals, robots.txt, sitemap, non-200 status codes, redirect rules?
do you use in-house developed tools or external tools?
if external - which ones?
how do you run your QA automation scripts? external server or some online tools? upon every release or hourly/daily/monthly?0 -
What is the best way to structure website URLs ?
Hi, can anyone help me to understand if having category folder in URL matters or not? how to google treat a URL? for example, I have the URL www.protoexpress.com/pcb/certification but not sure google will treat it a whole or in separate parts? if in separate parts, is it safe to use pcb/pcb-certification? or it will be considered as keyword stuffing? Thank you in anticipation,
Intermediate & Advanced SEO | | SierraPCB1 -
Penalty for duplicate content on the same website?
Is it possible to get a penalty for duplicate content on the same website? I have a old custom-built site with a large number of filters that are pre-generated for speed. Basically the only difference is the meta title and H1 tag, with a few text differences here and there. Obviously I could no-follow all the filter links but it would take an enormous amount of work. The site is performing well in the search. I'm trying to decide whether if there is a risk of a penalty, if not I'm loath to do anything in case it causes other issues.
Intermediate & Advanced SEO | | seoman100 -
Ranking problems with international website
Hey there, we have some ranking issues with our international website. It would be great if any of you could share their thoughts on that. The website uses subfolders for country and language (i.e. .com/uk/en) for the website of the UK branch in English. As the company has branches all over the world and also offers their content in many languages the url structure is quite complex. A recent problem we have seen is that in certain markets the website is not ranking with the correct country. Especially in the UK and the US, Google prefers the country subfolder for Ghana (.com/gh/en) over the .com/us/en and .com/uk/en versions. We have hreflang setup and should also have some local backlinks pointing to the correct subfolders as we switched from many ccTLDs to one gTLD. What confuses me is that when I check for incoming links (Links to your site) with GWT, the subfolder (.com/gh/en) is listed quite high in the column (Your most linked content). However the listed linking domains are not linking at all to this folder as far as I am aware. If I check them with a redirect checker they all link to different subfolders. So I have now idea why Google gives such high authority to this subfolder over the specific country subfolders. The content is pretty much identical at this stage. Has any of you experienced similar behaviour and could point me in a promising direction? Thanks a lot. Regards, Jochen
Intermediate & Advanced SEO | | Online-Marketing-Guy0 -
Best option for Affiliate links on your website?
Hello! I have a website which is completely affiliate based. What is the best option for the links on-page? Examples would be: affiliate.website.com/12901730?2=3532523=user12342901730?2=3532523=user?Whittie www.website.com/affiliate=user?Whittie=load-of-tracking=date=blah=blaH?blah And So on... Which look ugly as sin when you hover over the Anchor Text. Ideally I would like a 301 redirect to mysite.com/goto/affiliatename, which would then have a rel nofollow. This way I could also track the exit pages via Analytics too guess, which I've not currently got set up and i'm desperate for it to be done. Does this method effect anything on search engines though? I've seen mixed report, but going back to 2011 which is too long ago in the SEO world. Another option is to use the likes of "Bit.ly" or use another domain and host 301s on there? The new bit.ly integration from moz might come in handy here. Please advise on the subject, I really appreciate any help on this, as i'm at a brick wall. Thanks
Intermediate & Advanced SEO | | Whittie0 -
Why are these results being showed as blocked by robots.txt?
If you perform this search, you'll see all m. results are blocked by robots.txt: http://goo.gl/PRrlI, but when I reviewed the robots.txt file: http://goo.gl/Hly28, I didn't see anything specifying to block crawlers from these pages. Any ideas why these are showing as blocked?
Intermediate & Advanced SEO | | nicole.healthline0 -
301 Redirect Dilemma - Website redesign
Hi Guys, We are redesigning a clients ecommerce site. As part of the process, we're changing the URL structure to make it more friendly. I have put together a provisional 301 redirect plan but I'm not sure just how far I need to go with it. So far I have extract all the pages from the existing site that Google Webmaster Tools says have links pointing at them - this totals 93 pages. I have matched each page like for like to the new website structure. My next step was to pull the landing pages report from Google Analytics, I have extracted the pages that received entrances over the last 6 weeks. This totals 553, less the redirects I have already done and cleaning up some Google Translate pages I have circa 410 pages left. Many of these pages has more than 1 URL pointing to that page. I'm debating how important it is that that all of these remaining 410 pages have individual redirects set up for them one by one. I have to rule out regex because there is no pattern that makes sense given that I have already set up redirects for the first 93 pages that have external links. My question therefore is how important are 301 redirects on pages that have no external links and receive less than 10 entrances over a 6 week previous period? Do I need to 301 every single product on the old site to it's corresponding page on the new site? Also, I'm not sure how to treat pages that have mutliple URL's on the old site, the existing URL structure is such a mess that in some instances I have 5 URL's for one product page? I could feasibly create 5 seperate redirects but is this necessary? Also what about speed considerations, the server is going to have to load these redirects and it may slow the site down. I'm sitting at 100 odd so far. Any answers are most appreciated. Thanks Derek.
Intermediate & Advanced SEO | | pulseo0 -
Major Website Migration Recovery Ideas?
Since starting our business back in 2006 we've gone through alot of branding, and as a result URL and architectual migrations. This has always something that has been driven by usability, brand awareness and technical efficiency reasons, while knowing that there would be SEO hits to take from it....but ultimately hoping to have a much stronger foundation from an SEO perspective in the long run. Having just gone through our most recent (and hopefully final) migration, we are now about 15% down on traffic (although more like 35% - 40% in real terms when seasonality is stripped out). Below is a timeline to our structural history: 2007 - 2009 = We operated as a network of inidividual websites which started as 1, www.marbellainfo.com, but grew to 40, with the likes of www.thealgarveinfo.com, www.mymallorcainfo.com, www.mytenerifeinfo.com, www.mymaltainfo.com etc.. 2009 - 2010 = We decided to consolitdate everything onto 1 single domain, using a sub-domain structure. We used the domain www.mydestinationinfo.com and the subdomains http://marbella.mydestinationinfo.com, http://algarve.mydestinationinfo.com etc.. All old pages were 301 redirected to like for like pages on the new subdomains. We took a 70% drop in traffic and SERPS disappeared for over 6 months. After 9 months we had recovered back to traffic levels and similar rankings to what we had pre-migration. Using this new URL structure, we expanded to 100 destinations and therefore 100 sub-domains. 2011 = In April 2011, having not learnt our lesson from before :(, we undwent another migration. We had secured the domain name www.mydestination.com and had developed a whole new logo and branding. With 100 sub-domains we underwent a migration to the new URL and used a sub-directory folder. So this time www.myalgarveinfo.com had gone to <a></a>http://algarve.mydestinationinfo.com and was now www.mydestination.com/algarve. No content or designs were changed, and again we 301 re-directed pages to like for like pages and with this we even made efforts to ask those linking to us to update their links to use our new URL's. The problem: The situation we fine ourselves in now is no where near as bad as what happend with our migration in 2009/2010, however, we are still down on traffic and SERPS and it's now been 3 months since the migration. One thing we had identified was that our re-directs where going through a chain of re-directs, rather than pointing straight to the final urls (something which has just been rectified). I fear that our constant changing of URL's has meant we have lost out in terms of the passing over of link juice from all the old URL's and loss of trust with Google for changing so much. Throughout this period we have grown the content on our site by almost 2x - 3x each year and now have around 100,000 quality pages of unique content (which is produced by locals on the ground in each destination). I'm hoping that someone in the SEOmoz Community might have some ideas on things we may have slipped up on, or ways in which we can try and recover a little faster and actually get some growth, as opposed to working hard and waiting a while just for another recovery. Thanks Neil
Intermediate & Advanced SEO | | Neil-MyDestination0