PageRank from 5 to N/A
-
Hi there,
I do SEO for .. and it's a great site with all very valuable (and unique) information and resources. It was previously a PageRank 5 but recently (in the past week) it has gone to PR N/A across the entire website.
The website is still being indexed in Google and our rankings are still the same. And although I'm aware PageRank means very little these days, I'm a little concerned this may be the start for a decrease in rankings due to something being wrong on the website.
Have other people seen this?
Would love some input in case you see something that I'm not or what may be causing the issue.
Thank you for you time.
-
Absolutely, would love their insight too. It certainly has me stumped as to why it has happened and I've only give a few potential causes, nothing definite. So would love to see as well!
-
Yes, potentially it could. It always used to be the way that it didn't matter who linked to you, you wouldn't be penalised for it. But that all changed with the Penguin update. Now who links to you, where they link from and how are all very relevant.
I have seen these things sort of "fix themselves" in a few days, but it's a good idea to stay vigilant. If you ever need a hand, just drop me a line.
-
Hi Chris
That's a bit odd to say the least. Normally a loss or PR that dramatic means that Google might be penalising the site - although this normally means it goes to 0, rather than N/A.
You're right in thinking that there are much better metrics out there, but I think this certainly worth investigating. I'd advise you to keep a close eye on your traffic in analytics and your rankings - it may be penalty related. Only briefly looked at your link profile but there does seem to be one or two links that are a bit unnatural (such as this one) - but like I say, haven't checked fully so could just be the exception to the rule.
Has your hosting or domain expired or renewed recently? I've seen PR fall because of that on some occasions. Also, have a look at your webmaster tools for errors - specifically for any crawl errors that may have occurred, like 403s. Check for any spike there. If there is one, it may reoccur, so you'd obviously want to remedy any problem there. You might also want to check your robots.txt hasn't had anything added to it by mistake.
Failing that I can't really think what may have caused your toolbar PR to fall. My advice would be to watch your site very closely over the next week. Hopefully, everything will be fine and the PR may return. Worse case scenario, however, will be a penalty. Look for any sharp traffic or ranking falls. If you have the time, you may want to go through your backlink profile and make a note of any poor quality or suspicious links, just to give you a head-start should you need to remove them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What do you do with product pages that are no longer used ? Delete/redirect to category/404 etc
We have a store with thousands of active items and thousands of sold items. Each product is unique so only one of each. All products are pinned and pushed online ... and then they sell and we have a product page for a sold item. All products are keyword researched and often can rank well for longtail keywords Would you :- 1. delete the page and let it 404 (we will get thousands) 2. See if the page has a decent PA, incoming links and traffic and if so redirect to a RELEVANT category page ? ~(again there will be thousands) 3. Re use the page for another product - for example a sold ruby ring gets replaces with ta new ruby ring and we use that same page /url for the new item. Gemma
Technical SEO | | acsilver0 -
Confused about repeated occurences of URL/essayorg/topic/ showing up as 404 errors in our site logs
Working on a Wordpress website, https://thedoctorwithin.comScanning the site’s 404 errors, I’m seeing a lot of searches for URL/essayorg/topic, coming from Bingbot, as well as other spiders (Google, OpensiteExlorer). We get at least 200 of these irrelevant requests per week. Seems like each topic that follows /essayorg/ is unique. Some include typos: /dissitation/Haven't done a verification to make sure the spiders are who they say they are, yet.Almost seems like there are many links ‘in the wild’ intended for Essay.Org that are being directed towards the site I’m working on.I've considered redirecting any requests for URL/essayorg/ to our sitemap… figuring that might encourage further spidering of actual site content. Is redirection to our sitemap xml file a good idea, or might doing so have unintended consequences? Interested in suggestions about why this might be occurring. Thank you.
Technical SEO | | linkjuiced0 -
Should I actively filter very similar questions (but not worded identically) and answers for my Q/A Website?
We're setting up a medical Q/A on our website. I anticipate a lot of questions being asked, which we have already answered before through other questions in the Q/A. For example "How do I plan my pregnancy if I am Diabetic?" , "I have been Diabetic for the past 5 years, and am planning to start a family. What should be my approach regarding this? Should I: Write two different answers for each question, but which will eventually have the same meaning. Some answers could end up looking almost identical. Redirect new questions to the old question by posting a link of the answer. Merging the two questions, or deleting on of the two. What is the best solution for maximising SEO Benefits, and minimizing the chances of a penalty?
Technical SEO | | Fmann0 -
Spam pages / content created due to hack. 404 cleanup.
A hosting company's server was hacked and one of our customer's sites was injected with 7,000+ pages of fake, bogus, promotional content. Server was patched and spammy content removed from the server. Reviewing Google Webmaster's Tools we have all the hacked pages showing up as 404's and have a severe drop in impressions, rank and traffic. GWT also has 'Some manual actions apply to specific pages, sections, or links'... What do you recommend for: Cleaning up 404's to spammy pages? (I am not sure redirect to home page is a right thing to do - is it?) Cleaning up links that were created off site to the spam pages Getting rank bank // what would you do in addition to the above?
Technical SEO | | GreenStone0 -
A/B testing entire website VS Seo issues
I'm familar with A/B testing variations of a page but I'd like to A/B test a new designs version of a e-commerce site. I´m wondering about the best way to test with SEO concerns... this is what I´ve in mind right now, any suggestion? Use parameters to make version B different from A version. Redirect 50% of the users with 302 ( or javascript would be a better way?) Use noindex in the B pages. Use rel=canonical in the B pages pointing to A version. In the end use 301 redirect to all B pages to A urls. PS: We can´t use subdomain and i don´t wanna use robots.txt file to protect the new design from competitors. I´d love any suggestions and tips about it - thanks folks 🙂
Technical SEO | | SeoMartin10 -
IIS 7.5 - Duplicate Content and Totally Wrong robot.txt
Well here goes! My very first post to SEOmoz. I have two clients that are hosted by the same hosting company. Both sites have major duplicate content issues and appear to have no internal links. I have checked this both here with our awesome SEOmoz Tools and with the IIS SEO Tool Kit. After much waiting I have heard back from the hosting company and they say that they have "implemented redirects in IIS7.5 to avoid duplicate content" based on the following article: http://blog.whitesites.com/How-to-setup-301-Redirects-in-IIS-7-for-good-SEO__634569104292703828_blog.htm. In my mind this article covers things better: www.seomoz.org/blog/what-every-seo-should-know-about-iis. What do you guys think? Next issue, both clients (as well as other sites hosted by this company) have a robot.txt file that is not their own. It appears that they have taken one client's robot.txt file and used it as a template for other client sites. I could be wrong but I believe this is causing the internal links to not be indexed. There is also a site map, again not for each client, but rather for the client that the original robot.txt file was created for. Again any input on this would be great. I have asked that the files just be deleted but that has not occurred yet. Sorry for the messy post...I'm at the hospital waiting to pick up my bro and could be called to get him any minute. Thanks so much, Tiff
Technical SEO | | TiffenyPapuc0 -
How best to deal with www.home.com and www.home.com/index.html
Firstly, this is for an .asp site - and all my usual ways of fixing this (e.g. via htaccess) don't seem to work. I'm working on a site which has www.home.com and www.home.com/index.html - both URL's resolve to the same page/content. If I simply drop a rel canonical into the page, will this solve my dupe content woes? The canonical tag would then appear in both www.home.com and www.home.com/index.html cases. If the above is Ok, which version should I be going with? - or - Thanks in advance folks,
Technical SEO | | Creatomatic
James @ Creatomatic0 -
How to safely reduce the number of 301 redirects / should we be adding so many?
Hi All, We lost a lot of good rankings over the weekend with no obvious cause. Our top keyword went from p3 to p12, for example. Site speed is pretty bad (slower than 92% of sites!) but it has always been pretty bad. I'm on to the dev team to try and crunch this (beyond image optimisation) but I know that something I can effect is the number of 301 redirects we have in place. We have hundreds of 301s because we've been, perhaps incorrectly, adding one every time we find a new crawl error in GWT and it isn't because of a broken link on our site or on an external site where we can't track down the webmaster to fix the link. Is this bad practice, and should we just ignore 404s caused by external broken URLs? If we wanted to reduce these numbers, should we think about removing ones that are only in place due to external broken URLs? Any other tips for safely reducing the number of 301s? Thanks, all! Chris
Technical SEO | | BaseKit0