Www.oru.edu & oru.edu
-
Everyplace I check, the pagerank for both of those domains is either 0 or NA. However on this site oru.edu actually has a pagerank and this is the only source I can find for that. The sub www.oru.edu still has a zero on this site. A historical checker has oru.edu a N/A ranking just a few days ago and in June.
-
When I try that tool it completely malfunctions for me....doesn't do anything.
I think it's a moot point though.
If you previously had pagerank and now you've gone down to either 0 or N/A then there's a good chance you've got some kind of penalty.
Do you have any indication of whether traffic has dropped? If you don't have analytics you can possibly get information from your hosting company and server logs. Pagerank (PR) really doesn't mean much...but a loss of Pagerank plus a loss of traffic makes some type of penalty a big possibility.
The question is whether it is a manual penalty or an algorithmic one like Penguin or Panda.
I really would advise setting up webmaster tools today. You can either wait and see if a warning pops up or if you feel that traffic has dropped you could actually file a reconsideration request. Google will respond back within 3-14 days and let you know whether you actually do have a penalty. If so, then the nature of the message should help us know what the problem is. If not, then we go looking at traffic data to determine whether this is Penguin or Panda.
OR..............it's possible there is something else such as a robots.txt problem or something else going on. But really, without some type of traffic information and without knowing if there is a warning in webmaster tools it's going to be hard to diagnose the problem.
-
Unfortunately I don't have access to that.
This tool here is what's throwing me off with the http://oru.edu http://www.seomoz.org/toolbox/pagerank, even though I'm sure it's a 0 or N/A, I can't figure out why this one tool comes up with a different response than everyplace else.
-
I see you've fixed your div issues - it looks much better now!
oru.edu redirects to www.oru.edu (as it should) for me and I see both as Pagerank N/A.
I'm not sure if it was established or not on your previous message, but do you have webmaster tools set up, and if so, do you have any warnings in your messages there?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can I 100% safe get some of my keywords ranking on second & third page?
Hi, I want to know how can I rank some of my keywords which are in the second and third page on google on page one 100% save, so it will pass all penguin, pandas etc as quick as possible? Kind Regards
White Hat / Black Hat SEO | | rodica70 -
Google messages & penalties
I just read the following comment in a response to someone else's question. The Responer is an SEOMoz Authority whose opinion I respect and have learned from (not sure if it's cool to mention names in a question) and it spurred my curiosity: "...Generally you will receive a warning from Google before your site is penalized, unless you are talking about just specific keywords." This is something I have been wondering about in relation to my own sudden ranking drop for 2 specific keywords as I did not receive any warnings or notices. I have been proceeding as if I had over used these keywords on my Home page due to an initial lesser drop, but identifying the cause for the huge drop still seems useful for a number of reasons. Can anyone explain this further?
White Hat / Black Hat SEO | | gfiedel0 -
Rollover design & SEO
After reading this article http://www.seomoz.org/blog/designing-for-seo some questions came up from my developers. In the article it says "One potential solution to this problem is a mouse-over. Initially when viewed, the panel will look as it does on the left hand side (exactly as the designer want it), yet when a user rolls over the image the panel changes into what you see on the right hand side (exactly what the SEO wants)." My developers say" Having text in the rollovers is almost like hiding text and everyone knows in SEO that you should never hide text. "In the article he explains that it is not hidden text since its visible & readable by the engines.What are everyone's thoughts on this? Completely acceptable or iffy?Thanks
White Hat / Black Hat SEO | | DCochrane0 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0 -
Google Sitemaps & punishment for bad URLS?
Hoping y'all have some input here. This is along story, but I'll boil it down: Site X bought the url of Site Y. 301 redirects were added to direct traffic (and help transfer linkjuice) from urls in Site X to relevant urls in Site Y, but 2 days before a "change of address" notice was submitted in Google Webmaster Tools, an auto-generating sitemap somehow applied urls from Site Y to the sitemap of Site X, so essentially the sitemap contained urls that were not the url of Site X. Is there any documentation out there that Google would punish Site X for having essentially unrelated urls in its sitemap by downgrading organic search rankings because it may view that mistake as black hat (or otherwise evil) tactics? I suspect this because the site continues to rank well organically in Yahoo & Bing, yet is nonexistent on Google suddenly. Thoughts?
White Hat / Black Hat SEO | | RUNNERagency0 -
Link Wheel & Unnatural Links - Undoing Damage
Client spent almost a year with link wheels and mass link blasts - end result was getting caught by google. I have taken over, we;ve revamped the site and I'm finishing up with onsite optimization. Would anyone have any suggestions how to undo the damage of the unnatural links and get back into googles favour a little quicker? Or the best next steps to undo the damage.
White Hat / Black Hat SEO | | ravynn0 -
What on-page/site optimization techniques can I utilize to improve this site (http://www.paradisus.com/)?
I use a Search Engine Spider Simulator to analyze the homepage and I think my client is using black hat tactics such as cloaking. Am I right? Any recommendations on to improve the top navigation under Resorts pull down. Each of the 6 resorts listed are all part of the Paradisus brand, but each resort has their own sub domain.
White Hat / Black Hat SEO | | Melia0