SEOBook RankChecker Works at one location but not another
-
I use SEOBook's RankChecker to do spot checking of keywords for clients or for potential clients. I like the tool quite a bit, but I've noticed that it recently stopped working at our office (instead of rankings, I just get a series of dashes) but works fine from my home computer. I'm thinking that it may have to do with our company's firewall.
Anyone have any thoughts?
-
Aaron,
Thanks for your response. I have installed what I believe is the latest version (1.8.21). I have tried to turn off Google Instant, although this isn’t as easy as I thought. They removed the ability to turn it off and replaced it with a box where you can check “never show instant results”. Even when you do this, you still see instant results.
I have the delay between queries set to 12 seconds, so I don’t think that is the issue. My next steps will be to check the firewall and then delete my current extensions in Firefox to see if any conflict.
Eric
-
- Do you have the most recent version of the extension installed at your work?
- Have you turned off Google Instant?
- Do Google results render fine, or is there a captcha on them? If captcha, then have you increased the delay between queries?
- It could also be an issue with the Firewall or security software, but these tend to be much less common of issues than issues of having no delay time between searches or there being issues with Google Instant.
- Another issue could be another conflicting extension in Firefox.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using one robots.txt for two websites
I have two websites that are hosted in the same CMS. Rather than having two separate robots.txt files (one for each domain), my web agency has created one which lists the sitemaps for both websites, like this: User-agent: * Disallow: Sitemap: https://www.siteA.org/sitemap Sitemap: https://www.siteB.com/sitemap Is this ok? I thought you needed one robots.txt per website which provides the URL for the sitemap. Will having both sitemap URLs listed in one robots.txt confuse the search engines?
Technical SEO | | ciehmoz0 -
Is there an easy way to hide one of your URL's on google search?, rather than redirecting?
We don't want to redirect to a different page, as some people still use it, we just don't want it to appear in search
Technical SEO | | TheIDCo0 -
Exact keyword in domain - does it work
Hi, a hypothetical question: let's say there is a fresh domain travelnorway.com (of course there is already domain like that, I am using it just as a example) Will it rank first on googe for "travel norway" query on google?( I mean in situation where 30 other companies are trying to rank for the same phrase) Thanks!
Technical SEO | | LeszekNowakowski0 -
Submitting a new sitemap index file. Only one file is getting read. What is the error?
Hi community, I am working to submit a new a new sitemap index files, where about 5 50,000 sku files will be uploaded. Webmasters is reporting that only 50k skus have been submitted. Google Webmasters is accepting the index, however only the first file is getting read. I have 2 errors and need to know if this is the reason that the multiple files are not getting uploaded. Errors: | 1 | | Warnings | Invalid XML: too many tags | Too many tags describing this tag. Please fix it and resubmi | | 2 | | Warnings | Incorrect namespace | Your Sitemap or Sitemap index file doesn't properly declare the namespace. | 1 | Here is the url I am submitting: http://www.westmarine.com/sitemap/wm-sitemap-index.xml | 1 | | | | |
Technical SEO | | mm9161570 -
Help!!! Website won't index after taking it over from another IT Company
Hi, A while back we took over a website that was built in Wordpress. We rebuilt it on another platform and switched the servers over whilst retaining the same domain.I had access to the old GA Account however so did the old IT company. Therefore I created a new GA account and used that in the new website pages.Recently we found the website had been blacklisted (previous to us taking it over) and now after being crawled a lot, only 2 pages have been indexed (over a 2month period).We have submitted a request for revision (to relist the website) buthave had no movement.**Just wondering if having a old, active account that was still linked to their old website would affect our Google listing?****Will dropping the old GA Tracking code/script into the site and deleting the new account enable Google to index?**Also, there is ample content, metadata and descriptions on the site.I welcome any help on this please!
Technical SEO | | nimblerdigital0 -
Domain hacked and redirected to another domain
2 weeks ago my home page plus some others had a 301 redirect to another cloned domain for about 1 week (due to a hack).The original pages were then de-indexed and the new bad domain was indexed and in effect stole my rankings.Then the 301 was removed/cleaned from my domain and the bad domain was fully de-indexed via a request I made in WMT (this was 1 week ago).Then my pages came back into the index but without any ranking power (as if it's just in the supplemental index).It's been like this for a week now and the algorithms have not been able to correct it. So how do I get this damage undone or corrected? Can someone at Google reverse/cancel the 301 ranking transfer since the algorithms don't seem to be able to?I have the option to do a "Change of Address" in WMT from bad domain to my domain. But I don't think this would work properly because it says I also need to place a 301 on the bad domain back to mine. Would a change of address still work without the 301?Please advise/help what to do in order to get my rankings back to where they were.
Technical SEO | | Dantek0 -
What do I do with an old site when a new one is built?
I have a customer that has a PR 4 website with over 3000 pages of content. He decided to build a brand new website with a new domain and it now has a PR of 2. Our question is, what do we do with the old site? Do we migrate all the content over to the new site and do redirects on all the pages? Do we keep the old site up and put links over to the new site? He was just planning on shutting it down but that seemed like a complete waste of PR and SEO. What is his best course of action? Thanks for your replies.
Technical SEO | | smartlinksolutions0 -
Has anyone worked with Edgecast?
They have a caching system where they assign multiple ip's based on location and we are curious how it affects SEO.
Technical SEO | | DragonSearch1