From Number 4 in SERP to NA in SERP
-
Keyword : speed reading classes
URL : hhtp://www.speedreadingclasses.org/
Earlier the keyword position was # 4 in Google.com
Did a 301 to this from a different PR4 site. I was expecting position 2 or 3 but it went to 9 on last Wednesday. On Thursday It went to # 10 and today it is not even in the SERP.
What might have gone wrong? Is there any way to know about this?
Please help.
Regards
Ray
-
USA.
-
are you in India or USA
-
Checked using Google.com again using Firefox this time. #6
-
You are #6 here in the US (checked in Chrome). Can you see when your rankings shifted ? It might have been the Google Adsense Penalty, but looks highly unlikely to me. I think some minor on-page optimization can help you get back. Slow and Steady wins the race
-
Using the Bing Toolkit,
but you can see this by using dev tools in IE, press F12 and select network tab
Here is an example
The link to "http://www.speedreadingclasses.org/messages/" has resulted in HTTP redirection to "http://www.speedreadingclasses.org/wp-login.php?redirect_to=%2Fmessages%2F"Why would this redirect, from a frindly url to an ugly url anyhow?
-
How did you check that. through the search engine or by using any tool.
Earlier it was at position 4 (7 days back)
-
Yopu rank #7 from here in australia for speed reading classes,
Having a look at your site, i see a few problems, i dont know if you have made any changes in the last few weeks you did not say.
But one problem i see is you have very little content on home page, and very little above the fold.
Google now ranks pages lowerif they have little content above the fold.
http://googlewebmastercentral.blogspot.com.au/2012/01/page-layout-algorithm-improvement.htmlApart from that, you have over 1800 un-necessary redirects, meaning you have internal links that do not point directly at the correct rul, but are redirected there. Redirects leak link juice. you are missing a lot of link juice.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why isn't our complete meta title showing up in the Google SERPS? (cut off half way)
We carry a product line, cutless bearings (for use on boats). For instance, we have one, called the Able, that has the following meta title (and searched by View Page Source to confirm): BOOT 1-3/8" x 2-3/8" x 5-1/2" Johnson Cutless Bearing | BOOT Cutlass However, if I search for it on on Google by part number or name (boot cutless bearing, boot cutlass bearing), the meta title comes back with whole first part chopped off, only showing this : "x 5-1/2" Johnson Cutless Bearing | BOOT Cutlass - Citimarine ..." Any idea why? Here's the url if it will hopefully help: https://citimarinestore.com/en/metallic-inches/156-boot-johnson-cutless-bearing-870352103.html All the products in the category are doing the same. Thanks!
Intermediate & Advanced SEO | | Citimarine0 -
Mobile Site Panda 4.2 Penalty
We are an ecommerce company, and we outsource our mobile site to a service, and our mobile site is m.ourdomain.com. We pass the Google mobile ready test. Our product page content on the mobile site is woefully thin (typically less than 100 words), and it appears that we got hit with Panda 4.2 on the mobile site. Starting at the end of July, our mobile rankings have dropped, and our mobile traffic is now about half of what it was in July. We are working to correct the content issue but it obviously takes time. So here's my question - if our mobile site got hit with Panda 4.2, could that have a negative effect on our desktop site?
Intermediate & Advanced SEO | | AMHC0 -
Substantial difference between Number of Indexed Pages and Sitemap Pages
Hey there, I am doing a website audit at the moment. I've notices substantial differences in the number of pages indexed (search console), the number of pages in the sitemap and the number I am getting when I crawl the page with screamingfrog (see below). Would those discrepancies concern you? The website and its rankings seems fine otherwise. Total indexed: 2,360 (Search Consule)
Intermediate & Advanced SEO | | Online-Marketing-Guy
About 2,920 results (Google search "site:example.com")
Sitemap: 1,229 URLs
Screemingfrog Spider: 1,352 URLs Cheers,
Jochen0 -
Why is my m-dot site outranking my main site in SERPs?
My client has a WP site and a Duda mobile site that we inherited. For some reason their m-dot site is ranking on P1 of Google for their top KWs instead of the main site which is much more robust. The main site might rank beyond page 5 when the generic home page for their m-dot site appears on P1. Does anyone have any idea why this might be happening?
Intermediate & Advanced SEO | | Etna0 -
Improve change my Meta Description shows in SERP
I feel my meta descriptions are descriptive and fairly represent the info on each page of my site. However, Google frequently includes this "20+ items" in front of the snippet. I run a job site and each page list 20 jobs. What if I include a bit of coding in the Meta Description to include "Latest Jobs Posted TODAY's DATE" - since the jobs listed on the page will include a date. On each page there is also option to "Create Email Alert" and "save Jobs" maybe I should include writing about that as well? I have read all Google's documents on the importance of making Meta Des relevant for the page etc, so any good insight how increase my chances of getting the meta des displayed in the SERP would be appreciated. thank you, Kristian
Intermediate & Advanced SEO | | knielsen0 -
Googlebot found an extremely high number of URLs on your site
I keep getting the "Googlebot found an extremely high number of URLs on your site" message in the GWMT for one of the sites that I manage. The error is as below- Googlebot encountered problems while crawling your site. Googlebot encountered extremely large numbers of links on your site. This may indicate a problem with your site's URL structure. Googlebot may unnecessarily be crawling a large number of distinct URLs that point to identical or similar content, or crawling parts of your site that are not intended to be crawled by Googlebot. As a result Googlebot may consume much more bandwidth than necessary, or may be unable to completely index all of the content on your site. I understand the nature of the message - the site uses a faceted navigation and is genuinely generating a lot of duplicate pages. However in order to stop this from becoming an issue we do the following; No-index a large number of pages using the on page meta tag. Use a canonical tag where it is appropriate But we still get the error and a lot of the example pages that Google suggests are affected by the issue are actually pages with the no-index tag. So my question is how do I address this problem? I'm thinking that as it's a crawling issue the solution might involve the no-follow meta tag. any suggestions appreciated.
Intermediate & Advanced SEO | | BenFox0 -
NOINDEX content still showing in SERPS after 2 months
I have a website that was likely hit by Panda or some other algorithm change. The hit finally occurred in September of 2011. In December my developer set the following meta tag on all pages that do not have unique content: name="robots" content="NOINDEX" /> It's been 2 months now and I feel I've been patient, but Google is still showing 10,000+ pages when I do a search for site:http://www.mydomain.com I am looking for a quicker solution. Adding this many pages to the robots.txt does not seem like a sound option. The pages have been removed from the sitemap (for about a month now). I am trying to determine the best of the following options or find better options. 301 all the pages I want out of the index to a single URL based on the page type (location and product). The 301 worries me a bit because I'd have about 10,000 or so pages all 301ing to one or two URLs. However, I'd get some link juice to that page, right? Issue a HTTP 404 code on all the pages I want out of the index. The 404 code seems like the safest bet, but I am wondering if that will have a negative impact on my site with Google seeing 10,000+ 404 errors all of the sudden. Issue a HTTP 410 code on all pages I want out of the index. I've never used the 410 code and while most of those pages are never coming back, eventually I will bring a small percentage back online as I add fresh new content. This one scares me the most, but am interested if anyone has ever used a 410 code. Please advise and thanks for reading.
Intermediate & Advanced SEO | | NormanNewsome0 -
Why is my site's 'Rich Snippets' information not being displayed in SERPs?
We added hRecipe microformats data to our site in April and then migrated to the Schema.org Recipe format in July, but our content is still not being displayed as Rich Snippets in search engine results. Our pages validate okay in the Google Rich Snippets Testing Tool. Any idea why they are not being displayed in SERP's? Thanks.
Intermediate & Advanced SEO | | Techboy0