Traffic drop off and page isn't indexed
-
In the last couple weeks my impressiona and clicks have dropped off to about half what it used to be. I am wondering if Google is punishing me for something...
I also added two new pages to my site in the first week of June and they still aren't indexed. In the past it seemed like new pages would be indexed in a couple days.
Is there any way to tell if Google is unhappy with my site? WMT shows 3 server errors, 3 Access denied, and 122 not found errors. Could those not found pages be killing me?
Thanks for any advise,
Greg
-
Hi David,
I did add ton of new web pages and because of those got those 404's. I've since cleaned them all up. i thought I had them cleaned up before the my traffic fell but there could be a lag there. I am a little bummber my PR is 2... pretty marginal improvement over 0.
I will keep an eye on my traffic and hopefully it was the bad links.
Thank you for the thoughful response!
-
I'm showing your homepage as PR 2. So you're definitely indexed. I also Googled a sentence from you homepage, and it was the first result. So you're good with the index.
Your problem is all of the errors. The bots won't crawl your site as frequently if you have a lot of 404 errors. Also, your server errors and access denied errors are worrisome. Check your robots.txt and make sure it isn't blocking out part of your site. Additionally, you need to track down the server errors and fix them. If you're using a commercial host like Host Gator or Go Daddy, then their customer service can help you with the server side stuff.
Go back to the last time your changed your site's architecture or linking syntax, and that's probably the source of all the 404 errors. Then it's just a matter of figuring out which pages in your site contain links to the pages that are gone and fixing those bad links. You can also petition Google to stop indexing certain pages through the webmaster tools. That helps with the 404 errors.
After the site gets cleaned up, the crawl rate should pick up again. If you want to goose the crawl rate a little bit, but a blog on your site. It's fairly easy to get a wordpress blog looped onto your site. Consistently fresh content always helps a sluggish crawl rate.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Removing indexed internal search pages from Google when it's driving lots of traffic?
Hi I'm working on an E-Commerce site and the internal Search results page is our 3rd most popular landing page. I've also seen Google has often used this page as a "Google-selected canonical" on Search Console on a few pages, and it has thousands of these Search pages indexed. Hoping you can help with the below: To remove these results, is it as simple as adding "noindex/follow" to Search pages? Should I do it incrementally? There are parameters (brand, colour, size, etc.) in the indexed results and maybe I should block each one of them over time. Will there be an initial negative impact on results I should warn others about? Thanks!
Intermediate & Advanced SEO | | Frankie-BTDublin0 -
Organic search traffic is dropping since September
We own a large wiki site which allows people to make articles about their business and other things that Wikipedia would prohibit. To make our site more rich and expand the pages people can link to on their pages, we scraped between 1-2 million pages from the english wikipedia, pages such as “Los Angeles, CA” and “United States” etc. We’ve been getting a steady supply of organic backlinks from users who create their own pages and cite their wikis on their website, in news etc. However, starting 2 months ago our organic traffic has started slowly decaying as if we have received some kind of algorithmic penalty. What could it be? Could it be dupe content from the wikipedia pages we imported and indexed? Could it be some kind of algo from the Penguin update? We are just very confused why our organic search traffic would begin to drop at all since every day we have organic users making quality pages, some of whom organically backlink their articles on their own website and these obviously add up over time.
Intermediate & Advanced SEO | | teddef1 -
Why isn't Google indexing this site?
Hello, Moz Community My client's site hasn't been indexed by Google, although it was launched a couple of months ago. I've ran down the check points in this article https://moz.com/ugc/8-reasons-why-your-site-might-not-get-indexed without finding a reason why. Any sharp SEO-eyes out there who can spot this quickly? The url is: http://www.oldermann.no/ Thank you
Intermediate & Advanced SEO | | Inevo
INEVO, digital agency0 -
How to check if the page is indexable for SEs?
Hi, I'm building the extension for Chrome, which should show me the status of the indexability of the page I'm on. So, I need to know all the methods to check if the page has the potential to be crawled and indexed by a Search Engines. I've come up with a few methods: Check the URL in robots.txt file (if it's not disallowed) Check page metas (if there are not noindex meta) Check if page is the same for unregistered users (for those pages only available for registered users of the site) Are there any more methods to check if a particular page is indexable (or not closed for indexation) by Search Engines? Thanks in advance!
Intermediate & Advanced SEO | | boostaman0 -
Removing pages from index
My client is running 4 websites on ModX CMS and using the same database for all the sites. Roger has discovered that one of the sites has 2050 302 redirects pointing to the clients other sites. The Sitemap for the site in question includes 860 pages. Google Webmaster Tools has indexed 540 pages. Roger has discovered 5200 pages and a Site: query of Google reveals 7200 pages. Diving into the SERP results many of the pages indexed are pointing to the other 3 sites. I believe there is a configuration problem with the site because the other sites when crawled do not have a huge volume of redirects. My concern is how can we remove from Google's index the 2050 pages that are redirecting to the other sites via a 302 redirect?
Intermediate & Advanced SEO | | tinbum0 -
No matter what I do, my website isn't showing up in search results. What's happening?
I've checked for meta-robots, all SEO tags are fixed, reindexed with google-- basically everything and it's not showing up. According to SEOMoz all looks fine, I am making a few fixes, but nothing terribly major. It's a new website, and i know it takes a while, but there is no movement here in a month. Any insights here?
Intermediate & Advanced SEO | | Wabash0 -
Most Painless way of getting Duff Pages out of SE's Index
Hi, I've had a few issues that have been caused by our developers on our website. Basically we have a pretty complex method of automatically generating URL's and web pages on our website, and they have stuffed up the URL's at some point and managed to get 10's of thousands of duff URL's and pages indexed by the search engines. I've now got to get these pages out of the SE's indexes as painlessly as possible as I think they are causing a Panda penalty. All these URL's have an addition directory level in them called "home" which should not be there, so I have: www.mysite.com/home/page123 instead of the correct URL www.mysite.com/page123 All these are totally duff URL's with no links going to them, so I'm gaining nothing by 301 redirects, so I was wondering if there was a more painless less risky way of getting them all out the indexes (IE after the stuff up by our developers in the first place I'm wary of letting them loose on 301 redirects incase they cause another issue!) Thanks
Intermediate & Advanced SEO | | James770 -
Removing URLs in bulk when directory exclusion isn't an option?
I had a bunch of URLs on my site that followed the form: http://www.example.com/abcdefg?q=&site_id=0000000048zfkf&l= There were several million pages, each associated with a different site_id. They weren't very useful, so we've removed them entirely and now return a 404.The problem is, they're still stuck in Google's index. I'd like to remove them manually, but how? There's no proper directory (i.e. /abcdefg/) to remove, since there's no trailing /, and removing them one by one isn't an option. Is there any other way to approach the problem or specify URLs in bulk? Any insights are much appreciated. Kurus
Intermediate & Advanced SEO | | kurus1