Traffic drop off and page isn't indexed
-
In the last couple weeks my impressiona and clicks have dropped off to about half what it used to be. I am wondering if Google is punishing me for something...
I also added two new pages to my site in the first week of June and they still aren't indexed. In the past it seemed like new pages would be indexed in a couple days.
Is there any way to tell if Google is unhappy with my site? WMT shows 3 server errors, 3 Access denied, and 122 not found errors. Could those not found pages be killing me?
Thanks for any advise,
Greg
-
Hi David,
I did add ton of new web pages and because of those got those 404's. I've since cleaned them all up. i thought I had them cleaned up before the my traffic fell but there could be a lag there. I am a little bummber my PR is 2... pretty marginal improvement over 0.
I will keep an eye on my traffic and hopefully it was the bad links.
Thank you for the thoughful response!
-
I'm showing your homepage as PR 2. So you're definitely indexed. I also Googled a sentence from you homepage, and it was the first result. So you're good with the index.
Your problem is all of the errors. The bots won't crawl your site as frequently if you have a lot of 404 errors. Also, your server errors and access denied errors are worrisome. Check your robots.txt and make sure it isn't blocking out part of your site. Additionally, you need to track down the server errors and fix them. If you're using a commercial host like Host Gator or Go Daddy, then their customer service can help you with the server side stuff.
Go back to the last time your changed your site's architecture or linking syntax, and that's probably the source of all the 404 errors. Then it's just a matter of figuring out which pages in your site contain links to the pages that are gone and fixing those bad links. You can also petition Google to stop indexing certain pages through the webmaster tools. That helps with the 404 errors.
After the site gets cleaned up, the crawl rate should pick up again. If you want to goose the crawl rate a little bit, but a blog on your site. It's fairly easy to get a wordpress blog looped onto your site. Consistently fresh content always helps a sluggish crawl rate.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Home page suddenly dropped from index!!
A client's home page, which has always done very well, has just dropped out of Google's index overnight!
Intermediate & Advanced SEO | | Caro-O
Webmaster tools does not show any problem. The page doesn't even show up if we Google the company name. The Robot.txt contains: Default Flywheel robots file User-agent: * Disallow: /calendar/action:posterboard/
Disallow: /events/action~posterboard/ The only unusual thing I'm aware of is some A/B testing of the page done with 'Optimizely' - it redirects visitors to a test page, but it's not a 'real' redirect in that redirect checker tools still see the page as a 200. Also, other pages that are being tested this way are not having the same problem. Other recent activity over the last few weeks/months includes linking to the page from some of our blog posts using the page topic as anchor text. Any thoughts would be appreciated.
Caro0 -
Pages getting into Google Index, blocked by Robots.txt??
Hi all, So yesterday we set up to Remove URL's that got into the Google index that were not supposed to be there, due to faceted navigation... We searched for the URL's by using this in Google Search.
Intermediate & Advanced SEO | | bjs2010
site:www.sekretza.com inurl:price=
site:www.sekretza.com inurl:artists= So it brings up a list of "duplicate" pages, and they have the usual: "A description for this result is not available because of this site's robots.txt – learn more." So we removed them all, and google removed them all, every single one. This morning I do a check, and I find that more are creeping in - If i take one of the suspecting dupes to the Robots.txt tester, Google tells me it's Blocked. - and yet it's appearing in their index?? I'm confused as to why a path that is blocked is able to get into the index?? I'm thinking of lifting the Robots block so that Google can see that these pages also have a Meta NOINDEX,FOLLOW tag on - but surely that will waste my crawl budget on unnecessary pages? Any ideas? thanks.0 -
Can't get page moving!
Hi all. I've been working on a page for months now and can't seem to make any progress. I'm trying to get http://www.alwayshobbies.com/dolls-houses on the first page for term 'dolls houses'. I've done the following: Cleaned up the site's overall backlink profile Built some new links to the page Added 800 words of new copy Reduced the number of keyword instances on the page below 15 Any advice would be much appreciated. I don't think it's down to links as the DA/PA isn't wildly different from its competitors. Thanks!
Intermediate & Advanced SEO | | Blink-SEO0 -
Extra indexed pages from my blog in wordpress
I have a blog on my site which is in WordPress. When you publish an article it creates a couple of urls such as tags, author, category, month, ... . So when you look for indexed pages you see tons of pages for the blog. Does it hurt the SEO. If yes how I can sort it out,
Intermediate & Advanced SEO | | AlirezaHamidian0 -
Any downsides of (permanent)redirecting 404 pages to more generic pages(category page)
Hi, We have a site which is somewhat like e-bay, they have several categories and advertisements posted by customers/ client. These advertisements disappear over time and turn into 404 pages. We have the option to redirect the user to the corresponding category page, but we're afraid of any negative impact of this change. Are there any downsides, and is this really the best option we have? Thanks in advance!
Intermediate & Advanced SEO | | vhendriks0 -
How can I see all the pages google has indexed for my site?
Hi mozers, In WMT google says total indexed pages = 5080. If I do a site:domain.com commard it says 6080 results. But I've only got 2000 pages in my site that should be indexed. So I would like to see all the pages they have indexed so I can consider noindexing them or 404ing them. Many thanks, Julian.
Intermediate & Advanced SEO | | julianhearn0 -
Why isn't google indexing our site?
Hi, We have majorly redesigned our site. Is is not a big site it is a SaaS site so has the typical structure, Landing, Features, Pricing, Sign Up, Contact Us etc... The main part of the site is after login so out of google's reach. Since the new release a month ago, google has indexed some pages, mainly the blog, which is brand new, it has reindexed a few of the original pages I am guessing this as if I click cached on a site: search it shows the new site. All new pages (of which there are 2) are totally missed. One is HTTP and one HTTPS, does HTTPS make a difference. I have submitted the site via webmaster tools and it says "URL and linked pages submitted to index" but a site: search doesn't bring all the pages? What is going on here please? What are we missing? We just want google to recognise the old site has gone and ALL the new site is here ready and waiting for it. Thanks Andrew
Intermediate & Advanced SEO | | Studio330 -
Indexed non existent pages, problem appeared after we 301d the url/index to the url.
I recently read that if a site has 2 pages that are live such as: http://www.url.com/index and http://www.url.com/ will come up as duplicate if they are both live... I read that it's best to 301 redirect the http://www.url.com/index and http://www.url.com/. I read that this helps avoid duplicate content and keep all the link juice on one page. We did the 301 for one of our clients and we got about 20,000 errors that did not exist. The errors are of pages that are indexed but do not exist on the server. We are assuming that these indexed (nonexistent) pages are somehow linked to the http://www.url.com/index The links are showing 200 OK. We took off the 301 redirect from the http://www.url.com/index page however now we still have 2 exaact pages, www.url.com/index and http://www.url.com/. What is the best way to solve this issue?
Intermediate & Advanced SEO | | Bryan_Loconto0