Some pages of my website http://goo.gl/1vGZv stopped crawling in Google
-
hi , i have 5 years old website and some page of my website http://goo.gl/1vGZv stopped indexing in Google . I have asked Google webmaster to remove low quality link via disavow tool . What to do ?
-
In addition to what Moosa has said I would advice you look at historical rankings and see when you were hit and compare this with recent Google updates to see if you have been hit by an update - that sounds whats most likely happened here.
I'd also advise doing a "url" search on google and checking your site appears, if so it's indexed and your issue is not indexation.
-
Before you go with the link disavow tool it’s better to check your robots.txt and check if these pages are stopped from crawling r the page code says no-index which stops Google from indexing the page...usually this happens mistakenly when changing any plug-in or code from the website.
But if you think that the pages lost their positions or get out of the index because of the low quality and content then in that case you should better go with removing low quality link first and then after some time use the disavow tool but as a whole this process would take time may be more than a month.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I noindex shop page and blog page for SEO?
I have about 15 products in my store. Should I noindex shop and blog page for SEO? The reason I ask this question is because I see someone suggesting noindex archives pages. And the shop page is product archive and blog page is archive too, so should I choose index or noindex? Thanks!
White Hat / Black Hat SEO | | Helloiamgood0 -
Duplicate content warning: Same page but different urls???
Hi guys i have a friend of mine who has a site i noticed once tested with moz that there are 80 duplicate content warnings, for instance Page 1 is http://yourdigitalfile.com/signing-documents.html the warning page is http://www.yourdigitalfile.com/signing-documents.html another example Page 1 http://www.yourdigitalfile.com/ same second page http://yourdigitalfile.com i noticed that the whole website is like the nealry every page has another version in a different url?, any ideas why they dev would do this, also the pages that have received the warnings are not redirected to the newer pages you can go to either one??? thanks very much
White Hat / Black Hat SEO | | ydf0 -
Strategies to recover from a Google Penalty?
2 years ago we took over a client who had a hacked site and also had signed up with a black hat SEO team that set up 50 spammy directory links back to the site. Since then we have cleaned up the hacks, had the site reviewed by Google and readded to the Search Index, and disavowed all the directory links through GWT. Over the last 2 years, we've encouraged the client to create new content and have developed a small but engaged social following. The website is www.fishtalesoutfitting.com/. The site's domain authority is 30, but it struggles to rank higher than 20 for even uncompetitive long tail keywords. Other sites with much lower domain authorities outrank the site for our primary keywords. We are now overhauling the site design and content. We are considering creating an entirely new URL for the primary domain. We would then use 301 redirects from the old url to the new. We'd welcome insight into why the current site may still be getting penalized, as well as thoughts on our strategy or other recommendations to recover from the events of 2 years ago. Thank you.
White Hat / Black Hat SEO | | mlwilmore0 -
Why have bots (including googlebot) categorized my website as adult?
How do bots decide whether a website is adult? For example, I have a gifting portal, but strangely here, it is categorized as 'Adult'. Also, my google adsense application to run ads on my site got rejected - I have a feeling this is because googlebot categorized my site as adult. And there are good chances that other bots also consider it an adult website, rather than a gifting website. Can anyone please go through the site and tell me why this is happening? Thanks in advance.
White Hat / Black Hat SEO | | rahulkan0 -
Does Google crawl and index dynamic pages?
I've linked a category page(static) to my homepage and linked a product page (dynamic page) to the category page. I tried to crawl my website using my homepage URL with the help of Screamingfrog while using Google Bot 2.1 as the user agent. Based on the results, it can crawl the product page which is a dynamic. Here's a sample product page which is a dynamic page(we're using product IDs instead of keyword-rich URLs for consistency):http://domain.com/AB1234567 Here's a sample category page: http://domain.com/city/area Here's my full question, does the spider result (from Screamingfrog) means Google will properly crawl and index the property pages though they are dynamic?
White Hat / Black Hat SEO | | esiow20130 -
Penalised by Google - Should I Redirect to a new domain?
Last month my rankings dropped a couple of pages on Google and am no longer receiving as many visits from Google as I used to. It's coming up to summer which is the time my business naturally picks up yet I can't fix this problem. I have a crazy idea of redirecting my established site onto a new domain in hopes that the penalty would be removed. I have tried removing any manipulative links yet my ranking are not coming back. Anyone had success in redirecting to a new domain?
White Hat / Black Hat SEO | | penn730 -
Shadow Pages for Flash Content
Hello. I am curious to better understand what I've been told are "shadow pages" for Flash experiences. So for example, go here:
White Hat / Black Hat SEO | | mozcrush
http://instoresnow.walmart.com/Kraft.aspx#/home View the page as Googlebot and you'll see an HTML page. It is completely different than the Flash page. 1. Is this ok?
2. If I make my shadow page mirror the Flash page, can I put links in it that lead the user to the same places that the Flash experience does?
3. Can I put "Pinterest" Pin-able images in my shadow page?
3. Can a create a shadow page for a video that has the transcript in it? Is this the same as closed captioning? Thanks so much in advance, -GoogleCrush0 -
Pages For Products That Don't Exist Yet?
Hi, I have a client that makes products that are accessories for other company's popular consumer products. Their own products on their website rank for other companies product names like, for made up example "2011 Super Widget" and then my client's product... "Charger." So, "Super Widget 2011 Charger" might be the type of term my client would rank for. Everybody knows the 2012 Super Widget will be out in some months and then my client's company will offer the 2012 Super Widget Charger. What do you think of launching pages now for the 2012 Super Widget Charger. even though it doesn't exist yet in order to give those pages time to rank while the terms are half as competitive. By the time the 2012 is available, these pages have greater authority/age and rank, instead of being a little late to the party? The pages would be like "coming soon" pages, but still optimized to the main product search term. About the only negative I see is that they'lll have a higher bounce rate/lower time on page since the 2012 doesn't even exist yet. That seems like less of a negative than the jump start on ranking. What do you think? Thanks!
White Hat / Black Hat SEO | | 945010