Link Removal Request Sent to Google, Bad Pages Gone from Index But Still Appear in Webmaster Tools
-
|
On June 14th the number of indexed pages for our website on Google Webmaster tools increased from 676 to 851 pages. Our ranking and traffic have taken a big hit since then.
The increase in indexed pages is linked to a design upgrade of our website. The upgrade was made June 6th. No new URLS were added. A few forms were changed, the sidebar and header were redesigned. Also, Google Tag Manager was added to the site.
My SEO provider, a reputable firm endorsed by MOZ, believes the extra 175 pages indexed by Google, pages that do not offer much content, may be causing the ranking decline.
My developer submitted a page removal request to Google via Webmaster tools around June 20th. Now when a Google search is done for site:www.nyc-officespace-leader.com 851 results display. Would these extra pages cause a drop in ranking?
My developer issued a link removal request for these pages around June 20th and the number in the Google search results appeared to drop to 451 for a few days, now it is back up to 851. In Google Webmaster Tools it is still listed as 851 pages. My ranking drop more and more everyday.
At the end of displayed Google Search Results for site:www.nyc-officespace-leader.comvery strange URSL are displaying like:www.nyc-officespace-leader.com/wp-content/plugins/...
If we can get rid of these issues should ranking return to what it was before?I suspect this is an issue with sitemaps and Robot text. Are there any firms or coders who specialize in this? My developer has really dropped the ball.
Thanks everyone!! Alan
|
-
Hi Marie:
Thanks for your quite detailed response to my question. Some of the possibilities you mention probably don't apply for the following reasons:
1. None of the URLs' changed, so it cannot be that.
2. Page titles did not change, so it's not that.
3. As for unnatural links, these have existed for several years. In fact we succeeded in getting 28 our of 100 removed and made a disavow request for Google for the other 80 toxic links. While the link profile is weak it is not worse than what it was before.
When the upgrade was launched in early June Wordpress was upgraded to the latest version. I wonder if at that time some issue did not develop with robot txt or no-index. I find it very curious if that a removal request was made for the 175 URLs on June then the number of indexed pages went down for a few days and now they are back to 851.
My developer may be a little bit shy about accepting responsibility about this issue. Is there any source, a Guru or sorts that could check the Wordpress installation to see if that is the source of the 175 appearing on Google that should not be there? Someone way to eliminate any doubt about what is causing this issue?
Thanks, Alan
-
The extra pages possibly could negatively affect your site in the eyes of Panda...but there are many other possibilities.
Regarding the url removal tool, it will only work permanently if the pages either have a noindex tag (best) or are blocked by robots.txt. Perhaps that is the issue?
Any time that rankings drop after a design upgrade I would look first of all to on site issues such as accidental noindexing, accidental blocking by robots.txt. Have your urls changed at all? If so, then the appropriate redirects need to be put in place. Also, have all your page titles stayed the same? I recently saw a site that thought it was pummelled by Panda. It turned out that during a redesign their home page title changed from a keyword rich good title to "Home".
You've also got a good number of unnatural links and this can affect your rankings as well. Here are some examples:
http://niresource.com/detail/Business/Business_Travel/
http://www.londovor.com/Business/Real-Estate/?s=A&p=1161
I'm guessing that the solution is probably not as simple as just removing those pages from the index unfortunately.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Robots blocked by pages webmasters tools
a mistake made in software. How can I solve the problem quickly? help me. XTRjH
Intermediate & Advanced SEO | | mihoreis0 -
Do I need to remove pages that don't get any traffic from the index?
Hi, Do I need to remove pages that don't get any traffic from the index? Thanks Roy
Intermediate & Advanced SEO | | kadut1 -
Pages getting into Google Index, blocked by Robots.txt??
Hi all, So yesterday we set up to Remove URL's that got into the Google index that were not supposed to be there, due to faceted navigation... We searched for the URL's by using this in Google Search.
Intermediate & Advanced SEO | | bjs2010
site:www.sekretza.com inurl:price=
site:www.sekretza.com inurl:artists= So it brings up a list of "duplicate" pages, and they have the usual: "A description for this result is not available because of this site's robots.txt – learn more." So we removed them all, and google removed them all, every single one. This morning I do a check, and I find that more are creeping in - If i take one of the suspecting dupes to the Robots.txt tester, Google tells me it's Blocked. - and yet it's appearing in their index?? I'm confused as to why a path that is blocked is able to get into the index?? I'm thinking of lifting the Robots block so that Google can see that these pages also have a Meta NOINDEX,FOLLOW tag on - but surely that will waste my crawl budget on unnecessary pages? Any ideas? thanks.0 -
Does Google index more than three levels down if the XML sitemap is submitted via Google webmaster Tools?
We are building a very big ecommerce site. The site has 1000 products and has many categories/levels. The site is still in construccion so you cannot see it online. My objective is to get Google to rank the products (level 5) Here is an example level 1 - Homepage - http://vulcano.moldear.com.ar/ Level 2 - http://vulcano.moldear.com.ar/piscinas/ Level 3 - http://vulcano.moldear.com.ar/piscinas/electrobombas-para-piscinas/ Level 4 - http://vulcano.moldear.com.ar/piscinas/electrobombas-para-piscinas/autocebantes.html/ Level 5 - Product is on this level - http://vulcano.moldear.com.ar/piscinas/electrobombas-para-piscinas/autocebantes/autocebante-recomendada-para-filtros-vc-10.html Thanks
Intermediate & Advanced SEO | | Carla_Dawson0 -
Experience with Google Disawow Tool and discovering bad back-links
Hi Community, is there any experience to tell here about the disawow tool from Google? Any review? It have helped revocer sites beaten by Penguin or penalized after WMT Unnatural Link building message? Which tools and methods you use to find bad back-links to submit for the disawow tool? Thanks for your feedback,
Intermediate & Advanced SEO | | Braumueller0 -
Page Indexed but not Cached
A section of pages on my site are indexed (I know because they appear in SERPs if I copy and paste a sentence from the content), however according to the text-only cached version of the page they are not being read by Google.Why are they indexed event hough it seems like Google is not reading them..... or is Google in fact reading this text even though it seems like they should not be?Thanks for your assistance.
Intermediate & Advanced SEO | | theLotter0 -
Cleaning bad pages
We have 10,000 of bad pages, which panda could track and penalize us for that. If we delete them we will get 404 error, and after that we could again get penality from G algo. How can i delete them to follow google rules and avoid penalities? If we make redirect of 10k pages with 301 to index, can 10k old pages be treated as duplicate?
Intermediate & Advanced SEO | | bele0 -
How to remove hundreds of duplicate pages
Hi - while i was checking duplicate links, am finding hundreds of duplicates pages :- having undefined after domain name and before sub page url having /%5C%22/ after domain name and before the sub page url Due to Pagination limits Its a joomla site - http://www.mycarhelpline.com Any suggestions - shall we use:- 301 redirect leave these as standdstill and what to do of pagination pages (shall we create a separate title tag n meta description of every pagination page as unique one) thanks
Intermediate & Advanced SEO | | Modi0