Sudden Increase In Number of Pages Indexed By Google Webmaster When No New Pages Added
-
Greetings MOZ Community:
On June 14th Google Webmaster tools indicated an increase in the number of indexed pages, going from 676 to 851 pages. New pages had been added to the domain in the previous month. The number of pages blocked by robots increased at that time from 332 (June 1st) to 551 June 22nd), yet the number of indexed pages still increased to 851.
The following changes occurred between June 5th and June 15th:
-A new redesigned version of the site was launched on June 4th, with some links to social media and blog removed on some pages, but with no new URLs added. The design platform was and is Wordpress.
-Google GTM code was added to the site.
-An exception was made by our hosting company to ModSecurity on our server (for i-frames) to allow GTM to function.
In the last ten days my web traffic has decline about 15%, however the quality of traffic has declined enormously and the number of new inquiries we get is off by around 65%. Click through rates have declined from about 2.55 pages to about 2 pages.
Obviously this is not a good situation.
My SEO provider, a reputable firm endorsed by MOZ, believes the extra 175 pages indexed by Google, pages that do not offer much content, may be causing the ranking decline.
My developer is examining the issue. They think there may be some tie in with the installation of GTM. They are noticing an additional issue, the sites Contact Us form will not work if the GTM script is enabled. They find it curious that both issues occurred around the same time.
Our domain is www.nyc-officespace-leader. Does anyone have any idea why these extra pages are appearing and how they can be removed? Anyone have experience with GTM causing issues with this?
Thanks everyone!!!
Alan -
Yes, and I appreciate it!
Alan -
I did what I asked you to do.
-
-
-
- in my first post and repeated frequently.
-
-
-
-
Hi Egol:
How did you locate this duplicate or re-published content?
Obviously what you have pointed out is a major source of concern so I ran Copyscape search this afternoon for duplicate content and did not locate any the URLs you mention in the "this", "this" link above. It appears you entered the URL of the blog post in Google's search bar. Would that work? This method would be pretty slow going with 600 URLs.
Thanks,
Alan -
Those are the 448 URLs from your website that have been filtered.
You should find garbage in them like shown below.
Have you done what I have suggested three times above? Do that if you want to identify the problem pages.
-
www.nyc-officespace-leader.com/wp-content/plugins/...
A description for this result is not available because of this site's robots.txt – learn more.
-
www.nyc-officespace-leader.com/wp-content/plugins/...
A description for this result is not available because of this site's robots.txt – learn more.
-
www.nyc-officespace-leader.com/wp-content/plugins/...
A description for this result is not available because of this site's robots.txt – learn more.
-
-
Hi Egol:
Thanks for the suggestion.
When I click on _ repeat the search with the omitted results included _I get 448 results not the entire 859 results. Seems very strange. Some of these URLS have light content but I don't believe they are dups. I don't see any content outside our website when I click this.
Am I doing something wrong? I would think the total of 859 would appear not 447 URLs.
Thanks!!
Alan -
I don't know. You should ask someone who knows a lot about canonicalization.
Did you drill down through all of those indexed pages to see if you can identify all of them?
I've suggested it twice.
-
Hi Egol:
In the content of launching an upgraded site, could the canonicalization have implemented incorrectly? That could account for 175 pages sudden new content as the thin content has been there for some time.
I am particularly suspicious regarding canonicalization as there was an issue involving multi page URLs of property listings when the site was migrated from Drupal to Wordpress last Summer.
Thoughts?
Thanks, Alan
-
Apparently infitter24.rssing.com/chan-13023009/all is poaching my content, taking my original content and adding it to there site. I am not quiet sure what to do about that.
You can have an attorney demand that they stop, you can file DMCA complaints. Be careful
**However it does not explain the sudden appearance of the 175 pages on Googles index **
-
Do this query: site:www.nyc-officespace-leader.com
-
Start drilling down the SERPs. One page at a time. Look for content that you didn't make. Look for duplicates.
-
Get a spreadsheet that has all of your URLs. Drill down through the SERPs checking every one of them. Can you account for your pagination. You have a lot of it and that type of page is usually rubbish in the index. Combine, canonicalize, or get rid of them.
-
-
Hi Egol:
Thanks so much for taking the time for your thorough response!!
Apparently infitter24.rssing.com/chan-13023009/all is poaching my content, taking my original content and adding it to there site. I am not quiet sure what to do about that.
You have pointed out something very useful and I appreciate it and will act upon it. However it does not explain the sudden appearance of the 175 pages on Googles index that did not appear at the end of May and somehow coincided with uploading of the new version of our website in early June. Any ideas???
Thanks,
Alan -
-
Do this query: site:www.nyc-officespace-leader.com
-
Start drilling down the SERPs. One page at a time. Look for content that you didn't make. Look for duplicates.
-
When you drill down about 44 pages you will find this...
In order to show you the most relevant results, we have omitted some entries very similar to the 440 already displayed.
If you like, you can repeat the search with the omitted results included.The bad stuff is usually behind that link. Google doesn't want to show that stuff to people. It could be thin, it could be duplicate, it could be spammy, they just might not like it.
- Find out what is in there.
Possible problems that I see....
I see dupe content like this and this. Either your guys are grabbin' somebodyelse's content or they are grabbin' yours. Can get you in trouble with Panda. You need original and unique. Anything that is not original and unique should be deleted, noindexed or rewritten.
A lot of these pages are really skimpy. Think content can get you into trouble with Panda. Anything that is skimpy should be deleted, noindexed or beefed up.
I see multiple links to tags on lots of these posts. That can cause duplicate content problems.
The tag pages are paginated with just a few pages on each. These can generate extra pages that are low value, suck up your linkjuice or compound duplicate content problems.
You have archive pages, and category pages and more pagination problems.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Change Phone Number Based on Traffic Source + Ping URL for Call Tracking Number
Hi Everyone, Is there a tool that can change the phone number on a web page based on the visitor source (i.e., direct, organic, paid, etc.)? I'd like to implement a solution like this with different call tracking numbers based on the visitor source. We use the Google suite for our analytics (GA, GTM, Google Data Studio, Google Optimize is also an option as well). - Also, is there a good call tracking service that will ping a URL each time the phone number is called so that we can track these calls as events in GA? The majority of our visitors use a desktop PC and dial in the number on the screen rather than clicking (tapping) on it from a mobile device. Thanks, Andy
Reporting & Analytics | | AndyRCWRCM0 -
Tasks for Google Analytics training
Hi Mozzers, I'm delivering some Google Analytics (Fundamentals level) training, and trying to make it was fun and as interesting as possible... which is quite a challenge when it comes to GA. I was just wondering if you're aware of training tasks, or interactions, I could bring into this kind of training session? The group are particularly interested in user journeys and the effectiveness of content. Thanks!
Reporting & Analytics | | A_Q0 -
Find Pages with 0 traffic
Hi, We are trying to consolidate the amount of landing pages on our site, is there any way to find landing pages with a particular URL substring which have had 0 traffic? The minimum which appears in google analytics is 1 visit.
Reporting & Analytics | | driveawayholidays0 -
Google analytics Goals and Funnels??
Hey - how's things? I've got 2 goals setup, each of which use the same thank you page but with different required funnels. The results have been identical for the last few weeks - which tells me I've got something wrong!! The first goal has a contact form on every page in the /products/ directory set up like this:
Reporting & Analytics | | agua
destination equal to: /thanks-enquiry/funnel step1 page: ^/products/.* required: yes The first goal has a form on 1 page called /on-sale set up like this:
destination equal to: /thanks-enquiry/
funnel step1 page: /on-sale required: yes Any ideas?0 -
Google Tag Manager breaking integration
Using Live Chat's (www.livechatinc.com) Google Analytics integration was populating events and virtual pageviews into my analytics account. I've since added Tag Manager and moved my analytics tracking code into there, but since doing so, the integration no longer seems to work as there is no population of either events or pageviews anymore. Anyone else had any experience of something similar? Any other suggestions (beyond not using GTM for analytics code anymore)? I was considering setting up the event tracking code manually in GTM, but not really sure how to do so seeing as I'm not sure what to fire the different events on. This is the live chat JS code:
Reporting & Analytics | | AdrianCordiner0 -
Google Webmaster Tools - When will the links go away!?
About 9 months back we thought having an extremely reputable company build our client some local citations would be a good idea. You definitely know this citation company, but I'll leave names out. Regardless, it's our mistake to cut corners. Google Webmaster Tools quickly picked up these new citations and added them to the links section. One of these citation spawned a complete mess of about 60K+ links on their network of sites through ridiculous subdomains of every state in the country and so many other domain variations. We immediately went into remove mode and had the site's webmaster take down the bad links from their site. This process took about a month for outreach. The bad links (60K+) have not been on the spam site for well over 6 months but GWT still shows them in the "links to your site" section. Majestic, Bing, and OSE only displayed the bad links for a brief time. Why is webmaster tools still showing these links after 6+ months? We typically see GWT update about every 2 weeks, a month tops. Any ideas? Could a changed robots.txt on the bad site prevent Google from updating the links displayed in GWT? We have submitted to disavow, but Google replied with "no manual penalty". We even blasted the bad site with Fiverr links, in hopes that Google would re-crawl them. No luck with anything we do. We have patiently waited for way too long. The rankings for this site got crushed on Google after these citations. How do we fix this? Should we worry about this? Any advice would really help. Thanks so much in advance.
Reporting & Analytics | | zadro0 -
Google Links Disavow - Does that preclude new links from a domain?
If using Google disavow links tool and you disavow links from a 'domain' does that mean that any 'future or new links' from that domain will be blocked? Answer Yes is good if the domain is spammy but bad if the domain was submitted in error ........ Answer NO is good if the domain was submitted in error but bad if the site is spammy. Does anyone have an answer to this please? Also is there a disavow 'undo' request process available? cheers, Mike
Reporting & Analytics | | shags380 -
I have two campaigns that are only crawling one page, why is this?
I have a total of three campaigns running right now, and two of them are only crawling one page. I set the campaigns up the same, what is the problem?
Reporting & Analytics | | SiteVamp0