WMT only showing half of a newly submitted XML site map
-
After upgrading design and theme on a relatively high traffic wordpress site, I created an XML site map through Yoast SEO since WP Engine didn't allow the old XML site map plugin I was using.
A site:www.mysite.com search shows Google is indexing about 1,100 pages on my site, yet the XML site map I submitted shows "458 URLs submitted and 467 URLs indexed."
These numbers are about 1/2 of what they should be. My old site map had about 1,100 URLs and 965 or so indexed (used noindex on some low value pages.)
Any ideas as to what may be wrong?
-
I just did a site: search for your domain and looks like 1140 pages are indexed, so I'm assuming this got itself settled?
Congrats! Marking as answered.
-
You wont get a duplicate penalty, having duplicate content is not a crime unless you are doing some large scale spamming. duplicate content wont help but it wont hurt either. noindexing will hurt, even with follow you still lose some. Use canonical to fix your problem not noindex.
as for the sitemap, It is my suspicion that not al the maps are being read. I also don't know much about yoast sitemaps, I always us the xml standard.
Bing and Google have their own sitmap generation software, that you can use that lets them make your site map for you.
-
Thanks Alan,
Sure, here is the site map: http://www.nationalbankruptcyforum.com/sitemap_index.xml
As far as noindexing pages is concerned, I always use noindex, follow, but choose to noindex category and author archive pages as I think they can cause duplicate content/ Panda issues.
John
-
Can we see your sitemap.xml to look for any problems.
I would not be concerned, as sitemaps are not much help for sites that have good linking, a site map should not include all your links according to Duane forrester of bing, but the main pages only.
What is a concern is the noindexing of pages you mention. any links pointing to non indexed pages are wasting their link juice, there is nothing to gain by noindexing pages but a lot to lose. if you really mush noindex a page use the meta tag noindex,foloow, so the search engine follows the links and you will get some of the link juice back.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Webmaster tools not showing links but Moz OSE is showing links. Why can't I see them in the Google Search Console
Hi, Please see attached photos. I have a website that shows external follow links when performing a search on open site explorer. However, they are not recognised or visible in search console. This is the case for both internal and external links. The internal links are 'no follow' which I am getting developer to rectify. Any ideas why I cant see the 'follow' external links? Thanks in advance to those who help me out. Jesse T7dkL5s T7dkL5s OkQmPL4 3qILHqS
Technical SEO | | jessew0 -
Mobile site not ranking
Hello, I have a m.site.com version of my original site. It is about 1/10 the size, and no matter what I do-I can't get the site to rank. I've added more pages and specified canonical etc etc. Should I add as many pages as my larger site has? Are there specific places I should be submitting this version beyond the typical? I am at a loss, so any help would be greatly appreciated! Thanks! L
Technical SEO | | lfrazer1 -
Mobile site ranking instead of/as well as desktop site in desktop SERPS
I have just noticed that the mobile version of my site is sometimes ranking in the desktop serps either instead of as well as the desktop site. It is not something that I have noticed in the past as it doesn't happen with the keywords that I track, which are highly competitive. It is happening for results that include our brand name, e.g '[brand name][search term]'. The mobile site is served with mobile optimised content from another URL. e.g wwww.domain.com/productpage redirects to m.domain.com/productpage for mobile. Sometimes I am only seen the mobile URL in the desktop SERPS, other times I am seeing both the desktop and mobile URL for the same product. My understanding is that the mobile URL should not be ranking at all in desktop SERPS, could we be being penalised for either bad redirects or duplicate content? Any ideas as to how I could further diagnose and solve the problem if you do believe that it could be harming rankings?
Technical SEO | | pugh0 -
Best practice for eCommerce site migration, should I 301 redirect or match URLs on new site
Hi Guys, I have been struggling with this one for quite some time. I am no SEO expert like many of you, rather just a small business owner trying to do the right thing, so forgive me if I say something that makes no sense 🙂 I am moving our eCommerce store from one platform to another, in the process the store is getting a massive face lift. The part I am struggling with is whether I should keep my existing URL structure in place or use 301 redirects to create a cleaner looking URLs. Currently the URLs are a little long and I would love to move to a /category/product_name type format. Of course the goal is not to lose ranking in the process, I rank pretty well for several competitive phrases and do not want to create a negative impact. How would you guys handle this? Thanks, Dinesh
Technical SEO | | MyFairyTaleBooks0 -
Trouble making up a keyword map
I’m starting my first blog and I’m in the phase of making a keyword map. Now I’m not sure about how it done and if it is done right. I would be thankfull if you’d be so kind to check it out and tell me if it’s good or not and what should change. PM me if you wanna help me, I'll send out the excel file. P.S.S.: For those who help out, don’t mind the almost empty transactional column, ignore that, just advice on how it is ordered.
Technical SEO | | JeffBlanc0 -
Site offline - Mitigating measures?
Hi, Our domain has expired, and it could take up to 48h to recover our website. Appart from the obvious image damage, It worries me Google will just think we have vanisheg Any recommendations? Maybe update something on WebMasterTools? Not having the domain, cannot even do any temporary redirect, etc... Thanks! Jaime
Technical SEO | | BaseKit0 -
What to do if my site was De-indexed?
Hello fellow SEOs, I have been doing SEO for about a year now, I'm not expert, but I know enough to get the job done. I'm learning everyday about better techniques. So enough about that... Tonight I noticed that my site has, I believe, been de-indexed. Its a fairly new site, as we just launched it a few days ago and I went in and did all the title tags and meta. I still have to go in to do the h1 and h2 tags...plus add some alt tags and anchor text. Well anyways, after a couple of days after the title tags were implemented. I was propagating all over the place. Using my keyword tool here...I was number on the first page in Google for 71 or the 88 keywords. My new site was just indexed yesterday and thats when i noticed all my keywords. Well today I noticed that I am no where to be found, even if i type in my company's name. PLEASE help me out...any advice would be appreciated. Thank you. p.s. could my competitors could have done something to my site? just wondering... The website is www.eggheadconsultants.com
Technical SEO | | Jegghead1 -
How to Submit XML Site Map with more than 300 Subdomains?
Hi,
Technical SEO | | vaibhav45
I am creating sitemaps for site which has more than 500 Sub domains. Page varies from 20 to 500 in all subdomains & it will keep on adding in coming months. I have seen sites that create separate sitemap.xml for each subdomain which they mention in separate robots.txt file http://windows7.iyogi.com/robots.txt XML site map eg for subdomain: http://windows7.iyogi.com/sitemap.xml.gz , Currently in my website we have only 1 robots.txt file for main domain & sub domains. Please tell me shall i create separate robots.txt & XML site map file for each subdomain or 1 file. Creating separate xml for each sub-domain is not feasible as we have to verify in GWT separately. Is there any automatic way & do i have to ping separately if i add new pages in subdomain. Please advise me.0