WMT only showing half of a newly submitted XML site map
-
After upgrading design and theme on a relatively high traffic wordpress site, I created an XML site map through Yoast SEO since WP Engine didn't allow the old XML site map plugin I was using.
A site:www.mysite.com search shows Google is indexing about 1,100 pages on my site, yet the XML site map I submitted shows "458 URLs submitted and 467 URLs indexed."
These numbers are about 1/2 of what they should be. My old site map had about 1,100 URLs and 965 or so indexed (used noindex on some low value pages.)
Any ideas as to what may be wrong?
-
I just did a site: search for your domain and looks like 1140 pages are indexed, so I'm assuming this got itself settled?
Congrats! Marking as answered.
-
You wont get a duplicate penalty, having duplicate content is not a crime unless you are doing some large scale spamming. duplicate content wont help but it wont hurt either. noindexing will hurt, even with follow you still lose some. Use canonical to fix your problem not noindex.
as for the sitemap, It is my suspicion that not al the maps are being read. I also don't know much about yoast sitemaps, I always us the xml standard.
Bing and Google have their own sitmap generation software, that you can use that lets them make your site map for you.
-
Thanks Alan,
Sure, here is the site map: http://www.nationalbankruptcyforum.com/sitemap_index.xml
As far as noindexing pages is concerned, I always use noindex, follow, but choose to noindex category and author archive pages as I think they can cause duplicate content/ Panda issues.
John
-
Can we see your sitemap.xml to look for any problems.
I would not be concerned, as sitemaps are not much help for sites that have good linking, a site map should not include all your links according to Duane forrester of bing, but the main pages only.
What is a concern is the noindexing of pages you mention. any links pointing to non indexed pages are wasting their link juice, there is nothing to gain by noindexing pages but a lot to lose. if you really mush noindex a page use the meta tag noindex,foloow, so the search engine follows the links and you will get some of the link juice back.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site Hack In Meta Description
Hey MOZ Community, I am looking for some help in identifying where the following meta description is coming from on this home page - https://www.apins.com. I have scrubbed through the page source without being able to locate where the content is being pulled from. The website is built on WordPress and metas were updated using Yoast, but I am wondering if an installed plugin could be the culprit. On top of this, I have had a developer take a look for the "hack" and they have assured that the issue has been removed. I have submitted the URL in GSC a couple of times to be re-indexed but have not had much luck. Any thoughts would be much appreciated, the displayed description is below. The health screening plays http://buyviagraonlineccm.com/ a significant and key role in detecting potentially life-threatening illnesses such as cancer, heart ...
Technical SEO | | jordankremer0 -
Discontinued Product on a Ecommerce site
To create a better customer experience, rather then remove discontinued product from a site, we remove many links from the page, and remove it from the navigation of the site, but we keep the url and show that the product can no longer be purchased. This keeps the links, keeps the content, and gives customers the opportunity to find other products we have. But I often wonder if we should allow this items to just 404 and be done with them. Here is an example. http://www.americanmusical.com/Item--i-dyn-bm5a-list. Any advice?
Technical SEO | | dianeb1520 -
Open Site Explorer Question
In OSE I have 3 of my top 5 pages as store.com, store.com/Default.asp, and store.com/default.asp -- I have a canonical version of at store.com/default.asp. I have inbound links coming to all three urls -- b/c OSE is listing these as seperate pages does that mean the link juice is not being consolidated? Or is this not something to worry about?
Technical SEO | | IOSC0 -
Seomoz Can not Crawl My Site
Hello there Seomoz can not crawl my site. It's been 3 days now not a single page has been crawled. I deleted the campaign and tried again still now crawl not a single page.. Any solutions??
Technical SEO | | ExpertSolutions0 -
Redirecting the .com of our site
Hey guys, A company I consult for has a different site for its users depending on the geography. Example: When a visitor goes to www.company.com if the user is from the EU, it gets redirected to http://eu.company.com If the user is from the US, it goes to http://us.company.com And so on. I have two questions: Does having a redirect on the .com will influence rankings on each specific sub-site? I suspect it will affect the .com since it will simply not get indexed but not sure if affects the sub domains. The content on this sub-sites are not different (I´m still trying to figure out why they are using the sub-domains). Will they get penalized for duplicate content? Thanks!
Technical SEO | | FDSConsulting0 -
XML Sitemap Issue or not?
Hi Everyone, I submitted a sitemap within the google webmaster tools and I had a warning message of 38 issues. Issue: Url blocked by robots.txt. Description: Sitemap contains urls which are blocked by robots.txt. Example: the ones that were given were urls that we don't want them to be indexed: Sitemap: www.example.org/author.xml Value: http://www.example.org/author/admin/ My issue here is that the number of URL indexed is pretty low and I know for a fact that Robot.txt aren't good especially if they block URL that needs to be indexed. Apparently the URLs that are blocked seem to be URLs that we don't to be indexed but it doesn't display all URLs that are blocked. Do you think i m having a major problem or everything is fine?What should I do? How can I fix it? FYI: Wordpress is what we use for our website Thanks
Technical SEO | | Tay19860 -
Site Categorization?
I know getting site categories to appear under the site are dependent on a lot of factors including site mapping. We have a site that does the categorization thing when you type in the sites url name however more people search for the name of the talent to find the site and the short url on the site is just his name, but shorter. However I was just wondering is their a way to optimize the site so that way we could get categorization to show up under the sites URL when they search for the talents full name I ask because the amount of people looking for the talents full name rather than the short name is a lot larger and I would like to see if we can take advantage of the real estate, but I honestly don't think there is a way, however I figured I would open it up to discussion to see if anyone has any ideas. Example: Site name is ABCD you type this into Google and you get ABCD.com about blog how to contact However the actual person whose site it is is ABCDEF and when you type that in you just get: ABCD.com without any of the categories appearing below the url. And that is what I'm asking about. Thanks as I can't seem to find a lot of information on this. However if there is another spot on the site talking about this please let me know I may just not be searching with the right terms.
Technical SEO | | KateGMaker0 -
Site Structure question
when deciding the Site structure for a e-commerce site Is it better to keep everything mysite.com/widget.html or use categories like mysite.com/Gifts/widget.html
Technical SEO | | DavidKonigsberg0