Having Problems to Index all URLs on Sitemap
-
Hi all again ! Thanks in advance ! My client's site is having problems to index all its pages. I even bought the full extension of XML Sitemaps and the number of urls increased, but we still have problems to index all of them.
What are the reasons? The robots.txt is open for all robots, we only prohibit users and spiders to enter our Intranet. I've read that duplicate content and 404's can be the reason. Anything else?
-
Like already answered above it's quite hard to get to the 100% indexation rate for your webpages. What's your current indexation rate? If it's below 90% you might still have some issues somewhere.
-
Hi there
According to Google:
"...we don't guarantee that we'll crawl all of the pages of a particular site. Google doesn't crawl all the pages on the web, and we don't index all the pages we crawl. It's perfectly normal for not all the pages on a site to be indexed."
Google also provides tips and resources to help your site being indexed properly and, possibly, more fully. You can check that resource out here. Kissmetrics has a few other tips.
To Andy's point, Google indexes what it wants - don't be discouraged if your entire site isn't indexed in WMT. If you know how many pages are on your site (which you definitely should), I would try the "site:" function and get a better idea of what actually is indexed in Google.
Hope this helps! Good luck!
-
I have yet to see Google index every page of a site. They tend not to index pages that they don't think meet the criteria, so unless it was something like 90% weren't being indexed, I wouldn't worry about it. There are so many reasons that Google won't index a page.
You will also find that over time, if Google attributes more trust to the site, that more pages will be indexed.
Of course, you can do things to improve your changes, such as making sure Google can crawl all pages, check to see there are no bottlenecks anywhere and the big one - make sure your content is amazing. As long as the site is the best it can be, over time the number of indexed pages will increase.
Remember - the sitemap is not a guarantee that pages will be indexed. It just helps Google crawl your site.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Indexing with Keyword
Hi, My webpage url is indexed in Google but don't show when searching the Main Keyword. How can i index it with keyword. It should show on any SERP when the keyword is searched. Any suggestions.
Technical SEO | | green.h1 -
Which URL is better?
Hi everyone, Could you please help me with picking out the right URL for my company's website? We are MoonCreate and we make beautiful clothes. Unfortunately, the domain mooncreate.com is not available and I have to choose between mooncreatebrand.com or mooncreatewear.com Which one is better, in your opinion? Look forward to receive your suggestions! Thank you! 🙂
Technical SEO | | kirupa0 -
Updating content on URL or new URL
High Mozzers, We are an event organisation. Every year we produce like 350 events. All the events are on our website. A lot of these events are held every year. So i have an URL like www.domainname.nl/eventname So what would you do. This URL has some inbound links, some social mentions and so on. SO if the event will be held again in 2013. Would it be better to update the content on this URL or create a new one. I would keep this URL and update it because of the linkvalue and it is allready indexed and ranking for the desired keyword for that event. Cheers, Ruud
Technical SEO | | RuudHeijnen0 -
Noob 101 - Sitemaps
Hi guys, looking for some sitemap help. I'm running two seperate systems so my auto-generated sitemap on the main system has a few holes in it. I'd like to submit this to webmaster anyway, and then plug the holes with missing pages by adding them to 'Fetch as Google'. Does that make sense or will Google ignore one of them? Many thanks, Idiot
Technical SEO | | uSwSEO0 -
How to find original URLS after Hosting Company added canonical URLs, URL rewrites and duplicate content.
We recently changed hosting companies for our ecommerce website. The hosting company added some functionality such that duplicate content and/or mirrored pages appear in the search engines. To fix this problem, the hosting company created both canonical URLs and URL rewrites. Now, we have page A (which is the original page with all the link juice) and page B (which is the new page with no link juice or SEO value). Both pages have the same content, with different URLs. I understand that a canonical URL is the way to tell the search engines which page is the preferred page in cases of duplicate content and mirrored pages. I also understand that canonical URLs tell the search engine that page B is a copy of page A, but page A is the preferred page to index. The problem we now face is that the hosting company made page A a copy of page B, rather than the other way around. But page A is the original page with the seo value and link juice, while page B is the new page with no value. As a result, the search engines are now prioritizing the newly created page over the original one. I believe the solution is to reverse this and make it so that page B (the new page) is a copy of page A (the original page). Now, I would simply need to put the original URL as the canonical URL for the duplicate pages. The problem is, with all the rewrites and changes in functionality, I no longer know which URLs have the backlinks that are creating this SEO value. I figure if I can find the back links to the original page, then I can find out the original web address of the original pages. My question is, how can I search for back links on the web in such a way that I can figure out the URL that all of these back links are pointing to in order to make that URL the canonical URL for all the new, duplicate pages.
Technical SEO | | CABLES0 -
Keyword and URL
I have a client who has a popular name (like 'Joe Smith'). His blog URL has only his first name and the name of his company in it, like joe.company.com. His blog doesn't rank well at all in the first 3-4 Google SERPs. I was thinking of advising him to change the URL of his blog to joesmith.company.com, and having his webmaster do 301 redirects from the old URL to the new one. Do you think this is a good strategy, or would you recommend something else? I realize ranking isn't just about the URL, it's about links, etc. But I think making his URL more specific to his name could help. Any advice greatly appreciated! Jim
Technical SEO | | JamesAMartin0 -
Blank Canonical URL
So my devs have the canonical URL loaded up on pages automatically, and in most cases this gets done correctly. However we ran across a bug that left some of these blank like so: Does anyone know what effect that would have? I am trying to provide a priority for this so I can say "FIX IT NOW" or "Fix it after the other 'FIX IT NOW' type of items". Let me know if you have any ideas. I just want to be sure I am not telling google that all of these pages are like the home page. Thanks!
Technical SEO | | SL_SEM0 -
Are URL's with trailing slash seen as two different URLs
Hello, http://www.example.com and http://ww.example.com/ Are these seen as two different URL's ? Just as with www or non www ? Or it doesn't make any difference ?
Technical SEO | | seoug_20050