Webmaster Tools Data Discrepancies/Anomalies
-
Hi
Im looking in GWT account for a client of mine and see that in the index status area 400+ pages are indexed so all seems ok there! But then in the sitemap area 111 pages have been submitted but only 1 indexed !
Any ideas whats going on here ?
Cheers
Dan
-
the moz bar says they are 301'd so presume so but will double check with dev
i suppose if all definately are 301'd for sure, then just wait and see what haps after the sitemap is updated correctly and then take it from there
-
It's likely they used 301 then if that's what they said. You can check using an HTTP header tool. Uppercase in the Google results is odd as well. You'll want to double check this stuff and get the sitemap changed to help prompt the use of lowercase instead.
-
Re: 302 vs 301 do you mean there's a possibility dev rewrote via a 302 instead of a 301 (they had told me used a 301) ?
Re mozscape not updating, I just checked G search results and upper case versions indexed there and last cache date was 12 march
-
Possibly due to 302 vs 301 redirects. Those pages still load. Mozscape didn't update those pages from their previous crawls yet. Fixing the sitemap will help, further internal and external links pointing to the lowercase URL versus the uppercase will help too.
-
Grt thanks !!
Sorry 1 final question since just noticed another related anomaly
The urls have been redirected to lower case for over a month now but the urls showing in ranking reports are mainly still upper case versions, how can that be ?
Cheers
Dan
-
Yup. Update the site map with your current and accurate URLs and you should be on your way.
-
thanks for confirming that Ryan !
So use 301 as the method for 'rewriting' upper to lower case, so the 301 is the rewrite ?
They had told me they had already applied 301's of upper to lower case so i guess its probably just a case of updating the site map urls to resolve this ?
all best
dan
-
It certainly could. You'd also want dev to use 301 redirection when applying a change like that. The sitemap isn't a hard fast rule for what you want indexed in the search engines, rather an aid to help them crawl your site. If it's in error they'll still be able to crawl and index more than what's listed on the sitemap. Ideally though, your sitemap is an accurate reflection of the pages of your site.
-
thanks Ryan
I will do that , but just to confirm then re the contradictory messages, is that because there are 400+ pages indexed hence showing that info in the index status, BUT in the sitemap there's only the home page showing because all the other urls in the sitemap have probably changed ?
I had instructed dev to rewrite upper case urls to lowercase and amend the sitemap. If the have done one but not the other could this be the sort of thing causing this discrepancy in the GWT messaging ?
All Best
Dan
-
I would double check the sitemap file to see if it's still accurate or if the site has changed and the URLs in the sitemap no longer correspond to the structure of the site. My guess is that the root domain is the only one of the 111 that is correctly listed. Cheers!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long does it take for Webmaster Tools to index a site?
I submitted my client's site about a week ago. It had 138 links, it's still at 43 links. Should it be taking that long to index? Thanks! Luciana
Technical SEO | | Luciana_BAH1 -
Sharing/hosting of content questions...
I just wanted to get opinion on some of the fundamentals and semantics of optimisation and content generation/distribution - your thoughts and opinions are welcome. OK, for example, lets assume (for illustration purposes) that I have a site - www.examplegolfer.com aimed at golfers with golf related content. The keywords I would like to optimise for are: golf balls golf tees lowering your golf handicap drive a golf ball further Now, I'm going to be creating informative, useful content (infographics, articles, how to guides, video demonstrations etc) centred around these topics/keywords, which hopefully our audience/prospects will find useful and bookmark, share and monition our site/brand on the web, increasing (over time) our position of these terms/keywords in the SERP's. Now, once I've researched and created my content piece, where should I place it? Let's assume it's an infographic - should this be hosted on an infographic sharing site (such as Visually) or on my site, or both? If it's hosted or embedded on my site, should this be in a blog or on the page I'm optimising for (and I've generated my keyword around)? For example, if my infographic is around golf balls, should this be embedded on the page www.examplegolfer.com/golf-balls (the page I'm trying to optimise) and if so, and it's also placed elsewhere around the internet (i.e on Visually for example), this could technically be seen as duplicated content as the infographic is on my site and on Visually (for example)? How does everyone else share/distribute/host their created content in various locations whilst avoiding the duplicated content issue? Or have I missed something? Also, how important is it to include my keyword (golf balls) in the pieces' title or anchor text? Or indeed within the piece itself? One final question - should the content by authoured/shared as the brand/company or an individual (spokesperson if you like) on behalf of the company (i.e. John Smith)? I'm all for creating great, interesting, useful content for my audience, however I want to ensure we're getting the most out of it as researching influencers, researching the piece and creating it and distributing it isn't a quick or easy job (as we all know!). Thoughts and comments welcome. Thanks!
Technical SEO | | Carl2870 -
Is it necessary to 301 rewrite /index.php to /?
Hi, We have build a lot of external link to http://www.oursite.com/ Do I have to do a 301 redirect from http://www.oursite.com/index.php to http://www.outsite.com/? Thanks
Technical SEO | | LauraHT0 -
Keyword Suggestions Tool & Different Subdomains
Hey all, Was reading Dan Shure's brilliant post on the Keyword Planner, and decided to plug a few of my own pages into the URL-suggester tool as well. What I got back was nothing short of strange. After plugging in our Features page, (which describes our Social Media Contesting Platform,) and getting back a bunch of suggestions related to Dr Seuss and Interior Design Scholarships, I realized that the Keyword Suggestion tool was being broken by our subdomains. I looked for precedent on my particular issue, but I think I might not be searching properly. Could anyone provide any insight into whether or not this might affect how spiders see the content on Strutta.com, whether or not this is just something that will affect the Keyword Suggestions Tool or actual SERP rankings, and if this content is already present elsewhere on MOZ, a link to said content? Much obliged 🙂
Technical SEO | | danny.wood0 -
Which are the best website reporting tools for speed and errors
Hi, i have just made some changes on my site by adding some redirects and i want to know what affect this has had on my site www.in2town.co.uk I have been using tools such as Pingdom tools, http://gtmetrix.com but these always give me different time reports so i do not know the true speed of my site and give me different advice. So i am wanting to know how to check the true speed for my site in the UK and how to check for the errors to make it better any advice would be great on which tools to use
Technical SEO | | ClaireH-1848860 -
BingHoo Tools No Longer Directly Supporting AJAX
This post was derived from the fact that our website, which uses AJAX and AJAX urls formatted using Google's crawl guidelines (and ranks well), is being completely misread by BingHoo's bots (and destroying our rankings). After some research, we found that Bing's tools used to contain an option for their crawlers to interpret AJAX-infused urls, but that this feature was removed with the latest update. I've seen others post on this issue with no response, so I figured I'd post the customer support email we received below - kind of strange thing to receive. Takeaway - even AJAX done right is rough in BingHoo. (sorry I can't post my site here..). Hello, This is Roxanne of Bing Technical Support and I will be assisting you with this issue. I understand that you cannot find the Configure Bing for AJAX Crawling box in Bing Webmaster. Let me explain. We appreciate your feedback about Bing Webmaster tools. We regret to inform you that Bing Webmaster Tools is no longer directly supporting AJAX. We'll pass this feedback onto our Bing development team. If you are having ajax-specific related issues with your site in Webmaster tools, please let us know. We apologize for the time spent and the inconvenience this may have caused you. If you should require further clarification and need more assistance, please feel free to reply to this email. Thank you and have a great day! Regards, Roxanne Bing Technical Support
Technical SEO | | Blenny0 -
Using Sitemap Generator - Good/Bad?
Hi all I recently purchased the full licence of XML Sitemap Generator (http://www.xml-sitemaps.com/standalone-google-sitemap-generator.html) but have yet used it. The idea behind this is that I can deploy the package on each large e-commerce website I build and the sitemap will be generated as often as I set it be and the search engines will also be pinged automatically to inform them of the update. No more manual XML sitemap creation for me! Now it sounds great but I do not know enough about pinging search engines with XML sitemap updates on a regular basis and if this is a good or bad thing? Can it have any detrimental effect when the sitemap is changing (potentially) every day with new URLs for products being added to the site? Any thoughts or optinions would be greatly appreciated. Kris
Technical SEO | | yousayjump0 -
What can I do if Google Webmaster Tools doesn't recognize the robots.txt file?
I'm working on a recently hacked site for a client and and in trying to identify how exactly the hack is running I need to use the fetch as Google bot feature in GWT. I'd love to use this but it thinks the robots.txt is blocking it's acces but the only thing in the robots.txt file is a link to the sitemap. Unde the Blocked URLs section of the GWT it shows that the robots.txt was last downloaded yesterday but it's incorrect information. Is there a way to force Google to look again?
Technical SEO | | DotCar0