Low Index: 72 pages submitted and only 1 Indexed?
-
Hi Mozers,
I'm pretty stuck on this and wondering if anybody else can give me some heads up around what might be causing the issues.
I have 3 top level domains, NZ, AU, and USA. For some od reason I seem to be having a real issue with these pages indexing and also the sitemaps and I'm considering hiring someone to get the issue sorted as myself or my developer can''t seem to find the issues.
I have attached an example of the sitemap_au.xml file. As you can see there is only 1 page that has been indexed and 72 were submitted. Basically because we host all of our domains on the same server, I was told last time our sitemaps were possibly been overwritten hence the reason why we have sitemap_au.xml and its the same for the other sitemap_nz.xml and sitemap_us.xml I also orignially had sitemap.xml for each.
Another issue I am having is the meta tag des for each home page in USA and AU are showing the meta tag for New Zealand but when you look into the com and com.au code meta tag description they are all different as you can see here http://bit.ly/1KTbWg0 and here http://bit.ly/1AU0f5k
Any advice around this would be so much appreciated!
Thanks
Justin
-
Hi,
Yes you've got it spot on, 301s are there to keep old things pointing to the new, but only the new should be in the sitemap.
When you've crawled the live site ready to make your sitemap you can manually right click and remove a URL you would not want in there before generating it.
Kind Regards
Jimmy
-
Hey Jimmy,
Wow thanks so much for your great feedback, much appreciated!
Just want to clarify your answer to the 301. So it is okay to create the 301s for our users to direct them to the new urls, but not good to include in the sitemap? Am I correct in saying this? Or am I totally off track with this?
I think whats happened also is the sitemap from screaming frog has generated old urls and some new urls as well, I'm now seeing a two of our contact pages indexed for the com.au site, one is the older url and the other is the new url.
Let me know your feedback
Cheers again Jimmy
-
Hi Justin,
Yes as long as WMT is specifically watching the HTTPS website then the problem is not in WMT unfortunately.
As hectormainar says, check your sitemap in screaming frog
go to your sitemap.xml and save it to your computer
change the frog to list mode
open your sitemap and runAll the links in the sitemap should report 200
any 301s should be swapped with the direct versionsThe 301 is good to maintain backwards compatibility and allow backlinks and old users to navigate to your new content, but shouldn't be used as major navigation.
Kind Regards
Jimmy
-
Hey Jimmy,
Thanks for the heads up! Yes, I have been watching this via WMT and also I used screaming frog to generate the sitemaps and gave to my developer he then gave me the url to submit to google.
I also used https. I hope that helps?
Let me know if you have any further questions
Cheers Jimmy thanks again
-
Hi Hectormainar,
I understand what your saying, yes we had https://www.zenory.com.au/psychic-readings/psychic-readings before we updated the urls to the following https://www.zenory.com.au/psychic-readings
after doing this we were told to add 301 redirects.. so am a little confused now as to why it should not be done as our visitors would go to the old urls?
I used screaming frog to generate the sitemaps, and from that I think it included the urls? I'm not too sure which exactly it included? Is there a way to check this?
Thanks for your help
Justin
-
Which URLs do you include in your sitemap? Could you check if you try to index
https://www.zenory.com.au/psychic-readings/psychic-readings or https://www.zenory.com.au/psychic-readings ?The first one is the URL you link at menus, but it has a 301 redirect to the second URL format (and the same for the rest of main options). That is quite a bad idea. Please make sure you include the correct address on the sitemap and not the 301 redirect one. That could be causing the problem of Google Webmaster Tools not showing that page in your sitemap as indexed, as although final page is properly indexed in google (as you can check if you look up for site:www.zenory.com.au), GWT is not able to match both addresses.
-
Hi Justin,
It is hard to tell by your screenshot, but what website are you watching in Webamasters Tools? As you are using https, the website to track would have to be the https one as a recent WMT update now classifies these differently.
Having crawled your sites with the screaming frog, I don't see any smoking guns as to why the pages would not be indexed.
Let me know about the WMT account
Kind Regards
Jimmy
-
Hi Michael,
Thanks for your response! I have also done site:yourdomain and this is also showing up quiet low compared to the amount of pages submitted. usa is showing 10 pages indexed. AU slightly more and NZ alot more.
-
Webmaster Tools is not a current accurate reflection of what is actually indexed.
A search in Google for site:yourdomain.com will show the accurate information.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website relaunched: Both old pages and new pages indexed
Hi all, We have recently made major changes to our website and relaunched it. We have changed URLs of some pages. We have redirected old URLs to new before taking website live. When I check even after one week, still the same old and new pages also indexed at Google. I wonder why still old pages cache is there with Google. Please share your ideas on this. Thanks
International SEO | | vtmoz0 -
Footer pages on international sites
Hi guys, i have a question about footer indexed pages like about us, frequently questions, press or ads with us, among others. I'd like to put the same page in our website of .com.mx but i don't know how because i think it will be duplicate content. should i create new content for these pages? Thanks, J
International SEO | | pompero990 -
Duplicate Page Content due to Language and Currency
Hi Folks, hoping someone can help me out please I have a site that I'd like to rank in France and the UK but I'm getting a stack of duplicate content errors due to English and French pages and GBP and EUR prices. Below is an example of how the home page is duplicated: http://www.site.com/?sl=en?sl=fr
International SEO | | Marketing_Today
http://www.site.com/?sl=fr?sl=fr
http://www.site.com
http://www.site.com/?currency=GBP?sl=fr
http://www.site.com/?currency=GBP?sl=en
http://www.site.com/?sl=fr?sl=en
http://www.site.com/?currency=EUR?sl=fr
http://www.site.com/?currency=EUR?sl=en
http://www.site.com/?currency=EUR
http://www.site.com/?sl=en¤cy=EUR
http://www.site.com/?sl=en¤cy=GBP
http://www.site.com/?sl=en
http://www.site.com/?currency=GBP
http://www.site.com/?sl=en?sl=en Each page has the following code in the that updates according to the page you are on: How do I simplify this and what's the correct approach?0 -
Getting pages that load dynamically into the SE's
SEO'ers, Am dealing with an issue I cannot figure out the best way to handle. Working on a website that shows the definitions of words which are loaded dynamically from an open source. Source such as: wiktionary.org When you visit a particular page to see the definition of the word, say; www.example.com/dictionary/example/ the definition is there. However, how can we get all the definition pages to get indexed in search engines? The WordPress sitemap plugin is not picking up these pages to be added automatically - guess because it's dynamic - but when using a sitemap crawler pages are detected. Can anybody give advice on how to go about getting the 200k+ pages indexed in the SE's? If it helps, here's a reference site that seems to load it's definitions dynamically and has succeeded in getting its pages indexed: http://www.encyclo.nl/begrip/sample
International SEO | | RonFav0 -
How fast is my front page?
Yesterday, I changed all of my front page structure from tables to divs. I think this has improved page load time, but I am in Australia, so it is hard to tell. Using Firefox with Firebug tells me the load time here is between 4 to 6 seconds. One of my editors is in Houston, and she says 2 seconds. I'm hoping you can help me, it will take less than a minute. Can you load the front page and tell me how long it takes - and where you are - Country/State Also, if you click to a story, how long does that take? http://newsblaze.com I am working on the story page template too, but it will take longer to get right, because it also is the same for 3 other areas, so I have to be more careful. It would also be nice to get a before and after snapshot from various places. The reason I care about shaving off a second or two is that I've been told google may now care about loading speed, and they are rejecting my new adsense account because of poor user experience on my site, and I have no idea what they mean by that, so I'm clutching at straws.
International SEO | | loopyal0 -
Does 301 redirect on homepage impact seo strongness of this page
Hi, we are running a multilingual website with this structure : http://www.website.com/en
International SEO | | Samuraiz
http://www.website.com/fr
http://www.website.com/de
http://www.website.com/lang (etc.) with then all onsite URLs this way:
http://www.website.com/en/hello
http://www.website.com/fr/bonjour
http://www.website.com/it/ciao We have a 301 redirect on http://www.website.com going to http://www.website.com/en - except if a user already went on the website and chose a specific language. My question is : Do you think the english homepage will have more seo power if it goes directly to http://www.website.com/ I wonder if we lose some linkjuice with the 301 redirection, as many backlink goes directly to http://www.website.com1 -
Lightbox on Home Page for Geo-Targeting
Hi -- I have a client with various international versions of their site. By adding a lightbox to their U.S. home page enabling the user to select their preferred translation (and cookie them)....does this have any negative SEO implications? It seems like a better alternative than the splash page they were using, but just want to be sure. Thanks!
International SEO | | MedThinkCommunications0 -
Targeting specific Geographic areas. Use 1 large.Com or several smaller country specific TLDs?
Hi, I have a small number of exact match domains, both country specific TLDs and also the Generic TLD dot com and dot net. They are: ExactMatch**.Com**
International SEO | | Hurf
ExactMatch**.Net** ExactMatch**.Co.Uk**
ExactMatch**.Ca**
ExactMatch**.Co.Nz**
ExactMatch**.Co.Za** We have already successfully launched our UK site using the exact match .co.uk and this is currently number 2 in the UK SERPS for the Google, Yahoo and Bing. They are/will be niche specific classified ad sites, which are Geographically targeted by country (to Engish speakers in the main) and each region is likely to have a minumum of 2,000 unique listings submitted over the course of a year of so. My question (FINALLY) is this: Am I better to build one large global site (will grow to approx. 12,000 listings) using EXACTMATCH.Com with .com - targeting US users and then geo-targeted sub directories (ExactMatch.Com/Nz etc) - each sub dir targeted to the matching geographic area in webmaster tools, or use the ccTLDs and host each site in the country with perhaps (each site growing to approx 2,000 listings) I could use the ccTLDs just for marketing/branding onlyand redirect these to the specific sub directory of the .com site? I am aware that there is one main ccTLD that I cannot get .Com.Au (as I am not a resident of Australia - and it is already in use.) so I was wondering if the single site with .Com/AU/ etc might help me better target that country? If I use each ccTLD as separate sites I suppose I could use the largely redundant .net to target Australia? Your thoughts and advice would be most welcome. Thanks! An additional bit of intormation (or two) the .com is circa 2004. The product advertised is a reasonably bulky (perhaps 6kgs boxed) physical product and therefore the seller is unlikely to want to ship globally - will this make them shy away from a global site - even one divided into global sub sections? FYI Seller can specify in their listing Will Ship To ....... I would be open to looking at using the front page of the .Com site as a page which visitors select the country they wish to buy/sell on. (IF it is the general consensus that it is better to create one large site.) Consider also please how the end user is likely to percieve the benefits to them of one LARGE SITE versus TARGETED SITE - I know the .Com would be divided into geographic sub directories, but I am not sure if they won't see an additinal benefit to the ccTLD - Does this add a degree of reassurance and relevance that a .com/ccTLD cannot provide? I suppose I am biased by the fact that ebay use ccTLDs? Thanks again - and please forgive my tone which may suggest I am playing devil's advocate here. I am very torn on this issue.0