Low Index: 72 pages submitted and only 1 Indexed?
-
Hi Mozers,
I'm pretty stuck on this and wondering if anybody else can give me some heads up around what might be causing the issues.
I have 3 top level domains, NZ, AU, and USA. For some od reason I seem to be having a real issue with these pages indexing and also the sitemaps and I'm considering hiring someone to get the issue sorted as myself or my developer can''t seem to find the issues.
I have attached an example of the sitemap_au.xml file. As you can see there is only 1 page that has been indexed and 72 were submitted. Basically because we host all of our domains on the same server, I was told last time our sitemaps were possibly been overwritten hence the reason why we have sitemap_au.xml and its the same for the other sitemap_nz.xml and sitemap_us.xml I also orignially had sitemap.xml for each.
Another issue I am having is the meta tag des for each home page in USA and AU are showing the meta tag for New Zealand but when you look into the com and com.au code meta tag description they are all different as you can see here http://bit.ly/1KTbWg0 and here http://bit.ly/1AU0f5k
Any advice around this would be so much appreciated!
Thanks
Justin
-
Hi,
Yes you've got it spot on, 301s are there to keep old things pointing to the new, but only the new should be in the sitemap.
When you've crawled the live site ready to make your sitemap you can manually right click and remove a URL you would not want in there before generating it.
Kind Regards
Jimmy
-
Hey Jimmy,
Wow thanks so much for your great feedback, much appreciated!
Just want to clarify your answer to the 301. So it is okay to create the 301s for our users to direct them to the new urls, but not good to include in the sitemap? Am I correct in saying this? Or am I totally off track with this?
I think whats happened also is the sitemap from screaming frog has generated old urls and some new urls as well, I'm now seeing a two of our contact pages indexed for the com.au site, one is the older url and the other is the new url.
Let me know your feedback
Cheers again Jimmy
-
Hi Justin,
Yes as long as WMT is specifically watching the HTTPS website then the problem is not in WMT unfortunately.
As hectormainar says, check your sitemap in screaming frog
go to your sitemap.xml and save it to your computer
change the frog to list mode
open your sitemap and runAll the links in the sitemap should report 200
any 301s should be swapped with the direct versionsThe 301 is good to maintain backwards compatibility and allow backlinks and old users to navigate to your new content, but shouldn't be used as major navigation.
Kind Regards
Jimmy
-
Hey Jimmy,
Thanks for the heads up! Yes, I have been watching this via WMT and also I used screaming frog to generate the sitemaps and gave to my developer he then gave me the url to submit to google.
I also used https. I hope that helps?
Let me know if you have any further questions
Cheers Jimmy thanks again
-
Hi Hectormainar,
I understand what your saying, yes we had https://www.zenory.com.au/psychic-readings/psychic-readings before we updated the urls to the following https://www.zenory.com.au/psychic-readings
after doing this we were told to add 301 redirects.. so am a little confused now as to why it should not be done as our visitors would go to the old urls?
I used screaming frog to generate the sitemaps, and from that I think it included the urls? I'm not too sure which exactly it included? Is there a way to check this?
Thanks for your help
Justin
-
Which URLs do you include in your sitemap? Could you check if you try to index
https://www.zenory.com.au/psychic-readings/psychic-readings or https://www.zenory.com.au/psychic-readings ?The first one is the URL you link at menus, but it has a 301 redirect to the second URL format (and the same for the rest of main options). That is quite a bad idea. Please make sure you include the correct address on the sitemap and not the 301 redirect one. That could be causing the problem of Google Webmaster Tools not showing that page in your sitemap as indexed, as although final page is properly indexed in google (as you can check if you look up for site:www.zenory.com.au), GWT is not able to match both addresses.
-
Hi Justin,
It is hard to tell by your screenshot, but what website are you watching in Webamasters Tools? As you are using https, the website to track would have to be the https one as a recent WMT update now classifies these differently.
Having crawled your sites with the screaming frog, I don't see any smoking guns as to why the pages would not be indexed.
Let me know about the WMT account
Kind Regards
Jimmy
-
Hi Michael,
Thanks for your response! I have also done site:yourdomain and this is also showing up quiet low compared to the amount of pages submitted. usa is showing 10 pages indexed. AU slightly more and NZ alot more.
-
Webmaster Tools is not a current accurate reflection of what is actually indexed.
A search in Google for site:yourdomain.com will show the accurate information.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
MultiRegional site indexing problems
Hello there!!! I have a multiregional site and dealing with some indexing problems. The problem is that google have only indexed our USA site We have: -set up hreflang tags -set up specific subdirectories https://www.website.com/ (en-us site and our main site) https://www.website.com/en-gb https://www.website.com/en-ca https://www.website.com/fr-ca https://www.website.com/fr-fr https://www.website.com/es-es ..... -set up automatic GEO IP redirects (301 redirects) -created a sitemap index and a different sitemap for each regional site -created a google webmaster's tool for each country targeted -created translations for each different language and added some canonicals to the US' site when using English content. The problem is that Google is not indexing our regional sites. I think that the problem is that google is using a US bot when spidering the site, so it will be always redirect to the US version by a 301 redirect. I have used fetch as google with some of our regional folders and asked for "Indexing requested for URL and linked pages", but still waiting. Some ideas?? changing 301 to 302? Really don't know what to do. Thank you so much!!
International SEO | | Alejandrodurn0 -
DMoz, can I submit 3 top level domains?
Hi Guys, I have 3 top level domains, does anyone know if it is okay to submit all 3 domains? they all cover different countries, NZ, AUS, USA - The NZ one has been submitted, but our main site is the .com (USA) after running a few adwords campaign we decided to work the .com instead! Does anyone know the terms or guidelines around this?
International SEO | | edward-may0 -
Duplicate Page Content due to Language and Currency
Hi Folks, hoping someone can help me out please I have a site that I'd like to rank in France and the UK but I'm getting a stack of duplicate content errors due to English and French pages and GBP and EUR prices. Below is an example of how the home page is duplicated: http://www.site.com/?sl=en?sl=fr
International SEO | | Marketing_Today
http://www.site.com/?sl=fr?sl=fr
http://www.site.com
http://www.site.com/?currency=GBP?sl=fr
http://www.site.com/?currency=GBP?sl=en
http://www.site.com/?sl=fr?sl=en
http://www.site.com/?currency=EUR?sl=fr
http://www.site.com/?currency=EUR?sl=en
http://www.site.com/?currency=EUR
http://www.site.com/?sl=en¤cy=EUR
http://www.site.com/?sl=en¤cy=GBP
http://www.site.com/?sl=en
http://www.site.com/?currency=GBP
http://www.site.com/?sl=en?sl=en Each page has the following code in the that updates according to the page you are on: How do I simplify this and what's the correct approach?0 -
2 Domains, 2 Languages, but 1 WP Install?
I've got a case who wants to have one english website at one domain targeting Hawaii/ USA (bodywellnesshawaii.com) and a spanish speaking one (bodywellnesschile.cl) targeting Chile/ South America. What's the best way to go about this? Just clone the current bodywellnesshawaii.com site, translate it and have it live on a separate WP install? OR Is there a way in which we can use just one WP install with multi language and have each language live on separate domains? Not sure whether that's even possible, but it would be easier to add content/ maintain... Either one better for SEO? Thanks in advance.
International SEO | | stephanwb0 -
Any practical examples of ranking 1 domain in multiple countries?
Hi, I've done a fair amount of research on international SEO including here on MOZ but was hoping some fellow Mozzers might have some practical examples of how they have got 1 domain to rank in multiple countries, ideally US & UK. Im possibly looking at getting a high authority domain which ranks great on US into the UK engines. I want to keep to the 1 domain to benefit from the high authority and for logistical reasons. Thanks in advance, Andy
International SEO | | AndyMacLean0 -
Multilingual Ecommerce Product Pages Best Practices
Hi Mozzers, We have a marketplace with 20k+ products, most of which are written in English. At the same time we support several different languages. This changes the chrome of the site (nav, footer, help text, buttons, everything we control) but leaves all the products in their original language. This resulted in all kinds of duplicate content (pages, titles, descriptions) being detected by SEOMoz and GWT. After doing some research we implemented the on page rel="alternate" hreflang="x", seeing as our situation almost perfectly matched the first use case listed by Google on this page http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077. This ended up not helping at all. Google still reports duplicate titles and descriptions for thousands of products, months after setting this up. We are thinking about changing to the sitemap implementation rel="alternate" hreflang="X", but are not sure if this will work either. Other options we have considered include noindex or blocks with robots.txt when the product language is not the same as the site language. That way the feature is still open to users while removing the duplicate pages for Google. So I'm asking for input on best practice for getting Google to correctly recognize one product, with 6 different language views of that same product. Can anyone help? Examples: (Site in English, Product in English) http://website.com/products/product-72 (Site in Spanish, Product in English) http://website.com/es/products/product-72 (Site in German, Product in English) http://website.com/de/products/product-72 etc...
International SEO | | sedwards0 -
What is the best way to make country specific IP redirect for only product pricng pages?
My website has 3 services and its price will be different for US/EU/Developed world and Asian/African countries.Apart from pricing page, all other things remain same. I want to use IP based redirect .I heard this thing is called cloaking and used by black-hat guys. What kind of instructions should I give to my web developer to look best to Google/Search bots and correctly show visitors the intended prices.Is there any caution to be taken care of. Thanks for your time
International SEO | | RyanSat0 -
Targeting specific Geographic areas. Use 1 large.Com or several smaller country specific TLDs?
Hi, I have a small number of exact match domains, both country specific TLDs and also the Generic TLD dot com and dot net. They are: ExactMatch**.Com**
International SEO | | Hurf
ExactMatch**.Net** ExactMatch**.Co.Uk**
ExactMatch**.Ca**
ExactMatch**.Co.Nz**
ExactMatch**.Co.Za** We have already successfully launched our UK site using the exact match .co.uk and this is currently number 2 in the UK SERPS for the Google, Yahoo and Bing. They are/will be niche specific classified ad sites, which are Geographically targeted by country (to Engish speakers in the main) and each region is likely to have a minumum of 2,000 unique listings submitted over the course of a year of so. My question (FINALLY) is this: Am I better to build one large global site (will grow to approx. 12,000 listings) using EXACTMATCH.Com with .com - targeting US users and then geo-targeted sub directories (ExactMatch.Com/Nz etc) - each sub dir targeted to the matching geographic area in webmaster tools, or use the ccTLDs and host each site in the country with perhaps (each site growing to approx 2,000 listings) I could use the ccTLDs just for marketing/branding onlyand redirect these to the specific sub directory of the .com site? I am aware that there is one main ccTLD that I cannot get .Com.Au (as I am not a resident of Australia - and it is already in use.) so I was wondering if the single site with .Com/AU/ etc might help me better target that country? If I use each ccTLD as separate sites I suppose I could use the largely redundant .net to target Australia? Your thoughts and advice would be most welcome. Thanks! An additional bit of intormation (or two) the .com is circa 2004. The product advertised is a reasonably bulky (perhaps 6kgs boxed) physical product and therefore the seller is unlikely to want to ship globally - will this make them shy away from a global site - even one divided into global sub sections? FYI Seller can specify in their listing Will Ship To ....... I would be open to looking at using the front page of the .Com site as a page which visitors select the country they wish to buy/sell on. (IF it is the general consensus that it is better to create one large site.) Consider also please how the end user is likely to percieve the benefits to them of one LARGE SITE versus TARGETED SITE - I know the .Com would be divided into geographic sub directories, but I am not sure if they won't see an additinal benefit to the ccTLD - Does this add a degree of reassurance and relevance that a .com/ccTLD cannot provide? I suppose I am biased by the fact that ebay use ccTLDs? Thanks again - and please forgive my tone which may suggest I am playing devil's advocate here. I am very torn on this issue.0