More sitemap issues: help
-
Hey Guys,
Seems I'm having more sitemap issues -I just checked my WMT and find that for my com.au and com site - the com.au site is showing i only have 2 pages indexed and 72 Web Pages submitted.
The .com I look under sitemaps and it doesn't show any results as to how many pages have been indexed instead it is giving me this error warning - "Your Sitemap appears to be an HTML page. Please use a supported sitemap format instead."
All 3 sites are listed here:
Any advice would be much appreciate here!
Thanks guys
-
Hi Patrick,
Thanks for your response, all sites have sitemaps, I submitted all 3 domains below to through WMT
https://www.zenory.com.au/sitemap_us.xml
https://www.zenory.com/sitemap_au.xml
https://www.zenory.com/sitemap_nz.xml
so that they would not be overwriten all top level domains are on the same server. After this I still have no idea why they are not showing up correctly. I have limited all the cross referencing from each site to try and better defensiate each site so google could pick up that each one was relevant to the country specified. For the au it says 74 were submitted and only 2 were indexed.
Any advice around this would be awesome
Thanks in advance
-
Hi there
I am not seeing sitemaps for...
https://www.zenory.com.au/sitemap.xml
https://www.zenory.co.nz/sitemap.xmlDo you have sitemaps for these sites? Have you added each as their own profile in Search Console and submitted a sitemap? Google has a resource to help diagnose problems with your sitemap - have you looked into that at all to see if Google found any errors?
Let me know if this helps! Good luck!
-
Hi Oleg,
All 3 sites have the hreflang tags, they also have geo targeting set for all 3 countries, they are all index for the right countries, however the pages are indexing is quiet low considering i have submitted about 70 pages.
-
Your sites wont get indexed because they are all exact duplicates of one another. Hence, Google indexes the .com but not the others.
What you need to do is link all of the websites up with a hreflang tag... this tells google that all 3 of these sites are the same, except you should display the correct domain depending on where the searcher is from.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help finding website content scraping
Hi, I need a tool to help me review sites that are plagiarising / directly copying content from my site. But tools that I'm aware, such as Copyscape, appear to work with individual URLs and not a root domain. That's great if you have a particular post or page you want to check. But in this case, some sites are scraping 1000s of product pages. So I need to submit the root domain rather than an individual URL. In some cases, other sites are being listed in SERPs above or even instead of our site for product search terms. But so far I have stumbled across this, rather than proactively researched offending sites. So I want to insert my root domain & then for the tool to review all my internal site pages before providing information on other domains where an individual page has a certain amount of duplicated copy. Working in the same way as Moz crawls the site for internal duplicate pages - I need a list of duplicate content by domain & URL, externally that I can then contact the offending sites to request they remove the content and send to Google as evidence, if they don't. Any help would be gratefully appreciated. Terry
White Hat / Black Hat SEO | | MFCommunications0 -
Robots.txt file in Shopify - Collection and Product Page Crawling Issue
Hi, I am working on one big eCommerce store which have more then 1000 Product. we just moved platform WP to Shopify getting noindex issue. when i check robots.txt i found below code which is very confusing for me. **I am not getting meaning of below tags.** Disallow: /collections/+ Disallow: /collections/%2B Disallow: /collections/%2b Disallow: /blogs/+ Disallow: /blogs/%2B Disallow: /blogs/%2b I can understand that my robots.txt disallows SEs to crawling and indexing my all product pages. ( collection/*+* ) Is this the query which is affecting the indexing product pages? Please explain me how this robots.txt work in shopify and once my page crawl and index by google.com then what is use of Disallow: Thanks.
White Hat / Black Hat SEO | | HuptechWebseo0 -
Active, Old Large site with SEO issues... Fix or Rebuild?
Looking for opinions and guidance here. Would sincerely appreciate help. I started a site long, long ago (1996 to be exact) focused on travel in the US. The site did very well in the search results up until panda as I built it off templates using public databases to fill in the blanks where I didn't have curated content. The site currently indexes around 310,000 pages. I haven't been actively working on the site for years and while user content has kept things somewhat current, I am jumping back into this site as it provides income for my parents (who are retired). My questions is this. Will it be easier to track through all my issues and repair, or rebuild as a new site so I can insure everything is in order with today's SEO? and bonus points for this answer ... how do you handle 301 redirects for thousands of incoming links 😕 Some info to help: CURRENTLY DA is in the low 40s some pages still rank on first page of SERPs (long-tail mainly) urls are dynamic (I have built multiple versions through the years and the last major overhaul was prior to CMS popularity for this size of site) domain is short (4 letters) but not really what I want at this point Lots of original content, but oddly that content has been copied by other sites through the years WHAT I WANT TO DO get into a CMS so that anyone can add/curate content without needing tech knowledge change to a more relevant domain (I have a different vision) remove old, boilerplate content, but keep original
White Hat / Black Hat SEO | | Millibit1 -
Sitemap issues 19 warnings
Hi Guys I seem to be having a lot of sitemap issues. 1. I have 3 top level domains, all with the co.nz sitemap that was submitted 2. I'm in the midst of a site re-design so I'm unsure if I should be updating these now or when the new site goes live (in two weeks) 3. I have 19 warnings from GWT for the co.nz site and they gave me 3 examples looks like 404 errors however I'm not too sure and a bit green on my behalf to find out where the issues are and how to fix them. (it is also showing that 95 pages submitted and only 53 were indexed) 4. I generated recently 2 sitemaps for .com and com.au submitted these both to google and when i create i still see the co.nz sitemap Would love some guidance around this. Thanks
White Hat / Black Hat SEO | | edward-may0 -
SERPs Help
Hey Mozzers, Please can someone advise? I manage the on-line content for an estate of Gyms in the UK. We had an existing gym location in Birmingham - www.nuffieldhealth.com/gyms/birmingham and 5 months ago we opened a new location in Birmingham - www.nuffieldhealth.com/gyms/birmingham-central. The 2 pages have different in-page content, different H1's, different title tags, different citations in page both have a few back links from different root domains, however the 2nd page (birmingham-central) does not rank in the top 50 results even though our domain is strong that the vast majority of results? Our original page (/gyms/birmingham) also slipped from page 1 in SERPs to the bottom of page 2 when the second Birmingham gym page was deployed?? I am guessing Google does not know which page to serve in SERPs, bud i am at a lose as to how to fix this issue. Can anyone please advise?? Regards Ben
White Hat / Black Hat SEO | | Bendall0 -
Does linking older posts help?
Asking a blogger to add an anchor text into their old post that relates to my niche. does that help with backlinks? does the quality of backlinks determine by how new the post is or the page rank determines all? for example a new post with lesser page rank vs a old post with higher page rank which one is better to put your link on?
White Hat / Black Hat SEO | | andzon0 -
Has Panda help this site achieve great heights? How? and Why?
Today I went about my business in trying to understand what is happening in our market, eyewear, after the last Panda update. I was interested to know if any of our competitors were effected as much as we were for a very competitive key phrase To my surprise a new kid appeared on the block, well, on page one, position two. Imagine my second surprise, when the new kid turn out to be a 3 month old domain, yes 3 months, with zero page rank and zero back links. I was in for one more surprise before I stood up, walked to the window and gazed into space to contenplate the meaning of Panda and SEO as we know it. This third surprise was the site in question is a counterfeiting site using black hat SEO with fast results. It has a Blog its a good looking site with the key phrase menstioned a hundred times. google-UK-%20Search-Result.jpg panda-help.jpg
White Hat / Black Hat SEO | | ShoutChris0 -
Do backlinks with good anchor text from bad sites help?
Hi, In the Netherlands, the SEO competition for terms like loans is very competitive. I see a website in this industry that seems to be doing very well based on links with good anchor text from sites that seem quite worthless to me, such as: http://www.online-colleges-helper.com/ and http://www.alohapath.com/ My question is: is it worth pursuing this type of links? I assume these must be paid links, or am I wrong? I'd really rather not go down this route but I don't want to be outranked by someone who is using these types of links... Many thanks in advance for any type of insight! Annemieke
White Hat / Black Hat SEO | | AnnemiekevH0