Sitemap issues
-
Hi ALL
Okay I'm a bit confused here, but it says I have submitted 72 (pages) im assuming and its returning only (2 pages) have been indexed?
I submitted a new site map for each of my 3 top level domains and checked it today and its showing this result attached.
We are still having issues with meta tags showing up in the incorrect country.
If anyone knows how I can attend to this knightmare would be much appreciated lol
-
Awesome response Dirk! Thanks again for your endless help!
-
Hey again!
...I can't believe I didn't think of the simplicity of this earlier...
"it's even faster if you sort the url's in alphabetical order & delete the rows containing priority / lastmod /.. - then you only need to do a find/replace on the <loc>/</loc> "
100,000 spreadsheets and I don't even think of sorting for this task. Unreal. I laughed at myself aloud when I read it.
Thank you per usual my friend!
-
Hi Patrick,
Your method is a good one - I use more or less the same trick to retrieve url's from a sitemap in xls (it's even faster if you sort the url's in alphabetical order & delete the rows containing priority / lastmod /.. - then you only need to do a find/replace on the <loc>/</loc> )
It's just in this specific case as the sitemap was generated in Screaming Frog that it's easier to eliminate these redirected url's upfront.
Dirk
-
Thanks so much Dirk - this is great. I was speaking to how I found the specific errors. Thanks for posting this for the sitemap - definitely left a big chunk out on my part!
-
Hi Justin
The how-to of Patrick is correct - but as you are generating your sitemap using Screaming Frog there is really no need to go through this manual processing.
If you only need to create the sitemap:
Go to Configuration > Spider -
Tab: Basic settings: uncheck everything apart from "Crawl Canonicals" (unchecking Images/CSS/JS/External links is not strictly necessary but speeds up the crawl)Advanced: Check "Always Follow redirects" / "Respect Noindex" / "Respect Canonical"
After the crawl - generate the sitemap - it will now only contain the "final" url's - after the redirects.
Hope this helps,
Dirk
PS Try to avoid internal links which are redirected - better to replace these links by links to the final destination
-
Hi Justin
Probably the easiest way to eliminate these cross references is to ask your programmer to put all links as relative links rather than as absolute links. Relative links have the disadvantage that they can generate endless loops if something is wrong with the HTML - but this is something you can easily check with Screaming Frog.
If you check the .com version - example https://www.zenory.com/blog/tag/love/ -it's calling zenory.co.nz for plenty of links (just check the source & search for .co.nz) - both the http & the https version
You can check all these pages by hand - but I guess your programmer must be able to do this in an automated way.
It is also the case the other way round- on the .co.nz version - you'll find references in the source to the .com version
In screaming frog - the links with "NZ" are the only ones which should stay absolute - as they point to the other version
Hope this clarifies
Dirk
-
Wow thanks Patrick, let me run this and see how I go, thanks so much for your help!!!
-
Dirk, thanks so much for your help!
Could you tell me how to identify with the urls that are cross referencing - I tried using screaming frog and I found under the **external and clicked on inlinks and outlinks. **But whats really caught my eye, is alot of the links are from the blog with the same anchor text "name" others are showing up as a different name as well. Some are saying NZ NZ or AU AU as the anchor text and I think this has to do with the flag drop down to change the top level domains.
For eg:
FROM: https://www.zenory.co.nz/blog/tag/love/
TO: https://www.zenory.com.au/categories/love-relationships
Anchor Text: Twinflame Reading
-
Hi Justin
Yep! I use ScreamingFrog, here's how I do it:
-
Goto your /sitemap.xml
-
Select all + copy
-
Paste into Excel column A
-
Select column A
-
Turn "Wrap Text" off
-
Delete rows 1 through 5
-
Select column A again
-
"Find and Replace" the following:
-
<lastmod></lastmod>
-
<changefreq></changefreq>
-
daily
-
Whatever the date is
-
Priority numbers, usually 0.5 to 1.0
-
"Replace With" nothing, no spaces, nothing
-
You'll hit "Replace All" after every text string you put in, one at a time
-
With Column A still select, hit F5
-
Click "Special"
-
Click "Blank" and "Ok"
-
Right click in the spreadsheet
-
Select "Delete" and "Shift Rows Up"
Walla! You have your list. Now copy this list, and open ScreamingFrog. Click "Mode" up top and click "List". Click "Upload List" and click "Paste". Paste your URLs in there and hit Start.
Your sitemap will be crawled.
Here are URLs that returned 301 redirects:
https://www.zenory.com/blog/chat-psychic-readings/
https://www.zenory.com/blog/online-psychic-readings-private/
https://www.zenory.com/blog/live-psychic-readings/
https://www.zenory.com/blog/online-psychic-readings/Here are URLs that returned 503 Service Unavailable codes twice, but 200s now:
https://www.zenory.com/blog/spiritually-love/
https://www.zenory.com/blog/automatic-writing-psychic-readings/
https://www.zenory.com/blog/author/psychic-nori/
https://www.zenory.com/blog/zodiac-signs/
https://www.zenory.com/blog/soul-mate-relationship-challenges/
https://www.zenory.com/blog/author/zenoryadmin/
https://www.zenory.com/blog/soulmate-separation-break-ups/
https://www.zenory.com/blog/how-to-find-a-genuine-psychic/
https://www.zenory.com/blog/mind-body-soul/
https://www.zenory.com/blog/twin-flame-norishing-your-flame-to-find-its-twin/
https://www.zenory.com/blog/tips-psychic-reading/
https://www.zenory.com/blog/tips-to-dealing-with-a-broken-heart/
https://www.zenory.com/blog/the-difference-between-soul-mates-and-twin-flames/
https://www.zenory.com/blog/sex-love/
https://www.zenory.com/blog/psychic-advice-break-ups/
https://www.zenory.com/blog/author/ginny/
https://www.zenory.com/blog/chanelling-psychic-readings/
https://www.zenory.com/blog/first-release-cycle-2015/
https://www.zenory.com/blog/psychic-shaman-readings/
https://www.zenory.com/blog/chat-psychic-readings/
https://www.zenory.com/blog/psychic-medium-psychic-readings/
https://www.zenory.com/blog/author/trinity/
https://www.zenory.com/blog/psychic-readings-karmic-relationships/
https://www.zenory.com/blog/can-psychic-readings-heal-broken-heart/
https://www.zenory.com/blog/guidance-psychic-readings/
https://www.zenory.com/blog/mercury-retrograde-effects-life/
https://www.zenory.com/blog/online-psychic-readings-private/
https://www.zenory.com/blog/psychics-mind-readers/
https://www.zenory.com/blog/angel-card-readings-psychic-readings/
https://www.zenory.com/blog/cheating-relationship/
https://www.zenory.com/blog/long-distance-relationship/
https://www.zenory.com/blog/soulmate-psychic-reading/
https://www.zenory.com/blog/live-psychic-readings/
https://www.zenory.com/blog/psychic-readings-using-rune-stones/
https://www.zenory.com/blog/psychic-clairvoyant-psychic-readings/
https://www.zenory.com/blog/psychic-guidance-long-distance-relationships/
https://www.zenory.com/blog/author/libby/
https://www.zenory.com/blog/online-psychic-readings/I would check on that when you can. Check in Webmaster Tools if any issues have arrived there as well.
Hope this helps! Good luck!
-
-
Thanks so much Patrick! Can you recommend how I would go about finding the urls that are redirecting in the sitemap? I'm assuming screaming frog?
-
Hi Justin
Google doesn't seem to be figuring out (even with the correct hreflang in place) which site should be shown for each country.
If you look at the cached versions of your .com.au & .com versions it always the .co.nz version which is cached - this is probably also the reason why the meta description is wrong (it's always coming from the .co.nz version) and why the % of url's indexed for each sitemap (for the .com & .com.au version) is so low.
Try to rigorously eliminate all cross-references in your site - to make it more obvious for Google that these are 3 different sites:
-
in the footer - the links in the second column are pointing to the .co.nz version (latest articles) - change these links to relative ones
-
on all sites there are elements you load from the .com domain (see latest blog entries - the images are loaded from the .com domain for all tld's
As long as you send these confusing signals to Google - Google will mix up the different versions of your site.
rgds,
Dirk
-
-
Hi there Justin
Everything looks fine from here - there are a couple URLs that need to be updated in your sitemap as they are redirecting.
Google takes time to index, so give this a little more time. You could ask Google to recrawl your URLs but that's very unnecessary at the moment; just something to note.
I would make sure your internal links are all good to go and "follow" so that crawlers can at least find URLs that way.
I did a quick site: search on Google, so far you have 58 pages indexed. You should be okay.
Hope this helps! Good luck!
-
Hi Justin,
Similar question asked in this post @ http://moz.com/community/q/webmaster-tools-indexed-pages-vs-sitemap
Hope this helps you.
Thanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
"Google chose different canonical than user" Issue Can Anyone help?
Our site https://www.travelyaari.com/ , some page are showing this error ("Google chose different canonical than user") on google webmasters. status message "Excluded from search results". Affected on our route page urls mainly. https://www.travelyaari.com/popular-routes-listing Our canonical tags are fine, rel alternate tags are fine. Can anyone help us regarding why it is happening?
White Hat / Black Hat SEO | | RobinJA0 -
How many links can you have on sitemap.html
we have a lot of pages that we want to create crawlable paths to. How many links are able to be crawled on 1 page for sitemap.html
White Hat / Black Hat SEO | | imjonny0 -
Should I submit a sitemap for a site with dynamic pages?
I have a coupon website (http://couponeasy.com)
White Hat / Black Hat SEO | | shopperlocal_DM
Being a coupon website, my content is always keeps changing (as new coupons are added and expired deals are removed) automatically. I wish to create a sitemap but I realised that there is not much point in creating a sitemap for all pages as they will be removed sooner or later and/or are canonical. I have about 8-9 pages which are static and hence I can include them in sitemap. Now the question is.... If I create the sitemap for these 9 pages and submit it to google webmaster, will the google crawlers stop indexing other pages? NOTE: I need to create the sitemap for getting expanded sitelinks. http://couponeasy.com/0 -
Is there any reason to Nofollow Internal Links or XML Sitemap?
I am viewing a new client's site and they have the following nofollow(S) on their site homepage. Is there a reason for this? Also, they people who originally built their site have a footer link on every page to their company (I guess to promote their work). They didn't "nofollow" that link lol... What are the thoughts on footer links? About Us Privacy Policy Customer Service Shipping & Returns Blog Contact Us Site Map Thanks James Chronicle
White Hat / Black Hat SEO | | Atlanta-SMO0 -
Partial Sitemaps Impact on SERP
I have a website having 20 different categories. But have the sitemap for only 1 category and rest 19 categories will not have the sitemaps will this have an impact on the search results on not
White Hat / Black Hat SEO | | seosogo0 -
A client/Spam penalty issue
Wondering if I could pick the brains of those with more wisdom than me... Firstly, sorry but unable to give the client's url on this topic. I know that will not help with people giving answers but the client would prefer it if this thread etc didn't appear when people type their name in google. Right, to cut a long story short..gained a new client a few months back, did the usual things when starting the project of reviewing the backlinks using OSE and Majestic. There were a few iffy links but got most of those removed. In the last couple of months have been building backlinks via guest blogging and using bloggerlinkup and myblogguest (and some industry specific directories found using linkprospector tool). All way going well, the client were getting about 2.5k hits a day, on about 13k impressions. Then came the last Google update. The client were hit, but not massively. Seemed to drop from top 3 for a lot of keywords to average position of 5-8, so still first page. The traffic went down after this. All the sites which replaced the client were the big name brands in the niche (home improvement, sites such as BandQ, Homebase, for the fellow UK'ers). This was annoying but understandable. However, on 27th June. We got the following message in WMT - Google has detected a pattern of artificial or unnatural links pointing to your site. Buying links or participating in link schemes in order to manipulate PageRank are violations of Google's Webmaster Guidelines.
White Hat / Black Hat SEO | | GrumpyCarl
As a result, Google has applied a manual spam action to xxxx.co.uk/. There may be other actions on your site or parts of your site. This was a shock to say the least. A few days later the traffic on the site went down more and the impressions dropped to about 10k a day (oddly the rankings seem to be where they were after the Google update so perhaps a delayed message). To get back up to date....after digging around more it appears there are a lot of SENUKE type links to the site - links on poor wiki sites,a lot of blog commenting links, mostly from irrelevant sites, i enclose a couple of examples below. I have broken the links so they don't get any link benefit from this site. They are all safe for work http:// jonnyhetherington. com/2012/02/i-need-a-new-bbq/?replytocom=984 http:// www.acgworld. cn/archives/529/comment-page-3 In addition to this there is a lot of forum spam, links from porn sites and links from sites with Malware warnings. To be honest, it is almost perfect negative seo!! I contacted several of the sites in question (about 450) and requested they remove the links, the vast majority of the sites have no contact on them so I cannot get the links removed. I did a disavow on these links and then a reconsideration request but was told that this is unsuccessful as the site still was being naughty. Given that I can neither remove the links myself or get Google to ignore them, my options for lifting this penalty are limited. What would be the course of action others would take, please. Thanks and sorry for overally long post0 -
Penguin issues
Hello everyone, I run about 10 sites and pretty much every single one got hit by Penguin (the traffic plummeted on 24th April). I have never done reciprocal links (except 1 domain upto 2005 or so), I have never bought links, I have never spammed message boards or anything like that (except 1 different domain got hit by negative SEO by someone else) and I have never employed anyone to do any of the above. The way I have created sites for the last 10 years is to try to make them useful and let the links build naturally which more or less worked until April this year. I've been tearing my hair out ever since. The only thing you can say about all of them (apart from that I own them but I've been careful with whois etc) is that the link profile is 100% natural apart from the 2 provisos above. Since April I've hired people but I'm down $20K but not any better in the rankings. A few of the sites are: short-hairstyles.com was number 1 for short hairstyles and short haircuts for years then Penguin came and its dropped off for both. It had 10000 or so spammy message board links posted by someone as negative seo I have got some removed but google webmaster tools still reports them as there. There are tentative signs of recovery (maybe) but no traffic increase. 1001-hairstyles.com has been there or there abouts for 10 years for the keyword hairstyles and hair styles until April. A site ourlipsaresealed.skyblogs.be has 30000 links to it (there are only 40000 total) with the anchor text haarstijls which is dutch for hairstyles, I don't think its malicious just they set a template and do a new page every day and they also link in the same way to a competitor who wasn't affected. An seo firm have been working on this one for a few months, the traffic increased 50% a couple of weeks ago but bombed the day after to worse than before. Prom-hairstyles.org when the same way as above in April. The only back link oddity is a site polyvore.com links to it about 400 times (out of 1000 or so total) they are using our pictures to sell their prom dresses (with out permission) but mostly deep link. Most of the other sites went in a similar way but have no obvious backlink anomalies. Do I use the link disavowel tool? I am a bit wary of it because if you watch matt cutts video he keeps reiterating that the tool is for people who have used dodgy link practises in the past and want to do a clean up but that isn't me so am I owning up to something I haven't done by using it? Are the search results as strange in everybody's niche? In mine there is some real dross as well as loads of pinterest and other user generated stuff. Sorry to go on for so long and thanks for getting this far. Ian
White Hat / Black Hat SEO | | jwdl0