Using Site Maps Correctly
-
Hello
I'm looking to submit a sitemap for a post driven site with over 5000 pages.
The site hasn't got a sitemap but it is indexed by google - will submitting a sitemap make a difference at this stage?
Also, most free sitemap tools only go up to 5000 pages, and I'm thinking I would try a sitemap using a free version of the tool before I buy one - If my site is 5500 pages but I only submit a sitemap for 5000 (I have no control of which pages get included in the sitemap) would this have a negative effect for the pages that didn't get included?
Thanks
-
Submitting a sitemap in Webmaster Console is always a good idea at any stage. If your website URLs are crawled and indexed in search engines than there will be no negative impact of it but in the longer run if you add more pages sitemap will defiantly a help.
If you are using CMS like WordPress, Joomla, Zencart or any other they all have extensions and plugins in their directory that will help you generate the sitemap of your current site and will add links as soon as you will add more pages.
Rest peter explains almost everything in detail like if you have URL issues and issues with crawling and indexing.
If you have a custom CMS, I think you should seriously consider the idea by Peter as this is something you need on regular basis anyways!
Hope this helps!
-
It's hard to tell without seeing your URL architecture.
First there are two specific terms and you never, never ever should forget them. They are - crawling and indexing. Once you prepare sitemap and submit there (or include in robots.txt) all bots get some map of your site and start crawling pages based on their crawling budget for your site. In crawling process they MAY find new pages that doesn't include in this map and will crawl them too. Again this is based on your crawling budget.
So when you submit sitemap - bot will get within seconds list of "non-crawled" 5000 pages and will start crawl them. Then he can find missed 500 pages and will crawl them too. Tricky is that when you update sitemap - he can detect quick changes there and start recrawling them again. But for missed 500 pages he can visit you again to check them for changes. And this will be also under your crawling budget. But if pages there isn't changed often - isn't big deal.
So you shouldn't hesitated about negative impact there. Only negative impact can happen if you have some serious URL architecture issues and messy URLs there. Then submitting partial sitemap can obfuscate this issues and some of your URLs to remain non-crawled.
Technically in SearchConsole you can see sitemap statistics like submitted and indexed. In perfect world numbers should be almost equal with little difference. But if you see huge difference between them - then you're in trouble. For example - on some site i have sitemap with submitted 44,950 pages and indexed of them was 29,643. This is pure example site crawling troubles or sitemap troubles. Because 1/3 of all pages isn't indexed at all.
PS: I forgot. You should use own CMS plugin for generating sitemap inside. Even if your CMS was custom made you should write (or hire someone) to create plugin inside. It's near 20-30 lines of write-here-your-favorite-language (PHP/Python/Perl/Ruby) and isn't big deal. This plugin will minimize crawling time from 3rd party sitemap generator tool because CMS already have all information inside and just need to be exported to XML.
-
It would definitely be better to submit a complete sitemap. If your site is built in Wordpress, Joomla, Magento, or many other standard CMS, it should have the ability to generate a full sitemap. Plugins like Yoast or Google Sitemaps help. Just depends on the site.
Otherwise you can probably get any pro SEO or agency to create a full 5500+ sitemap for you for $100 bucks or so. PM me if you need more help.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site property is verified for new version of search console, but same property is unverified in the old version
Hi all! This is a weird one that I've never encountered before. So basically, an admin granted me search console access as an "owner" to the site in search console, and everything worked fine. I inspected some URL's with the new tool and had access to everything. Then, when I realized I had to remove certain pages from the index it directed me to the old search console, as there's no tool for that yet in the new version. However, the old version doesn't even list the site under the "property" dropdown as either verified or unverified, and if I try to add it it makes me undergo the verification process, which fails (I also have analytics and GTM access, so verification shouldn't fail). Has anyone experienced something similar or have any ideas for a fix? Thanks so much for any help!
Reporting & Analytics | | TukTown1 -
New Site Worries
To cut a long story short, our old web developers who built us a bespoke site decided that they could no longer offer us support so we decided to move our back end to the latest Magento 2 software and move over to https with a new company. The new setup has been live for 3 weeks, I have checked in webmaster tools and it says we have 4 pages indexed, if I type in site:https://www.mydomain.com/ we have 6560 pages indexed, our robots.txt file looks like this:Sitemap: https://www.mydomain.com/sitemap.xml Sitemap: https://www.mydomain.com/sitemaps/sitemap_default.xml I use Website Auditor and Screaming Frog, Website Auditor returns a 302 for my domain and Screaming Frog returns a 403 which means I cannot scan any of these. If I check my domain using an https checking tool some sites return an error but some return a 200.
Reporting & Analytics | | Palmbourne
I have spoken to my new developer and he says everything is fine, in Webmaster tools I can see some redirects from his domain to mine when the site was in testing mode. I am concerned that something is not right as I always check my pages on a regular basis. Can anyone shed any light on this, is it right or am I right to be concerned. Thank you in advance0 -
Losing referrer data on http link that redirects to an https site when on an https site. Is this typical or is something else going on here?
I am trying to resolve a referral data issue. Our client noticed that their referrals from one of their sites to another had dropped to almost nothing from being their top referrer. The referring site SiteA which is an HTTPs site, held a link to SiteB, which is also an HTTPs site, so there should be no loss, however the link to SiteB on SiteA had the HTTP protocol. When we changed the link to the HTTPs protocol, the referrals started flowing in. Is this typical? If the 301 redirect is properly in place for SiteB, why would we lose the referral data?
Reporting & Analytics | | Velir0 -
Moz Crawl shows over 100 times more pages than my site has?
The latest crawl stats are attached. My site has just over 300 pages? Wondering what I have done wrong? RRv3fR0
Reporting & Analytics | | Billboard20120 -
Had suspicious spike in Adsense clicks, next day site ranking tanks
Yesterday, one of my sites had extreme Adsense clicks for several hours in the morning, which brought it up to CTRs of around 120%. My normal CTR is about 10-15%. It added several hundred dollars income over and above my normal amount. After that, it went back to normal. I have waited to see if Google would adjust the income down, as someone or some bot seemingly clicked the heck out of the site's ads. Nothing has been adjusted; it's been 24 hours. Question #1: what usually causes this type of insane clicking to occur (i.e. competitors messing me?) Then, today I noticed something else disturbing. I cannot find my site in the top 100 SERPs for the main keyword. I was at #1 for a couple years, then, when I changed themes from Thesis to Genesis (site otherwise exactly the same) a couple months ago, I bounced around various positions on the first page. In the last couple weeks we've been bouncing between the teens and the thirties. Two days ago we were at #15. (the site is still indexed when I use "site:" to check. It seems awfully coincidental that yesterday I had the Adsense click explosion, and today I'm not even in the top 100 for the first time in my pretty stable two-year history, and have no idea how far behind 100 I am. I went to Google Webmaster Tools and see no errors or warnings relating to this. Adsense has not sent me any messages. So... Question #2: does Google search apply some sort of penalty to site that have suspicious Adsense clicking? By the way, I don't have any funny business going on with any bad SEO practices, it's all above board, and I have thousands of real readers each day Liking and commenting on the pages. It's a very real site. Note: I have been checking the ranking each day via a Google Incognito window and searching for the term. Of course I use MOZ but I do the Incognito search for a quick real time check, which I've found to be accurate.
Reporting & Analytics | | bizzer0 -
What tools are people using to analyse clicked links
Hi, What tools do you use/recommend to analyse what/where links are being clicked on a page. I have seen a few mentions about CrazyEgg but are there any free (but good) tools around worth using?
Reporting & Analytics | | NeilD0 -
Can you use Google tag manager to manage rich snippets/schema mark up?
Hi,Does anyone know if you can use the new Google tag manager to manage rich snippets? I've seen that there is an HTML section where you can edit HTML that isn't shown on the site - do you think this field could be used to add schema data?Thanks,Karen
Reporting & Analytics | | Digirank0 -
Setting up Google Analytic Goals to a 3rd Party Site
I recently received help on a question I asked on SEOmoz but need additional clarification. I am trying to set up goals in Google Analytics for people who click on a “purchase botton” which sends them to PayPal. I created a Thank You page and tried to get PayPal to redirect to it, however, our customers only get to our site’s 404 page. Here is what I’ve done so far: Went into my PayPal account and turned the “Auto Return” to ‘on’ Under website payment preferences, I added the following URL http://www.teecycle.org/thank-youutm_nooverride1. (I formatted the URL this way because the person who provided me with help recommended using the format ?UTM_nooverride=1. However, our CMS system won’t allow “?” or “=”)
Reporting & Analytics | | EricVallee340