Sitemap Help!
-
Hi Guys,
Quick question regarding sitemaps. I am currently working on a huge site that has masses of pages.
I am looking to create a site map. How would you guys do this? i have looked at some tools but it say it will only do up to 30,000 pages roughly. It is so large it would be impossible to do this myself....any suggestions?
Also, how do i find out how many pages my site actually has indexed and not indexed??
Thank You all
Wayne
-
The problem that I have with CMS side sitemap generators is that it often pulls content from pages that are existing and adds entries based off that information. If you have pages linked to that are no longer there, as is the case with dynamic content, then you'll be imposing 404's on yourself like crazy.
Just something to watch out for but it's probably your best solution.
-
Hi! With this file, you can create a Google-friendly sitemap for any given folder almost automatically. No limits on the number of files. Please note that the code is the courtesy of @frkandris who generously helped me out when I had a similair problem. I hope it will be as helpful to you as it was to me
- Copy / paste the code below into a text editor.
- Edit the beginning of the file: where you see seomoz.com, put your own domain name there
- Save the file as getsitemap.php and ftp it to the appropriate folder.
- Write the full URL in your browser: http://www.yourdomain.com/getsitemap.php
- The moment you do it, a sitemap.xml will be generated in your folder
- Refresh your ftp client and download the sitemap. Make further changes to it if you wish.
=== CODE STARTS HERE ===
define(DIRBASE, './');define(URLBASE, 'http://www.seomoz.com/'); $isoLastModifiedSite = "";$newLine = "\n";$indent = " ";if (!$rootUrl) $rootUrl = "http://www.seomoz.com"; $xmlHeader = "$newLine"; $urlsetOpen = "<urlset xmlns=""http://www.google.com/schemas/sitemap/0.84"" ="" <="" span="">xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.google.com/schemas/sitemap/0.84 http://www.google.com/schemas/sitemap/0.84/sitemap.xsd">$newLine";$urlsetValue = "";$urlsetClose = "</urlset>$newLine"; function makeUrlString ($urlString) { return htmlentities($urlString, ENT_QUOTES, 'UTF-8');} function makeIso8601TimeStamp ($dateTime) { if (!$dateTime) { $dateTime = date('Y-m-d H:i:s'); } if (is_numeric(substr($dateTime, 11, 1))) { $isoTS = substr($dateTime, 0, 10) ."T" .substr($dateTime, 11, ."+00:00"; } else { $isoTS = substr($dateTime, 0, 10); } return $isoTS;} function makeUrlTag ($url, $modifiedDateTime, $changeFrequency, $priority) { GLOBAL $newLine; GLOBAL $indent; GLOBAL $isoLastModifiedSite; $urlOpen = "$indent<url>$newLine";</url> $urlValue = ""; $urlClose = "$indent$newLine"; $locOpen = "$indent$indent<loc>";</loc> $locValue = ""; $locClose = "$newLine"; $lastmodOpen = "$indent$indent<lastmod>";</lastmod> $lastmodValue = ""; $lastmodClose = "$newLine"; $changefreqOpen = "$indent$indent<changefreq>";</changefreq> $changefreqValue = ""; $changefreqClose = "$newLine"; $priorityOpen = "$indent$indent<priority>";</priority> $priorityValue = ""; $priorityClose = "$newLine"; $urlTag = $urlOpen; $urlValue = $locOpen .makeUrlString("$url") .$locClose; if ($modifiedDateTime) { $urlValue .= $lastmodOpen .makeIso8601TimeStamp($modifiedDateTime) .$lastmodClose; if (!$isoLastModifiedSite) { // last modification of web site $isoLastModifiedSite = makeIso8601TimeStamp($modifiedDateTime); } } if ($changeFrequency) { $urlValue .= $changefreqOpen .$changeFrequency .$changefreqClose; } if ($priority) { $urlValue .= $priorityOpen .$priority .$priorityClose; } $urlTag .= $urlValue; $urlTag .= $urlClose; return $urlTag;} function rscandir($base='', &$data=array()) { $array = array_diff(scandir($base), array('.', '..')); # remove ' and .. from the array / foreach($array as $value) : / loop through the array at the level of the supplied $base / if (is_dir($base.$value)) : / if this is a directory / $data[] = $base.$value.'/'; / add it to the $data array / $data = rscandir($base.$value.'/', $data); / then make a recursive call with the current $value as the $base supplying the $data array to carry into the recursion / elseif (is_file($base.$value)) : / else if the current $value is a file / $data[] = $base.$value; / just add the current $value to the $data array */ endif; endforeach; return $data; // return the $data array } function kill_base($t) { return(URLBASE.substr($t, strlen(DIRBASE)));} $dir = rscandir(DIRBASE);$a = array_map("kill_base", $dir); foreach ($a as $key => $pageUrl) { $pageLastModified = date ("Y-m-d", filemtime($dir[$key])); $pageChangeFrequency = "monthly"; $pagePriority = 0.8; $urlsetValue .= makeUrlTag ($pageUrl, $pageLastModified, $pageChangeFrequency, $pagePriority); } $current = "$xmlHeader$urlsetOpen$urlsetValue$urlsetClose"; file_put_contents('sitemap.xml', $current); ?>
=== CODE ENDS HERE ===
-
HTML sitemaps are good for users; having 100,000 links on a page though, not so much.
If you can (and certainly with a site this large) if you can do video and image sitemaps you'll help Google get around your site.
-
Is there any way i can see pages that have not been indexed?
Not that I can tell and using site: isn't going to be feasible on a large site I guess.
Is it more beneficial to include various site maps or just the one?
Well, the max files size is 50,000 or 10MB uncompressed (you can gzip them), so if you've more than 50,000 URLs you'll have to.
-
Is there any way i can see pages that have not been indexed?
Is it more beneficial to include various site maps or just the one?
Thanks for your help!!
-
Thanks for your help
do you ffel it is important to have HTML + Video site maps as well? How does this make a differance?
-
How big we talking?
Probably best grabbing something server side if your CMS can't do it. Check out - http://code.google.com/p/sitemap-generators/wiki/SitemapGenerators - I know Google says they've not tested any (and neither have I) but they must have looked at them at some point.
Secondly you'll need to know how to submit multiple sitemap parts and how to break them up.
Looking at it Amazon seem to cap theirs at 50,000 and Ebay at 40,000, so I think you should be fine with numbers around there.
Here's how to set up multiple sitemaps in the same directory - http://googlewebmastercentral.blogspot.com/2006/10/multiple-sitemaps-in-same-directory.html
Once you've submitted your sitemaps Webmaster Tools will tell you how many URLs you've submitted vs. how many they've indexed.
-
Hey,
I'm assuming you mean XML sitemaps here: You can create a sitemap index file which essentially lists a number of sitemaps in one file (A sitemap of sitemap files if that makes sense). See http://www.google.com/support/webmasters/bin/answer.py?answer=71453
There are automatic sitemap generators out there - if you're site has categories with thousands of pages I'd split up them up and have a sitemap per category.
DD
-
To extract URLs, you can use Xenu Link Sleuth. Then you msut make a hiearchy of sitemaps so that all sitemaps are efficiently crawled by Google.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Fixing Index Errors in the new Google Search Console - Help
Hi, So I have started using the new Search Console and for one of my clients, there are a few 'Index Coverage Errors'. In the old version you could simply, analyse, test and then mark any URLs as fixed - does anyone know if that is possible in the new version? There are options to validate errors but no 'mark as fixed' options. Do you need to validate the errors before you can fix them?
On-Page Optimization | | daniel-brooks0 -
HTML Visual Sitemap
Hello, Can anyone suggest the best tool for a visual HTML sitemap? I want to show what a website looks like architecturally before and be able to drag pages around visually to show an enhanced site architecture. I have looked at a few tools online but would like to try before I buy and also get recommendations. Any ideas?
On-Page Optimization | | AL123al1 -
Traffic Down - May Need Outside Help
Hi Moz Community - Our website (www.motivators.com) experienced a small traffic drop in mid-March. This was followed by a steady decline in traffic through May, June and July. Please note that a site redesign went live on April 4th. Starting in mid-July, we began implementing aggressive site improvements (mostly based upon site speed), but our traffic is still down. Can anyone recommend a service or company that can look at our site and determine the root cause / more strategies for improvement? Thanks for your suggestions!
On-Page Optimization | | Motivators0 -
Google cache tool help
This link is for the Ebay Google cache - http://webcache.googleusercontent.com/search?q=cache:www.ebay.com&strip=1 I wanted to do the same for my homepage so I switched out the urls and it worked. When I try to get a different link in there such as mysite.com/category it wont work. I know my pages are indexed. Any ideas why it wont work for other pages?
On-Page Optimization | | EcommerceSite0 -
Broken Links/Images Non Optimized Content Help
Hello, After 4 years of incorrect blog creation and 4000 blog posts I have seen the light. I have implemented a new theme, SEO friendly, based on a yoast website report and have been creating timeless blogs with correct structure. However, I have 5400 broken links and images based on the Broken Link Checker. Additionally, of our 2500 blog posts which are on app reviews, only 500 are optimized so I have been painstakingly been going through each to optimize, while balancing new content output. The trouble is Google is seeing massive changes and we are tumbling in search. Any suggestions on how to approach this as it is my understanding that the broken links/images are doing considerable damage to our website overall. the website is crazymikesapps.com thank you Mike
On-Page Optimization | | crazymikesapps0 -
Hit by Panda - Google Disavow Help
Hi I hope you can help me A Website I manage has been hit hard by the Panda Update. I am really struggling to understand what is seen as a Spammy link. The Website use to be on page 1 for "fancy dress" now it isnt visable for that term at all and most other terms the site has dropped for. I have looked into what might have gone wrong and have removed several links , used the disavow tool 2-3 times and submitted re-consideration requests, but each time google informs me that they are still detecting unnatural links. Could somebody please take a look at our link profile www.partydomain.co.uk for "fancy dress" as an example and show examples of links you would consider that google might not like. It would also be good if anybody had any contacts in the UK that could help thanks Adam
On-Page Optimization | | AMG1000 -
Why Aren’t All My XML Sitemap Images Indexed in Webmaster Tools?
Hi, Here is our main sitemap http://www.vistastores.com/newsitemap/main_sitemap.xml We have submitted all category wise sitemap having Image Tags : For eg - Ac Category http://www.vistastores.com/newsitemap/window_ac_sitemap.xml contains iamge tag - image:imageimage:locimage:captionimage:title</image:title></image:caption></image:loc></image:image> All our 142 category pages includes these format. Still the sitemap report on 4-Apr-2013 says: Sitemaps content Web pages:
On-Page Optimization | | CommercePundit
Submitted 14,569
Indexed 11,219 Images:
Submitted 21,442
Indexed 11,762 You can see major difference in submitted v/s indexed. I have looked into Jay Simpson question - http://www.seomoz.org/q/any-idea-why-our-sitemap-images-aren-t-indexed to find this answer but didn't get Perfect & clear answer. I need urgent answer to fix this issue..... K0NDuw5s.jpg0 -
Help with Appropriate Use of Rel Canonical
Whenever i enable Canonical URL through the 3DCart Control panel I get this Critical Factor error when running the on page report card: Appropriate Use of Rel Canonical Moderate fix <dl> <dt>Canonical URL</dt> <dd>"http://rcnitroshop.com/Nitro-Monster-Truck"</dd> <dt>Explanation</dt> <dd>If the canonical tag is pointing to a different URL, engines will not count this page as the reference resource and thus, it won't have an opportunity to rank. Make sure you're targeting the right page (if this isn't it, you can reset the target above) and then change the canonical tag to reference that URL.</dd> <dt>Recommendation</dt> <dd>We check to make sure that IF you use canonical URL tags, it points to the right page. If the canonical tag points to a different URL, engines will not count this page as the reference resource and thus, it won't have an opportunity to rank. If you've not made this page the rel=canonical target, change the reference to this URL. NOTE: For pages not employing canonical URL tags, this factor does not apply.</dd> </dl> Now if I disable Canonical URL then run the on page report card again the critical error goes away but I get this Optional Factor error instead: Canonical URL Tag Usage Moderate fix <dl> <dt>Number of Canonical tags</dt> <dd>0</dd> <dt>Explanation</dt> <dd>Although the canonical URL tag is generally thought of as a way to solve duplicate content problems, it can be extremely wise to use it on every (unique) page of a site to help prevent any query strings, session IDs, scraped versions, licensing deals or future developments to potentially create a secondary version and pull link juice or other metrics away from the original. We believe the canonical URL tag is a best practice to help prevent future problems, even if nothing is specifically duplicate/problematic today.</dd> <dt>Recommendation</dt> <dd>Add a canonical URL tag referencing this URL to the header of the page.</dd> </dl> So basically I disabled it because obviously a Critical error is much worse then an optional error. Is there a way I can get rid of both errors?
On-Page Optimization | | bilsonx0