SITEMAP.asp having 10,000 pages
-
A website having more than 10,000 pages, as per the Google Algorithm If I restrict the page links to 100 for sitemap.asp then I have to generate 100 pages, any idea to shorten the process. Please advice.
-
I just want to verify that you're talking about a sitemap for the users, and not the xml sitemap for the search engines to crawl. The sitemap for the search engines can have 50,000 entries in a sitemap, per the specifications at http://www.sitemaps.org/protocol.html.
One example Ryan Kent likes to use is Verizon's sitemap at http://www.verizonwireless.com/b2c/sitemap.jsp. They don't have every one of their pages listed, but you can easily find what you are looking for via their sitemap.
-
I have a site with 100K pages and our sitemap has about 100 entries I think.
In my opinion you should not shorten the process. It is important to do it correctly, and it depends on the site at hand.
Now, I do not now which website this is regarding, so hard to give advice about a specfic case.
But what I would do is include the home page and all navigational sections (sub section indexes). Other than that include the most important pages for your company and the central ones which change often.
These things you probably already know. I have not heard of quick way of just making a sitemap for a large site. It requires attention. But i guess, that if you really want to do it quickly and really do not want to put in the effort and thaught to make quality sitemap you could probably just take the 100 most visited pages for your site in the last year. In their you should find the most important pages?
Regards,
Rasmus
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap Rules
Hello there, I have some questions pertaining to sitemaps that I would appreciate some guidance on. 1. Can an XML sitemap contain URLs that are blocked by robots.txt? Logically, it makes sense to me to not include pages blocked by robots.txt but would like some clarity on the matter i.e. will having pages blocked by robots.txt in a sitemap, negatively impact the benefit of a sitemap? 2. Can a XML sitemap include URLs from multiple subdomains? For example: http://www.example.com/www-sitemap.xml would include the home page URL of two other subdomains i.e. http://blog.example.com/ & http://blog2.example.com/ Thanks
Technical SEO | | SEONOW1230 -
Pages Not Getting Indexed
Hey there I have a website with pretty much 3-4 pages. All of them had a canonical pointing to one page and the same content ( which happened by mistake ) I removed the canonical URL and added one pointing to its page. Also, I added the original content that was supposed to be there to begin with. It's been weeks but those pages are not getting indexed on the SERPS while the one that they use to point with the canonical does.
Technical SEO | | AngelosS0 -
New site: More pages for usability, or fewer more detailed pages for greater domain authority flow?
Ladies and gents! We're building a new site. We have a list of 28 professions, and we're wondering whether or not to include them all on one long and detailed page, or to keep them on their own separate pages. Thinking about the flow of domain authority - I could see 28 pages diluting it quite heavily - but at the same time, I think having the separate pages would be better for the user. What do you think?
Technical SEO | | Muhammad-Isap1 -
23,000 pages indexed, I think bad
Thank you Thank you Moz People!! I have a successful vacation rental company that has terrible seo but getting better. When I first ran Moz crawler and page grader, I had 35,000 errors and all f's.... tons of problem with duplicate page content and titles because not being consistent with page names... mainly capitalization and also rel canonical errors... with that said, I have now maybe 2 or 3 errors from time to time, but I fix every other day. Problem Maybe My site map shows in Google Webmaster submitted 1155
Technical SEO | | nickcargill
1541 indexed But google crawl shows 23,000 pages probably because of duplicate errors or possibly database driven url parameters... How bad is this and how do I get this to be accurate, I have seen google remove tool but I do not think this is right? 2) I have hired a full time content writer and I hope this works My site in google was just domain.com but I had put a 301 in to www.domain.com becauses www. had a page authority where the domain.com did not. But in webmasters I had domain.com just listed. So I changed that to www.domain.com (as preferred domain name) and ask for the first time to crawl. www.domain.com . Anybody see any problems with this? THank you MOZ people, Nick0 -
One page templates
Hi, I have in plan to implement One-page template (http://shapebootstrap.net/wp-content/plugins/shapebootstrap/demo.php?id=380) is someone have experience how this type of page have SEO problems? Best regards
Technical SEO | | komir20040 -
Missing page titles
Does anyone know why my SeoMoz crawl reads my page titles differen't to what they truly are on my active site? I changed my pages titles and optimised them several months ago. Is my old page titles still been crawled rather than the new ones how do i fix this is?
Technical SEO | | gimes0 -
How can I prevent duplicate content between www.page.com/ and www.page.com
SEOMoz's recent crawl showed me that I had an error for duplicate content and duplicate page titles. This is a problem because it found the same page twice because of a '/' on the end of one url. e.g. www.page.com/ vs. www.page.com My question is do I need to be concerned about this. And is there anything I should put in my htaccess file to prevent this happening. Thanks!
Technical SEO | | onlineexpression
Karl0 -
Diagnostic says too many links on a page and most of the pages are from blog entries. Are tags considered links? How do I decrease links?
I just ran my first diagnostic on my site and the results came back were negative in the area of too many links one a page. There were also quite a few 404 errors. What is the best way to fix these problems? Most of the pages with too many links are from blog posts, are the tags counted as well and is this the reason for too many links?
Technical SEO | | Newport10300