SITEMAP.asp having 10,000 pages
-
A website having more than 10,000 pages, as per the Google Algorithm If I restrict the page links to 100 for sitemap.asp then I have to generate 100 pages, any idea to shorten the process. Please advice.
-
I just want to verify that you're talking about a sitemap for the users, and not the xml sitemap for the search engines to crawl. The sitemap for the search engines can have 50,000 entries in a sitemap, per the specifications at http://www.sitemaps.org/protocol.html.
One example Ryan Kent likes to use is Verizon's sitemap at http://www.verizonwireless.com/b2c/sitemap.jsp. They don't have every one of their pages listed, but you can easily find what you are looking for via their sitemap.
-
I have a site with 100K pages and our sitemap has about 100 entries I think.
In my opinion you should not shorten the process. It is important to do it correctly, and it depends on the site at hand.
Now, I do not now which website this is regarding, so hard to give advice about a specfic case.
But what I would do is include the home page and all navigational sections (sub section indexes). Other than that include the most important pages for your company and the central ones which change often.
These things you probably already know. I have not heard of quick way of just making a sitemap for a large site. It requires attention. But i guess, that if you really want to do it quickly and really do not want to put in the effort and thaught to make quality sitemap you could probably just take the 100 most visited pages for your site in the last year. In their you should find the most important pages?
Regards,
Rasmus
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
[Organization schema] Which Facebook page should be put in "sameAs" if our organization has separate Facebook pages for different countries?
We operate in several countries and have this kind of domain structure:
Technical SEO | | Telsenome
example.com/us
example.com/gb
example.com/au For our schemas we've planned to add an Organization schema on our top domain, and let all pages point to it. This introduces a problem and that is that we have a separate Facebook page for every country. Should we put one Facebook page in the "sameAs" array? Or all of our Facebook pages? Or should we skip it altogether? Only one Facebook page:
{
"@type": "Organization",
"@id": "https://example.com/org/#organization",
"name": "Org name",
"url": "https://example.com/org/",
"sameAs": [
"https://www.linkedin.com/company/xxx",
"https://www.facebook.com/xxx_us"
], All Facebook pages:
{
"@type": "Organization",
"@id": "https://example.com/org/#organization",
"name": "Org name",
"url": "https://example.com/org/",
"sameAs": [
"https://www.linkedin.com/company/xxx",
"https://www.facebook.com/xxx_us"
"https://www.facebook.com/xxx_gb"
"https://www.facebook.com/xxx_au"
], Bonus question: This reasoning springs from the thought that we only should have one Organization schema? Or can we have a multiple sub organizations?0 -
Overdynamic Pages - How to Solve it?
Hi everyone, I'm running a classified real estate ads site, where people can publish their apartment or house they want to sell, so we use multiple filters to help people find what they want. Lately we added multiple filters to the URL to make the search more precise, things like: Prices (priceAmount=###) Bedrooms (BedroomsNumber=2) Bathrooms (BathroomsNumber=3) TotalArea (totalArea=1_50) Services (Elevator, CommonAreas, security) Among other Filters so you see the picture, all this filters are on the URL so that people can share their search on multiple social media, that makes two problems for moz crawl: Overdynamic URLs Too long URLs Now what would be a good solution for this 2 problems, would a canonical to the original page before the "?" would be ok? Example:
Technical SEO | | JoaoCJ
http://urbania.pe/buscar/venta-de-propiedades?bathroomsNumber=2&services=gas&commonAreas=solarium The problem I have with this solution is that I also have a pagination parameter (page=2), and I'm using prev and next tags, if I use a such canonical will break the prev and next tag? http://urbania.pe/buscar/venta-de-propiedades?bathroomsNumber=2&services=gas&commonAreas=solarium&page=2 Also thinking if adding a noindex on pages with paramters could also be an option. Thanks a lot, I'm trying to address this issues.0 -
23,000 pages indexed, I think bad
Thank you Thank you Moz People!! I have a successful vacation rental company that has terrible seo but getting better. When I first ran Moz crawler and page grader, I had 35,000 errors and all f's.... tons of problem with duplicate page content and titles because not being consistent with page names... mainly capitalization and also rel canonical errors... with that said, I have now maybe 2 or 3 errors from time to time, but I fix every other day. Problem Maybe My site map shows in Google Webmaster submitted 1155
Technical SEO | | nickcargill
1541 indexed But google crawl shows 23,000 pages probably because of duplicate errors or possibly database driven url parameters... How bad is this and how do I get this to be accurate, I have seen google remove tool but I do not think this is right? 2) I have hired a full time content writer and I hope this works My site in google was just domain.com but I had put a 301 in to www.domain.com becauses www. had a page authority where the domain.com did not. But in webmasters I had domain.com just listed. So I changed that to www.domain.com (as preferred domain name) and ask for the first time to crawl. www.domain.com . Anybody see any problems with this? THank you MOZ people, Nick0 -
Why are my Duplicated Pages not being updated?
I've recently changed a bunch of duplicated pages from our site. I did get a slightly minimized amount of duplicated pages, however, some of the pages that I've already fixed are still unfixed according to MOZ. Whenever I check the back-end of each of these pages, I see that they've already been changed and non of them are the same in terms of Meta Tag Title is concern. Can anyone provide any suggestions on what I should do to get a more accurate result? Is there a process that I'm missing?
Technical SEO | | ckroaster0 -
Pages to be indexed in Google
Hi, We have 70K posts in our site but Google has scanned 500K pages and these extra pages are category pages or User profile pages. Each category has a page and each user has a page. When we have 90K users so Google has indexed 90K pages of users alone. My question is. Should we leave it as they are or should we block them from being indexed? As we get unwanted landings to the pages and huge bounce rate. If we need to remove what needs to be done? Robots block or Noindex/Nofollow Regards
Technical SEO | | mtthompsons0 -
Searching on root domain words = ranking on > page 10 in SERP
Hello, Our website wingmancondoms.com (a new condom brand) is not ranking in Google on the keywords "wingman condom", and I don't know why. In Yahoo and Bing everything is allright. I saw on this forum that it is maybe best to change my language URL's to wingmancondoms.com/nl /de and /fr instead of a direct URL like http://www.wingmancondoms.com/wingman-kondome (german translation). But is this our problem or are there more problems. Google is indexing our page well, no errors etc. Any other possibilities?
Technical SEO | | jogo0 -
Off-page SEO and on-page SEO improvements
I would like to know what off-page SEO and on-page SEO improvements can be made to one of our client websites http://www.nd-center.com Best regards,
Technical SEO | | fkdpl2420 -
Wrong Page Ranking
Higher-level page with more power getting pushed out by weaker page in the SERPs for an important keyword. I don't care about losing the weaker page. Should I: 404 the weaker page and wait for Google to (hopefully) replace it with the stronger page? 301 the weaker page to the stronger page? NOTE: Due to poor communication between content team and myself, the weak and strong pages have similar title tags (i.e, "lawsuits" and "litigation")
Technical SEO | | LCNetwork0