Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
HTML Sitemap Pagination?
-
Im creating an a to z type directory of internal pages within a site of mine however there are cases where there are over 500 links within the pages. I intend to use pagination (rel=next/prev) to avoid too many links on the page but am worried about indexation issues. should I be worried?"
-
This may be a good case for a private question, as the SEOmoz staff and associates are under NDA and you can share a few more details.
-
To a certain extent, yes. The site will be very dynamic based around performance of particular pages of the site so this section is essentially to give some structure to the site. The pages will also be accessible through a categorisation section of the site also but most of the time they will be accessed through redirection from using certain keywords in the search function. Its quite a tough one to explain without giving too much away!
-
If it's to avoid orphaning, does this mean that they otherwise have nothing internal linking to them and that it's expected for people to go to the site map of 5000 URLs to find them?
-
Well it is the main mechanism to avoid orphaning any of these particular pages within the site as there are approximately 5000 of them. This A - Z is a section is not a map of the entire site but just one section of it, so I think that all pages will need to be linked to as they are, in essence, priority pages of the site in terms of SEO.
-
An alternate question is -- do you need each and every page in your site in the html index? Consider how Verizon lays theirs out http://www.verizonwireless.com/b2c/sitemap.jsp where they show the main pages, but not each and every individual page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is sitemap required on my robots.txt?
Hi, I know that linking your sitemap from your robots.txt file is a good practice. Ok, but... may I just send my sitemap to search console and forget about adding ti to my robots.txt? That's my situation: 1 multilang platform which means... ... 2 set of pages. One for each lang, of course But my CMS (magento) only allows me to have 1 robots.txt file So, again: may I have a robots.txt file woth no sitemap AND not suffering any potential SEO loss? Thanks in advance, Juan Vicente Mañanas Abad
Technical SEO | | Webicultors0 -
Good alternatives to Xenu's Link Sleuth and AuditMyPc.com Sitemap Generator
I am working on scraping title tags from websites with 1-5 million pages. Xenu's Link Sleuth seems to be the best option for this, at this point. Sitemap Generator from AuditMyPc.com seems to be working too, but it starts handing up, when a sitemap file, the tools is working on,becomes too large. So basically, the second one looks like it wont be good for websites of this size. I know that Scrapebox can scrape title tags from list of url, but this is not needed, since this comes with both of the above mentioned tools. I know about DeepCrawl.com also, but this one is paid, and it would be very expensive with this amount of pages and websites too (5 million ulrs is $1750 per month, I could get a better deal on multiple websites, but this obvioulsy does not make sense to me, it needs to be free, more or less). Seo Spider from Screaming Frog is not good for large websites. So, in general, what is the best way to work on something like this, also time efficient. Are there any other options for this? Thanks.
Technical SEO | | blrs120 -
Is it important to include image files in your sitemap?
I run an ecommerce business that has over 4000 product pages which, as you can imagine, branches off into thousands of image files. Is it necessary to include those in my sitemap for faster indexing? Thanks for you help! -Reed
Technical SEO | | IceIcebaby0 -
Correct linking to the /index of a site and subfolders: what's the best practice? link to: domain.com/ or domain.com/index.html ?
Dear all, starting with my .htaccess file: RewriteEngine On
Technical SEO | | inlinear
RewriteCond %{HTTP_HOST} ^www.inlinear.com$ [NC]
RewriteRule ^(.*)$ http://inlinear.com/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^./index.html
RewriteRule ^(.)index.html$ http://inlinear.com/ [R=301,L] 1. I redirect all URL-requests with www. to the non www-version...
2. all requests with "index.html" will be redirected to "domain.com/" My questions are: A) When linking from a page to my frontpage (home) the best practice is?: "http://domain.com/" the best and NOT: "http://domain.com/index.php" B) When linking to the index of a subfolder "http://domain.com/products/index.php" I should link also to: "http://domain.com/products/" and not put also the index.php..., right? C) When I define the canonical ULR, should I also define it just: "http://domain.com/products/" or in this case I should link to the definite file: "http://domain.com/products**/index.php**" Is A) B) the best practice? and C) ? Thanks for all replies! 🙂
Holger0 -
Removing Redirected URLs from XML Sitemap
If I'm updating a URL and 301 redirecting the old URL to the new URL, Google recommends I remove the old URL from our XML sitemap and add the new URL. That makes sense. However, can anyone speak to how Google transfers the ranking value (link value) from the old URL to the new URL? My suspicion is this happens outside the sitemap. If Google already has the old URL indexed, the next time it crawls that URL, Googlebot discovers the 301 redirect and that starts the process of URL value transfer. I guess my question revolves around whether removing the old URL (or the timing of the removal) from the sitemap can impact Googlebot's transfer of the old URL value to the new URL.
Technical SEO | | RyanOD0 -
Is it bad to have same page listed twice in sitemap?
Hello, I have found that from an HTML (not xml) sitemap of a website, a page has been listed twice. Is it okay or will it be considered duplicate content? Both the links use same anchor text, but different urls that redirect to another (final) page. I thought ideal way is to use final page in sitemap (and in all internal linking), not the intermediate pages. Am I right?
Technical SEO | | StickyRiceSEO1 -
Rel next prev, should i nofollow pagination links
Hi Everyone. When implementing rel next and prev on pagination pages, should I make the pagination links themselves no followed? Have seen people saying yes and no so just want a final answer! Thanks
Technical SEO | | Sayers0 -
Video Sitemaps <video:content_loc>and<video:player_loc></video:player_loc></video:content_loc>
Hi guys, If I'm creating a video sitemap do I need to use both: video:content_locandvideo:player_loc</video:player_loc></video:content_loc> Or could I just use video:content_loc?</video:content_loc> Thanks
Technical SEO | | Tug-Agency0