Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
HTML Sitemap Pagination?
-
Im creating an a to z type directory of internal pages within a site of mine however there are cases where there are over 500 links within the pages. I intend to use pagination (rel=next/prev) to avoid too many links on the page but am worried about indexation issues. should I be worried?"
-
This may be a good case for a private question, as the SEOmoz staff and associates are under NDA and you can share a few more details.
-
To a certain extent, yes. The site will be very dynamic based around performance of particular pages of the site so this section is essentially to give some structure to the site. The pages will also be accessible through a categorisation section of the site also but most of the time they will be accessed through redirection from using certain keywords in the search function. Its quite a tough one to explain without giving too much away!
-
If it's to avoid orphaning, does this mean that they otherwise have nothing internal linking to them and that it's expected for people to go to the site map of 5000 URLs to find them?
-
Well it is the main mechanism to avoid orphaning any of these particular pages within the site as there are approximately 5000 of them. This A - Z is a section is not a map of the entire site but just one section of it, so I think that all pages will need to be linked to as they are, in essence, priority pages of the site in terms of SEO.
-
An alternate question is -- do you need each and every page in your site in the html index? Consider how Verizon lays theirs out http://www.verizonwireless.com/b2c/sitemap.jsp where they show the main pages, but not each and every individual page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Customer Reviews on Product Page / Pagination / Crawl 3 review pages only
Hi experts, I present customer feedback, reviews basically, on my website for the products that are sold. And with this comes the ability to read reviews and obviously with pagination to display the available reviews. Now I want users to be able to flick through and read the reviews to help them satisfy whatever curiosity they have. My only thinking is that the page that contains the reviews, with each click of the pagination will present roughly the same content. The only thing that changes is the title tags which will contain the number in the H1 to display the page number. I'm thinking this could be duplication but i have yet to be notified by Google in my Search console... Should i block crawlers from crawling beyond page 3 of reviews? Thanks
Technical SEO | | Train4Academy.co.uk0 -
Good alternatives to Xenu's Link Sleuth and AuditMyPc.com Sitemap Generator
I am working on scraping title tags from websites with 1-5 million pages. Xenu's Link Sleuth seems to be the best option for this, at this point. Sitemap Generator from AuditMyPc.com seems to be working too, but it starts handing up, when a sitemap file, the tools is working on,becomes too large. So basically, the second one looks like it wont be good for websites of this size. I know that Scrapebox can scrape title tags from list of url, but this is not needed, since this comes with both of the above mentioned tools. I know about DeepCrawl.com also, but this one is paid, and it would be very expensive with this amount of pages and websites too (5 million ulrs is $1750 per month, I could get a better deal on multiple websites, but this obvioulsy does not make sense to me, it needs to be free, more or less). Seo Spider from Screaming Frog is not good for large websites. So, in general, what is the best way to work on something like this, also time efficient. Are there any other options for this? Thanks.
Technical SEO | | blrs120 -
302 redirect used, submit old sitemap?
The website of a partner of mine was recently migrated to a new platform. Even though the content on the pages mostly stayed the same, both the HTML source (divs, meta data, headers, etc.) and URLs (removed index.php, removed capitalization, etc) changed heavily. Unfortunately, the URLs of ALL forum posts (150K+) were redirected using a 302 redirect, which was only recently discovered and swiftly changed to a 301 after the discovery. Several other important content pages (150+) weren't redirected at all at first, but most now have a 301 redirect as well. The 302 redirects and 404 content pages had been live for over 2 weeks at that point, and judging by the consistent day/day drop in organic traffic, I'm guessing Google didn't like the way this migration went. My best guess would be that Google is currently treating all these content pages as 'new' (after all, the source code changed 50%+, most of the meta data changed, the URL changed, and a 302 redirect was used). On top of that, the large number of 404's they've encountered (40K+) probably also fueled their belief of a now non-worthy-of-traffic website. Given that some of these pages had been online for almost a decade, I would love Google to see that these pages are actually new versions of the old page, and therefore pass on any link juice & authority. I had the idea of submitting a sitemap containing the most important URLs of the old website (as harvested from the Top Visited Pages from Google Analytics, because no old sitemap was ever generated...), thereby re-pointing Google to all these old pages, but presenting them with a nice 301 redirect this time instead, hopefully causing them to regain their rankings. To your best knowledge, would that help the problems I've outlined above? Could it hurt? Any other tips are welcome as well.
Technical SEO | | Theo-NL0 -
For an image which is in the CSS and not the HTML, can you add an alt tag?
I would like to improve SEO on a page with three big images, which are currently hosted in the CSS. The sample I am working with is at http://xquisitevents.com/about-us/ and I put my cursor over the big picture of the wedding dress with bouquet, I inspected the element and saw this code in a div tag: #upperleft { background-image:url(images/AboutTopLeft.jpg); Can I add an alt tag to the CSS somehow, or can I have it added to the HTML? What is the best way to handle this, to include keywords like exquisite weddings and special event designs?
Technical SEO | | BridgetGibbons0 -
Is there a suggested limit to the amount of links on a sitemap?
Currently, I have an error on my moz dashboard indicating there are too many links on one of my pages. That page is the sitemap. It was my understanding all internal pages should be linked to the sitemap. Can any mozzers help clarify the best practice here? Thanks, Clayton
Technical SEO | | JorgeUmana0 -
Duplicate title-tags with pagination and canonical
Some time back we implemented the Google recommendation for pagination (the rel="next/prev"). GWMT now reports 17K pages with duplicate title-tags (we have about 1,1m products on our site and about 50m pages indexed in Google) As an example we have properties listed in various states and the category title would be "Properties for Sale in [state-name]". A paginated search page or browsing a category (see also http://searchengineland.com/implementing-pagination-attributes-correctly-for-google-114970) would then include the following: The title for each page is the same - so to avoid the duplicate title-tags issue, I would think one would have the following options: Ignore what Google says Change the canonical to http://www.site.com/property/state.html (which would then only show the first XX results) Append a page number to the title "Properties for Sale in [state-name] | Page XX" Have all paginated pages use noindex,follow - this would then result in no category page being indexed Would you have the canonical point to the individual paginated page or the base page?
Technical SEO | | MagicDude4Eva2 -
Do I need an XML sitemap?
I have an established website that ranks well in Google. However, I have just noticed that no xml sitemap has been registered in Google webmaster tools, so the likelihood is that it hasn't been registered with the other search engines. However, there is an html sitemap listed on the website. Seeing as the website is already ranking well, do I still need to generate and submit an XML sitemap? Could there be any detriment to current rankings in doing so?
Technical SEO | | pugh0 -
Video Sitemaps <video:content_loc>and<video:player_loc></video:player_loc></video:content_loc>
Hi guys, If I'm creating a video sitemap do I need to use both: video:content_locandvideo:player_loc</video:player_loc></video:content_loc> Or could I just use video:content_loc?</video:content_loc> Thanks
Technical SEO | | Tug-Agency0