Sitemaps
-
Hi, I have doubt using sitemaps My web page is a news we page and we have thousands of articles in every section. For example we have an area that is called technology We have articles since 1999!! So the question is how can Make googl robot index them? Months ago when you enter the section technology we used to have a paginator without limits, but we notice that this query consume a lot of CPU per user every time was clicked. So we decide to limit to 10 pages with 1 records. Now it works great BUT I can see in google webmaster tools that our index decreased dramatically The answer is very easy, the bot doesn't have a way to get older technoly news articles because we limit he query to 150 records total Well, the Questin is how can I fix this? Options: 1) leave the query without limits 2) create a new button " all tech news" with a different query without a limit but paginated with (for example) 200 records each page 3) Create a sitemap that contain all the tech articles Any idea? Really thanks.
-
Hi Dana!! I have implemented pagination tags prev and next already and I saw that video. It's great. What happens if I have a sitemap.xml and I omitted articles? Are sitemaps complementary or if you upload a sitemaps then google only use that and nothing else? If are complementary I can do it monthly and append the new articles per each area of my web page, is that correct? Thanks you Dana
-
I would highly recommend submitting a sitemap. I am surprised you haven't already. Is it because there are so many articles? Is there any way to generate an automated list of your article URLs? If you can then formatting a sitemap shouldn't be that difficult even if there are a huge number of them. I also like your idea of adding the "View all" option. The company where I do in-house SEO is battling pagination issues at the moment and we believe this is a good solution and it is one that is supported and even recommended by Google here: http://googlewebmastercentral.blogspot.com/2012/03/video-about-pagination-with-relnext-and.html
I hope that helps a bit!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap For Static Content And Blog
We'll be uploading a sitemap to google search console for a new site. We have ~70-80 static pages that don't really chance much (some may change as we modify a couple pages over the course of the year). But we have a separate blog on the site which we will be adding content to frequently. How can I set up the sitemap to make sure that "future" blog posts will get picked up and indexed. I used a sitemap generator and it picked up the first blog post that's on the site, but am wondering what happens with future ones? I don't want to resubmit a new sitemap each time that has a link to a new blog post we posted.
Technical SEO | | vikasnwu0 -
Sitemaps, 404s and URL structure
Hi All! I recently acquired a client and noticed in Search Console over 1300 404s, all starting around late October this year. What's strange is that I can access the pages that are 404ing by cutting and pasting the URLs and via inbound links from other sites. I suspect the issue might have something to do with Sitemaps. The site has 5 Sitemaps, generated by the Yoast plugin. 2 Sitemaps seem to be working (pages being indexed), 3 Sitemaps seem to be not working (pages have warnings, errors and nothing shows up as indexed). The pages listed in the 3 broken sitemaps seem to be the same pages giving 404 errors. I'm wondering if auto URL structure might be the culprit here. For example, one sitemap that works is called newsletter-sitemap.xml, all the URLs listed follow the structure: http://example.com/newsletter/post-title Whereas, one sitemap that doesn't work is called culture-event-sitemap.xml. Here the URLs underneath follow the structure http://example.com/post-title. Could it be that these URLs are not being crawled / found because they don't follow the structure http://example.com/culture-event/post-title? If not, any other ideas? Thank you for reading this long post and helping out a relatively new SEO!
Technical SEO | | DanielFeldman0 -
URL / sitemap structure for support pages
I am creating a site that has four categories housed in folders off of the TLD. Example: example.com/category-1
Technical SEO | | InterCall
example.com/category-2
example.com/category-3
example.com/category-4 Those category folders contain sub-folders that house the products inside each category. Example: example.com/category-1/product-1
example.com/category-2/product-1
etc. Each of the products have a corresponding support page with technical information, FAQs, etc. I have three options as to how to structure the support pages' URLs. Option 1 - Add new sub-folder with "support" added to string: example.com/category-1/product-1-support Option 2 - Add a second sub-folder off of the product sub-folder for support: example.com/category-1/product-1/support Option 3 - Create a "support" folder with product sub-folders: example.com/support/product-1 Which of these three options would you choose? I don't like having one large /support folder that houses all products. It seems like this would create a strange crawling and UX situation. The sitemap would have a huge /support folder with all of my products in it and the keywords in my category folders would be replaced with the word "support." Because I would rather have the main product pages ranking over any of the support pages (outside of searches containing the word "support"), I am leaning toward Option 2: example.com/category-1/product-1/support. I think this structure indicates to crawlers that the more important page is the product page, while the support page is secondary to that. It also makes it clear to users that this is the support page for that particular product. Does anyone have any experience or perspective on this? I'm open to suggestions and if I'm overthinking it, tell me that too. Thanks, team.0 -
XML Sitemap Generators
I am looking to use a different sitemap generator that can do 5 thousand or more pages at once. Any recommendations? Thanks guys.
Technical SEO | | Chenzo0 -
Image & Video Sitemaps - Submitted vs. Indexed
Hi Mozzers, I have read all the relevant blogs from media indexing experts like Phil Nottingham and have followed Google's best practice as well as advice from similar discussions on here. We have submitted video and image sitemaps to WT, and the image sitemap has 33 indexed from 720 submitted images, and the video 170 indexed from 738 submitted. With the image sitemap the number (33) has remained steady while the submitted has grown by over 100 in the last month. The video has shown signs of indexing new videos however but still not the amount that were submitted. Thus far, I have followed the guidelines sitemap structure as per Google. We are using Cloudfront so I have added and verified our cloudfront server in the same WT account. If anyone has any advice, it would be most appreciated. There is no duplicate content and the robots.txt is not blocking anything within the sitemap. Image sitemap: view-source:http://www.clowdy.com/sitemap.images.xml
Technical SEO | | Morrreau0 -
Adding multi-language sitemaps to robots.txt
I am working on a revamped multi-language site that has moved to Magento. Each language runs off the core coding so there are no sub-directories per language. The developer has created sitemaps which have been uploaded to their respective GWT accounts. They have placed the sitemaps in new directories such as: /sitemap/uk/sitemap.xml /sitemap/de/sitemap.xml I want to add the sitemaps to the robots.txt but can't figure out how to do it. Also should they have placed the sitemaps in a single location with the file identifying each language: /sitemap/uk-sitemap.xml /sitemap/de-sitemap.xml What is the cleanest way of handling these sitemaps and can/should I get them on robots.txt?
Technical SEO | | MickEdwards0 -
How could i create sitemap with 1000 page and should i update sitemap frequently?
My website have over 1000 pages but the sitemap creator tools i knew only create maximum 500 pages, how could i create sitemap with full of my webpage?
Technical SEO | | magician0 -
Removing Redirected URLs from XML Sitemap
If I'm updating a URL and 301 redirecting the old URL to the new URL, Google recommends I remove the old URL from our XML sitemap and add the new URL. That makes sense. However, can anyone speak to how Google transfers the ranking value (link value) from the old URL to the new URL? My suspicion is this happens outside the sitemap. If Google already has the old URL indexed, the next time it crawls that URL, Googlebot discovers the 301 redirect and that starts the process of URL value transfer. I guess my question revolves around whether removing the old URL (or the timing of the removal) from the sitemap can impact Googlebot's transfer of the old URL value to the new URL.
Technical SEO | | RyanOD0