Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Should I include unnecessary pages in the sitemap.xml
-
I have a lot of pages that I don't want Google to index, so for most of them, I used cannonical, were they were duplicates, noindex were I wanted to remove the pages, but the question is: Should I include these pages in the sitemap.xml, or just the important pages?
Also should I include them in order to get the changes indexed fastet by Google?
-
That clearly changes my ideas about this ;-). As we're talking about a couple of million pages I wouldn't include them in the sitemaps then and to make sure they're absolutely made sure that it's noindexed.
-
One of the main problem is that there are a lot of such pages (aprox. 2-3 milions) and my indexation rate is really slow for a site this big. The old sitemap structure was to complex, and I wanted so simplify it, so Google wiil crawl only the important pages
-
Hi Silviu,
Hard question, related to your use case I would suggest not to include them. But on the other hand it also shouldn't harm your performance as the URLs in a sitemap are mostly meant to search engines as a full list of URLs they might miss otherwise. It would also help you to see what your indexation rate is. Curious to see what other people think about this.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Optimization Error
Hi, I am trying to track a page optimization feature for one of my project, https://shinaweb.com but i keep getting this below error: "PAGE OPTIMIZATION ERROR
On-Page Optimization | | shinawebnavid
There was a problem loading this page. Please make sure the page is loading properly and that our user-agent, rogerbot, is not blocked from accessing this page." I checked robots.txt file, it all looks fine. Not sure what is the problem? Is it a problem with Moz or the website?0 -
What is the best meta description for Category Pages, Tag Pages and Main Article?
Hi, I want to index all my categories and tags. But I fear about duplicating the meta description. for example: I have a tag name "Learn Stock Market", a category name "Learning", and a main article "What is Stock Market". What is your suggestion for meta description of these three pages that looks great for seo google?
On-Page Optimization | | mbmozmb0 -
How to deal with filter pages - Shopify
Hi there, /collections/living-room-furniture/black
On-Page Optimization | | williamhuynh
/collections/living-room-furniture/fabric Is that ok to make all the above filter pages canonicalised with their main category /collections/living-room-furniture Also, does it needs to be noindex, follow as well? Note - already removed the content from filter pages, updated meta tags as well. Please advice, thank you1 -
Do you need to include the top menu on every single page of the site in the code?
When using cache: on google, and clicking on Text-only version, our site has the top menu gibberish on top? My feeling is that this take away SEO juice from our title and focus keyword. Our website is culinarydepotinc.com
On-Page Optimization | | Sammyh1 -
H1 tag- on home page - what is it best to include
is it best to have in the H1 tag 1. just our website address 2. combination of website address followed by short keywords about our website
On-Page Optimization | | CostumeD0 -
Duplicate Content on Event Pages
My client has a pretty popular service of event listings and, in hope of gathering more events, they opened up the platform to allow users to add events. This works really well for them and they are able to garner a lot more events this way. The major problem I'm finding is that many event coordinators and site owners will take the copy from their website and copy and paste it, duplicating a lot of the content. We have editor picks that contain a lot of unique content but the duplicate content scares me. It hasn't hurt our page ranking (we have a page ranking of 7) but I'm wondering if this is something that we should address. We don't have the manpower to eliminate all the duplication but if we cut down the duplication would we experience a significant advantage over people posting the same event?
On-Page Optimization | | mattdinbrooklyn0 -
Creating New Pages Versus Improving Existing Pages
What are some things to consider or things to evaluate when deciding whether you should focus resources on creating new pages (to cover more related topics) versus improving existing pages (adding more useful information, etc.)?
On-Page Optimization | | SparkplugDigital0 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5