I want to resubmit sitemap
-
I am doing major changes in my website some of my old url pages i don't want them to be indexed or submitted in site map some of other old pages i want to keep them and there is new pages any one can give me hints what should i do also I have thousands of pages on my website and I don't want to submit all my pages i want to submit best pages to google in sitemap that why i want to resubmit new site maps
-
Hi,
Thanks but also if I have loads of pages millions of pages that might effect Google indexation for example Google will not shows all my indexed pages in search results so that why I am thinking about resubmitting site maps with just important stuff also I cannot prevent pages if i want disallow from robbots I have loads of pages I do not want to focus on so any body can tell whether this right
-
If you're looking to highlight your most important pages to Google you could consider using the priority tag. There's more about this here: https://support.google.com/webmasters/answer/183668?hl=en&ref_topic=6080646&rd=1
-
Hi,
Thanks a lot basically I have big number of pages and some of these pages does not appear in google search result so i thought to minimize my site map and submit just important pages
-
Hi there,
If you don't want Google to index certain pages on your site then you should block them using robots.txt. You can also de-index them via Google Webmaster Tools. You can find more info on this here: http://moz.com/learn/seo/robotstxt
If necessary you can then create a new sitemap and submit it via GWT.
I hope this helps.
-
If you don't want Google to index certain pages, you're best off adding them to your robot.txt file. The sitemap is just a guide, and leaving out certain pages does not necessarily mean Google won't index them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap Size effect SEO
So I've noticed that the sitemap I use has a capacity of 4500 URLs, but my website is much larger. Is it worth paying for a commercial sitemap that encompasses my entire site? I also notice that of the 4500 URLs which have been submitted, only 104 are indexed. Is this normal, if not, why is the index rate so low?
Technical SEO | | moon-boots0 -
Search Console rejecting XML sitemap files as HTML files, despite them being XML
Hi Moz folks, We have launched an international site that uses subdirectories for regions and have had trouble getting pages outside of USA and Canada indexed. Google Search Console accounts have finally been verified, so we can submit the correct regional sitemap to the relevant search console account. However, when submitting non-USA and CA sitemap files (e.g. AU, NZ, UK), we are receiving a submission error that states, "Your Sitemap appears to be an HTML page," despite them being .xml files, e.g. http://www.t2tea.com/en/au/sitemap1_en_AU.xml. Queries on this suggest it's a W3 Cache plugin problem, but we aren't using Wordpress; the site is running on Demandware. Can anyone guide us on why Google Search Console is rejecting these sitemap files? Page indexation is a real issue. Many thanks in advance!
Technical SEO | | SearchDeploy0 -
Sitemap question
Hello, In your opinion what is better for a root domain and micro-sites using sub-domains?, to have a single sitemap for the root domain including all links to the sub-domains or to have a separate sitemap for each sub-domain? Thanks Arnold
Technical SEO | | arnoldwender0 -
Should I Edit Sitemap Before Submitting to GWMT?
I use the XML sitemap generator at http://www.auditmypc.com/xml-sitemap.asp and use the filter that forces the tool to respect robots.txt exclusions. This generator allows me to review the entire sitemap before downloading it. Depending on the site, I often see all kinds of non-content files still listed on the sitemap. My question is, should I be editing the sitemap to remove every file listed except ones I really want spidered, or just ignore them and let the Google spiderbot figure it all out after I upload-submit the XML?
Technical SEO | | DonB0 -
Will an XML sitemap override a robots.txt
I have a client that has a robots.txt file that is blocking an entire subdomain, entirely by accident. Their original solution, not realizing the robots.txt error, was to submit an xml sitemap to get their pages indexed. I did not think this tactic would work, as the robots.txt would take precedent over the xmls sitemap. But it worked... I have no explanation as to how or why. Does anyone have an answer to this? or any experience with a website that has had a clear Disallow: / for months , that somehow has pages in the index?
Technical SEO | | KCBackofen0 -
Client wants to distribute web content to dealers - iFrame?
I have a client who sells a product through a network of nationwide dealers. He wants to provide update-able content to these dealers so they can create sections on their websites dedicated to the product. For ex., www.dealer.com/product_XYZ. The client is thinking he'd like to provide an iframe solution to the dealers, so he can independently update the content that appears on their sites. I know iFrames are old, but are there any SEO concerns I should know about? Another option is to distribute content via HTML that has a rel=canonical command as part of the code, but then he loses the ability to centrally update all the distributed content. Are there other solutions he should consider? Thanks --
Technical SEO | | 540SEO0 -
Sitemaps
Hi, I have doubt using sitemaps My web page is a news we page and we have thousands of articles in every section. For example we have an area that is called technology We have articles since 1999!! So the question is how can Make googl robot index them? Months ago when you enter the section technology we used to have a paginator without limits, but we notice that this query consume a lot of CPU per user every time was clicked. So we decide to limit to 10 pages with 1 records. Now it works great BUT I can see in google webmaster tools that our index decreased dramatically The answer is very easy, the bot doesn't have a way to get older technoly news articles because we limit he query to 150 records total Well, the Questin is how can I fix this? Options: 1) leave the query without limits 2) create a new button " all tech news" with a different query without a limit but paginated with (for example) 200 records each page 3) Create a sitemap that contain all the tech articles Any idea? Really thanks.
Technical SEO | | informatica8100 -
I want my Meta Description re-indexed fast!
We have an old meta description that advertises an old offer (FREE X if you Buy Y) that we are no longer running on the site. I changed the meta description, now what is the fastest way I can get Google to update their SERP with the new description?
Technical SEO | | pbhatt0