XML Sitemap After On Page Changes
-
Hi everyone, could anyone please help me understand what to do next with the xml sitemap after making on page changes?
For example, a website has an already existing xml sitemap and it's submitted to Google search console. We make changes to the website - URL structure, content, added new pages, 301 redirected broken links etc. for optimisation. Is there anything that we should do to change/update their current xml sitemap? Does it automatically update itself? Do we have to resubmit their xml sitemap to search console?
Thanks!
-
Correct!
-
Awesome, thanks Martin!
-
Hey Nikki,
Yes, those are correct.
Cheers, Martin
-
Hi Martin, thank you for your reply. Just to clarify, if we uploaded the sitemap manually to the website, we'll have to reupload a new one, and then resubmit to GSC.
If our sitemap is generated by a tool, say, in WordPress, then the sitemap gets updated automatically, and we just have to resubmit it to GSC.
Are those correct?
-
I second Martin, definitely update the XML sitemaps that you have for your site.
-
Hey Nikki,
It depends how the file was created. If you exported it manually it won't change by itself. Especially, if you changed URL structure, added some redirects and some new pages.
I'd recommend to resubmit new Sitemap.xml to Google Search Console because the new one will probably differ quite a bit.
Hope it helps. Cheers, Martin
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Spammy page titles
Over the last couple of weeks, I have noticed that Google aren't showing the page titles for my online shop anymore. They're set up with a third party plug-in piece of software, and while it's an old version of the software, the developer said it wouldn't be causing issues. They have suggested that I re-write my page titles to be less spammy. The thing is, Google haven't attacked just spammy looking titles, they're just taking a swoop through my whole site and not showing any of my page titles in their search results. I'm getting "Category Name - Shop Name" showing. Here's some of the page titles no longer appearing and I honestly have no idea how to rewrite these to not be spammy. Are there any good articles on what's spammy and what isn't? "Coconut oil - best tasting in Australia. Buy online from <my business="" name="">"</my> "Discount Vitamix Blender. Best deal in Australia. Buy online from <my business="" name="">."</my> "Natural & Organic skin care for the face | buy online in Australia from <my business="" name="">."</my> There are others that are showing the real page titles, but I think it's only a matter of re-indexing before they're all not showing. Any clue?
On-Page Optimization | | sparrowdog0 -
Changing site title
I'm wondering what the procedure and implications are of changing my sites tile? I realise that my Having my keyword in my sites title whilst chasing the same keyword in articles may be causing over optimization. The slug also takes on the article title too, in effect giving me the keyword three times before I've even written my article. Example below. Imaginary site title : soap benefits.org Article: The essential guide to making homemade soap Slug: The-essential-guide-to-making-homemade-soap As you can see, soap has now been mentioned three times, not including excerpt/meta description or image alt tags. As most of the article titles would contain my supposed keyword "soap" I'm thinking the best option would be to change site title with allinoneseo (that possible?) and change the slug to something relevant, giving me more room to escape over optimization. Does this sound sensible? I don't have that many articles so if I had to change other things it wouldn't be too much of a hassle. It seems a pity to loose my sites title I picked, but if I end up writing hundreds of articles this would be a problem. Help appreciated.
On-Page Optimization | | marangus0 -
On-page SEO optimization
hi there! Is it possible not to be in the first 20 or 30 positions in the SERPs after executing onpage SEO actions (keyword optimization, metatags, ....) even for keywords for which there's not "too much" competition? Is there a way of visualize the pages indexed by the google bot? (the pages especifically, not the number) in order to discard indexing problems? Thank you!
On-Page Optimization | | juanmiguelcr1 -
Imiges on own page???
Can somebody help me to explain why when I click on an image in a post it goes to its own page?? And why if I click on other image it does not go anywhere?? this is one of the posts: http://villasdiani.com/beach-wedding-at-a-luxury-boutique-hotel/ what is good or bad?? On most of the images SEOmoz is reporting that I am missing meta description!
On-Page Optimization | | VillasDiani0 -
Locating Duplicate Pages
Hi, Our website consists of approximately 15,000 pages however according to our Google Webmaster Tools account Google has around 26,000 pages for us in their index. I have run through half a dozen sitemap generators and they all only discover the 15,000 pages that we know about. I have also thoroughly gone through the site to attempt to find any sections where we might be inadvertently generating duplicate pages without success. It has been over six months since we did any structural changes (at which point we did 301's to the new locations) and so I'd like to think that the majority of these old pages have been removed from the Google Index. Additionally, the number of pages in the index doesn't appear to be going down by any discernable factor week on week. I'm certain it's nothing to worry about however for my own peace of mind I'd like to just confirm that the additional 11,000 pages are just old results that will eventually disappear from the index and that we're not generating any duplicate content. Unfortunately there doesn't appear to be a way to download a list of the 26,000 pages that Google has indexed so that I can compare it against our sitemap. Obviously I know about site:domain.com however this only returned the first 1,000 results which all checkout fine. I was wondering if anybody knew of any methods or tools that we could use to attempt to identify these 11,000 extra pages in the Google index so we can confirm that they're just old pages which haven’t fallen out of the index yet and that they’re not going to be causing us a problem? Thanks guys!
On-Page Optimization | | ChrisHolgate0 -
Creating a sitemap
what is the best place to go to create a xml sitemap? I have used xml-sitemap.com but it only does 500 pages. Any suggestions? Does a sitemap have value on the SEO front?
On-Page Optimization | | jgmayes0 -
Does Too Many On-Page Links on a Page Really Matters?
Does Too Many On-Page Links on a Page Really Matters? Especially if they are pointing to internal page?
On-Page Optimization | | AppleCapitalGroup1 -
Pages crawled
I noticed there is a limited in the number of pages crawled on galena.org? Will this number increase over time?
On-Page Optimization | | nskislak240