How to handle New Page/post with site map
-
Hi,
I've created and submitted to google (through webmaster tool) a site map with the WP plugin XML google maps.
Now I've created new pages and posts. My question is: do i have to recreate and re submit another site map to google or can i just submit to google the new pages and posts with the option 'FETCH AS GOOGLE' ?
Tx so much in advance.
-
OK, tx for the advice.
About sitemaps, If i want my images appear in google, do I have to leave unceked the option of media in YOAST?
On this moz article they suggest to exclude media from sitempas:
http://moz.com/blog/setup-wordpress-for-seo-success
Whats your opinion suggestion about this and other xml settings?
-
Yes that is exactly what I would do, just remove the URL from Google Webmaster Tools and add the new URL if the new sitemap is ready via Yoast.
-
So, if now I have created and submitted a sitemap to google webmaster with XML sitemap plugin, what should I do?
Can I deactivate and uninstall the XML sitemap plugin and resubmit a sitemap to google done with YOAST?
What you suggest is best at this point?
-
Always go with Yoast, it's the most used SEO plugin for WordPress.
-
Tx so much guys.
Another doubt I have: if I'm running and using the XML sitemap plugin, do I have to disable/uncheck the YOAST sitemap option or Can I leave it on? Also what would you suggest is better to use for sitemaps, XML Sitemap or YOAST?
-
Hi,
As mentioned above, you can do both. In a natural way you would wait for Google to crawl your sitemap and find the new URL in there. But by doing this via the fetch as Google tool you can speed up this process and submit it directly to being indexed. I would recommend this if Google is not crawling your site on a regular basis.
-
I cannot see why it would hurt to submit articles with fetch as google, but in general if you have a plugin controlling your XML sitemap, Google will get updated whenever something changes or something new is posted. Fetch as Google will just make it happen faster than if you wait for the bot to come and crawl the site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Delete old blog posts after 301 redirects to new pages?
Hi Moz Community, I've recently created several new pages on my site using much of the same copy from blog posts on the same topics (we did this for design flexibility and a few other reasons). The blogs and pages aren't exactly identical, as the new pages have much more content, but I don't think there's a point to having both and I don't want to have duplicate content, so we've used 301 redirects from the old blog posts to the new pages of the same topic. My question is: can I go ahead and delete the old blog posts? (Or would there be any reasons I shouldn't delete them?) I'm guessing with the 301 redirects, all will be well in the world and I can just delete the old posts, but I wanted to triple check to make sure. Thanks so much for your feedback, I really appreciate it!
Technical SEO | | TaraLP1 -
A new client has image urls showing above their page rankings for the same key phrase.
New client website https://yorkshirefoodguide.co.uk/ has for some key phrase searches the URL for an image showing above or as well as the url for the landing page. I'd be happy for it to show in the image pack but I want to url to rank in the main serp. The site is in WordPress and I'm sure this is just a setting I need to manage. Can you help please?
Technical SEO | | Marketing_Optimist0 -
URL Changes And Site Map Redirects
We are working on a site redesign which will change/shorten our url structure. The primary domain will remain the same however most of the other urls on the site are getting much simpler. My question is how should this be best handled when it comes to sitemaps because there are massive amounts of URLS that will be redirected to the new shorter URL how should we best handle our sitemaps? Should a new sitemap be submitted right at launch? and the old sitemap removed later. I know that Google does not like having redirects in sitemaps. Has anyone done this on a large scale, 60k URLs or more and have any advice?
Technical SEO | | RMATVMC0 -
Does adding a YouTube video to a page decrease site speed?
If you embed a YouTube video on your page, does Google count that as part of their site speed calculation. Since it is in a iFrame, I would think that it is not counted.
Technical SEO | | ProjectLabs0 -
How can I prevent duplicate content between www.page.com/ and www.page.com
SEOMoz's recent crawl showed me that I had an error for duplicate content and duplicate page titles. This is a problem because it found the same page twice because of a '/' on the end of one url. e.g. www.page.com/ vs. www.page.com My question is do I need to be concerned about this. And is there anything I should put in my htaccess file to prevent this happening. Thanks!
Technical SEO | | onlineexpression
Karl0 -
Is there a great tool for URL mapping old to new web site?
We are implementing new design and removing some pages and adding new content. Task is to correctly map and redirect old pages that no longer exist.
Technical SEO | | KnutDSvendsen0 -
How to Submit XML Site Map with more than 300 Subdomains?
Hi,
Technical SEO | | vaibhav45
I am creating sitemaps for site which has more than 500 Sub domains. Page varies from 20 to 500 in all subdomains & it will keep on adding in coming months. I have seen sites that create separate sitemap.xml for each subdomain which they mention in separate robots.txt file http://windows7.iyogi.com/robots.txt XML site map eg for subdomain: http://windows7.iyogi.com/sitemap.xml.gz , Currently in my website we have only 1 robots.txt file for main domain & sub domains. Please tell me shall i create separate robots.txt & XML site map file for each subdomain or 1 file. Creating separate xml for each sub-domain is not feasible as we have to verify in GWT separately. Is there any automatic way & do i have to ping separately if i add new pages in subdomain. Please advise me.0 -
Does anyone see benefit in .com/en vs .com/uk for a UK site?
The client is already on /en and in my opinion there is not much to be gained by switching to /uk
Technical SEO | | Red_Mud_Rookie0