How to handle New Page/post with site map
-
Hi,
I've created and submitted to google (through webmaster tool) a site map with the WP plugin XML google maps.
Now I've created new pages and posts. My question is: do i have to recreate and re submit another site map to google or can i just submit to google the new pages and posts with the option 'FETCH AS GOOGLE' ?
Tx so much in advance.
-
OK, tx for the advice.
About sitemaps, If i want my images appear in google, do I have to leave unceked the option of media in YOAST?
On this moz article they suggest to exclude media from sitempas:
http://moz.com/blog/setup-wordpress-for-seo-success
Whats your opinion suggestion about this and other xml settings?
-
Yes that is exactly what I would do, just remove the URL from Google Webmaster Tools and add the new URL if the new sitemap is ready via Yoast.
-
So, if now I have created and submitted a sitemap to google webmaster with XML sitemap plugin, what should I do?
Can I deactivate and uninstall the XML sitemap plugin and resubmit a sitemap to google done with YOAST?
What you suggest is best at this point?
-
Always go with Yoast, it's the most used SEO plugin for WordPress.
-
Tx so much guys.
Another doubt I have: if I'm running and using the XML sitemap plugin, do I have to disable/uncheck the YOAST sitemap option or Can I leave it on? Also what would you suggest is better to use for sitemaps, XML Sitemap or YOAST?
-
Hi,
As mentioned above, you can do both. In a natural way you would wait for Google to crawl your sitemap and find the new URL in there. But by doing this via the fetch as Google tool you can speed up this process and submit it directly to being indexed. I would recommend this if Google is not crawling your site on a regular basis.
-
I cannot see why it would hurt to submit articles with fetch as google, but in general if you have a plugin controlling your XML sitemap, Google will get updated whenever something changes or something new is posted. Fetch as Google will just make it happen faster than if you wait for the bot to come and crawl the site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I create a new site or keep company on parent company's subdomain?
I am working with a realty company that is hosted on a subdomain of the larger, parent realty company: [local realty company].[parent realty company].com How important is it to ride on the DA of the larger company (only about a 40)? I'm trying to weigh the value of creating an entirely separate domain for simplicity of the end user and Google bots: [local company].realtor They don't have any substantial links to their subdomain, so it wouldn't a huge loss. I have a couple options... Create an entirely new site on their current subdomain, leveraging the DA of the larger parent company. Create an entirely new site on a new URL, starting from scratch (which doesn't hurt you as much as it seems it once did). Create two sites, a micro site that targets a sector of their audience that they really want to reach, plus option (1) or (2). Love this community!
Technical SEO | | Gabe_BlueGuru0 -
Nofollow/Noindex Category Listing Pages with Filters
Our e-commerce site currently has thousands of duplicate pages indexed because category listing pages with all the different filters selected are indexed. So, for example, you would see indexed: example.com/boots example.com/boots/black example.com/boots/black-size-small etc. There is a logic in place that when more than one filter is selected all the links on the page are nofollowed, but Googlebot is still getting to them, and the variations are being indexed. At this point I'd like to add 'noindex' or canonical tags to the filtered versions of the category pages, but many of these filtered pages are driving traffic. Any suggestions? Thanks!
Technical SEO | | fayfr0 -
Why is the Page Authority for posts in my blog so low
I have noticed that the Page Authority for my posts in my blog are all hovering around 1 and the rest of the pages on my website are around 20. The Domain Authority for my website is 16 and I think the page authority of my posts are negatively affecting my Domain Authority as I write more content. Any suggestions or recommendations as to why posts have such low Page Authority compared to similar pages. I have images, links, and great content in my posts, but they are considerably lower in Page Authority*
Technical SEO | | JoeyGedgaud0 -
Is the Authority of Individual Pages Diluted When You Add New Pages?
I was wondering if the authority of individual pages is diluted when you add new pages (in Google's view). Suppose your site had 100 pages and you added 100 new pages (without getting any new links). Would the average authority of the original pages significantly decrease and result in a drop in search traffic to the original pages? Do you worry that adding more pages will hurt pages that were previously published?
Technical SEO | | Charlessipe0 -
Micro-sites for Landing Pages?
We are working with a site that is difficult at best to update. The client intends to re-do the site in 18 months or so but needs to start generating more traffic (and sales) now. What are thoughts on creating landing pages as micro-sites that point to the current site conversion page as a stop gap? Beyond not sharing authority is there any known penalty? By the way they don't have tremendous ranking right now - often bottom of page two - and the micro-site won't duplicate any content.
Technical SEO | | InformaticsInc0 -
Does posting an article on multiple sites hurt seo?
A client of mine creates thought leadership articles and pitches multiple sites to host the article on their site to reach different audiences. The sites that pick it up are places such as AdAge and MarketingProfs and we do get link juice from these sources most of the time. Does having the same article on these sites as well as your own hurt your SEO efforts in any way? Could it be recognized as duplicate content? I know the links are great just wondering if there is any other side effects especially when there are no links provided! Thank you!
Technical SEO | | Scratch_MM0 -
Content loc and player log tags for XML video site maps
I need a little help understanding how to create two of the required tags for a XML video site map for Google. 1. video:content_loc2.<video:player_loc< p=""></video:player_loc<></video:content_loc> Google explains their Video XML Site map requirements here:
Technical SEO | | dsexton10
www.google.com/support/webmasters/bin/answer.py?answer=80472
Using the example on this Google Web Master Help page (where they explain all six of the required tags) , here are examples of the two tags I need help with: video:content_locwww.example.com/video123.flv</video:content_loc> <video:player_loc allow_embed="yes" autoplay="ap=1">www.example.com/videoplayer.swf?video=12...video:player_loc></video:player_loc> The video I am trying to optimize is located on a page on my site:
www.mountainbikingmaine.com/races/bradbury_hawk.html
This page has an embedded Vimeo video. So I don't have the video file on my domain. It is on Vimeo. Here is source code from my page that I think provides the information I need to create the two tags that Google requires. <iframe src="<a rel=" nofollow"="" href="http://player.vimeo.com/video/24580638?title=0&byline=0&portrait=0"" target="_blank">player.vimeo.com/video/24580638?title=0&...amp;portrait=0"</a> width="400" height="533" frameborder="0"></iframe> [vimeo.com/24580638">Bradbury](<a rel=) Mountain Maine Hawk Migration Count from [vimeo.com/user3219915">dan](<a rel=) sexton Using this source from my site, can you suggest what to put in the two tags? Thanks! Dan0 -
Backlinks pointing to the B page of an A/B test.
To rel-canonical or to 301, that is the question. We're frequently running an A/B split test on our home page to optimize conversion. As a result about 10,000 backlinks to our homepage point to the B page. (If we're running a test when a blog or newspaper checks us out, there's a 50% chance they're diverted to the B page. So when they copy our home page URL, they're unknowingly copying the B page link.) We can't contact all of these sites and ask for them to change their links. A lot of the links are from big organizations that aren't interested in tweaking the links of old articles. So should we rel-canonical or 301 the B page? We consistently use the same URL for our B page tests, so we'd only have to 'fix' one page. Thanks in advance!
Technical SEO | | JoeNYC0