Getting a Sitemap for a Subdomain into Webmaster Tools
-
We have a subdomain that is a Wordpress blog, and it takes days, sometimes weeks for most posts to be indexed. We are using the Yoast plugin for SEO, which creates the sitemap.xml file.
The problem is that the sitemap.xml file is located at blog.gallerydirect.com/sitemap.xml, and Webmaster Tools will only allow the insertion of the sitemap as a directory under the gallerydirect.com account.
Right now, we have the sitemap listed in the robots.txt file, but I really don't know if Google is finding and parsing the sitemap.
As far as I can tell, I have three options, and I'd like to get thoughts on which of the three options is the best choice (that is, unless there's an option I haven't thought of):
1. Create a separate Webmaster Tools account for the blog
2. Copy the blog's sitemap.xml file from blog.gallerydirect.com/sitemap.xml to the main web server and list it as something like gallerydirect.com/blogsitemap.xml, then notify Webmaster Tools of the new sitemap on the galllerydirect.com account
3. Do an .htaccess redirect on the blog server, such as
RewriteRule ^sitemap.xml http://gallerydirect.com/blogsitemap_index.xml
Then notify Webmaster Tools of the new blog sitemap in the gallerydirect.com account.
Suggestions on what would be the best approach to be sure that Google is finding and indexing the blog ASAP?
-
Thank you Darin
-
Thanks, Wissam. I have done as you suggested and will watch WT closely.
-
Verify blog.gallerydirect.com in your Google Webmaster tools and go ahead submit the sitemaps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will deindexing a subdomain negate the benefit of backlinks leading to that subdomain?
My client has a subdomain from their main site where their online waiver tool lives. Currently, all the waivers generated by users are creating indexed pages, I feel they should deindex that subdomain entirely. However, a lot of their backlinks are from their clients linking to their waivers. If they end up deindexing their subdomain, will they lose the SEO benefit of backlinks pointing to that subdomain? Thanks! Jay
Intermediate & Advanced SEO | | MCC_DSM0 -
No Control Over Subdomains - What Will the Effect Be?
Hello all, I work for a university and I my small team is responsible for the digital marketing, website, etc. We recently had a big initiative on SEO and generating traffic to our website. The issue I am having is that my department only "owns" the www subdomain. There are lots of other subdomains out there. For example, a specific department can have its own subdomain at department.domain.com and students can have their own webpage at students.domain.com, etc. I know the possibilities of domain cannibilization, but has any one run into long term problems with a similar situation or had success in altering the views of a large organization? If I do get the opportunity to help some of these other domains, what is best to help our overall domain authority? Should the focus be on removing similar content to the www subdomain or cleaning up errors? Some of these subdomains have hundreds of 4XX errors.
Intermediate & Advanced SEO | | Jeff_Bender0 -
How to Get Google to Recognize Your Pages Are Gone
Here's a quick background of the site and issue. A site lost half of its traffic over 18 months ago and its believed to be a Panda penalty. Many, many items were already taken care of and crossed off the list, but here's something that was recently brought up. There are 30,000 pages indexed in Google,but there are about 12,000 active products. Many of these pages in their index are out of stock items. A site visitor cannot find them by browsing the site unless he/she had bookmarked and item before, was given the link by a friend, read about it, etc. If they get to an old product because they had a link to it, they will see an out of stock graphic and not allow to make the purchase. So, efforts have been made about 1 month ago to 301 old products to something similar, if possible, or 410 them. Google has not been removing them from the index. My question is how to make sure Google sees that these pages are no longer there and remove from the index? Some of the items have links to them and this will help Google see them, but what about the items which have 0 external / internal links? Thanks in advance for your assistance. In working on a site which has about 10,000 items available for sale. Looking in G
Intermediate & Advanced SEO | | ABK7170 -
Subdomain and root domain
Hey Everyone, our page has multiple domains and I'm wondering how it affects search rankings today. I saw some stuff from almost a year ago, but I'm not sure if something has changed. We currently have our root domain "www.xyz.com" and started moving some pages over to a different sub-domain "web.xyz.com" because of usability and ease of adjusting content. How much will this affect our seo? Thanks!
Intermediate & Advanced SEO | | josh1230 -
Linking Across Subdomains - Any Concerns?
I use two subdomains on my website (news.webhostinghero.com and www.webhostinghero.com) - I know www.webhostinghero.com is not really a subdomain... That said, both subdomains are linking to each other through menus and sometimes articles. Can this cause any problem? Does Google perceive this as links from different domains / websites?
Intermediate & Advanced SEO | | sbrault740 -
301 Redirect how to get those juices flowing
HI Guys Following on from my previous posts i have still not got my rankings back, http://www.seomoz.org/q/301-redirect-have-no-ranking i am beginning to think that i do have a underlying issue in the site which is restricting me My old site www.economyleasinguk.co.uk was moved to www.economy-car-leasing.co.uk, as mentioned the 301 seemed to go really well and all pages updated within 48 hours, however over 5 months on and the juice from the old site is still not pushed over and i hardly rank at all for anything. here are a list of things i have tried 1:Swapped the original 301 which was PHP for an Htaccess 2: added canonical tag to all pages 3: Turned on internal links as per this post by Everett Sizemore http://www.seomoz.org/blog/uncrawled-301s-a-quick-fix-for-when-relaunches-go-too-well number 3 was only done 5 days ago and initially bot traffic was immense, and may need a bit more time to see any results. I still think i have another underlying issue due to the below reasons 1: Page rank on home page is one but inner pages mixture of 1, 2 and 3 sporadically 2: If I copy text from home page no results 3: Open site explorer still has the old site at with a PA of 60 compared to 42 for the new site 4: Checked server logs and Google is visiting old site 5: Header responses are all correct for the canonicals and see no chaining of the 301’s 6: All pages are do follow and no robots restrictions 7: site:has only in the last few days removed the old site from the index naturally it could be that its just a matter of time however 5 months for a 301 is a very long time and 80% traffic loss is immense I would really appreciate it if someone can give the site a once over and see if i have missed anything obvious. Thanks in advance
Intermediate & Advanced SEO | | kellymandingo0 -
Is it bad to host an XML sitemap in a different subdomain?
Example: sitemap.example.com/sitemap.xml for pages on www.example.com.
Intermediate & Advanced SEO | | SEOTGT0 -
Subdomains - duplicate content - robots.txt
Our corporate site provides MLS data to users, with the end goal of generating leads. Each registered lead is assigned to an agent, essentially in a round robin fashion. However we also give each agent a domain of their choosing that points to our corporate website. The domain can be whatever they want, but upon loading it is immediately directed to a subdomain. For example, www.agentsmith.com would be redirected to agentsmith.corporatedomain.com. Finally, any leads generated from agentsmith.easystreetrealty-indy.com are always assigned to Agent Smith instead of the agent pool (by parsing the current host name). In order to avoid being penalized for duplicate content, any page that is viewed on one of the agent subdomains always has a canonical link pointing to the corporate host name (www.corporatedomain.com). The only content difference between our corporate site and an agent subdomain is the phone number and contact email address where applicable. Two questions: Can/should we use robots.txt or robot meta tags to tell crawlers to ignore these subdomains, but obviously not the corporate domain? If question 1 is yes, would it be better for SEO to do that, or leave it how it is?
Intermediate & Advanced SEO | | EasyStreet0