How to Submit XML Site Map with more than 300 Subdomains?
-
Hi,
I am creating sitemaps for site which has more than 500 Sub domains. Page varies from 20 to 500 in all subdomains & it will keep on adding in coming months.I have seen sites that create separate sitemap.xml for each subdomain which they mention in separate robots.txt file http://windows7.iyogi.com/robots.txt
XML site map eg for subdomain: http://windows7.iyogi.com/sitemap.xml.gz ,
Currently in my website we have only 1 robots.txt file for main domain & sub domains.
Please tell me shall i create separate robots.txt & XML site map file for each subdomain or 1 file. Creating separate xml for each sub-domain is not feasible as we have to verify in GWT separately.
Is there any automatic way & do i have to ping separately if i add new pages in subdomain.
Please advise me.
-
Let me know how it goes. I'm sure it can be done. Just needs the right team
-
Yea in wordpress that option is available, but we are using Ruby On rails platform, so i am not sure whether we can do or not.
For eg http://windows7.iyogi.com/sitemap.xml.gz they use Wordpress CMS & it's mentioned in page that
"It was generated using the Blogging-Software WordPress and the Google Sitemap Generator Plugin by Arne Brachhold."
Anyway thx for ur help i will speak to my smart developers, let's c what they can do
-
Okay with this little bit of information it does sound like it might in fact be legitimate. If it is, then the best solution is to work with the development team to automate the creation of each sitemap.xml file, and have them submitted to Google automatically, I know this is possible because I use the google Sitemaps plug-in for WordPress - and it automatically submits to Google and Bing.
How it does that I do not know. That's up to smart web developers to figure out and replicate.
-
Hi Alan, i recently joined this co & i can't change the whole structure.
I believe they have created virtual sub - domains & Moreover site traffic is growing at a great rate so they can't think of changing structure.
Last month it has been ranked as 20th Most visited website in India, so things are pretty fine. Moreover it's an education website and students can easily remember Subdomain URL eg: http://gmat.abc.com . also direct traffic to these sub domains is very high. So now how should i solve problem of XML sitemap
-
The more important, and URGENT issue is why are there so many subdomains, and why are there going to be more? That's got to be one of the most serious and potentially harmful things you could do to your SEO efforts unless it's an extremely rare situation that justifies the tactic.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Similar pages on a site
Hi I think it was at BrightonSEO where PI DataMetrics were talking about similar pages on a website can cause rankings to drop for your main page. This has got me thinking. if we have a category about jumpers so: example.com/jumpers but then our blog has a category about jumpers, where we write all about jumpers etc which creates a category page example.com/blog/category/jumpers, so these blog category pages have no index put on them to stop them ranking in Google? Thanks in Advance for any tips. Andy
Technical SEO | | Andy-Halliday1 -
URL Changes And Site Map Redirects
We are working on a site redesign which will change/shorten our url structure. The primary domain will remain the same however most of the other urls on the site are getting much simpler. My question is how should this be best handled when it comes to sitemaps because there are massive amounts of URLS that will be redirected to the new shorter URL how should we best handle our sitemaps? Should a new sitemap be submitted right at launch? and the old sitemap removed later. I know that Google does not like having redirects in sitemaps. Has anyone done this on a large scale, 60k URLs or more and have any advice?
Technical SEO | | RMATVMC0 -
Are mobile annotation in PC xml sitemaps a replacement for mobile xml sitemaps?
These two links confused me as to what I should do... https://developers.google.com/webmasters/smartphone-sites/details https://support.google.com/webmasters/answer/34648?hl=en
Technical SEO | | JasonOliveira0 -
Site not passing page authority....
Hi, This site powertoolworld.co.uk is not passing page authority. In fact every page shows no links unless it has a link from an external source. Originally this site blocked Roger from crawling it but that block was lifted over 6 months ago. I also ran a crawl test last night and it shows the same thing. PA of 1 and no links. I would like to point out that the problem seems to be the same for all sites on the same platform. Which points me in the direction of code. for example there is a display: none tag in the ccs which is used to style where the side bar links are. It's a Blue Park platform. What could be causing the problem? Thanks in advance. EDIT Turns out that blocking the ezooms crawler stopped it from being included.
Technical SEO | | PowerToolWorld0 -
Multiple domains pointing to same site
Over the years, we have acquired a great number of variations of our domains, or industry-specific domains to protect our brand. Currently, the majority of those domains are parked at the registrars. Would we do any harm to our rankings if we pointed the dormant domains to our website (www.ellsworth.com)? If not, are there any recommendations as the best way to do this, or just point them to the same IP?
Technical SEO | | Ellsworth0 -
Mobile site rank on Google S.E. instead of desktop site.
Hello, all SEOers~ Today, I would like to hear your opinion regarding on Mobile site and duplicate contents issue. I have a mobile version of our website that is hosted on a subdomain (m instead www). Site is targeting UK and Its essentially the same content, formatted differently. So every URL on www exists also at the "m" subdomain and is identical content. (there are some different contents, yet I could say about 90% or more contents are same) Recently I've noticed that search results are showing links to our mobile site instead of the desktop site. (Google UK) I have a sitemap.xml for both sites, the mobile sitemap defined as follows: I didn't block googlebot from mobile site and also didn't block googlebot-mobile from desktop site. I read and watched Google webmaster tool forum and related video from Matt Cutts. I found many opinion that there is possibility which cause duplicate contents issue and I should do one of followings. 1. Block googlebot from mobile site. 2. Use canonical Tag on mobile site which points to desktop site. 3. Create and develop different contents (needless to say...) Do you think duplicate contents issue caused my mobile site rank on S.E. instead of my desktop site? also Do you think those method will help to show my desktop site on S.E.? I was wondering that I have multi-country sites which is same site format as I mentioned above. However, my other country sites are totally doing fine on Google. Only difference that I found is my other country sites have different Title & Meta Tag comparing to desktop site, but my UK mobile site has same Title & Meta Tag comparing to desktop. Do you think this also has something to do with current problem? Please people~! Feel free to make some comments and share your opinion. Thanks for reading my long long explanation.
Technical SEO | | Artience0 -
Robots.txt blocking site or not?
Here is the robots.txt from a client site. Am I reading this right --
Technical SEO | | 540SEO
that the robots.txt is saying to ignore the entire site, but the
#'s are saying to ignore the robots.txt command? See http://www.robotstxt.org/wc/norobots.html for documentation on how to use the robots.txt file To ban all spiders from the entire site uncomment the next two lines: User-Agent: * Disallow: /0 -
Why did our site drop in Google rankings?
My site's URL (web address) is: http://tinyurl.com/3svn2l9 Hi there, We operate a travel site that lists numerous tours, accommodation and activities. Since 6th August 2011 we have dropped from top 10 SERP rankings of our pages to around result number 100 (page 10) and losing massive amount of visitors via Google Search. Our Yahoo and Bing rankings are still in the top10. We need your advice and quick! The last changes we have made are the following: -redirected the non-www version to the www version on the 1st August -bought advertising with a follow link in a sidebar that is being populated across the site (+4000 pages) about 2 months ago -added a blog to the website 2 weeks ago and posted 2 posts to date. Additionally, our website structure allows visitors (and bots) to see the same listings via different URLs which caused duplicate content. This has been the case since the launch of our website about 1 year ago. To prevent this duplicate content we have placed canonical tags on the individual listings pages. Why did our site all of a sudden plummet in the rankings?
Technical SEO | | Robbern0