Wordpress multilanguage sitemaps
-
Hi, I have a multilingual wordpress site. which is in Bulgarian and English - translated using qtranslate.
The xml sitemap of the 2 languages is in one sitemap file- all the links for the Bulgarian and English version are in one file. (Our web is using this plugin -
http://wordpress.org/extend/plugins/google-xml-sitemaps-v3-for-qtranslate
Do you have any idea how can I make separate xml sitemap for every language? I ask you here because may be you have identical problems with your multilanguage wordpress website.
You can see the sitemap with 2 languages links in one sitemap here:
http://cholakovit.com/ sitemap.xml Cholakov IT
I have read from this article that it is better practise and also it will help with geo-targetazing your web site:
http://www.seomoz.org/blog/multiple-xml-sitemaps-increased-indexation-and-traffic
-
Hi, thanks for the idea. Actually manually of course can be done, but if there is automatic update solution would be much easier, expecially if there are many projects.
-
There may be a better answer than this, but I what I'm thinking would probably work.
You can have as many sitemaps as you want, just use different filenames (e.g. /sitemap-en.xml and /sitemap-bg.xml). You might need to manually build them though.
Link to them from your robots.txt files and just submit the individual sitemaps in your Google Webmaster Tools. Repeat everytime you want Google to know about new files (if Google doesn't already).
I hope this helps!
Kevin Phelps
http://www.linkedin.com/in/kevinwphelps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Which pages should I index or have in my XML sitemap?
Hi there, my website is ConcertHotels.com - a site which helps users find hotels close to concert venues. I have a hotel listing page for every concert venue on my site - about 12,000 of them I think (and the same for nearby restaurants). e.g. https://www.concerthotels.com/venue-hotels/madison-square-garden-hotels/304484 Each of these pages list the nearby hotels to that concert venue. Users clicking on the individual hotel are brought through to a hotel (product) page e.g. https://www.concerthotels.com/hotel/the-new-yorker-a-wyndham-hotel/136818 I made a decision years ago to noindex all of the /hotel/ pages since they don't have a huge amount of unique content and aren't the pages I'd like my users to land on . The primary pages on my site are the /venue-hotels/ listing pages. I have similar pages for nearby restaurants, so there are approximately 12,000 venue-restaurants pages, again, one listing page for each concert venue. However, while all of these pages are potentially money-earners, in reality, the vast majority of subsequent hotel bookings have come from a fraction of the 12,000 venues. I would say 2000 venues are key money earning pages, a further 6000 have generated income of a low level, and 4000 are yet to generate income. I have a few related questions: Although there is potential for any of these pages to generate revenue, should I be brutal and simply delete a venue if it hasn't generated revenue within a time period, and just accept that, while it "could" be useful, it hasn't proven to be and isn't worth the link equity. Or should I noindex these "poorly performing pages"? Should all 12,000 pages be listed in my XML sitemap? Or simply the ones that are generating revenue, or perhaps just the ones that have generated significant revenue in the past and have proved to be most important to my business? Thanks Mike
Technical SEO | | mjk260 -
Moving wordpress to it's own server
Our company wants to remove wordpress from our current windows OS server at provider 1 and move it to a new server at provider 2. Godaddy handles our DNS. I would like to have it on the same domain without masking. I would like to make a DNS entry on godaddy so that our current server and our new server can use the same URL (ie sellstuff.com). But I only want the DNS to direct traffic to our current server. The goal here is to have the new server using the same URL as the old server so nothing needs to be masked once traffic is redirected with a 301 rule in the htaccess file. But no traffic outside of the 301 rule will end up going to the new server. I would then like to edit the htaccess file on our current server to redirect to the new servers IP address when someone goes to sellstuff.com/blog. Does this make since and is it possible?
Technical SEO | | larsonElectronics0 -
Sitemap Rules
Hello there, I have some questions pertaining to sitemaps that I would appreciate some guidance on. 1. Can an XML sitemap contain URLs that are blocked by robots.txt? Logically, it makes sense to me to not include pages blocked by robots.txt but would like some clarity on the matter i.e. will having pages blocked by robots.txt in a sitemap, negatively impact the benefit of a sitemap? 2. Can a XML sitemap include URLs from multiple subdomains? For example: http://www.example.com/www-sitemap.xml would include the home page URL of two other subdomains i.e. http://blog.example.com/ & http://blog2.example.com/ Thanks
Technical SEO | | SEONOW1230 -
Host sitemaps on S3?
Hey guys, I run a dynamic web service and I will start building static sitemaps for it pretty soon. The fact that my app lives in a multitude of servers doesn't make it easy to distribute frequently updated static files throughout the servers. My idea was to host the files in AWS S3 and point my robots.txt sitemap directive there. I'll use a sitemap index so, every other sitemap will be hosted on S3 as well. I could dynamically mirror the content from the files in S3 through my app, but that would be a little more resource intensive than just serving the static files from a common place. Any ideas? Thanks!
Technical SEO | | tanlup0 -
Hosted Wordpress Blog creating Duplicate Content
In my first report from SEOmoz, I see that there are a bunch of "duplicate content" errors that originate from our blog hosted on Wordpress. For example, it's showing that the following URLs all have duplicate content: http://blog.kultureshock.net/2012/11/20/the-secret-merger/ys/
Technical SEO | | TomHu
http://blog.kultureshock.net/2012/11/16/vendome-prize-website/gallery-7701/
http://blog.kultureshock.net/2012/11/20/the-secret-merger/sm/
http://blog.kultureshock.net/2012/11/26/top-ten-tips-to-mastering-the-twitterverse/unknown/
http://blog.kultureshock.net/2012/11/20/the-secret-merger/bv/ They all lead to the various images that have been used in various blog posts. But, I'm not sure why they are considered duplicate content because they have unique URLs and the title meta tag is unique for each one, too. But even so, I don't want these extraneous URLs cluttering up our search results, so, I'm removing all of the links that were automatically created when placing the images in the posts. But, once I do that, will these URLs eventually disappear, or continue to be there? Because our blog is hosted by Wordpress, I unfortunately can't add any of the SEO plugins I've read about, so, wondering how to fix this without special plugins. Thanks!
Tom0 -
Seomoz is showing duplicate page content for my wordpress blog
Hi Everyone, My seomoz crawl diagnostics is indicating that I have duplicate content issues in the wordpress blog section of my site located at: http://www.cleversplash.com/blog/ What is the best strategy to deal with this? Is there a plugin that can resolve this? I really appreciate your help guys. Martin
Technical SEO | | RogersSEO0 -
How to handle sitemap with pages using query strings?
Hi, I'm working to optimize a site that currently has about 5K pages listed in the sitemap. There are not in face this many pages. Part of the problem is that one of the pages is a tool where each sort and filter button produces a query string URL. It seems to me inefficient to have so many items listed that are all really the same page. Not to mention wanting to avoid any duplicate content or low quality issues. How have you found it best to handle this? Should I just noindex each of the links? Canonical links? Should I manually remove the pages from the sitemap? Should I continue as is? Thanks a ton for any input you have!
Technical SEO | | 5225Marketing0