Wordpress Blog Integrated into eCommerce site - Should we use one xml sitemap or two?
-
Hi guys,
I wonder whether you can help me with a couple of SEO queries:
So we have an ecommerce website (www.exampleecommercesite.com) with its own xml sitemap, which we have submitted to the Google Webmasters Console. However, recently we decided to add a blog to our site for SEO purposes.
The blog is on a subdomain of the site such as:
(We wanted to have it as www.exampleecommercesite.com/blog but our server made it very difficult and it wasn't technically possible at the time)
1. Should we add the blog.exampleecommercesite.com as a separate property in the Google Webmaster tools?
2. Should we create a separate xml sitemap for the blog content or are there more benefits in terms of SEO if we have one sitemap for the blog and the ecommerce site?
If appreciate your opinions on the topic!
Thank you and have a good start of the week!
-
Glad to help!
-
Thanks Logan! This makes perfect sense but we just wanted to be 100% sure.
-
Since your blog is on a subdomain, yes, you will need to set up a separate WMT profile for it. The blog will also need its own robots.txt and XML sitemap files, since technically speaking, subdomains are regarded as a different site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
By Using interstitial (popups) on the webiste, will google penalize ranks for desktop and mobile both ?
We have implemented interstitials (pop-ups) on a website (Business Articles Website). The popups are basically used for getting leads from the website (using Signup popups). Before Popup implementation the traffic was steady, After the implementation, the traffic started to decay after a couple of weeks and due to the drop we disabled the popup from the website and initiated a force crawl and within next few weeks, we observed traffic gaining back to its normal trend. Within these timelines drop in desktop traffic was more and mobile traffic remain steady. As per Google guidelines, interstitials are more likely to be affected on mobile than desktop. But in our case, desktop traffic was hit more than mobile. So we carried out this experiment for 3 months. And we observed traffic decay and regain. Is interstitials the only culprit here (as the drop is only in desktop) or Can there be some other reasons as well for the traffic drop? bF7hc
Algorithm Updates | | iQuanti1 -
Do we take a SEO hit for having multiple URLs on an infinite scroll page vs a site with many pages/URLs. If we do take a hit, quantify the hit we would suffer.
We are redesigning a preschool website which has over 100 pages. We are looking at 2 options and want to make sure we meet the best user experience and SEO. Option 1 is to condense the site into perhaps 10 pages and window shade the content. For instance, on the curriculum page there would be an overview and each age group program would open via window shade. Option 2 is to have an overview and then each age program links to its own page. Do we lose out on SEO if there are not unique URLS? Or is there a way using metatags or other programming to have the same effect?
Algorithm Updates | | jgodwin0 -
Staging site - Treated as duplicate?
Last week (exactly 8 days ago to be precise) my developer created a staging/test site to test some new features. The staging site duplicated the entire existing site on the same server. To explain this better -My site address is - www.mysite.com The path of the new staging site was www.mysite/staging I realized this only today and have immediately restricted robot text and put a no index no follow on the entire duplicate server folder but I am sure that Google would have indexed the duplicate content by now? So far I do not see any significant drop in traffic but should I be worried? and what if anything can I do at this stage?
Algorithm Updates | | rajatsharma0 -
Seo results are down. Is my "all in one seo pack" to blame?
My website www.noobtraveler.com has shown a dip of 40% since Penguin's last update in November. I also transferred hosting at time, but I was wondering if I'm over optimizing with the all in one seo pack. I would appreciate it if someone could do a quick sweep and share their thoughts. Thanks!
Algorithm Updates | | Noobtraveler0 -
Large number of thin content pages indexed, affect overall site performance?
Hello Community, Question on negative impact of many virtually identical calendar pages indexed. We have a site that is a b2b software product. There are about 150 product-related pages, and another 1,200 or so short articles on industry related topics. In addition, we recently (~4 months ago) had Google index a large number of calendar pages used for webinar schedules. This boosted the indexed pages number shown in Webmaster tools to about 54,000. Since then, we "no-followed" the links on the calendar pages that allow you to view future months, and added "no-index" meta tags to all future month pages (beyond 6 months out). Our number of pages indexed value seems to be dropping, and is now down to 26,000. When you look at Google's report showing pages appearing in response to search queries, a more normal 890 pages appear. Very few calendar pages show up in this report. So, the question that has been raised is: Does a large number of pages in a search index with very thin content (basically blank calendar months) hurt the overall site? One person at the company said that because Panda/Penguin targeted thin-content sites that these pages would cause the performance of this site to drop as well. Thanks for your feedback. Chris
Algorithm Updates | | cogbox0 -
Why do in-site search result pages rank better than my product pages?
Maybe this is a common SERP for a generic product type but I'm seeing it a lot more often. Here is an example SERP "rolling stools". The top 4 results are dynamic in-site search pages from Sears, ebay and Amazon (among others). I understand their influence and authority but why would a search return a dynamic in-site SERP instead of a solid product page. A better question would be - How do I get my in-site SERPs to rank or how do I get my client's page to rise above the #5 spot is currently ranks at? Thanks
Algorithm Updates | | BenRWoodard0 -
Our Developer Site randomly drops 10+ places in Google searches for our Company Name. Why?
Hey everyone, At Betable, we have a player-facing site and a developer-facing site. We also have a developer-facing blog. We have this issue where our developer-facing site will randomly drop 10+ places in Google's Search results for the keyword "betable". This problem can be reproduced by others and in incognito mode, so it's not just one person's results. Furthermore, the developer-facing blog and our social media accounts all suddenly rank higher than the developer site. Even stranger, this problem randomly fixes itself after a few days. This has happened twice so far, and on each occasion there were no changes to the website that would have prompted a drop in rank. After the first drop, we did our best to neutralize any SEOMoz "red alerts" but to no avail, the drop happened again last week. Can someone help us understand what's going on? Are there ways to avoid this? Thanks, Tyler
Algorithm Updates | | Betable0 -
If a page one result for a keyword is mostly directories, do I have a chance to rank for this keyword?
I feel like although directories carry a lot of weight and links, I'd think that my client would be able to gain a top position, since none of the others are competitor pages, nor are the directories engaging.
Algorithm Updates | | randallseo0