Webmaster Tools Site Map Question
-
I have TLD that has authority and a number of micro-sites built off of the primary domain. All sites relate to the same topic, as I am promoting a destination. The primary site and each micro-site have their own CMS installation, but the domains are mapped accordingly.
www.regionalsite.com/ <- primary
www.regioanlsite.com/theme1/ <- theme 1
www.regioanlsite.com/theme2/ <- theme 2
www.regionalsite.com/theme3/ <- theme 3Question: Should my XML site map for Webmaster Tools feed all sites off of the primary domain site map or are there penalties for this?
Thanks.
-
That would be having it twice, I would not recommend that. However that's debateable. IMO one mention per keyword in the URL is plenty. For regional links focus on building links locally (by area/regional physical location) to each page.
-
Yes, what you're describing is what I am doing. Thanks and sorry for the confusion around the terminology.
Doy ou think there would be a benefit to repeating the region name in the folder structure www.REGION.com/REGION-topic?
-
Micro sites are separate from your primary domain. If you want to target Region Golf, Region Dining, Region Events, you domains should be submitted something like this:
This is one site.
www.regionalsite.com/ <- primary
www.regioanlsite.com/Region-Golf <- page 1 with key (build diverse links using "regional golf")
www.regioanlsite.com/Region-Dining <- page 2 with key www.regionalsite.com/Region-Events <- page 3 with keyIn this case you submit you sitemap to Google, listing all your pages.
This would be a micro site:
www.RegionGolf.com
The domain is completely different f rom the micro site. No association with www.regionalsite.com/, so it can not be included in the sitemap.
-
Using your example, I currently do rank #1 for "Chocolate", but I want to rank for a number of words related to chocolate now. So the content silo on my site related to "Chocolate Fudge" (as an example) is a micro-site built out specifically around the topic of Fudge. (my example is Region Golf, Region Dining, Region Events).
Really, the 'micro-sites' as I call them are all built in their own CMS as they have different stakeholder groups. It's all mapped off of the primary domain through as it relates specifically to the region with the theme modifier.
My question is - does the primary domain site map get submitted to Google with al of the site URLs (Primary and themed micro-sites) included since they are all off of the same domain?
-
MIcro sites all have a unique URL. They are built to target one keyword each.
This example you give is a standard site, Sub-Domain/Page (include broad keyword here). You can make as many pages as you want. However, building these pages will spread you juice around so choose your keywords carefully.
www.regionalsite.com/ <- primary
www.regioanlsite.com/theme1/ <- theme 1
www.regioanlsite.com/theme2/ <- theme 2
www.regionalsite.com/theme3/ <- theme 3Example of a micro site.
I have a site for lets say chocolate: www. chocolate.com im number 1 for chocolate in Google which is the only keyword I am after. For the example there are no more keys to go after "chocolate". So I want to rank 2 times for chocolate. So I build a micro site targeting "chocolate" the domain could be www:1chocolate.com
I would not build another page called chocolate within my ranking domain.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What should I do with URLs that cause site map errors?
Hi Mozzers, I have a client who uses an important customer database and offers gift cards via https://clients.mindbodyonline.com located within the navigation which causes sitemap errors whenever it is submitted since domain is different. Should I ask to remove those links from navigation? if so where can I relocate those links? If not what should I do to have a site map without any errors? Thanks! 1n16jlL.png
Technical SEO | | Ideas-Money-Art0 -
301 Multiple Sites to Main Site
Over the past couple years I had 3 sites that sold basically the same products and content. I later realized this had no value to my customers or Google so I 301 redirected Site 2 and Site 3 to my main site (Site 1). Of course this pushed a lot of page rank over to Site 1 and the site has been ranking great. About a week ago I moved my main site to a new eCommerce platform which required me to 301 redirect all the url's to the new platform url's which I did for all the main site links (Site 1). During this time I decided it was probably better off if I DID NOT 301 redirect all the links from the other 2 sites as well. I just didn't see the need as I figured Google realized at this point those sites were gone and I started fearing Google would get me for Page Rank munipulation for 301 redirecting 2 whole sites to my main site. Now I am getting over 1,000 404 crawl errors in GWT as Google can no longer find the URL's for Site 2 and Site 3. Plus my rankings have dropped substantially over the past week, part of which I know is from switching platforms. Question, did I make a mistake not 301 redirecting the url's from the old sites (Site 2 and Site 3) to my new ecommerce url's at Site 1?
Technical SEO | | SLINC0 -
Google webmaster tools says access denied for 77 urls
Hi i am looking in google webmaster tools and i have seen a major problem which i hope people can help me sort out. The problem is, i am being told that 77 urls are being denied access. The message when i look for more information says the below Googlebot couldn't crawl your URL because your server either requires login to access the page, or is blocking Googlebot from accessing your site. the responce code is 403 here is a couple of examples http://www.in2town.co.uk/Entertainment-Magazine http://www.in2town.co.uk/Weight-Loss-Hypnotherapy-helped-woman-lose-3-stone i think the problem could be that i have sent them to another url in my httaccess file using the 403 re-direct but why would it bring up that google bot could not crawl them any help would be great
Technical SEO | | ClaireH-1848860 -
Technical question about site structure using a CMS, redirects, and canonical tag
I have a couple of sites using a particular CMS that creates all of the pages under a content folder, including the home page. So the url is www.example.com/content/default.asp. There is a default.asp in the root directory that redirects to the default page in the content folder using a response.redirect statement and it’s considered a 302 redirect. So all incoming urls, i.e. www.example.com and example.com and www.example.com/ will go to the default.asp which then redirects to www.example.com/ content/default.asp. How does this affect SEO? Should the redirect be a 301? And whether it’s a 301 or a 302, can we have a rel=canonical tag on the page that that is rel=www.example.com? Or does that create some sort of loop? I’ve inherited several sites that use this CMS and need to figure out the best way to handle it.
Technical SEO | | CHutchins1 -
Recent Webmaster Tools Glitch Impacting Site Quality?
The ramifications of this would not be specific to myself but to anyone with this type of content on their pages... Maybe someone can chime in here, but I'm not sure how much if at all site errors (for example 404 errors) as reported by Google Webmaster Tools are seen as a factor in site quality, which would impact SEO rankings. Any insight on that alone would be appreciated. I've noticed some fairly new weird stuff going on in the WMT 404 error reports. It seems as though their engine is finding objects within the source code of the page that are NOT links but look a URL, then trying to crawl them and reporting them as broken. I've seen a couple different of cases in my environment that seem to trigger this issue. The easiest one to explain are Google Analytic virtual pageview Javascript calls where for example you might send a virtual pageview back to GA for clicks on outbound links. So in the source code of your page you would have something like: onclick="<a class="attribute-value">_gaq.push(['_trackPageview', '/outboundclick/www.othersite.com']);</a> Although this is obviously not a crawl-able link, sure enough Webmaster Tools now would be reporting the following broken page with a 404: www.mysite.com/outboundclick/www.otherwite.com I've seen other such cases of thing that look like URLs but not actual links being pulled out of the page source and reported as broken links. Has anyone else noticed this? Do 404 instances (in this case false ones) reported by Webmaster Tools impact site quality rankings and SEO? Interesting issue here, I'm looking forward to hear some people's thoughts on this. Chris
Technical SEO | | cbubinas0 -
I am Posting an article on my site and another site has asked to use the same article - Is this a duplicate content issue with google if i am the creator of the content and will it penalize our sites - or one more than the other??
I operate an ecommerce site for outdoor gear and was invited to guest post on a popular blog (not my site) for a trip i had been on. I wrote the aritcle for them and i also will post this same article on my website. Is this a dup content problem with google? and or the other site? Any Help. Also if i wanted to post this same article to 1 or 2 other blogs as long as they link back to me as the author of the article
Technical SEO | | isle_surf0 -
Should we block URL param in Webmaster tools after URL migration?
Hi, We have just released a new version of our website that now has a human readable nice URL's. Our old ugly URL's are still accessible and cannot be blocked/redirected. These old URL's use a URL param that has an xpath like expression language to define the location in our catalog. We have about 2 million pages indexed with this old URL param in it while we have approximately 70k nice URL's after the migration. This high number of old URL's is due to facetting that was done using this URL param. I wonder if we should now completely block this URL param from Google Webmaster tools so that these ugly URL's will be removed from the Google index. Or will this harm our position in Google? Thanks, Chris
Technical SEO | | eCommerceSEO0 -
My site cannot be found by google at all
I don't know why but our company site can not be found by google at all. I have submitted to google webmaster, have social media point to, etc, Is there any reason for this? url for our website is www.bistosamerica.com Thank you
Technical SEO | | BistosAmerica0