Hi Moz peeps, google said that we should have two site maps...
-
Hi Moz peeps, google said that we should have two site maps...
one for google and one for people. Now i know having a site map to submit to google for the first time is important for SEO, but is having a site map for people visitng the site help at all in terms of google's bots crawling your site?
I know it might actually help human people navigate through your site, I just want to know if by not having it or having it affects on page SEO at all,
thanks guys
-
David
I think what you are referring to is HTML sitemaps vs XML Sitemaps and the inclusion of both. Google prefers XML and people use HTML : From GWMT -
Sitemaps are a way to tell Google about pages on your site we might not otherwise discover. In its simplest terms, a XML Sitemap—usually called Sitemap, with a capital S—is a list of the pages on your website. Creating and submitting a Sitemap helps make sure that Google knows about all the pages on your site, Including URLs that may not be discoverable by Google's normal crawling process.
I agree with the others, that an HTML sitemap can be helpful to visitors. For indexing, we use XML.
Best
-
Hi David, For SEO to be effective it is important that the search engine spiders and web crawlers can properly index a site. Based on your site's architecture and its links (both external and internal) the engines can establish and weight the profile of how each interacts to map out which pages have the highest significance to their index.
One of the most effective ways of encouraging full and extensive visiting and indexing of your site by the spiders is to integrate a site map (or maps) into your development. Sitemaps are an easy way for web masters to represent the structure of a site and inform search engines about pages on their sites that are available for crawling. Human visitors also appreciate site maps as navigation.
An HTML sitemap is an actual page on a site containing links to the website's most important pages. Any site with anything approaching complex navigation should consider an HTML sitemap for visitors. Whilst an HTML sitemap can be read and the links followed by spiders (so some passing of link juice is involved that might affect SEO), it also has the advantage of being readable by humans as well.
A visitor might refer to an HTML site map to help them navigate when they can't find a specific page easily. This visitor readability and usability issue that allows users to locate pages within a large site will give weight to your site in the SERP's, (search engine ranking pages) as they will weight your pages favourably for catering to your users and being user-friendly.
HTML sitemaps, however offer only a general guide to a site, an overview, not necessarily containing all the links available to spider in a website. This is especially the case for websites that contain dynamically generated pages (shopping carts, etc).
editor's note: for complete post, see http://www.seoconsult.com/on-page-search-engine-optimisation/what-effect-do-xml-and-html-sitemaps-have-in-relation-to-seo.html
I hope that you r query has been solved.
-
Hi David,
If you have a good structure on your website then there is no need for a HTML sitemap. By good structure I mean, is every important page on your site easily accessible within 2 clicks from any other page on your website? Are your menu's SEO friendly? Are your main products/services highlighted on your home page in summary? I do not use an HTML sitemap on any of my sites, I just make sure that Google can access every page that is important to me no matter where they land on my site.
-
Hello,
A site map for people will not only help visitors find your content easier but it will also help at SEO, because it improves the way your data is linked internally - increasing the number of internal inbound links.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site Structure question?
Hey guys, Sorry for posting this again but the last thread got a bit too wayword. I'll sum it up better here. We're producing a WordPress theme every 3-6 months. Each is differently niched (eg: ecommerce, restaurant, magazine, etc...) Which option is better for our products going forward (even the ones we've yet to launch...eg...which method will get future projects more "trust juice" from google): A: create a subfolder for each theme eg: http://bigbangthemes.net/TicketLab_WP/wordpress-ticket-system & http://bigbangthemes.net/Showoff_WP/landing-page/ **This is currently what we're doing.**B: have them all under bigbangthemes.net/wordpress-themes/ eg: bigbangthemes.net/wordpress-themes/wordpress-ticket-system & bigbangthemes.net/wordpress-themes/showoff-startup-agency-theme Thanks for the help!
On-Page Optimization | | andy.bigbangthemes0 -
If Product Pages Perform Well In Google then Is It Possible That Category Page Can perform Well In Google?
Hi All, For my ecommerce site I have optimized my product pages very nicely like good images, detailed information about products, good reviews, implemented schema for my product and reviews and very perfect onpage. Now my query is if my products pages performing well in google then there are chances that my category page rank well in google too? Thanks!
On-Page Optimization | | wright3350 -
Working on this site...
and wondering what is wrong in terms of on page SEO (basically just want some feedback on tips/changes to make) http://www.stevenholmesstudio.com/ I'm assuming that the title shouldn't be just the img file name..any suggestions for what it should be?
On-Page Optimization | | callmeed0 -
Large Site - Advice on Subdomaining
I have a large news site - over 1 million pages (have already deleted 1.5 million) Google buries many of our pages, I'm ready to try subdomaining http://bit.ly/dczF5y There are two types of content - news from our contributors, and press releases. We have had contracts with the big press release companies going back to 2004/5. They push releases to us by FTP or we pull from their server. These are then processed and published. It has taken me almost 18 months, but I have found and deleted or fixed all the duplicates I can find. There are now two duplicate checking systems in place. One runs at the time the release comes in and handles most of them. The other one runs every night after midnight and finds a few, which are then handled manually. This helps fine-tune the real-time checker. Businesses often link to their release on the site because they like us. Sometimes google likes this, sometimes not. The news we process is reviews by 1,2 or 3 editors before publishing. Some of the stories are 100% unique to us. Some are from contributors who also contribute to other news sites. Our search traffic is down by 80%. This has almost destroyed us, but I don't give up easily. As I said, I've done a lot of projects to try to fix this. Not one of them has done any good, so there is something google doesn't like and I haven't yet worked it out. A lot of people have looked and given me their ideas, and I've tried them - zero effect. Here is an interesting and possibly important piece of information: Most of our pages are "buried" by google. If I dear, even for a headline, even if it is unique to us, quite often the page containing that will not appear in the SERP. The front page may show up, an index page may show up, another strong page pay show up, if that headline is in the top 10 stories for the day, but the page itself may not show up at all - UNTIL I go to the end of the results and redo the search with the "duplicates" included. Then it will usually show up, on the front page, often in position #2 or #3 According to google, there are no manual actions against us. There are also no notices in WMT that say there is a problem that we haven't fixed. You may tell me just delete all of the PRs - but those are there for business readers, as they always have been. Google supposedly wants us to build websites for readers, which we have always done, What they really mean is - build it the way we want you to do it, because we know best. What really peeves me is that there are other sites, that they consistently rank above us, that have all the same content as us, and seem to be 100% aggregators, with ads, with nothing really redeeming them as being different, so this is (I think) inconsistent, confusing and it doesn't help me work out what to do next. Another thing we have is about 7,000+ US military stories, all the way back to 2005. We were one of the few news sites supporting the troops when it wasn't fashionable to do so. They were emailing the stories to us directly, most with photos. We published every one of them, and we still do. I'm not going to throw them under the bus, no matter what happens. There were some duplicates, some due to screwups because we had multiple editors who didn't see that a story was already published. Also at one time, a system code race condition - entirely my fault, I am the programmer as well as the editor-in-chief. I believe I have fixed them all with redirects. I haven't sent in a reconsideration for 14 months, since they said "No manual spam actions found" - I don't see any point, unless you know something I don't. So, having exhausted all of the things I can think of, I'm down to my last two ideas. 1. Split all of the PRs off into subdomains (I'm ready to pull the trigger later this week) 2. Do what the other sites do, that I believe create little value, which is show only a headline and snippet and some related info and link back to the original page on the PR provider website. (I really don't want to do this) 3. Give up on the PRs and delete them all and lose another 50% of the income, which means releasing our remaining staff and upsetting all of the companies and people who linked to us. (Or find them all and rewrite them as stories - tens of thousands of them) and also throw all our alliances under the bus (I really don't want to do this) There is no guarantee this is the problem, but google won't tell me, the google forums are crap, and nobody else has given me an idea that has helped. My thought is that splitting them off into subdomains will have a number of effects. 1. Take most of the syndicated content onto subdomains, so its not on the main domain. 2. Shake up the Domain Authority 3. Create a million 301 redirects. 4. Make it obvious to the crawlers what is our news and what is PRs 5. make it easier for Google News to understand Here is what I plan to do 1. redirect all PRs to their own subdomain. pn.domain.com for PRNewswire releases bw.domain.com for Businesswire releases etc 2. Fix all references so they use the new subdomain Here are my questions - and I hope you may see something I haven't considered. 1. Do you have any experience of doing this? 2. What was the result 3. Any tips? 4. Should I put PR index pages on the subdomains too? I was originally planning to keep them on the main domain, with the individual page links pointing to the actual release on the subdomain. Obviously, I want them only in one place, but there are two types of these index pages. a) all of the releases for a particular PR company - these certainly could be on the subdomain and not on the main domain b) Various category index pages - agriculture, supermarkets, mining etc These would have to stay on the main domain because they are a mixture of different PR providers. 5. Is this a bad idea? I'm almost out of ideas. Should I add a condensed list of everything I've done already? If you are still reading, thanks for hanging in.
On-Page Optimization | | loopyal0 -
Checking for content originality in a site
two part question on original content How would you go about checking if a site holds original content accept the long search quary within Google? ans also if I find many sites carrying my content and I am the original source should I replace the content? thanks
On-Page Optimization | | ciznerguy0 -
Numbers above actual site content
Most pages on my website contain many numbers above the actual text on the page. This is useful for users and looks good on an actual view of the page. However, when a bot reads the page it appears as rows of numbers with a few sentences at the bottom of the page. Does having these number have a negative SEO effect? If so, should I change them to something such as an image so they aren't readable by search engines?
On-Page Optimization | | theLotter0 -
Site Stucture Advice - Keyword Dillema
I am creating a new site and am looking for some advice on how to structure the site Using Google's keyword search tool it seems like I have a dilemma in that about 50% of the keyword pairs are contained in 10 keyword pairs that are similar The first two pairs have about 49% of the traffic and only differ between plural / singular, not quite sure how to handle that, or if google has a method to make these more or less synonomous The last 8 pairs are roughly similar in distribtuion As an example (not my case, just for visualization) Mountain Bike Classes Mountain Bike Instruction Mountain Bike Workshops Mountain Bike Training Etc ... which all more or less give the same results (yes some difference but they all deal with learning how to ride a mountain bike, again this is not my exact case, don't care a whit about mountain bikes 😉 I don't see giving each of those kinds of pairs their own page since the content would be pretty much the exact same, making it substantially different would also be problematic (if I am thinking about this correctly) I have a clean slate to work with from a site perspective so I am wondering how people here would, or better yet have handled similar situations
On-Page Optimization | | bThere0 -
Best site structure for SEO
Hi, I'm currently in the process of redesigning/rebuilding a well ranking but a dated looking and structured website. Using analytics info I'm trying to put togerther an optimied site map plan for the site based on keywords. Currently the site is structured like this (a few examples) for some of its best ranking keywords / landing pages www.companyname.co.uk/frames/software/companyname-software/keyword/overview.php www.companyname.co.uk/frames/software/companyname-software/keyword/keyword.php I'd like to simplfy this as part of the re build so url's look like this www.companyname.co.uk/companyname-software/softwarecatogry/keyword Obviously I would 201 the old urls. My question is : A. is this a good idea? (From what I've read it is?) B. is there any benifit from having the company name repeated in the url (ie www.companyname.co.uk/companyname-software). My thinking before this is that companyname-software currently ranks well and brings a good amount of traffic. Or should I just go with www.companyname.co.uk/software/softwarecatogry/keyword as opposed to www.companyname.co.uk/companyname-software/softwarecatogry/keyword? Many thanks in advance!
On-Page Optimization | | JamesJacobs0