Big site SEO: To maintain html sitemaps, or scrap them in the era of xml?
-
We have dynamically updated xml sitemaps which we feed to Google et al.
Our xml sitemap is updated constantly, and takes minimal hands on management to maintain.
However we still have an html version (which we link to from our homepage), a legacy from back in the pre-xml days. As this html version is static we're finding it contains a lot of broken links and is not of much use to anyone.
So my question is this - does Google (or any other search engine) still need both, or are xml sitemaps enough?
-
From an SE point of view XML sitemaps are enough, if you have a large site you may want to consider having more than one sitemap for different categories.
As Kieron suggested HTML Sitemaps are useful for people to navigate your site it might be worthwhile writing some PHP to convert the XML into HTML and making your HTML Sitemap a little more dynamic?
-
Although users might not go to the sitemap very often, it is usually a very easy way to make sure some linkjuice is passed to all pages. Especially if the sitemap is linked to from a lot of pages, it usually has quite some juice to pass on. However, the sitemap should never be the only way you link to deeper pages.
-
The html sitemap should not be there for Google or Bing but for real people. If no one is using the html site map and it’s causing lots of problems to maintaining it, then drop it.
But if visitors are using it, look to get it replaced with a dynamics generated html sitemap, which helps the visitors find what they are looking for. You may wish to consider using similar to amazon’s “Shop all departments” page, which allows visitors to drill into the categories.
Hope this helps.
K
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our sitemap is not indexed i Google even though it's successfully processed
Hi, Ours is a WP hosted website. We have submitted the XML sitemap with a WP plugin. It's been successfully processed by Google but it's not been indexed in and can't be found in SERP. How to get this indexed? Will there be any low crawling of sitemap as it's not indexed? Thanks
Algorithm Updates | | vtmoz0 -
Domain has been redirected our site; but many incoming links from sub domain. Will this hurts?
Hi all, This is the scenario: Our website is newwebsite.com. Our old website is oldwebsite.com which has been redirected to newwebsite.com (years back). But one of the old website's sub domain has a lot of back links to our current website like: seo.oldwebsite.com to newwebsite.com. Will this scenario hurts with any wrong linking? Thanks
Algorithm Updates | | vtmoz0 -
We recently transitioned a site to our server, but Google is still showing the old server's urls. Is there a way to stop Google from showing urls?
We recently transitioned a site to our server, but Google is still showing the old server's urls. Is there a way to stop Google from showing urls?
Algorithm Updates | | Stamats0 -
Did .org vs. .com SEO importance recently changed?
I have seen previous answers in the Forum about this subject but Google has seemed to have again changed the playing surface. Within the past 30 days, we have seen a huge spike in organic search returns seeming to favor .org as domain authorities. Has anyone else noticed this shift and is it just coincidence or worth factoring in? If it is a shift, will Google punish those that have .org but have used.com previously for switching the redirects to serve .org first? Thanks, Jim
Algorithm Updates | | jimmyzig0 -
Should I use canonical tags on my site?
I'm trying to keep this a generic example, so apologies if this is too vague. On my main website, we've always had a duplicate content issue. The main focus of our site is breaking down to specific, brick and mortar locations. We have to duplicate the description of product/service for every geographic location (this is a legal requirement). So for example, you might have the parent "product/service" page targeting the term, and then 100's of sub pages with "product/service San Francisco", "product/service Austin", etc. These pages have identical content except for the geographic location is dynamically swapped out. There is also additional useful content like google map of area, local resources, etc. As I said this was always seen as an SEO issue, specifically you could see in the way that googlebot would crawl pages and how pagerank flowed through the site that having 100's of pages with identical copy and just swapping out the geographic location wasn't seen as good content, however we still always received traffic and conversions for the long tail geographic terms so we left it. Las year, with Panda, we noticed a drop in traffic and thought it was due to this duplicate issue so I added canonical tags to all our geographic specific product/service pages that pointed back to the parent page, that seemed to be received well by google and traffic was back to normal in short order. However, recently what I notice a LOT in our SERP pages is if I type in a geographic specific term, i.e. "product/service san francisco", our deep page with the canonical tag is what google is ranking. Google inserts its own title tag on the SERP page and leaves the description blank as it doesn't index the page due to the canonical tag on the page. Essentially what I think it is rewarding is the site architecture which organizes the content to the specific geo in the URL: site.com/service/location/san-francisco. Other than that there is no reason for it to rank that page. Sorry if this is lengthy, thanks for reading all of that! Essentially my question is, should I keep the canonical tags on the site or take them off since Google insists on ranking the page? If I am ranking already then the potential upside to doing that is ranking higher (we're usually in the 3-6 spot on the result page) and also higher CTR because we can get a description back on our resulting page. The counter argument is I'm already ranking so leave it and focus on other things. Appreciate your thoughts on this!
Algorithm Updates | | edu-SEO0 -
Site-wide Footer Link on Client/Friend Website - Dangerous?
Hi Guys, I've got a friend / client / business associate who's website I helped develop. It's a three letter dot-com, so good trust, and an eCommerce site, so lot's of pages. When I launched my new site about 6 weeks ago I put "Official IT Partner of MySite.com" in the footer. No keywords in the anchor text, just the domain URL... There are no other external links like that on the site whatsoever, and I haven't been hit by Penguin. I'm ranking well for local targeted keywords a few weeks after launch, and traffic continues to increase... I am worried that Google will see this is unnatural, but I've received no warning or experienced any decline in rankings. There's about 2800 pages linking from the site to my site, all in the footer of course. Would it be better to remove the link from the footer and add it just to the home page and a couple of other high authority pages, or should I leave it be. It's not "unnatural", I am affiliated with the site and work in partnership with the site, but it does fit that profile. I'm thinking about removing the footer link and adding a small graphic on the home page of the linking site which links to my root domain, with a couple of broad keyword anchored links in a description underneath that also link to relevant pages on my site... What do you think? 2800 links w/ my URL as anchor text from high Domain Authority / Low Page Authority pages (the homepage and a few other pages have decent authority) to my root domain OR Three different links from one High DA/ High PA homepage (one image alt, two anchored w/ broad keywords) to three different pages on my site. Again, there are no other site-wide external links on the domain, and I'm pretty sure I escaped the Penguin. Looking forward to hearing the different points of view. Thanks, Anthony
Algorithm Updates | | Anthony_NorthSEO2 -
Is URL appearance defined by crawling or by XML sitemap
I am having a problem developing a sitemap because I have long URLs that are made by zend. They go like this: http://myagingfolks.com/professionals/20661/social-workers/pennsylvania-civi-stanger Because these URL's are long and are fed by Zend when I try to call them all up, to put on the sitemap, the system runs out of memory and crashes. Do you know what part of a search result, in google, say, comes from the URL? Would it be fine for me to submit to google only www.myagingfolks.com/professionals/20661. Does the crawler find that the URL is indeed http://myagingfolks.com/professionals/20661/social-workers/pennsylvania-civi-stanger or does it go with just what the sitemap tells it?
Algorithm Updates | | Jordanrg0