Sitemap issue
-
How can I create XML as well as HTML sitemaps for my website (both eCommerce and non - eCommerce )Is there any script or tool that helps me making perfect sitemapPlease suggest
-
If you have a website with over 10,000 pages that is partially e-commerce then you should not be afraid to pay for a proper solution. Free is great when you are starting out but there comes a time when you need to invest money (and it will be a small amount) in the proper tools to continue your site's growth.
-
But https://www.xml-sitemaps.com/ is only for website having not more than 500 page. We have more than 10,000 web pages.
-
Hi Ravi,
I personally use Screaming Frog to create the sitemaps. I create an advanced exclude/include list for the crawler, then run a crawl on my website and export the sitemap.
After the sitemap is created I double check it in an editor like notepad++
I hope this will work out for you too.
Keszi
-
Hi Ravi,
You can create xml sitemap by using https://www.xml-sitemaps.com/ but for HTML sitemap I would like to suggest you to create by yourself or ask developer to create.
Thanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemaps:
Hello, doing an audit found in our sitemaps the tag which at the time was to say that the url was mobile. In our case the URL is the same for desktop and mobile.
Technical SEO | | romaro
Do you recommend leaving or removing it?
Thank you!0 -
How much of an issue is JS?
Hey folks, So, I got two pages. Page A has a lot more content but in a tabular format which uses javascript and a Title Tag which is a synonym for our keyword, but not the actual keyword. Page B has less content, and the title tag is the exact keyword phrase we want to rank for. Page A has a bigger backlink profile (though not enormous by any extent). Page A ranks in 30th. Page B ranks in 7th. Importance of Title tag? Importance of JS? Both? Discuss! Cheers, Rhys
Technical SEO | | SwanseaMedicine0 -
Generating a xml sitemap?
Hi What is everyone's preferred method of generating an XML sitemap? Just wondering if one piece of software is better than others?
Technical SEO | | TheZenAgency1 -
Sitemap do they get cleared when its a 404
Hi, Sitemap do they get cleared when its a 404. We have a drupal site and a sitemap that has 60K links and i want to know if in these 4 years we deleted 100's of links and do they have them automatically cleared from Sitemap or we need to build the sitemap again? Thanks
Technical SEO | | mtthompsons0 -
Duplicate Content Issues
We have some "?src=" tag in some URL's which are treated as duplicate content in the crawl diagnostics errors? For example, xyz.com?src=abc and xyz.com?src=def are considered to be duplicate content url's. My objective is to make my campaign free of these crawl errors. First of all i would like to know why these url's are considered to have duplicate content. And what's the best solution to get rid of this?
Technical SEO | | RodrigoVaca0 -
Development Website Duplicate Content Issue
Hi, We launched a client's website around 7th January 2013 (http://rollerbannerscheap.co.uk), we originally constructed the website on a development domain (http://dev.rollerbannerscheap.co.uk) which was active for around 6-8 months (the dev site was unblocked from search engines for the first 3-4 months, but then blocked again) before we migrated dev --> live. In late Jan 2013 changed the robots.txt file to allow search engines to index the website. A week later I accidentally logged into the DEV website and also changed the robots.txt file to allow the search engines to index it. This obviously caused a duplicate content issue as both sites were identical. I realised what I had done a couple of days later and blocked the dev site from the search engines with the robots.txt file. Most of the pages from the dev site had been de-indexed from Google apart from 3, the home page (dev.rollerbannerscheap.co.uk, and two blog pages). The live site has 184 pages indexed in Google. So I thought the last 3 dev pages would disappear after a few weeks. I checked back late February and the 3 dev site pages were still indexed in Google. I decided to 301 redirect the dev site to the live site to tell Google to rank the live site and to ignore the dev site content. I also checked the robots.txt file on the dev site and this was blocking search engines too. But still the dev site is being found in Google wherever the live site should be found. When I do find the dev site in Google it displays this; Roller Banners Cheap » admin dev.rollerbannerscheap.co.uk/ A description for this result is not available because of this site's robots.txt – learn more. This is really affecting our clients SEO plan and we can't seem to remove the dev site or rank the live site in Google. In GWT I have tried to remove the sub domain. When I visit remove URLs, I enter dev.rollerbannerscheap.co.uk but then it displays the URL as http://www.rollerbannerscheap.co.uk/dev.rollerbannerscheap.co.uk. I want to remove a sub domain not a page. Can anyone help please?
Technical SEO | | SO_UK0 -
Benefits to having an HTML sitemap?
We are currently migrating our site to a new CMS and in part of this migration I'm getting push-back from my development team regarding the HTML sitemap. We have a very large news site with 10s of thousands of pages. We currently have an HTML sitemap that greatly helps with distributing PR to article pages, but is not geared towards the user. The dev team doesn't see the benefit to recreating the HTML sitemap despite my assurance that we don't want to lose all these internal links since removing 1000s of links could have a negative impact on our Domain Authority. Should I give in and concede the HTML sitemap since we have an XML one? Or am I right that we don't want to get rid of it?
Technical SEO | | BostonWright0 -
Domain Redirect Issues
Hi, I have a domain that is 10 years old, this is the old domain that used to be the website for the company. The company approximately 7 years ago was bought by another and purchased a new domain that is 7 years old. The company did not do a 301 redirect as they were not aware of the SEO implications. They continued building web applications on the old domain while using the new domain for all marketing and for business partner links. They just put in a server level redirect on the folders themselves to point to the new root. I am on Tomcat, I do not have the option of a 301 redirect as the web applications are all hard coded links (non-relative) (hundreds of thousands of dollars to recode) After beginning SEO; Google is seeing them as the same domain, and has replaced all results in Google with the old domain instead of the new one..... My questions is.... Is it better to take the hit and just put a robots.txt to disallow all robots on the old domain Or... Will that hurt my new domain as well since Google is seeing them as the same? Or.... Has Google already made the switch without a redirect to see these as the same and i should just continue on? (even the cache for the new site shows the old domain address) Old Domain= www.floridahealthcares.com New = www.fhcp.com *****Update after writing this I began changing index.htm to all non relative links so all links on the old domain homepage would point to fhcp.com fixing the issue of the entire site being replicated under the old domain. I think this might "Patch" my issue, but i would still love to get the opinion of others Thanks Shane
Technical SEO | | Jinx146780