Site Maps / Robots.txt etc
-
Hi everyone
I have setup a site map using a Wordpress pluggin: http://lockcity.co.uk/site-map/
Can you please tell me if this is sufficient for the search engines? I am trying to understand the difference between this and having a robots.txt - or do I need both?
Many thanks, Abi
-
Ahh I see thanks so much for your reply - is it best to get an XML site map rather than the one I have?
Thanks, Abi
-
A sitemap is for crawling your pages, either a user sitemap like you set up or a sitemap.xml file which is a streamlined way for Google to spider your site.
Robots.txt is used to direct the bots where NOT to go like if you don't want them crawling certain parts of your site like cgi-bin, js folder or images folder for example.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
On site issues after Magento 2 launch
We did a new site launch on Feb 7th this year - www.vesternet.com It changed from Magento 1 to Magento 2. We had some launch issues around SEO but now we've solved most every crawler issue in Moz reporting - according to Moz we're in better shape on-site than ever. But our organic search is just dropping daily - we expected a drop after launch then back to normal, but over 2 months on something just isn't right. A good example, on Google UK for keyword 'home automation' we've always been about position 10, but now we're out the top 50... Forget about off site for now - what's wrong with our site itself to have caused this? Can anyone help with insights please as this is killing our sales
On-Page Optimization | | dbsmtec1 -
Creating a .cn site with the existing site content
Hi all, I'm planning to create a .cn site. If I simply translate the existing content on my site (.com.au) into Chinese, do you think Google will see the .cn site as a duplicate of the main site? Will this cause any duplicate content issues? Thanks
On-Page Optimization | | QuantumWeb620 -
Redirects for new site new urls?
If redoing a site and updating some of the url's for SEO should you do permanent redirects for the old sites url's ? Using WordPress. I saw that the Yoast Pro plugin allows you to do this inside WordPress , is this the best way? Suggestions.? I know there are old articles written out on the web pointing back to the what will be soon old url's so just wondering what's the best way to go about this. Thanks Scott
On-Page Optimization | | scott3150 -
Internal Duplicate Content/Canonical Issue/ or nothing to worry about
Unfortunately, my developer cannot give me an answer to this so I really do hope someone can help. The homepage of my website is http://www.laddersfree.co.uk however I also have a page http://www.laddersfree.co.uk/index.php that has a page rank and essentially duplicates the home page. Does someone know what this is? Do I need to get my developer to do a 404? It is worrying that he has not come back to me. Thanks Jason
On-Page Optimization | | gymmad0 -
Google Maps API as primary navigation
Is it okay for SEO to have a google maps api as the primary source of navigation? For example, have people find locations on a map instead of links to them. I'm wondering how/if Google views this method, kinda like how Google can't read images. Will Google realize that these pages are linked to from the homepage gmaps API?
On-Page Optimization | | terran0 -
Site Structure
I'm confused about the best way for seo to set up the site structure . i understand the examples of the pyramid diagrams and how link juice flows, however does this mean that global navigation is not good? It appears the pyramid structure leads to the designated number of category pages (we'll use five) and they lead to the 5 content pages etc and some "superman pages" can be linked to from the home page but is this is global navigation or anchor text navigation and is gloval navigation acdeptable for content pages? Any input would be greatly appreciated. Thank you.
On-Page Optimization | | JulB0 -
Using meta robots 'noindex'
Alright, so I would consider myself a beginner at SEO. I've been doing merchandising and marketing for Ecommerce sites for about a year and a half now and am just now starting to attempt to apply some intermediate SEO techniques to the sites I work on so bear with me. We are currently redoing the homepage of our site and I am evaluating what links to have on it. I don't want to lose precious link juice to pages that don't need it, but there are certain pages that we need to have on the homepage that people just won't search for. My question is would it be a good move to add the meta robots 'noindex' tag to these pages? Is my understanding correct that if the only link on the page is back to the homepage it will pass back the linkjuice? Also, how many homepage links are too many? We have a fairly large ecommerce site with a lot of categories we'd like to feature, but don't want to overdo the homepage. I appreciate any help!
On-Page Optimization | | ClaytonKendall0 -
Does Google respect User-agent rules in robots.txt?
We want to use an inline linking tool (LinkSmart) to cross link between a few key content types on our online news site. LinkSmart uses a bot to establish the linking. The issue: There are millions of pages on our site that we don't want LinkSmart to spider and process for cross linking. LinkSmart suggested setting a noindex tag on the pages we don't want them to process, and that we target the rule to their specific user agent. I have concerns. We don't want to inadvertently block search engine access to those millions of pages. I've seen googlebot ignore nofollow rules set at the page level. Does it ever arbitrarily obey rules that it's been directed to ignore? Can you quantify the level of risk in setting user-agent-specific nofollow tags on pages we want search engines to crawl, but that we want LinkSmart to ignore?
On-Page Optimization | | lzhao0