Robots.txt issue with indexation
-
Hello
i have a problem with one of the rules for robots.txt
i have a multilingual mutation of entire page on www.example.com/en/
I want to make indexable /allow/ the main page under /en/
but not indexable /disallow/ everything else under /en/*
Please help me how to write the rule.
-
Well put the rest of the content in a different directory then and disallow that, thats the only other solution I can think of...
-
There is no option like
/en/index.html
The only adress where you can reach the english main page version is www.example.com/en/
-
Name the page you want indexing something and you can use the following:
Disallow: /en/
Allow: /en/index.html
Always test robots.txt in google webmaster tools.
Hope that helps,
Keith
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to check Robots.txt File
I want to know that how we are going to check that the Robots.txt File of the website is working properly. Kindly elaborate the mechanism for it please.
International SEO | | seobac1 -
Issues with Baidu indexing
I have a few issues with one of my sites being indexed in Baidu and not too sure of how to resolve them; 1. Two subdomains were redirected to the root domain, but both (www. and another) subdomains are still indexed after ~4 months. 2. A development subdomain is indexed, despite no longer working (it was taken down a few months back). 3. There's conflicting information on what the best approach is to get HTTPS pages indexed in Baidu and we can't find a good solution. 4. There are hundreds of variations of the home page (and a few other pages) on the main site, where Baidu has indexed lots of parameters. There doesn't appear to be anywhere in their webmaster tools to stop that happening, unlike with Google. I'm not the one who deals directly with this site, but I believe that Baidu's equivalent of Webmaster Tools has been used where possible to correctly index the site. Has anyone else had similar issues and, if so, were you able to resolve them? Thanks
International SEO | | jobhuntinghq0 -
Web Site Migration - Time to Google indexing
Soon we will do a website migration .com.br to .com/pt-br. Wi will do this migration when we have with lower traffic. Trying to follow Google Guidelines, applying the 301 redirect, sitemap etc... I would like to know, how long time the Google generally will use to transfering the relevance of .com.br to .com/pt-br/ using redirect 301?
International SEO | | mobic0 -
Do I have duplicate content issues to be worried about?
Hey guys, We built a website http://www.cylon.com/ targeting different regions but with the same English langauage (Ireland, England and America). The content for the most part is the same set up on 3 different subfolders. http://www.cylon.com/ - Targeting United States in WMT http://www.cylon.com/ie - Targeting Ireland in WMT http://www.cylon.com/uk - Targeting UK in WMT Do I have duplicate content issues to be worried about? If so, how do I get around this issue? Also is there anyway of finding out if Google have in some way penalised these pages for having the same content on other pages trageting different Countries? I have not received any messages from Google in WMT saying there is duplicate so I'm not sure if this is an issue. Thanks Rob
International SEO | | daracreative0 -
Ranking issues for UK vs US spelling - advice please
Hi guys, I'm reaching out here for what may seem to be a very simple and obvious issue, but not something I can find a good answer for. We have a .com site hosted in Germany that serves our worldwide audience. The site is in English, but our business language is British (UK) English. This means that we rank very well for (e.g.) optimisation software but optimization software is nowhere to be found. The cause of this to me seems obvious; a robot reading those two phrases sees two distinct words. Nonetheless, having seen discussions of a similar nature around the use of plurals in keywords, it would seem to me that Google should have this sort of thing covered. Am I right or wrong here? If I'm wrong, then what are my options? I really don't want to have to make a copy of the entire site; apart from the additional effort involved in content upkeep I see this path fraught with duplicate content issues. Any help is very much appreciated, thanks.
International SEO | | StevenHowe0 -
Geo targeting issue and hosting
Hi guys and gals, this is not a problem per se, but an oddity that I would appreciate some insight on from the big juicy brains in this community. Our site had hosting in the US, and I was concerned that therefore our relevance to our own country (Australia) was diminished because of it. For one of our main keywords we were a few spots behind the competitor on the 1st page for an australian searcher, but when i searched the same keyword from Google.com with gl=us to show US only results, we outranked the competitors by a few spots. On page elements aside (if anything we had more geo identifiers on the ranking page in question) I wanted to move hosts anyway and got hosting in Australia. The next week our search traffic jumped by 25%. But it was almost all US traffic. Australian traffic was unchanged. Any idea how this could happen? It's an .AU domain, hosted in Australia, with on page clearly identifying Australia. I checked webmaster tools and our geo is properly set to Australia. I checked the keywords that the traffic increased for and they are not geo specific at all. Besides that I don't know how else to pin this down. Thanks.
International SEO | | Digital3600 -
Is duplicate content really an issue on different International Google engines?
i.e. Google.com v.s. Google.co.uk This relates to another question I have open on a similar issue. So if I open the same e-commerce site (virtually) on company.com and company.co.uk, does Google really view that as duplicate content? I would be inclined to think they have that figured out but I havent had much experience with international SEO...
International SEO | | BlinkWeb0 -
De-Indexing URLs from a specific Locale
Is it possible to de-index a specific URL from showing up in a specific locale? For example, if I want to de-index http://www.example.com/category/product1 from http://www.google.co.uk but not http://www.google.com, is that possible?
International SEO | | craigsmith3330