Robots.txt issue with indexation
-
Hello
i have a problem with one of the rules for robots.txt
i have a multilingual mutation of entire page on www.example.com/en/
I want to make indexable /allow/ the main page under /en/
but not indexable /disallow/ everything else under /en/*
Please help me how to write the rule.
-
Well put the rest of the content in a different directory then and disallow that, thats the only other solution I can think of...
-
There is no option like
/en/index.html
The only adress where you can reach the english main page version is www.example.com/en/
-
Name the page you want indexing something and you can use the following:
Disallow: /en/
Allow: /en/index.html
Always test robots.txt in google webmaster tools.
Hope that helps,
Keith
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap issue
On this link you can see the sitemap: http://bit.ly/2XC9J5G.
International SEO | | vanGoGh-creative
I don't understand what the error is.
From the search console I have the following indication: "The XML Sitemap cannot be analyzed because it contains one or more prefixes of the unassociated namespace. For example, the error is generated when xhtml:linknot preceded by xmlns:xhtml="http://www.w3.org/1999/xhtml" is detected in a Sitemap.
Will you help me, please solve?
Thank you very much.</xhtml:link> Massimiliano0 -
Website relaunched: Both old pages and new pages indexed
Hi all, We have recently made major changes to our website and relaunched it. We have changed URLs of some pages. We have redirected old URLs to new before taking website live. When I check even after one week, still the same old and new pages also indexed at Google. I wonder why still old pages cache is there with Google. Please share your ideas on this. Thanks
International SEO | | vtmoz0 -
MultiRegional site indexing problems
Hello there!!! I have a multiregional site and dealing with some indexing problems. The problem is that google have only indexed our USA site We have: -set up hreflang tags -set up specific subdirectories https://www.website.com/ (en-us site and our main site) https://www.website.com/en-gb https://www.website.com/en-ca https://www.website.com/fr-ca https://www.website.com/fr-fr https://www.website.com/es-es ..... -set up automatic GEO IP redirects (301 redirects) -created a sitemap index and a different sitemap for each regional site -created a google webmaster's tool for each country targeted -created translations for each different language and added some canonicals to the US' site when using English content. The problem is that Google is not indexing our regional sites. I think that the problem is that google is using a US bot when spidering the site, so it will be always redirect to the US version by a 301 redirect. I have used fetch as google with some of our regional folders and asked for "Indexing requested for URL and linked pages", but still waiting. Some ideas?? changing 301 to 302? Really don't know what to do. Thank you so much!!
International SEO | | Alejandrodurn0 -
Why Google is not indexing each country/language subfolder on the ranks?
Hi folks, We use Magento 2 for the multi-country shops (its a multistore). The URL: www.avarcas.com The first days Google indexed the proper url in each country: avarcas.com/uk avarcas.com/de ... Some days later, all the countries are just indexing / (the root). I correctly set the subfolders in Webmaster tools. What's happening? Thanks
International SEO | | administratorwibee0 -
Do I have duplicate content issues to be worried about?
Hey guys, We built a website http://www.cylon.com/ targeting different regions but with the same English langauage (Ireland, England and America). The content for the most part is the same set up on 3 different subfolders. http://www.cylon.com/ - Targeting United States in WMT http://www.cylon.com/ie - Targeting Ireland in WMT http://www.cylon.com/uk - Targeting UK in WMT Do I have duplicate content issues to be worried about? If so, how do I get around this issue? Also is there anyway of finding out if Google have in some way penalised these pages for having the same content on other pages trageting different Countries? I have not received any messages from Google in WMT saying there is duplicate so I'm not sure if this is an issue. Thanks Rob
International SEO | | daracreative0 -
Google Indexing Part two
Hi Everybody, I am trying to understand how does Google works, so I ve been reading and researching a lot. But I am still having a problem that I cannot solve. My website is in several languages, but its main language is Catalan. so if you get into my webite: "www.vallnord.com" the default language will be catalan. but if someone using Google.es in Spanish I would like the spanish version of the web to be the main result not the catalan. unfortunately this does not work like this. For a search query like "Esqui Andorra" the catalan version is on the 1st page and the spanish (www.vallnord.com/es) is on the 4th page. Does anybody know why is this happening or how can I solve it? Regards.
International SEO | | SilbertAd0 -
Geotargetting Issues
I have a different problem then most. My international website (www.solmelia.com) is showing number one in english for "sol melia" in the Mexican google search engine. Plus the 3rd listing on google.com.mx is our homepage in spanish but it is showing up as a 401. We need to redirect the ccTLD (www.solmelia.es) to our current spanish version that is actually a subdomain (es.solmelia.com). Please let me know how I can fix both issues.
International SEO | | Melia0 -
Geo Targeting for Similar Sites to Specific Countries in Google's Index
I was hoping Webmaster Tools geo targeting would prevent this - I'm seeing in select google searches several pages indexed from our Australian website. Both sites have unique TLDs: barraguard.com barraguard.com.au I've attached a screenshot as an example. The sites are both hosted here in the U.S. at our data center. Are there any other methods for preventing Google and other search engines from indexing the barraguard.com.au pages in searches that take place in the U.S.? dSzoh.jpg
International SEO | | longbeachjamie0