Search Engine blocked by robots.txt
-
I do that, because i am using joomla. is bad? thanks
-
yeah, the structure of the site can get confusing.
-
Oh yeh this is good for query strings, will not crawl non SEF URLs
So your are good
-
Yes, i have that.
then put:
Block all query
stringsDisallow: /*?
if i don´t put that, the crawler index me x 2 all the web pages.
example: now i have 400 indexed. if i take off that, will index like 800
-
this is a sample of a Joomla Site that i have for robots.txt.
User-agent: * Disallow: /administrator/ Disallow: /cache/ Disallow: /includes/ Disallow: /installation/ Disallow: /language/ Disallow: /libraries/ Disallow: /media/ Disallow: /plugins/ Disallow: /templates/ Disallow: /tmp/ Disallow: /xmlrpc/
-
just put this in robots:
Block all query stringsDisallow: /*?
saing that not index pages with this string. this don´t generate duplicated files.
it´s bad too?
thanks
Regards
Gabo
-
If you need to Index your website and gets rankings, yes this is bad for your website.
This is means that you don't want any Search engine to index your website, hence, people wont find you in the search engines.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Spam URL'S in search results
We built a new website for a client. When I do 'site:clientswebsite.com' in Google it shows some of the real, recently submitted pages. But it also shows many pages of spam url results, like this 'clientswebsite.com/gockumamaso/22753.htm' - all of which then go to the sites 404 page. They have page titles and meta descriptions in Chinese or Japanese too. Some of the urls are of real pages, and link to the correct page, despite having the same Chinese page titles and descriptions in the SERPS. When I went to remove all the spammy urls in Search Console (it only allowed me to temporarily hide them), a whole load of new ones popped up in the SERPS after a day or two. The site files itself are all fine, with no errors in the server logs. All the usual stuff...robots.txt, sitemap etc seems ok and the proper pages have all been requested for indexing and are slowly appearing. The spammy ones continue though. What is going on and how can I fix it?
Technical SEO | | Digital-Murph0 -
Will it be possible to point diff sitemap to same robots.txt file.
Will it be possible to point diff sitemap to same robots.txt file.
Technical SEO | | nlogix
Please advice.0 -
Is sitemap required on my robots.txt?
Hi, I know that linking your sitemap from your robots.txt file is a good practice. Ok, but... may I just send my sitemap to search console and forget about adding ti to my robots.txt? That's my situation: 1 multilang platform which means... ... 2 set of pages. One for each lang, of course But my CMS (magento) only allows me to have 1 robots.txt file So, again: may I have a robots.txt file woth no sitemap AND not suffering any potential SEO loss? Thanks in advance, Juan Vicente Mañanas Abad
Technical SEO | | Webicultors0 -
Robots.txt and Magento
HI, I am working on getting my robots.txt up and running and I'm having lots of problems with the robots.txt my developers generated. www.plasticplace.com/robots.txt I ran the robots.txt through a syntax checking tool (http://www.sxw.org.uk/computing/robots/check.html) This is what the tool came back with: http://www.dcs.ed.ac.uk/cgi/sxw/parserobots.pl?site=plasticplace.com There seems to be many errors on the file. Additionally, I looked at our robots.txt in the WMT and they said the crawl was postponed because the robots.txt is inaccessible. What does that mean? A few questions: 1. Is there a need for all the lines of code that have the “#” before it? I don’t think it’s necessary but correct me if I'm wrong. 2. Furthermore, why are we blocking so many things on our website? The robots can’t get past anything that requires a password to access anyhow but again correct me if I'm wrong. 3. Is there a reason Why can't it just look like this: User-agent: * Disallow: /onepagecheckout/ Disallow: /checkout/cart/ I do understand that Magento has certain folders that you don't want crawled, but is this necessary and why are there so many errors?
Technical SEO | | EcomLkwd0 -
High search volume and low CTR
I have a two words domain with exact match keyword volume of over 1 millions local monthly search and CTR of 0.2% at average position of 6.6. Competition for this keywords is low. Shouldn't I be getting at least 5-10% at that position? Thanks, tb 7IhQQop.png
Technical SEO | | techbytes0 -
Using Robots.txt
I want to Block or prevent pages being accessed or indexed by googlebot. Please tell me if googlebot will NOT Access any URL that begins with my domain name, followed by a question mark,followed by any string by using Robots.txt below. Sample URL http://mydomain.com/?example User-agent: Googlebot Disallow: /?
Technical SEO | | semer0 -
Matching C Block
Hi Guys We have 2 sites that are in the same niche and competing for the same keywords. The sites are on seperate domains one is UK and one is .com They have their own IP's however have both have the same C Block... We have noticed that when the rankings for one site improves the other drops.... Could the C Block be causing this?
Technical SEO | | EwanFisher0 -
New Site Search Critique
Hi I am a huge fan of the SEOMOZ site and this great community which has helped me learn the current SEO skills I have now which are still very basic compared to the pros on the forum. I have tried to follow best practice regarding onsite and technical seo when developing my new site www.cheapfindergames.com and I would really appreciate it if experts on the forum could spare a minute to critique the site from a search perspective please This will give any elements of what onsite and technical SEO I done well and what aspects still need work. I am currently trying to build quality links and social mentions into the site which will take time, and the site has been designed around usability and conversions. Many Thanks Ian
Technical SEO | | ocelot0