Hi all,I'm facing a strange issue with a website: www.foodmood.itI've written this robots.txt: www.foodmood.it/robots.txt in which I'm using all commands I've tested and used on other websites.The problem is that this website is not indexing correctly, and my fear is that I've written the robots in a wrong way.In this forum I've been taught that this command Disallow: /*? should be substituted with Disallow: /? in order to be sure the crawler is not indexing all the URLs containing a question mark.My question is: do you think this command modified in the above way can cause a problem to my website?I repeat that I've used it on several sites and this is the first time I'm having this kind of problems.Thanks in advance for your help.A.
Posts made by OptimizedGroup
-
Issue with a particular command of robots.txt! Help needed!
-
Dynamic Content Boxes: how to use them without get Duplicate Content Penalty?
Hi everybody,
I am starting a project with a travelling website which has some standard category pages like Last Minute, Offers, Destinations, Vacations, Fly + Hotel.
Every category has inside a lot of destinations with relative landing pages which will be like: Last Minute New York, Last Minute Paris, Offers New York, Offers Paris, etc.
My question is: I am trying to simplify my job thinking about writing some dynamic content boxes for Last Minute, Offers and the other categories, changing only the destination city (Rome, Paris, New York, etc) repeated X types in X different combinations inside the content box.
In this way I would simplify a lot my content writing for the principal generic landing pages of each category but I'm worried about getting penalized for Duplicate Content.
Do you think my solution could work? If not, what is your suggestion? Is there a rule for categorize a content as duplicate (for example number of same words in a row, ...)?
Thanks in advance for your help!
A.
-
Help with robots.txt on Magento
Hi everybody,
I need your help in order to fix some problems with HTML errors and Crawling errors generated by Magento on my client's website www.casabiancheria.it
I have some problems with duplicate meta informations due to the fact that there are a lot of links such as
-
/stampe-romagnole/tovaglie-con-tovaglioli**/colore/**beige,marrone,giallo,lilla/show/all.html
-
/stampe-romagnole/tovaglie-con-tovaglioli**/colore/**beige,marrone,lilla/show/all.html
that are generated by the filter /colore/ and so they have duplicate content and meta information on them.
I activated the canonicals on Magento but this hasn't fixed the problem yet.
On the sitemap there are only 1 link for each product, so it seems that the canonicals are working, but bot Google Webmaster Tools and SEO Moz are giving me errors on duplicate content and meta informations.
I would like to solve these problems by excluding from robots.txt all the urls that contain the filter parameters, such as /colore/, /price/, /dimensions/, etc. (take a look to the attachment).
I tried different solutions in order to exclude these links from robots, but I wasn't able to succeed.
Below you can find my current robots.txt... can someone help me in order to write the correct form of this file and finally exclude all these urls generated by filters on Magento?
Finally, is it worth it to exclude also the images from Magento? (take a look to the final lines of the robots below).
Thank you very much for your help!
Alberto
User-agent: *
Disallow: /CVS
Disallow: /.svn$
Disallow: /.idea$
Disallow: /.sql$
Disallow: /.tgz$
Disallow: /w1nL1f3L0g1c/
Disallow: /app/
Disallow: /downloader/
Disallow: /errors/
Disallow: /includes/
Disallow: /lib/
Disallow: /pkginfo/
Disallow: /shell/
Disallow: /var/
Disallow: /404/
Disallow: /cgi-bin/
Disallow: /magento/
Disallow: /report/
Disallow: /scripts/
Disallow: /shell/
Disallow: /skin/
Disallow: /stats/
Disallow: /api.php
Disallow: /cron.php
Disallow: /cron.sh
Disallow: /error_log
Disallow: /get.php
Disallow: /install.php
Disallow: /LICENSE.html
Disallow: /LICENSE.txt
Disallow: /LICENSE_AFL.txt
Disallow: /README.txt
Disallow: /RELEASE_NOTES.txt
Disallow: /?dir
Disallow: /?dir=desc
Disallow: /?dir=asc
Disallow: /?limit=all
Disallow: /?mode*
Disallow: /index.php/
Disallow: /?SID=
Disallow: /checkout/
Disallow: /onestepcheckout/
Disallow: /customer/
Disallow: /customer/account/
Disallow: /customer/account/login/
Disallow: /catalogsearch/
Disallow: /catalog/product_compare/
Disallow: /catalog/category/view/
Disallow: /catalog/product/view/
Disallow: /cgi-bin/
Disallow: /cleanup.php
Disallow: /apc.php
Disallow: /memcache.php
Disallow: /phpinfo.php
Disallow: /control/
Disallow: /customize/
Disallow: /newsletter/
Disallow: /poll/
Disallow: /review/
Disallow: /sendfriend/
Disallow: /tag/
Disallow: /wishlist/
Disallow: /catalog/product/gallery/
Disallow: /?*
Disallow: //colore/
Disallow: //price/
Disallow: //misura/
Disallow: //marca/
Disallow: //sort-by/
Disallow: //combinazione/
Disallow: /*/seleziona-colore/
Disallow: /colore/
Disallow: /price/
Disallow: /misura/
Disallow: /marca/
Disallow: /sort-by/
Disallow: /combinazione/
Disallow: /seleziona-colore/
Disallow: /*colore/
Disallow: /*price/
Disallow: /*misura/
Disallow: /*marca/
Disallow: /*sort-by/
Disallow: /*combinazione/
Disallow: /*seleziona-colore/ -
-
RE: Problem of possible duplicate title tag and description. Help me!
Thank you very much for your answer Andy!
-
Problem of possible duplicate title tag and description. Help me!
Hi everybody,
I'm optimizing this huge website that has a lot of identical categories for differente locations.
I'm trying to find a smart way to write title and description for these categories, changing the location as a variable on the title and description phrase. Here some examples:
Title: Attractions in [CITY]. Sightseeings, monuments and museums in [CITY].
Description: Find travel ideas and suggestions for [CITY]. On [NAME OF THE WEBSITE] you can find a lot of attractions, monuments and sightseeing off the beaten path in [CITY].
Changing only the name of the CITY on these Titles and Descriptions, am I running the risk of duplicate title and description?
Thanks in advance for your help!
-
RE: Good rankings with Google but difficulties to enter on Top 50 with Bing and Yahoo?
Thank you so much for your answers! I'll keep working on good links and good on page SEO.
-
Good rankings with Google but difficulties to enter on Top 50 with Bing and Yahoo?
Hi guys, I'm facing this problem with some websites that I'm trying to optimize for the search engines. I've worked on a good SEO on Page and I've had not so many difficulties to get good ranking results on Google. The problem is that I'm not having any result with Yahoo and Bing, even if I've sent the sitemap to the Bing Webmaster Toll and I'm keeping under control the possible crawling errors both on Google and Bing Webmaster Tool. Can someone please help me with an explanation?