Problems in indexing a website built with Magento
-
Hi all
My name is Riccardo and i work for a web marketing agency. Recently we're having some problem in indexing this website www.farmaermann.it which is based on Magento.
In particular considering google web master tools the website sitemap is ok (without any error) and correctly uploaded. However only 72 of 1.772 URL have been indexed; we sent the sitemap on google webmaster tools 8 days ago. We checked the structure of the robots.txt consulting several Magento guides and it looks well structured also.
In addition to this we noticed that some pages in google researches have different titles and they do not match the page title defined in Magento backend.To conclude we can not understand if this indexing problems are related to the website sitemap, robots.txt or something else.
Has anybody had the same kind of problems?Thank you all for your time and consideration
Riccardo
-
Hi Dan!
Thank you very much for your help and suggestions. I will try to follow your guidelines also.
Riccardo
-
Thank you Linda!
We will try and we will see what happens.
Riccardo
-
However, you should allow Google to crawl your JavaScript and CSS (which is now blocked). Here's some background info on that:
-
Hi Riccardo
Yes to confirm the site is indexed and crawlable. Checking the number of URLs from a sitemap that are indexed isn't the most reliable way to see if you content is indexed. You can do a site: search on your domain in Google like this as probably one of the most reliable ways. Also, you can try jus crawling the site with a tool like Screaming Frog SEO Spider - and if the tool can crawl everything, there may be just a delay on Google's end. But in your case now, all looks good!
-Dan
-
Hi Riccardo,
Since I do not know which pages exist on your site, I cannot be a 100% sure. You can remove this though from your robots.txt and see what happens (in Google Search Console & Bing Webmaster Tools).
Allow: /*?p=
Allow: /catalog/seo_sitemap/category/
Allow: /catalogsearch/result/Good luck!
-
Hi Linda!
Unfortunately we didn't develop the website but we have to work on its optimization. Probably you have right about the robots.txt because the sitemaps looks ok. I will try to remove the crawl delay. On the other hand which disallow rules should i remove or which modifies should i do in particular?
Thank you very much for your help!
Riccardo
-
Hi Josh!
Thank you very much for your help!
So probably there is a delay in webmaster tools data. Unfortunately we didn't develop the site but we only work on its optimization so we are a little bit confused with these data. -
Hi Ricardo,
Your home page is indexed.
It is most likely your problems are because of the robots.txt. -> http://www.farmaermann.it/robots.txt
1. You set a crawl delay of 10 seconds for all bots, which is quite long.
User-agent: *
Crawl-delay: 102. Some of your pages are not allowed to be crawled, like this one in your menu: http://www.farmaermann.it/integratori.html and http://www.farmaermann.it/contraccettivi-e-gravidanza.html
Allow: /*?p=
Allow: /catalog/seo_sitemap/category/
Allow: /catalogsearch/result/My advice is to modify your robots.txt: remove the crawl delay (and check whether your server can handle that) and make sure the pages in your menu can be crawled.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why do I have so many extra indexed pages?
Stats- Webmaster Tools Indexed Pages- 96,995 Site: Search- 97,800 Pages Sitemap Submitted- 18,832 Sitemap Indexed- 9,746 I went through the search results through page 28 and every item it showed was correct. How do I figure out where these extra 80,000 items are coming from? I tried crawling the site with screaming frog awhile back but it locked because of so many urls. The site is a Magento site so there are a million urls, but I checked and all of the canonicals are setup properly. Where should I start looking?
Intermediate & Advanced SEO | | Tylerj0 -
Check website update frequency?
Is the tools out there that can check our frequently website is updated with new content products? I'm trying to do an SEO analysis between two websites. Thanks in advance Richard
Intermediate & Advanced SEO | | seoman100 -
Product Pages not indexed by Google
We built a website for a jewelry company some years ago, and they've recently asked for a meeting and one of the points on the agenda will be why their products pages have not been indexed. Example: http://rocks.ie/details/Infinity-Ring/7170/ I've taken a look but I can't see anything obvious that is stopping pages like the above from being indexed. It has a an 'index, follow all' tag along with a canonical tag. Am I missing something obvious here or is there any clear reason why product pages are not being indexed at all by Google? Any advice would be greatly appreciated. Update I was told 'that each of the product pages on the full site have corresponding page on mobile. They are referred to each other via cannonical / alternate tags...could be an angle as to why product pages are not being indexed.'
Intermediate & Advanced SEO | | RobbieD910 -
Website not ranking
Firstly, apologies for the long winded question. I'm 'newish' to SEO We have a website built on Magento , www.excelclothing.com We have been online for 5 years and had reasonable success. Having used a few SEO companies in the past we found ourselves under a 'partial manual penalty' early last year. By July we were out of penalty. We have been gradually working our way through getting rid of 'spammy' links. Currently the website ranks for a handful of non competitive keywords looking at the domain on SEM RUSH. This has dropped drastically over the last 2 years. Our organic traffic over the last 2-3 years has seen no 'falling off a cliff' and has maintained a similar pattern. I've been told so many lies by SEO companies trying to get into my wallet I'm not sure who to believe. We have started to add content onto all our Category pages to make more unique although most of our Meta Descriptions are a 'boiler plate' template. I'm wondering.... Am I still suffering from Penquin ? Am I trapped by Panda and if so how can I know that? Do I need more links removed? How can I start to rank for more keywords I have a competitor online with the same DA, PA and virtually same number of links but they rank for 3500 keywords in the top 20. Would welcome any feedback. Many Thanks.
Intermediate & Advanced SEO | | wgilliland1 -
Google Indexing of Images
Our site is experiencing an issue with indexation of images. The site is real estate oriented. It has 238 listings with about 1190 images. The site submits two version (different sizes) of each image to Google, so there are about 2,400 images. Only several hundred are indexed. Can adding Microdata improve the indexation of the images? Our site map is submitting images that are on no-index listing pages to Google. As a result more than 2000 images have been submitted but only a few hundred have been indexed. How should the site map deal with images that reside on no-index pages? Do images that are part of pages that are set up as "no-index" need a special "no-index" label or special treatment? My concern is that so many images that not indexed could be a red flag showing poor quality content to Google. Is it worth investing in correcting this issue, or will correcting it result in little to no improvement in SEO? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
Effects of having both http and https on my website
You are able to view our website as either http and https on all pages. For example: You can type "http://mywebsite.com/index.html" and the site will remain as http: as you navigate the site. You can also type "https://mywebsite.com/index.html" and the site will remain as https: as you navigate the site. My question is....if you can view the entire site using either http or https, is this being seen as duplicate content/pages? Does the same hold true with "www.mywebsite.com" and "mywebsite.com"? Thanks!
Intermediate & Advanced SEO | | rexjoec1 -
Remove content that is indexed?
Hi guys, I want to delete a entire folder with content indexed, how i can explain to google that content no longer exists?
Intermediate & Advanced SEO | | Valarlf0 -
How does a competing website with clearly black hat style SEO tactics, have a far higher domain authority than our website that only uses legitimate link building tactics?
Through SEO Moz link analysis tools, we looked at a competing websites external followed links and discovered a large number of links going to Blog pages with domain authorities in the 90's (their blog page authorities were between 40 and 60), however the single blog post written by this website was exactly the same in every instance and had been posted in August 2011. Some of these blog sites had 160 or so links linking back to this competing website whose domain authority is 49 while ours is 28, their Moz Trust is 5.43 while ours is 5.18. An example of some of the blogs that link to the competing website are: http://advocacy.mit.edu/coulter/blog/?p=13 http://pest-control-termite-inspection.posterous.com/\ However many of these links are "no follow" and yet still show up on Open Site Explorer as some of this competing websites top linking pages. Admittedly, they have 584 linking root domains while we have only 35, but if most of them are the kind of websites posted above, we don't understand how Google is rewarding them with a higher domain authority. Our website is www.anteater.com.au Are these tactics now the only way to get ahead?
Intermediate & Advanced SEO | | Peter.Huxley590