Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Problems in indexing a website built with Magento
-
Hi all
My name is Riccardo and i work for a web marketing agency. Recently we're having some problem in indexing this website www.farmaermann.it which is based on Magento.
In particular considering google web master tools the website sitemap is ok (without any error) and correctly uploaded. However only 72 of 1.772 URL have been indexed; we sent the sitemap on google webmaster tools 8 days ago. We checked the structure of the robots.txt consulting several Magento guides and it looks well structured also.
In addition to this we noticed that some pages in google researches have different titles and they do not match the page title defined in Magento backend.To conclude we can not understand if this indexing problems are related to the website sitemap, robots.txt or something else.
Has anybody had the same kind of problems?Thank you all for your time and consideration
Riccardo
-
Hi Dan!
Thank you very much for your help and suggestions. I will try to follow your guidelines also.
Riccardo
-
Thank you Linda!
We will try and we will see what happens.
Riccardo
-
However, you should allow Google to crawl your JavaScript and CSS (which is now blocked). Here's some background info on that:
-
Hi Riccardo
Yes to confirm the site is indexed and crawlable. Checking the number of URLs from a sitemap that are indexed isn't the most reliable way to see if you content is indexed. You can do a site: search on your domain in Google like this as probably one of the most reliable ways. Also, you can try jus crawling the site with a tool like Screaming Frog SEO Spider - and if the tool can crawl everything, there may be just a delay on Google's end. But in your case now, all looks good!
-Dan
-
Hi Riccardo,
Since I do not know which pages exist on your site, I cannot be a 100% sure. You can remove this though from your robots.txt and see what happens (in Google Search Console & Bing Webmaster Tools).
Allow: /*?p=
Allow: /catalog/seo_sitemap/category/
Allow: /catalogsearch/result/Good luck!
-
Hi Linda!
Unfortunately we didn't develop the website but we have to work on its optimization. Probably you have right about the robots.txt because the sitemaps looks ok. I will try to remove the crawl delay. On the other hand which disallow rules should i remove or which modifies should i do in particular?
Thank you very much for your help!
Riccardo
-
Hi Josh!
Thank you very much for your help!
So probably there is a delay in webmaster tools data. Unfortunately we didn't develop the site but we only work on its optimization so we are a little bit confused with these data. -
Hi Ricardo,
Your home page is indexed.
It is most likely your problems are because of the robots.txt. -> http://www.farmaermann.it/robots.txt
1. You set a crawl delay of 10 seconds for all bots, which is quite long.
User-agent: *
Crawl-delay: 102. Some of your pages are not allowed to be crawled, like this one in your menu: http://www.farmaermann.it/integratori.html and http://www.farmaermann.it/contraccettivi-e-gravidanza.html
Allow: /*?p=
Allow: /catalog/seo_sitemap/category/
Allow: /catalogsearch/result/My advice is to modify your robots.txt: remove the crawl delay (and check whether your server can handle that) and make sure the pages in your menu can be crawled.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google indexed "Lorem Ipsum" content on an unfinished website
Hi guys. So I recently created a new WordPress site and started developing the homepage. I completely forgot to disallow robots to prevent Google from indexing it and the homepage of my site got quickly indexed with all the Lorem ipsum and some plagiarized content from sites of my competitors. What do I do now? I’m afraid that this might spoil my SEO strategy and devalue my site in the eyes of Google from the very beginning. Should I ask Google to remove the homepage using the removal tool in Google Webmaster Tools and ask it to recrawl the page after adding the unique content? Thank you so much for your replies.
Intermediate & Advanced SEO | | Ibis150 -
Check website update frequency?
Is the tools out there that can check our frequently website is updated with new content products? I'm trying to do an SEO analysis between two websites. Thanks in advance Richard
Intermediate & Advanced SEO | | seoman100 -
How to fully index big ecommerce websites (that have deep catalog hierarchy)?
When building very large ecommerce sites, the catalog data can have millions of product SKUs and a massive quantity of hierarchical navigation layers (say 7-10) to get to those SKUs. On such sites, it can be difficult to get them to index substantially. The issue doesn’t appear to be product page content issues. The concern is around the ‘intermediate’ pages -- the many navigation layers between the home page and the product pages that are necessary for a user to funnel down and find the desired product. There are a lot of these intermediate pages and they commonly contain just a few menu links and thin/no content. (It's tough to put fresh-unique-quality content on all the intermediate pages that serve the purpose of helping the user navigate a big catalog.) We've played with NO INDEX, FOLLOW on these pages. But structurally it seems like a site with a lot of intermediate pages containing thin content can result in issues such as shallow site indexing, weak page rank, crawl budget issues, etc. Any creative suggestions on how to tackle this?
Intermediate & Advanced SEO | | AltosDigital-10 -
Best way to remove full demo (staging server) website from Google index
I've recently taken over an in-house role at a property auction company, they have a main site on the top-level domain (TLD) and 400+ agency sub domains! company.com agency1.company.com agency2.company.com... I recently found that the web development team have a demo domain per site, which is found on a subdomain of the original domain - mirroring the site. The problem is that they have all been found and indexed by Google: demo.company.com demo.agency1.company.com demo.agency2.company.com... Obviously this is a problem as it is duplicate content and so on, so my question is... what is the best way to remove the demo domain / sub domains from Google's index? We are taking action to add a noindex tag into the header (of all pages) on the individual domains but this isn't going to get it removed any time soon! Or is it? I was also going to add a robots.txt file into the root of each domain, just as a precaution! Within this file I had intended to disallow all. The final course of action (which I'm holding off in the hope someone comes up with a better solution) is to add each demo domain / sub domain into Google Webmaster and remove the URLs individually. Or would it be better to go down the canonical route?
Intermediate & Advanced SEO | | iam-sold0 -
Infinite Scrolling: how to index all pictures
I have a page where I want to upload 20 pictures that are in a slideshow. Idea is that pictures will only load when users scroll down the page (otherwise too heavy loading). I see documentation on how to make this work and ensure search engines index all content. However, I do not see any documentation how to make this work for 20 pictures in a slideshow. It seems impossible to get a search engines to index all such pictures, when it shows only as users scroll down a page. This is documentation I am already familiar with, and which does not address my issue:
Intermediate & Advanced SEO | | khi5
http://googlewebmastercentral.blogspot.com/2014/02/infinite-scroll-search-friendly.html http://www.appelsiini.net/projects/lazyload http://luis-almeida.github.io/unveil/ thank you0 -
Credit Links on Client Websites
I know there have been several people who have asked this but a lot of them were back in 2012 before many of the google changes. My question is the same though. With all the changes with Google's algorithm. Is it okay to put your link on the bottom of your clients website. Like Web Design by, etc. Part of the reason is to drive traffic but also if someone is actually interested who designed the website, they will click it. But now reading about how bad links can hurt you tremendously, it makes me second guess if this is ok. My gut feeling says, no.
Intermediate & Advanced SEO | | blackrino0 -
Different domains for multilingual website
Hey guys, A site that I'm currently working on as different domains for each website language. So for example: word1word2.com for the english version word3word4.com for the french version word5word6.com for spanish version .... Is it better to move all of the different languages to the same domain and use subfolders for each language /fr/... Please note that the domains being used bring in organic traffic as well as they are EMDs. Thank You.
Intermediate & Advanced SEO | | BruLee0 -
How important is the number of indexed pages?
I'm considering making a change to using AJAX filtered navigation on my e-commerce site. If I do this, the user experience will be significantly improved but the number of pages that Google finds on my site will go down significantly (in the 10,000's). It feels to me like our filtered navigation has grown out of control and we spend too much time worrying about the url structure of it - in some ways it's paralyzing us. I'd like to be able to focus on pages that matter (explicit Category and Sub-Category) pages and then just let ajax take care of filtering products below these levels. For customer usability this is smart. From the perspective of manageable code and long term design this also seems very smart -we can't continue to worry so much about filtered navigation. My concern is that losing so many indexed pages will have a large negative effect (however, we will reduce duplicate content and be able provide much better category and sub-category pages). We probably should have thought about this a year ago before Google indexed everything :-). Does anybody have any experience with this or insight on what to do? Thanks, -Jason
Intermediate & Advanced SEO | | cre80