Problem of indexing
-
Hello, sorry, I'm French and my English is not necessarily correct.
I have a problem indexing in Google.
Only the home page is referenced: http://bit.ly/yKP4nD.
I am looking for several days but I do not understand why.
I looked at:
-
The robots.txt file is ok
-
The sitemap, although it is in ASP, is valid with Google
-
No spam, no hidden text
-
I made a request for reconsideration via Google Webmaster Tools and it has no penalties
-
We do not have noindex
So I'm stuck and I'd like your opinion.
thank you very much
A.
-
-
Hello Rasmus,
i think it's ok now.
Indexing is better http://bit.ly/yKP4nD
Thank you so much.
Take care
A.
-
Hi,
very interesting, good idea !!!
I think you're right.
I will tell you
Best regards
A.
-
Ah!
I've found it!
You have a canonical link on each page?
| rel="canonical" href="http://www.syrahetcompagnie.com/Default.asp" /> |
This is not so good, as it is on http://www.syrahetcompagnie.com/vins-vallee-du-rhone-nord.htm AND http://www.syrahetcompagnie.com/PBHotNews.asp?PBMInit=1
If you remove that (and keep it on the start page) you should experience a whole lot of indexing in the following days
Best regards
Rasmus
-
You are correct. I've just found this page:
http://www.robotstxt.org/robotstxt.html
It says:
User-agent: *
Disallow:
Allows all robots to all pages.So that was my mistake. I am truly sorry for the confusion.
I will have a look at it later to see if I can find a good explanation...
-
Hi Rasmus,
User-agent: *
Disallow:means that all robots can enter the site
User-agent: *
Disallow: /block all robots to enter.
User-agent: WebCrawler
Disallow:block WebCrawler robot, but other can enter
Always first line of robots.txt tells what robots can crawl a site and * means all. Second and next lines are pointing specific catalogues on a server e.g. Disallow: /admin/
So I think that is not a robots.txt issue - please ensure me
-
Hi again,
Do you use Google Webmaster tools?
In Webmaster tools you can see how many URLs on your site that has been restricted due to robots.txt file. Perhaps that could give you a clue.
I would recommend that you take a look at webmaster tools. All in all there are a lot of good information in there for optimizing your site.
Best regards
Rasmus
-
Thanks for your answer.
OK I will edit the file but I am not convinced that this is causing my problem because it was written that way.
Take care
-
Actually your robots.txt is NOT ok. It says:
Sitemap: http://www.syrahetcompagnie.com/Sitemap.asp?AccID=27018&LangID=0 User-agent: * Disallow: Which means that all pages are to be disallowed. You should have: User-agent: * Allow: /
If you change that, it should fix it!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why Aren't My Images Being Indexed?
Hi, One of my clients submitted an image sitemap with 465 images. It was submitted on July 20 2017 to Google Search Console. None of the submitted images have been indexed. I'm wondering why? Here's the image sitemap: http://www.tagible.com/images_sitemap.xml We do use a CDN for the images, and the images are hosted on a subdomain of the client's site: ex. https://photos.tagible.com/images/Les_Invalides_Court_Of_Honor.jpg Thanks in advance! Cheers,
Intermediate & Advanced SEO | | SEOdub
Julian0 -
No-Indexing on Ecommerce site
Hi Our site has a lot of similar/lower quality product pages which aren't a high priority - so these probably won't get looked at in detail to improve performance as we have over 200,000 products . Some of them do generate a small amount of revenue, but an article I read suggested no-indexing pages which are of little value to improve site performance & overall structure. I wanted to find out if anyone had done this and what results they saw? Will this actually improve rankings of our focus areas? It makes me a bit nervous to just block pages so any advice is appreciated 🙂
Intermediate & Advanced SEO | | BeckyKey0 -
Facets Being Indexed - What's the Impact?
Hi Our facets are from what I can see crawled by search engines, I think they use javascript - see here http://www.key.co.uk/en/key/lockers I want to get this fixed for SEO with an ajax solution - I'm not sure how big this job is for developers, but they will want to know the positive impact this could have & whether it's worth doing. Does anyone have any opinions on this? I haven't encountered this before so any help is welcome 🙂
Intermediate & Advanced SEO | | BeckyKey0 -
Travel site schema problems
I'm developing a travel site for my home state (Kansas - http://www.kansasisbeautiful.com though it's still has some development being worked on), but struggling to find scheme to work with for some items. So far my site is laid out by both region (northeast Kansas, western Kansas, etc.) and location types (waterfalls, parks, etc.). I'm currently working on coding in schema markup. I've found schema types for waterfalls, parks and landmarks, but I'm struggling to find anything for scenic drives (or highways, drives, anything related), hiking/biking trails and regions (northeast Kansas, southeast Kansas, etc.) The question I have is: What can I do to still try and put some kind of markup when there's nothing available that fits the item I'm trying to markup?
Intermediate & Advanced SEO | | msphoto0 -
Trying to find example of in app indexing in SERPs
My colleague who is a developer is trying to find an example of in apps being indexed in the SERPs. Does anybody know of any examples? Thanks+
Intermediate & Advanced SEO | | RosemaryB0 -
Google Is Indexing My Internal Search Results - What should i do?
Hello, We are using a CMS/E-Commerce platform which isn't really built with SEO in mind, this has led us to the following problem.... a large number of internal (product search) search result pages, which aren't "search engine friendly" or "user friendly", are being indexed by google and are driving traffic to the site, generating our client revenue. We want to remove these pages and stop them from being indexed, replacing them with static category pages - essentially moving the traffic from the search results to static pages. We feel this is necessary as our current situation is a short-term (accidental) win and later down the line as more pages become indexed we don't want to incur a penalty . We're hesitant to do a blanket de-indexation of all ?search results pages because we would lose revenue and traffic in the short term, while trying to improve the rankings of our optimised static pages. The idea is to really move up our static pages in Google's index, and when their performance is strong enough, to de-index all of the internal search results pages. Our main focus is to improve user experience and not have customers enter the site through unexpected pages. All thoughts or recommendations are welcome. Thanks
Intermediate & Advanced SEO | | iThinkMedia0 -
Google Not Indexing XML Sitemap Images
Hi Mozzers, We are having an issue with our XML sitemap images not being indexed. The site has over 39,000 pages and 17,500 images submitted in GWT. If you take a look at the attached screenshot, 'GWT Images - Not Indexed', you can see that the majority of the pages are being indexed - but none of the images are. The first thing you should know about the images is that they are hosted on a content delivery network (CDN), rather than on the site itself. However, Google advice suggests hosting on a CDN is fine - see second screenshot, 'Google CDN Advice'. That advice says to either (i) ensure the hosting site is verified in GWT or (ii) submit in robots.txt. As we can't verify the hosting site in GWT, we had opted to submit via robots.txt. There are 3 sitemap indexes: 1) http://www.greenplantswap.co.uk/sitemap_index.xml, 2) http://www.greenplantswap.co.uk/sitemap/plant_genera/listings.xml and 3) http://www.greenplantswap.co.uk/sitemap/plant_genera/plants.xml. Each sitemap index is split up into often hundreds or thousands of smaller XML sitemaps. This is necessary due to the size of the site and how we have decided to pull URLs in. Essentially, if we did it another way, it may have involved some of the sitemaps being massive and thus taking upwards of a minute to load. To give you an idea of what is being submitted to Google in one of the sitemaps, please see view-source:http://www.greenplantswap.co.uk/sitemap/plant_genera/4/listings.xml?page=1. Originally, the images were SSL, so we decided to reverted to non-SSL URLs as that was an easy change. But over a week later, that seems to have had no impact. The image URLs are ugly... but should this prevent them from being indexed? The strange thing is that a very small number of images have been indexed - see http://goo.gl/P8GMn. I don't know if this is an anomaly or whether it suggests no issue with how the images have been set up - thus, there may be another issue. Sorry for the long message but I would be extremely grateful for any insight into this. I have tried to offer as much information as I can, however please do let me know if this is not enough. Thank you for taking the time to read and help. Regards, Mark Oz6HzKO rYD3ICZ
Intermediate & Advanced SEO | | edlondon0 -
Indexing a several millions pages new website
Hello everyone, I am currently working for a huge classified website who will be released in France in September 2013. The website will have up to 10 millions pages. I know the indexing of a website of such size should be done step by step and not in only one time to avoid a long sandbox risk and to have more control about it. Do you guys have any recommandations or good practices for such a task ? Maybe some personal experience you might have had ? The website will cover about 300 jobs : In all region (= 300 * 22 pages) In all departments (= 300 * 101 pages) In all cities (= 300 * 37 000 pages) Do you think it would be wiser to index couple of jobs by couple of jobs (for instance 10 jobs every week) or to index with levels of pages (for exemple, 1st step with jobs in region, 2nd step with jobs in departements, etc.) ? More generally speaking, how would you do in order to avoid penalties from Google and to index the whole site as fast as possible ? One more specification : we'll rely on a (big ?) press followup and on a linking job that still has to be determined yet. Thanks for your help ! Best Regards, Raphael
Intermediate & Advanced SEO | | Pureshore0