Problem of indexing
-
Hello, sorry, I'm French and my English is not necessarily correct.
I have a problem indexing in Google.
Only the home page is referenced: http://bit.ly/yKP4nD.
I am looking for several days but I do not understand why.
I looked at:
-
The robots.txt file is ok
-
The sitemap, although it is in ASP, is valid with Google
-
No spam, no hidden text
-
I made a request for reconsideration via Google Webmaster Tools and it has no penalties
-
We do not have noindex
So I'm stuck and I'd like your opinion.
thank you very much
A.
-
-
Hello Rasmus,
i think it's ok now.
Indexing is better http://bit.ly/yKP4nD
Thank you so much.
Take care
A.
-
Hi,
very interesting, good idea !!!
I think you're right.
I will tell you
Best regards
A.
-
Ah!
I've found it!
You have a canonical link on each page?
| rel="canonical" href="http://www.syrahetcompagnie.com/Default.asp" /> |
This is not so good, as it is on http://www.syrahetcompagnie.com/vins-vallee-du-rhone-nord.htm AND http://www.syrahetcompagnie.com/PBHotNews.asp?PBMInit=1
If you remove that (and keep it on the start page) you should experience a whole lot of indexing in the following days
Best regards
Rasmus
-
You are correct. I've just found this page:
http://www.robotstxt.org/robotstxt.html
It says:
User-agent: *
Disallow:
Allows all robots to all pages.So that was my mistake. I am truly sorry for the confusion.
I will have a look at it later to see if I can find a good explanation...
-
Hi Rasmus,
User-agent: *
Disallow:means that all robots can enter the site
User-agent: *
Disallow: /block all robots to enter.
User-agent: WebCrawler
Disallow:block WebCrawler robot, but other can enter
Always first line of robots.txt tells what robots can crawl a site and * means all. Second and next lines are pointing specific catalogues on a server e.g. Disallow: /admin/
So I think that is not a robots.txt issue - please ensure me
-
Hi again,
Do you use Google Webmaster tools?
In Webmaster tools you can see how many URLs on your site that has been restricted due to robots.txt file. Perhaps that could give you a clue.
I would recommend that you take a look at webmaster tools. All in all there are a lot of good information in there for optimizing your site.
Best regards
Rasmus
-
Thanks for your answer.
OK I will edit the file but I am not convinced that this is causing my problem because it was written that way.
Take care
-
Actually your robots.txt is NOT ok. It says:
Sitemap: http://www.syrahetcompagnie.com/Sitemap.asp?AccID=27018&LangID=0 User-agent: * Disallow: Which means that all pages are to be disallowed. You should have: User-agent: * Allow: /
If you change that, it should fix it!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I disable the indexing of tags in Wordpress?
Hi, I have a client that is publishing 7 or 8 news articles and posts each month. I am optimising selected posts and I have found that they have been adding a lot of tags (almost like using hashtags) . There are currently 29 posts but already 55 tags, each of which has its own archive page, and all of which are added to the site map to be indexed (https://sykeshome.europe.sykes.com/sitemap_index.xml). I came across an article (https://crunchify.com/better-dont-use-wordpress-tags/) that suggested that tags add no value to SEO ranking, and as a consequence Wordpress tags should not be indexed or included in the sitemap. I haven't been able to find much more reliable information on this topic, so my question is - should I get rid of the tags from this website and make the focus pages, posts and categories (redirecting existing tag pages back to the site home page)? It is a relatively new websites and I am conscious of the fact that category and tag archive pages already substantially outnumber actual content pages (posts and news) - I guess this isn't optimal. I'd appreciate any advice. Thanks wMfojBf
Intermediate & Advanced SEO | | JCN-SBWD0 -
Sitemap Indexed Pages, Google Glitch or Problem With Site?
Hello, I have a quick question about our Sitemap Web Pages Indexed status in Google Search Console. Because of the drastic drop I can't tell if this is a glitch or a serious issue. When you look at the attached image you can see that under Sitemaps Web Pages Indexed has dropped suddenly on 3/12/17 from 6029 to 540. Our Index status shows 7K+ indexed. Other than product updates/additions and homepage layout updates there have been no significant changes to this website. If it helps we are operating on the Volusion platform. Thanks for your help! -Ryan rou1zMs
Intermediate & Advanced SEO | | rrhansen0 -
App Index Knowledge Graph Link
Is it possible to App Index within a knowledge graph? I would like to app index my app within the knowledge graph instead of it going to the website but is that possible?
Intermediate & Advanced SEO | | mattdinbrooklyn0 -
How long to re-index a page after being blocked
Morning all! I am doing some research at the moment and am trying to find out, just roughly, how long you have ever had to wait to have a page re-indexed by Google. For this purpose, say you had blocked a page via meta noindex or disallowed access by robots.txt, and then opened it back up. No right or wrong answers, just after a few numbers 🙂 Cheers, -Andy
Intermediate & Advanced SEO | | Andy.Drinkwater0 -
How to speed indexing of web pages after website overhaul.
We have recently overhauled our website and that has meant new urls as we moved from asp to php. we also moved from http to https. The website (https://) has 694 urls submitted through site map with 679 indexed in sitemap of google search console. As we look through the google search console analytics we notice that google index section / index status it says: https://www.xyz.com version - index status 2
Intermediate & Advanced SEO | | Direct_Ram
www.xyz.com version - index status 37
xyz.com version - index status 8 how can we get more pages to be indexed or found by google sooner rather than later as we have lost major traffic. thanks for your help in advance0 -
Links from non-indexed pages
Whilst looking for link opportunities, I have noticed that the website has a few profiles from suppliers or accredited organisations. However, a search form is required to access these pages and when I type cache:"webpage.com" the page is showing up as non-indexed. These are good websites, not spammy directory sites, but is it worth trying to get Google to index the pages? If so, what is the best method to use?
Intermediate & Advanced SEO | | maxweb0 -
XML Sitemap Index Percentage (Large Sites)
Hi all I'm wanting to find out from those who have experience dealing with large sites (10s/100s of millions of pages). What's a typical (or highest) percentage of indexed pages vs. submitted pages you've seen? This information can be found in webmaster tools where Google shows you the pages submitted & indexed for each of your sitemap. I'm trying to figure out whether, The average index % out there There is a ceiling (i.e. will never reach 100%) It's possible to improve the indexing percentage further Just to give you some background, sitemap index files (according to schema.org) have been implemented to improve crawl efficiency and I'm wanting to find out other ways to improve this further. I've been thinking about looking at the URL parameters to exclude as there are hundreds (e-commerce site) to help Google improve crawl efficiency and utilise the daily crawl quote more effectively to discover pages that have not been discovered yet. However, I'm not sure yet whether this is the best path to take or I'm just flogging a dead horse if there is such a ceiling or if I'm already at the average ballpark for large sites. Any suggestions/insights would be appreciated. Thanks.
Intermediate & Advanced SEO | | danng0 -
Canonical Problems
Hi Guys, There is so much info out there about canonical issues and 301 redirects I'm not sure what to do about my problem. Google webmaster says I have over 2000 duplicate page titles. Google is showing most of my pages in duplicate or triplicate url format. Example:
Intermediate & Advanced SEO | | fasctimseo/store/LOVE_OIL_CARIBBEAN_ROSE/
/store/LOVE_OIL_CARIBBEAN_ROSE
/store/love_oil_caribbean_rose/Im using x-cart pro as my cart.When I look at the source code I see each one having a rel=canonical tag with the exact urls you see above. Can someone give me an example of a redirect that I can put in my .htaccess file that would work site wide?I obviously cant go through and 301 this on a page by page basis. It would take a year.Thank You Tim
0