Problem of indexing
-
Hello, sorry, I'm French and my English is not necessarily correct.
I have a problem indexing in Google.
Only the home page is referenced: http://bit.ly/yKP4nD.
I am looking for several days but I do not understand why.
I looked at:
-
The robots.txt file is ok
-
The sitemap, although it is in ASP, is valid with Google
-
No spam, no hidden text
-
I made a request for reconsideration via Google Webmaster Tools and it has no penalties
-
We do not have noindex
So I'm stuck and I'd like your opinion.
thank you very much
A.
-
-
Hello Rasmus,
i think it's ok now.
Indexing is better http://bit.ly/yKP4nD
Thank you so much.
Take care
A.
-
Hi,
very interesting, good idea !!!
I think you're right.
I will tell you
Best regards
A.
-
Ah!
I've found it!
You have a canonical link on each page?
| rel="canonical" href="http://www.syrahetcompagnie.com/Default.asp" /> |
This is not so good, as it is on http://www.syrahetcompagnie.com/vins-vallee-du-rhone-nord.htm AND http://www.syrahetcompagnie.com/PBHotNews.asp?PBMInit=1
If you remove that (and keep it on the start page) you should experience a whole lot of indexing in the following days
Best regards
Rasmus
-
You are correct. I've just found this page:
http://www.robotstxt.org/robotstxt.html
It says:
User-agent: *
Disallow:
Allows all robots to all pages.So that was my mistake. I am truly sorry for the confusion.
I will have a look at it later to see if I can find a good explanation...
-
Hi Rasmus,
User-agent: *
Disallow:means that all robots can enter the site
User-agent: *
Disallow: /block all robots to enter.
User-agent: WebCrawler
Disallow:block WebCrawler robot, but other can enter
Always first line of robots.txt tells what robots can crawl a site and * means all. Second and next lines are pointing specific catalogues on a server e.g. Disallow: /admin/
So I think that is not a robots.txt issue - please ensure me
-
Hi again,
Do you use Google Webmaster tools?
In Webmaster tools you can see how many URLs on your site that has been restricted due to robots.txt file. Perhaps that could give you a clue.
I would recommend that you take a look at webmaster tools. All in all there are a lot of good information in there for optimizing your site.
Best regards
Rasmus
-
Thanks for your answer.
OK I will edit the file but I am not convinced that this is causing my problem because it was written that way.
Take care
-
Actually your robots.txt is NOT ok. It says:
Sitemap: http://www.syrahetcompagnie.com/Sitemap.asp?AccID=27018&LangID=0 User-agent: * Disallow: Which means that all pages are to be disallowed. You should have: User-agent: * Allow: /
If you change that, it should fix it!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why Is this page de-indexed?
I have dropped out for all my first page KWDs for this page https://www.key.co.uk/en/key/dollies-load-movers-door-skates Can anyone see an issue? I am trying to find one.... We did just migrate to HTTPS but other areas have no problem
Intermediate & Advanced SEO | | BeckyKey0 -
Redirected Old Pages Still Indexed
Hello, we migrated a domain onto a new Wordpress site over a year ago. We redirected (with plugin: simple 301 redirects) all the old urls (.asp) to the corresponding new wordpress urls (non-.asp). The old pages are still indexed by Google, even though when you click on them you are redirected to the new page. Can someone tell me reasons they would still be indexed? Do you think it is hurting my rankings?
Intermediate & Advanced SEO | | phogan0 -
Google indexing wrong pages
We have a variety of issues at the moment, and need some advice. First off, we have a HUGE indexing issue across our entire website. Website in question: http://www.localsearch.com.au/ Firstly
Intermediate & Advanced SEO | | localdirectories
In Google.com.au, if you search for 'plumbers gosford' (https://www.google.com.au/#q=plumbers+gosford), the wrong page appears - in this instance, the page ranking should be http://www.localsearch.com.au/Gosford,NSW/Plumbers I can see this across the board, across multiple locations. Secondly
Recently I've seen Google reporting in 'Crawl Errors' in webmaster tools URLs such as:
http://www.localsearch.com.au/Saunders-Beach,QLD/Electronic-Equipment-Sales-Repairs&Sa=U&Ei=xs-XVJzAA9T_YQSMgIHQCw&Ved=0CIMBEBYwEg&Usg=AFQjCNHXPrZZg0JU3O4yTGjWbijon1Q8OA This is an invalid URL, and more specifically, those query strings seem to be referrer queries from Google themselves: &Sa=U&Ei=xs-XVJzAA9T_YQSMgIHQCw&Ved=0CIMBEBYwEg&Usg=AFQjCNHXPrZZg0JU3O4yTGjWbijon1Q8OA Here's the above example indexed in Google: https://www.google.com.au/#q="AFQjCNHXPrZZg0JU3O4yTGjWbijon1Q8OA" Does anyone have any advice on those 2 errors?0 -
Having problems resolving duplicate meta descriptions
Recently, I’ve recommended to the team running one of our websites that we remove duplicate meta descriptions. The site currently has a large number of these and we’d like to conform to SEO best practice. I’ve seen Matt Cutt’s recent video entitled, ‘Is it necessary for every page to have a meta description’, where he suggests that webmasters use meta descriptions for their most tactically important pages, but that it is better to have no meta description than duplicates. The website currently has one meta description that is duplicated across the entire site. This seemed like a relatively straight forward suggestion but it is proving much more challenging to implement over a large website. The site’s developer has tried to resolve the meta descriptions, but says that the current meta description is a site wide value. It is possible to create 18 distinct replacements for 18 ‘template’ pages, but any sub-pages of these will inherit the value and create more duplicates. Would it be better to: Have no meta descriptions at all across the site? Stick with the status quo and have one meta description site-wide? Make 18 separate meta descriptions for the 18 most important pages, but still have 18 sets of duplicates across the sub-pages of the site. Or…is there a solution to this problem which would allow us to follow the best practice in Matt’s video? Any help would be much appreciated!
Intermediate & Advanced SEO | | RG_SEO0 -
To index or de-index internal search results pages?
Hi there. My client uses a CMS/E-Commerce platform that is automatically set up to index every single internal search results page on search engines. This was supposedly built as an "SEO Friendly" feature in the sense that it creates hundreds of new indexed pages to send to search engines that reflect various terminology used by existing visitors of the site. In many cases, these pages have proven to outperform our optimized static pages, but there are multiple issues with them: The CMS does not allow us to add any static content to these pages, including titles, headers, metas, or copy on the page The query typed in by the site visitor always becomes part of the Title tag / Meta description on Google. If the customer's internal search query contains any less than ideal terminology that we wouldn't want other users to see, their phrasing is out there for the whole world to see, causing lots and lots of ugly terminology floating around on Google that we can't affect. I am scared to do a blanket de-indexation of all /search/ results pages because we would lose the majority of our rankings and traffic in the short term, while trying to improve the ranks of our optimized static pages. The ideal is to really move up our static pages in Google's index, and when their performance is strong enough, to de-index all of the internal search results pages - but for some reason Google keeps choosing the internal search results page as the "better" page to rank for our targeted keywords. Can anyone advise? Has anyone been in a similar situation? Thanks!
Intermediate & Advanced SEO | | FPD_NYC0 -
What problems could arise from updating php version?
i havent really gotten a straight answer yet for this question - my client says: "The developers are skeptical about the possibility to update PHP on our server as this could seriously damage the entire RV site functionality." since i know nothing about php and any potential hazards i have to ask the community to see if there is any validity to these concerns. we cant update our version of WP unless the version of php is first upgraded from 5.1.6 to 5.2.4 client wont do this because developers say its a potential nightmare i , as seo, want a current updated version of WP for many obvious reasons can anyone please tell me what, if any, problems could arise from upgrading the sites php? or is it just a lot of work and the developers are making excuses because they dont want to do it? thanks very much to whoever answers
Intermediate & Advanced SEO | | Ezpro92 -
Member request pages, indexed or no indexed?
We run a service website and basically users of the site post their request to get certain items fixed/serviced. Through Google Analytics we have found that we got lots of traffic to these request pages from people searching for those particular items. E.g. A member's request page: "Cost to fix large Victorian oven" has got many visits from searchers searching for "large Victorian oven". The traffic to these pages is about 40% of our Google organic traffic but didn't covert to more users/requests well and has roughly 67% bounce rate. So my question is: should we keep these pages indexed and if yes what can we do to improve the conversion rate/reduce bounce rate? Many thanks guys. David
Intermediate & Advanced SEO | | sssrpm0 -
How do you explain the problem with several re-directs to a client?
I have a client who has done a lot of link building, and just migrated his site from an old platform to a more seo friendly one, but now he is moving pages on the new site. Old Site --> (301 re-direct) --> New Site --> (301 re-direct) --> Changed Page -->(301 re-direct) Changed page again, etc All his changes are making a lot of etra work for me every month and I feel he is wasting a lot of link juice, How Would you explain to the client why they shouldn't be using several re-directs? What can I do to make sure that they keep as much link juice as possible?
Intermediate & Advanced SEO | | anchorwave0