Issue with site not being properly found in Google
-
We have a website [domain name removed] that is not being properly found in Google. When we run it through Screaming Frog, it indicates that there is a problem with the robot.txt file.
However, I am unsure exactly what this problem is, and why this site is no longer properly being found.
Any help here on how to resolve this would be appreciated!
-
Note: We've edited and removed select links and images in this thread as requested by the OP for privacy.
-
Hi Thomas,
Thanks for all your help here. You've been fantastic!
We have had an issue generating a sitemap for our website using our usual sitemap creation tools. Could you explain why this is?
-
Moderator's Note: Attached images, along with select links in this thread have been edited and/or removed for privacy at the request of the OP.
--
I noticed your robots.txt is fixed but I would recommend two things to get your site back into the index faster based on the photographs below I am suggesting fetching your site as a Google bot as well as adding your XML site map to Webmaster tools.
Please do not forget to add all four versions of your website to webmaster tools if it has not been added
when I say that I mean add every URL below to Google Webmaster tools with and without www
target the site to the fourth or canonical URL. Choose the one with www.
here is a reference from Google
https://support.google.com/webmasters/answer/34592?hl=en&ref_topic=4564315
I would do two things I would add my site map to my robots.txt file because if you're going to use search tools it's going to help you.
You should set up your robots.txt just like this
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php[Sitemap: https://www.website.com/sitemap_index.xml]
you can reference
https://yoast.com/ultimate-guide-robots-txt/
Allow
directiveWhile not in the original “specification”, there was talk of an
allow
directive very early on. Most search engines seem to understand it, and it allows for simple, and very readable directives like this:Disallow: /wp-admin/ Allow: /wp-admin/admin-ajax.php
The only other way of achieving the same result without an
allow
directive would have been to specificallydisallow
every single file in thewp-admin
folder.because you don't want your login to be showing up in Google.
after which I would go into Webmaster tools/search console and fetch as a Google bot
Ask Google to re-crawl your URLs
If you’ve recently made changes to a URL on your site, you can update your web page in Google Search with the_Submit to Index_function of the Fetch as Google tool. Thisfunction allows you to ask Google to crawl and index your URL.
See
http://searchengineland.com/how-to-use-fetch-as-googlebot-like-seo-samurai-214292
https://support.google.com/webmasters/answer/6066468?hl=en
Ask Google to crawl and index your URL
- Click Submit to Index, shown next the status of a recent, successful fetch in the Fetches Table.
- Select** Crawl only this URL **to submit one individual URL to the Google for re-crawling. You can submit up to 500 individual URLs in this way within a 30 day period.
- Select** Crawl this URL and its direct links** to submit the URL as well as all the other pages that URL links to for re-crawling. You can submit up to 10 of requests of this kind within a 30 day period.
- Click Submit to let Google know that your request is ready to be processed.
adding your XML site map to Google Webmaster tools
[https://www.website.com/sitemap_index.xml]
will help Google determined that you are back online you should not see any real fallout from this. And submitting a complete XML site map gets a lot of images into Google images.
I hope this helps,
Tom
-
Hi it seems your robots.txt file is blocking Google and all other bots that search the web and obey robots.txt basically the good ones. So if you would like your site to be seen and indexed by Google and other search engines you need to remove the forward slash "/"
Shown here in your robots.txt file
Block all web crawlers from all content
User-agent: * Disallow: /
Go here to see [
https://www.website.com/robots.txt]-
Please read https://moz.com/learn/seo/robotstxt
-
Use to make the file http://tools.seobook.com/robots-txt/generator/
it looks like you're using WordPress so if you're using Apache or Yoast SEO you can go in and set it to use this I added your xml sitemap https://www.brightonpanelworks.com.au/sitemap_index.xml
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php[Sitemap: https://www.website.com/sitemap_index.xml]
You can use tools like this to analyze & fix robots.txt & can allways see it by adding /robots.txt after the .com or tld.
I hope that helps,
Tom ```
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google + and Schema
I've noticed with a few of the restaurant clients I work with that Schema isn't contributing at all to their SERP -- their Google + page is. Is there any way to have more control over what Google is pulling to help make UX better? I.e. showing photos of the restaurant without a logo, etc.
Intermediate & Advanced SEO | | Anti-Alex0 -
301 issues
Hi, I have this site: www.berenjifamilylaw.com. We did a 301 from the old site: www.bestfamilylawattorney.com to the one above. It's been several weeks now and Google has indexed the new site, but still pulls the old one on search terms like: Los Angeles divorce lawyer. I'm curious, does anyone have experience with this? How long does it take for Google to remove the old site and start serving the new one as a search result? Any ideas or tips would be appreciated. Thanks.
Intermediate & Advanced SEO | | mrodriguez14400 -
Site re-design, full site domain A/B test, will we drop in rankings while leaking traffic
We are re-launching a client site that does very well in Google. The new site is on a www2 domain which we are going to send a controlled amount of traffic to, 10%, 25%, 50%, 75% to 100% over a 5 week period. This will lead to a reduction in traffic to the original domain. As I don't want to launch a competing domain the www2 site will not be indexed until 100% is reached. If Google sees the traffic numbers reducing over this period will we drop? This is the only part I am unsure of as the urls and site structure are the same apart from some new lower level pages which we will introduce in a controlled manner later? Any thoughts or experience of this type of re-launch would be much appreciated. Thanks Pete
Intermediate & Advanced SEO | | leshonk0 -
Does anyone know how to appear with snippet that says something like: Jobs 1-10 of 80 in the beginning of the description on Google? e.g. like on: https://www.google.co.za/#q=pickers+and+packers
Does anyone know how to appear with snippet that says something like: Jobs 1-10 of 80 in the beginning of the description on Google? e.g. like on: https://www.google.co.za/#q=pickers+and+packers Any markup that could be used to be listed like this. Why is some sites listed like this and some not. Why is the adzuna.co.za page listed with Results 1-10 while some other with Jobs 1-10 ?
Intermediate & Advanced SEO | | classifiedtech0 -
New site now links disappearing in Open Site Explorer and GWT
We launched a new site at the beginning of December 2012 and carefully 301'd all URLs from the old site to the new (custom CMS on old site wordpress on new). Our rankings have slipped quite badly but the most worrying thing is that we used to have about 1200 backlinks according to GWT/OSE before the new site launched and now we're down to about 30. Can anyone help shed some light on this please? The site is www.littleoneslondon.co.uk A few things that might help: 1. We were getting a lot of links through our job feeds (it's a nanny recruitment site) on indeed and trovitt, for some reason no new ones from these have appeared in site explorer and all the old jobs are gone completely. 2. We had 1000s of not found errors in google webmaster tools and once these were redirected and marked as fixed this is when the links disappeared. 3. We are getting quite a few 504 errors on the site due to an old proxy redirect (/blog was hosted on a different server on the old site and has not been removed yet), this will be fixed tomorrow but could this be a factor? 4. The developer seems to have redirected all the links through wordpress directly some how (I don't see any redirect plugins but there are lots of pages called 'redirect'). There are no references in the htaccess file for any redirects other than from the /blog folder that the wordpress instance sits in. Sorry for the long post, I hope I've given any details you'd need and I really appreciate any help anyone can give. Thanks, Karl
Intermediate & Advanced SEO | | Bdig0 -
What's the best way to manage content that is shared on two sites and keep both sites in search results?
I manage two sites that share some content. Currently we do not use a cross-domain canonical URL and allow both sites to be fully indexed. For business reasons, we want both sites to appear in results and need both to accumulate PR and other SEO/Social metrics. How can I manage the threat of duplicate content and still make sure business needs are met?
Intermediate & Advanced SEO | | BostonWright0 -
So What On My Site Is Breaking The Google Guidelines?
I have a site that I'm trying to rank for the Keyword "Jigsaw Puzzles" I was originally ranked around #60 or something around there and then all of a sudden my site stopped ranking for that keyword. (My other keyword rankings stayed) Contacted Google via the site reconsideration and got the general response... So I went through and deleted as many links as I could find that I thought Google may not have liked... heck, I even removed links that I don't think I should have JUST so I could have this fixed. I responded with a list of all links I removed and also any links that I've tried to remove, but couldn't for whatever reasons. They are STILL saying my website is breaking the Google guidelines... mainly around links. Can anyone take a peek at my site and see if there's anything on the site that may be breaking the guidelines? (because I can't) Website in question: http://www.yourjigsawpuzzles.co.uk UPDATE: Just to let everyone know that after multiple reconsideration requests, this penalty has been removed. They stated it was a manual penalty. I tried removing numerous different types of links but they kept saying no, it's still breaking rules. It wasn't until I removed some website directory links that they removed this manual penalty. Thought it would be interesting for some of you guys.
Intermediate & Advanced SEO | | RichardTaylor0 -
One site or five sites for geo targeted industry
OK I'm looking to try and generate traffic for people looking for accommodation. I'm a big believer in the quality of the domain being used for SEO both in terms of the direct benefit of it having KW in it but also the effect on CTR a good domain can have. So I'm considering these options: Build a single site using the best, broad KW-rich domain I can get within my budget. This might be something like CheapestHotelsOnline.com Advantages: Just one site to manage/design One site to SEO/market Better potential to resell the site for a few million bucks Build 5 sites, each catering to a different region using 5 matching domains within my budget. These might be domains like CheapHotelsEurope.com, CheapHotelsAsia.com etc Advantages: Can use domains that are many times 'better' by adding a geo-qualifier. This should help with CTR and search Can be more targeted with SEO & Marketing So hopefully you see the point. Is it worth the dilution of SEO & marketing activities to get the better domain names? I'm chasing the longtail searchs whetever I do. So I'll be creating 5K+ pages each targeting a specific area. These would be pages like CheapestHotelsOnline.com/Europe/France/Paris or CheapHoteslEurope.com/France/Paris to target search terms targeting hotels in Paris So with that thought, is SEO even 100% diluted? Say, a link to the homepage of the first option would end up passing 1/5000th of value through to the Paris page. However a link to the second option would pass 1/1000th of the link juice through to the Paris page. So by thet logic, one only needs to do 1/5th of the work for each of the 5 sites ... that implies total SEO work would be the same? Thanks as always for any help! David
Intermediate & Advanced SEO | | OzDave0