Google Search Console Block
-
Am new to SEO.
My clients site was completed using Yoast premium and then used Google search console to initiate the crawl.
Initially setup an http:// property and all seemed good. Then i removed that under search console an created an https:// did the render and it appears google has put a block and placed their own robots.txt file which basically has rendered the site useless.
Feedback most appreciated.
-
what is interesting is that i can see that all the individual pages are good in terms of displaying in the browser correctly except the "home" page.
-
No problem, good luck! Moz has plenty of great resources to help you along the way. Be sure to check out the beginners guide to SEO.
-
Ok looks like I have work to do so will focus on these things now...
I was trying to create a rather flat layout with the pages as there are only a few; however; I do have a "services" page and will put the internal links between home page and services and then incorporate that page into the process.
I believe that it could be a wise investment for me at this stage to step back and get Yoast further involved and do a "Gold Review" on the site... this should fill in the gaps and raise my SEO knowledge.
Really appreciate the feedback...
-
Responses to the first 3 questions:
- HTTPS is in place, but a redirect is not in place to push HTTP to HTTPS
- Ok good, keep all Search Console profile intact, it's a good way to identify problems specifically as they relate to HTTP and HTTPS indexing (you don't want both to show)
- This search, site:albertaautosales.com. As you can see when you click that link, you've only got a few URLs indexed, 2 for the homepage, with and without HTTPS.
Now that I have the domain, I see a few problems.
- You have no internal linking - Screaming Frog will not go beyond the homepage. Upon further inspection, the only internal link I saw on the homepage was to a dead URL
- Google isn't creating a robots.txt file for you, there's just nothing for them to crawl as a result of my previous point.
- I cannot view your source code, if I can't see it, chances are Google can't either.
If this currently live version of the site is placeholder for development, I'd recommend putting the old site back out there and working on the new site in a development environment.
-
Hi Logan;
Thanks for reply...
the site is -- https://albertaautosales.com
-
Yes the HTTPS has been setup correctly and is active with no issues on all pages.
-
Yes I realize now that i could have left the http profile. It actually had a complete status and was ranking my key word phrases (also setup a campaign in Moz). I did activate it again however now shows blank pages even though the status is complete.
-
not sure if I get your question 3. Prior to removing and setting up the https profile the site was fine and the google ranking process was occuring...
I have created a help ticket for Google under the Search Console but no idea how prompt they are on responding. Site is simply down just showing some images. From what i can see Google blocked it by applying a very restrictive robots.txt file... but not sure as I am new to this.
Appreciate
-
-
Hi David,
I've got a few questions before I can provide any advice.
- Is the site using HTTPS everywhere?
- Why shut down the HTTP Search Console profile? You should always have all four versions of your domain setup in SC - http/https and www/non-www.
- Have you done a site:domain.com search in Google to verify indexation?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What website changes (technical) SEOs can ignore confidently? Google's perspective!
Hi community members, I am looking after SEO at our company and there are lots of changes happening about our website; especially technical changes. It's hard for me to look after every deployment of the website like change of server location, etc. We generally agree that every change related to website must be notified by SEO to understand the ranking fluctuation and how search engines welcome them. I just wonder what technical deployments of a website I could confidently ignore to save time and give a go ahead to technical team without interrupting or waiting for my approval. Thanks
Web Design | | vtmoz1 -
Why Is Google Showing My Images Upside Down in the Index?
Hi, My client has PDFs of their catalog on the site which google is indexing. However, it seems that google is taking an image from the catalog and then showing it upside in the index for images/search results. The images are not upside down on the site. Has anyone heard of this happening before or does anyone know a way to fix it? Thanks
Web Design | | AliMac260 -
NO Meta description pulling through in SERP with react website - Requesting Indexing & Submitting to Google with no luck
Hi there, A year ago I launched a website using react, which has caused Google to not read my meta descriptions. I've submitted the sitemap and there was no change in the SERP. Then, I tried "Fetch and Render" and request indexing for the homepage, which did work, however I have over 300 pages and I can't do that for every one. I have requested a fetch, render and index for "this url and linked pages," and while Google's cache has updated, the SERP listing has not. I looked in the Index Coverage report for the new GSC and it says the urls and valid and indexable, and yet there's still no meta description. I realize that Google doesn't have to index all pages, and that Google may not also take your meta description, but I want to make sure I do my due diligence in making the website crawlable. My main questions are: If Google didn't reindex ANYTHING when I submitted the sitemap, what might be wrong with my sitemap? Is submitting each url manually bad, and if so, why? Am I simply jumping the gun since it's only been a week since I requested indexing for the main url and all the linked urls? Any other suggestions?
Web Design | | DigitalMarketingSEO1 -
Fetch data for users with ajax but show it without ajax for Google
Hi, We have a thematic footer which shows similar pages links relevant to the search criteria made on a page. We want to fetch those footer similar links through ajax when users search on site but the links will be shown without using ajax when Google fetches those pages. We want to do this to improve our page load time. The links content & count will be exactly same in both cases whether Google fetches the search pages or user fetches those pages. Will this be treated as negative by Google, Can this have any negative affect on our rankings or traffic. Regards,
Web Design | | vivekrathore0 -
Fixing Render Blocking Javascript and CSS in the Above-the-fold content
We don't have a responsive design site yet, and our mobile site is built through Dudamobile. I know it's not the best, but I'm trying to do whatever we can until we get around to redesigning it. Is there anything I can do about the following Page Speed Insight errors or are they just a function of using Dudamobile? Eliminate render-blocking JavaScript and CSS in above-the-fold content Your page has 3 blocking script resources and 5 blocking CSS resources. This causes a delay in rendering your page.None of the above-the-fold content on your page could be rendered without waiting for the following resources to load. Try to defer or asynchronously load blocking resources, or inline the critical portions of those resources directly in the HTML.Remove render-blocking JavaScript: http://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js http://mobile.dudamobile.com/…ckage.min.js?version=2015-04-02T13:36:04 http://mobile.dudamobile.com/…pts/blogs.js?version=2015-04-02T13:36:04 Optimize CSS Delivery of the following: http://fonts.googleapis.com/…:400|Great+Vibes|Signika:400,300,600,700 http://mobile.dudamobile.com/…ont-pack.css?version=2015-04-02T13:36:04 http://mobile.dudamobile.com/…kage.min.css?version=2015-04-02T13:36:04 http://irp-cdn.multiscreensite.com/kempruge/files/kempruge_0.min.css?v=6 http://irp-cdn.multiscreensite.com/…mpruge/files/kempruge_home_0.min.css?v=6 Thanks for any tips, Ruben
Web Design | | KempRugeLawGroup0 -
Given the lastest Google update, should I rewrite my Flash site or try to present an alternative HTML/CSS site?
I have a site that was created using Flash. The reasoning behind this was, at the time, that I didn't care if the site ranked or not (portfolio site). Now I would like to drive traffic to the site from SE's. Given the Penguin update, should I rewrite my Flash site in HTML/CSS or present an alternative site for bots and browsers that don't support Flash? My concern is that by presenting an alternative site to bots and non Flash supporting browsers that the SE's will see potentially see this as cloaking. Thoughts and advice would be much appreciated.
Web Design | | mj7750 -
How long does Google take to re-cache a site?
Specifically, I just redesigned my site. I'm reading Danny Dovers book, and learned about checking the cache version of the site to see what google is REALLY seeing . . . . . . which evidently is my old site. Obviously, my sites not going to make any real progress with SEO as long as the site is out of date. It says it last checked the site on 5/5 and I launched the site on 5/9. Obviously, it does not do these things immediately, but anyone have any ideas on how long it should take before google starts to show me some love?
Web Design | | damon12120 -
The primary search keywords for our news release network have dropped like a rock in Google... we are not sure why.
Hi, On April 11th, a month after the farmer update was released for U.S. users of Google, the primary keywords for ALL our sites significantly dropped in Google. I have some ideas why, but I wanted to get some second opinions also. First off, I did some research if Google did anything on the 11th of April... they did. They implemented the farmer update internationally, but that does not explain why our ranks did not drop in March for U.S. Google users... unless they rolled out their update based on what site the domain is registered in... in our case, Canada. The primary news release site is www.hotelnewsresource.com, but we have many running on the same server. EG. www.restaurantnewsresource.com, www.travelindustrywire.com and many more. We were number 1 or had top ranks for terms like ¨Hotel News¨, ¨Hotel Industry¨, ¨Hotel Financing¨, ¨Hotel Jobs¨, ¨Hotels for Sale¨, etc... and now, for most of these we have dropped in a big way. It seems that Google has issued a penalty for every internal page we link to. Couple obvious issues with the current template we use... too many links, and we intend to change that asap, but it has never been a problem before. The domain hotelnewsresource.com is 10 years old and still holds a page rank of 6. Secondly, the way our news system works, it´s possible to access an article from any domain in the network. E.G. I can read an article that was assigned to www.hotelnewsresource.com on www.restaurantnewsresource.com... we don´t post links to the irrelevant domain, but it does sometimes get indexed. So, we are going to implement the Google source meta tag option. The bottom line is that I think we put too much faith in the maturity of the domain... thinking that may protect us... not the case and it´s now a big mess. Any insight you can offer would be greatly appreciated. Do you think it was farmer or possibly something else? Thanks, Jarrett
Web Design | | jarrett.mackay0