Google Search Console Block
-
Am new to SEO.
My clients site was completed using Yoast premium and then used Google search console to initiate the crawl.
Initially setup an http:// property and all seemed good. Then i removed that under search console an created an https:// did the render and it appears google has put a block and placed their own robots.txt file which basically has rendered the site useless.
Feedback most appreciated.
-
what is interesting is that i can see that all the individual pages are good in terms of displaying in the browser correctly except the "home" page.
-
No problem, good luck! Moz has plenty of great resources to help you along the way. Be sure to check out the beginners guide to SEO.
-
Ok looks like I have work to do so will focus on these things now...
I was trying to create a rather flat layout with the pages as there are only a few; however; I do have a "services" page and will put the internal links between home page and services and then incorporate that page into the process.
I believe that it could be a wise investment for me at this stage to step back and get Yoast further involved and do a "Gold Review" on the site... this should fill in the gaps and raise my SEO knowledge.
Really appreciate the feedback...
-
Responses to the first 3 questions:
- HTTPS is in place, but a redirect is not in place to push HTTP to HTTPS
- Ok good, keep all Search Console profile intact, it's a good way to identify problems specifically as they relate to HTTP and HTTPS indexing (you don't want both to show)
- This search, site:albertaautosales.com. As you can see when you click that link, you've only got a few URLs indexed, 2 for the homepage, with and without HTTPS.
Now that I have the domain, I see a few problems.
- You have no internal linking - Screaming Frog will not go beyond the homepage. Upon further inspection, the only internal link I saw on the homepage was to a dead URL
- Google isn't creating a robots.txt file for you, there's just nothing for them to crawl as a result of my previous point.
- I cannot view your source code, if I can't see it, chances are Google can't either.
If this currently live version of the site is placeholder for development, I'd recommend putting the old site back out there and working on the new site in a development environment.
-
Hi Logan;
Thanks for reply...
the site is -- https://albertaautosales.com
-
Yes the HTTPS has been setup correctly and is active with no issues on all pages.
-
Yes I realize now that i could have left the http profile. It actually had a complete status and was ranking my key word phrases (also setup a campaign in Moz). I did activate it again however now shows blank pages even though the status is complete.
-
not sure if I get your question 3. Prior to removing and setting up the https profile the site was fine and the google ranking process was occuring...
I have created a help ticket for Google under the Search Console but no idea how prompt they are on responding. Site is simply down just showing some images. From what i can see Google blocked it by applying a very restrictive robots.txt file... but not sure as I am new to this.
Appreciate
-
-
Hi David,
I've got a few questions before I can provide any advice.
- Is the site using HTTPS everywhere?
- Why shut down the HTTP Search Console profile? You should always have all four versions of your domain setup in SC - http/https and www/non-www.
- Have you done a site:domain.com search in Google to verify indexation?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Reason for robots.txt file blocking products on category pages?
Hi I have a website with thosands of products. On the category pages, all the products are linked to with the code “?cgid” in the URL. But “?cgid” is also blocked in the robots.txt file for some reason. So I'm thinking it's stopping all my products getting crawled by Google. Am I right here? Is there any reason why a website would want to limit so many URL's? I'm only here a week and the sites getting great traffic, so don't want to go breaking it!!! Thanks
Web Design | | Frankie-BTDublin0 -
How does Google rank a "Site:yourexamplesite.com" Query
Hi All, Sorry for the potentially confusing title. I am trying to find out how google ranks the pages of your site when you search "site:yourwebsite.com". When I did this with my website I was surprised what pages showed up on the first page, there were sub-category pages in the top 5 results and top level category pages that weren't on the first page. I have been unable to find information as to how google returns these results, is it the same algorithm/factors that make pages rank highly in a regular search, or does it have something to do with how recently google crawled these pages. Any feedback would be helpful. Additionally, if anyone has worked through a similar scenario I would be interested to know if there were any insights you gained from finding out which of your pages google returned first. Thanks for the help! Jason
Web Design | | Jason-Reid0 -
We added hundreds of pages to our website & restructured the layout to include 3 additional locations within the sub-pages, same brand/domain name. How long could Google take to crawl/index the new pages and rank the keywords used within those pages?
We added hundreds of pages to our website & restructured the layout to include 3 additional locations within the sub-pages, same brand/domain name. The 3 locations old domains were redirected to their sites within our main brand domain. How long could Google take to crawl/index the new pages and rank the keywords used within those pages? And possibly increase our domain authority hopefully? We didn't want our brand spread out over multiple websites/domains on the internet. This also allowed for more content to be written on pages, per each of our locations service's, as well.
Web Design | | BurgSimpson0 -
Google tag manager on blocked beta site - will it phone home to Google and cause site to get indexed?
We want to develop a beta site, in a directory with the robots.txt blocking bots. We want to include the Google Tag Manager tags and event layer tracking code on this beta site. My question is that by including the Google Tag Manager code, that phones home to Google, will it cause Google to index this beta site when we don't want it indexed?
Web Design | | CFSSEO0 -
Are URL suffixes ignored by Google? Or is this duplicate content?
Example URLs: www.example.com/great-article-on-dog-hygiene.html www.example.com/great-article-on-dog-hygiene.rt-article.html My IT dept. tells me the second instance of this article would be ignored by Google, but I've found a couple of instances in which Google did index the 'rt-article.html' version of the page. To be fair, I've only found a couple out of MANY. Is it an issue? Thanks, Trisha
Web Design | | lzhao0 -
Any way of showing missed sales in Google Analytics?
Sit down, this might get a little complicated... I was approached by a design company to do some SEO work a client of theirs. Usually, this stuff is white label but I have direct contact with the client as the design agency felt it was easier for me to do this. The website is performing really well and looking at the sales funnel, I'm getting people wanting to buy. BUT, and here's the problem, everything falls apart because of the way the check out works. It's appalling. The customer has to register to buy a product, there's no guest check out or anything. The checkout button is right below the fold and you'd miss it completely if you didn't actually search for it. Basically, it's losing the client money. Last month alone there were 300~ people entering the conversion funnel and NONE of them complete it. I've been talking with the design company and they basically saying that it's too much work for them to change it, it's a signed off project blah blah. UI reports have been conducted and sent to them but still nothing. I have the client asking (a great client, obviously wondering why there is a lack of return on his investment) why he isn't making money. He's asking me because I'm the guy thats meant to be getting him the cash back. I keep saying to the design agency the problems and that it's never going to make money. The potential is massive. But thats my problem. Is there ANY way in GA to calculate the missed sales? I know that I can view the total amount made when the customer successfully checks out but I need figures to present that I'm leading the horse to water, but the check out system is preventing it from drinking. tl;dr I need to show client/design agency missed sales due to poorly built checkout system. Cheers!
Web Design | | jasonwdexter0 -
For a web design firm, should i make a google plus local page or company page?
I have a web design firm located in India, At this moment we are focusing on local clients as the current competition in local market is very low. But in few months we will shift our focus to outsourcing. So I wanted to know if we should make a google plus local page and connect it with my google places account and website or should I make a google plus business page and connect it to website? Our major focus is on seo. Thanks
Web Design | | hard0 -
Next Google Index..?
Hi Guys, Does anybody have an idea when the next Google index is due roughly and if there is anyway I can tell approx when these are due to happen and how would I know? Thanks In advance, Craig Fenton IT
Web Design | | craigyboy0