Your search - site:domain.com - did not match any documents.
-
I've recently started work on a new clients website and done some preliminary work with on-page optimisation, and there is still plenty of work to be done and issues to resolve. They are ranking ok on Bing, but they are not getting any ranking on Google at all (except paid) - I tried the site:domain.com search and comes up with no results... so this confirms that something is going on with the google search rank!
Can anyone shed light on what can cause this or why this would happen?
My next step is to look at their webmaster tools (haven't had access yet), but if anyone has any tips to resolve this or where to look, that would be great!
Thanks!
-
Thanks again for your help! I will give those ideas a go.
I hope to get to the bottom of it, if for nothing else than to learn more!
Cheers.
-
Hey,
It depends on the penalty, if any.
If you have no manual actions under the Webmaster Tools, that's a hint. However, it could be an algorithmic penalty.
If the penalty, again, if any, applies to the whole site, then changing the site's contents while making sure your entire site (backlinks too) is in compliance with Google's quality guidelines, then the penalty should be revoked.
If the issue is actually only the fact that Google can't access the site, then check why, fix that ASAP and you should be ranking again in no time (check using the fetch as Googlebot to make sure that is/isn't it first).
To sum up, you should run an extensive analysis on links, content, server responses errors and find the cause of the "penalty", then work on fixing it to start ranking. Once you do, you can continue with the other SEO/design tasks.
As I said before, opening a thread in Google's Webmaster Help forums could be of much help.
All the best!
-
Thanks again for all your helpful suggestions. Here's an update on this...
Access to GWT and analytics and some more Moz tracking have revealed some server connectivity and crawl errors on the site. So I'm thinking the bots are having trouble accessing the site and hence are penalising... Bing is sill ok strangely!
At this stage there is a hold on resolving this as we are also in the process of developing a new site for this client - so we plan to now just focus on getting this site live and hopefully all the crawl errors etc will be flushed out.
One last question - is a google penalty linked to a domain or the site/files? So if we launch a new site on the same domain, but new server (host), and new files, do you think this will clear any penalties?
Thanks again.
-
If all the pages are not indexed, then yes I would assume a penalty. One of the more common reseaons a site gets penalized is due to improper linking, either inbound or outbound in nature.
If you do not yet have access to webmaster tools, there are still steps you can take. This is something you are going to have to do anyway, once you figure out what the penalty was for.
First place to start: links.
There are a wide variety of backlink tools out there. Here are a few you can try:
http://raventools.com/marketing-tools/link-manager/
http://moz.com/researchtools/ose
https://ahrefs.com/Start looking for the spammy or paid links. How can you tell? Simple. If a link has a domain like rankmehighingoogle.com or something like that, chances are it's a bad or paid backlink. The example given is a silly domain name, but you will see some like that come up. If you are unsure of a links quality, manually visit the sites to see what they are all about. If the home page has a 0 or a ? for pagerank, chances are the linking site got hit with a penalty and you should disavow that linking domain.
Another way to test is to search for the linking domain in Google. If you search for a web directory site or linking domain specifially by their name and they are nowhere to be found, Google most likely hammered them for some practice they were using.
Since you dont have access to GWT yet, this would be a good way to see what is going on with this site. You stated that you just started doing the optimization for the client, so you most likely havent had time to research the domains history yet. Once you have access to GWT you will be taking a look at links anyway, so while you are waiting for access be proactive
-
Thank you Devanur. I will look into this.
-
Hi,
I 100% agree with FedeEinhorn.
roofrackworld.com.au seems penalized somewhere around November 2013 to December 2013.
Go here: http://www.barracuda-digital.co.uk/panguin-tool/
Give an offline access to the tool for your Google Analytics account. Select the date range from September 2013 till date. Look for any Google update related penalty.
Please post your observations here so that we can take it from there.
Best regards,
Devanur Rafi
-
Glad someone else thinks it is weird!
Thank you for your help and suggestions... I will get access to webmaster tools and see what I can find.
-
Holy... this IS weird.
Checked the robots.txt and there's nothing blocking the indexing, robots meta tags are present with INDEX.
You clearly need urgent access to Webmaster tools, seems like a penalty for pure spam or something like that, as there's no 1 single page indexed, while there are other sites linking to it.
What I would do? Before doing any further onsite SEO, get that resolved. Go to Webmaster tools and check any manual action, message, etc. Try the fetch as googlebot. Then go to Google's Webmaster forums and ask, usually someone from Google jumps in.
-
I was looking for more general advice on this issue initially, to see if others had encountered this problem. But happy to share domain if it helps... with the disclaimer as I mentioned above, that there is clearly much more work to be done to get a good rank - but this issues seems to be bigger than on-site optimisation...
Thanks
-
Care sharing the real domain?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
76% drop in traffic for iModular.com
_Hello everyone....I am in the midst of a huge battle and Moz is my intelligence headquarters. Unfortunately, I just experienced a 76% drop in my search visibility after having had a hopeful increase over the past month. I have done nothing with the web site accept for add quality content. No links, no nada. _
Algorithm Updates | | MattBanes0 -
Still no good search results after 2 months of indexation
Hi guys, One of our website (https://www.residentiebosrand.be/) has been online for about two months. It's indexed and Google shows search results. But the website is not ranking on the keywords it's supposed to be ranking: 'residentie bosrand'. How come we still don't find the website on the first pages in the search results, while these are the main keywords on the website's URL, page, ... ? Best regards,
Algorithm Updates | | conversal0 -
Google indexing https sites by default now, where's the Moz blog about it!
Hello and good morning / happy Friday! Last night an article from of all places " Venture Beat " titled " Google Search starts indexing and letting users stream Android apps without matching web content " was sent to me, as I read this I got a bit giddy. Since we had just implemented a full sitewide https cert rather than a cart only ssl. I then quickly searched for other sources to see if this was indeed true, and the writing on the walls seems to indicate so. Google - Google Webmaster Blog! - http://googlewebmastercentral.blogspot.in/2015/12/indexing-https-pages-by-default.html http://www.searchenginejournal.com/google-to-prioritize-the-indexing-of-https-pages/147179/ http://www.tomshardware.com/news/google-indexing-https-by-default,30781.html https://hacked.com/google-will-begin-indexing-httpsencrypted-pages-default/ https://www.seroundtable.com/google-app-indexing-documentation-updated-21345.html I found it a bit ironic to read about this on mostly unsecured sites. I wanted to hear about the 8 keypoint rules that google will factor in when ranking / indexing https pages from now on, and see what you all felt about this. Google will now begin to index HTTPS equivalents of HTTP web pages, even when the former don’t have any links to them. However, Google will only index an HTTPS URL if it follows these conditions: It doesn’t contain insecure dependencies. It isn’t blocked from crawling by robots.txt. It doesn’t redirect users to or through an insecure HTTP page. It doesn’t have a rel="canonical" link to the HTTP page. It doesn’t contain a noindex robots meta tag. It doesn’t have on-host outlinks to HTTP URLs. The sitemaps lists the HTTPS URL, or doesn’t list the HTTP version of the URL. The server has a valid TLS certificate. One rule that confuses me a bit is : **It doesn’t redirect users to or through an insecure HTTP page. ** Does this mean if you just moved over to https from http your site won't pick up the https boost? Since most sites in general have http redirects to https? Thank you!
Algorithm Updates | | Deacyde0 -
How on earth is a site with ONE LINK ranking so well for a competitive keyword?
Ok, so I'm sure you get the gist of what I'm asking about in my question. The query is 'diy kitchens' in Google UK and the website is kitchens4diy[dot]com - which is ranking in third from my viewing. The thing is, the site has just ONE BACKLINK and has done for a good while. Yet, it's ranking really well. What gives?
Algorithm Updates | | Webrevolve0 -
How much does domain age play a part today?
My specific reason for asking this is due to the need to consolidate two of my client's websites into one main umbrella. The two websites now are duplicating content between each other. They are essentially the same business with two locations. The goal is to have one website to build authority on, reduce management/maintenance time and eliminate the duplicate content issues. In doing so, we will need to use a completely different domain name for branding reasons. I'm concerned that this may hurt us due to it being a brand new domain name. With the current websites, one domain is 8 years old and the other 2 years old.
Algorithm Updates | | HiddenPeak0 -
Troubleshooting Decline of Branded Keyword Searches
Hi, Over the past year, I have seen a huge change in the distribution of our organic keyword traffic. I'm trying to research why our branded keywords have gone down. Google analytics only shows me impressions for the past three months. Does anyone have ideas on how to explain this change in traffic? Please see the attached chart. Thanks! branded-v-nonbranded-organic-search.jpg
Algorithm Updates | | netdiva_amy0 -
Google.uk rankings plummet, .com improves. What to do?
Hey Guys, Seems so much has changed with international SEO I'm not sure what to do with our site. We have a huge site with many country level landing pages that perform very well on google.com searches (IE; keyword + Jamaica) etc. We are not using a .co.uk version of our site and now our rankings have plummeted in the UK. Should we just make a .co.uk with similar (or the exact same content) or is there some newer strategy to follow?
Algorithm Updates | | iAnalyst.com0 -
Help, I am in Local Search Results!
I do not know what to do with this... and could use a bit of advice on this issue: "Doing things right", resulted in great organic rankings and a bonus by showing top of local search results for our area. Sounds great... until Google decides it was time to mix things up a little. I do not know if this applies to all types of businesses, but for ours it means that you will no longer get any organic page 1 listing if you are a local business that (un)luckily ranks in local results too. One day G will include local results on a keyword, the next they won't... making our SEOMoz Campaign rankings weekly a true yo-yo of "50 keywords declined by >48 and >49 places", and "30 keywords improved by <47 and <49". It turned this feature in campaigns completely useless for me (ever since SEOMoz decided to include the local result light bulb that is) Some traffic dropped from 240 a day for one keyword, to 30 now for that same keyword. Frustrated? You bet. I do not understand why Google seems to create a war with local businesses. Should we get out of Local results or does anyone have any ideas, suggestions? Thanks a bunch guys!
Algorithm Updates | | Discountvc3