Website disappeared from Google organic keyword searches.
-
We have an auto repair company as a client www.autorepairauroratilden.com who for the better part of a year their website had ruled the 1st page organic Google search results. Their website, Blogs, Facebook, and Twitter all came up on page one for their keyword searches. On May 13th, it all came to a screeching halt. The website is nowhere to be found for any of their keywords (example: brake repair Aurora.) There are a couple of blogs on page 2 but it’s nothing like it was prior to May 13th.
On May 12th we published 5 branded websites for this client – Chrysler, Ford, Honda, Jeep, and Toyota, all on separate URL’s. All the page titles, keywords, and descriptions were specifically branded to the individual websites as were all the keywords.
Since the beginning of June we’ve taken down the 5 branded websites and we’ve gone through our keywords on the auto repair website. The website was last crawled on June 11th. We still do not have any page 1 placement or for that matter any page placement. I checked 10 pages out.
We have a 2nd auto repair client that has been running their website as well as their 5 branded websites a couple of months longer than this client and we’ve had no problems with any of their websites and keyword search results.
How do we fix this?
-
Thanks Brett - we'll do that
-
Run a detailed Linkdetective scan and start contacting each blog.
-
Is there anything that can be done to flush those links out?
-
Those are the types of links Penguin is going after.
If this is the case, then I would just focus on building higher quality links,and focus on the local markets.
Ahrefs.com has some good insights http://ahrefs.com/site-explorer/overview/subdomains/www.autorepairauroratilden.com
And I would use http://www.linkdetective.com/ to really analyze your links.
-
Brent,
Thank you very much for your insight and time. I ran the the site explorer link tool for this site and I'm seeing links where other bloggers are just re-posting blog articles we've done on behalf of our client and leaving the links in tact, http://www.konuyeri.com/2011/12/Auto-Repair-in-Aurora-Colorado/ , I'm not sure why or what benefit these links provide them but I could sure see that they could be a negative factor when Penguin's out there looking for spammy links. Your thoughts?
-
Did you receive a message or a warning from Google in Google Webmaster Tools?
-
Looks like you have a lot of bad inbound links from irrelevant websites.
I would start by examining all of your inbound links and the PA/DA and relevance of these sites.
-
No straight answer but looks like you were hit by Penguin.
Are there any problems in Google Webmaster Tools? How about Bing and Yahoo?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
[Organization schema] Which Facebook page should be put in "sameAs" if our organization has separate Facebook pages for different countries?
We operate in several countries and have this kind of domain structure:
Technical SEO | | Telsenome
example.com/us
example.com/gb
example.com/au For our schemas we've planned to add an Organization schema on our top domain, and let all pages point to it. This introduces a problem and that is that we have a separate Facebook page for every country. Should we put one Facebook page in the "sameAs" array? Or all of our Facebook pages? Or should we skip it altogether? Only one Facebook page:
{
"@type": "Organization",
"@id": "https://example.com/org/#organization",
"name": "Org name",
"url": "https://example.com/org/",
"sameAs": [
"https://www.linkedin.com/company/xxx",
"https://www.facebook.com/xxx_us"
], All Facebook pages:
{
"@type": "Organization",
"@id": "https://example.com/org/#organization",
"name": "Org name",
"url": "https://example.com/org/",
"sameAs": [
"https://www.linkedin.com/company/xxx",
"https://www.facebook.com/xxx_us"
"https://www.facebook.com/xxx_gb"
"https://www.facebook.com/xxx_au"
], Bonus question: This reasoning springs from the thought that we only should have one Organization schema? Or can we have a multiple sub organizations?0 -
Should we reinstate the old website?
In a nut shell we had a great site that performed well and grew month on month but it perhaps looked a bit dated. A decision was taken to build a new site and the job given to a PR agency for some reason. All the titles, H1 tags, page content and url structure was changed and now the site has drop 50% of organic traffic. I've been tasked with trying to rebuild rankings but so far it's not going well. A snapshot of the old website still exists and i'm very tempted to have it reinstated in the hopes that our traffic will recover. What are your thoughts?
Technical SEO | | etienneb0 -
Some results disappeares after a while
Hi every body i have a strange issue with my search results in google. i noticed some of my results are disappeared after a while! there are no notifications or messages in webmaster tools just my results get lost! for example when my URL (zomorodgasht.com) take a good place with a keyword's result for some days (page 2 or 3) but after a while that will gone for good in all pages! there is no results anymore in any pages! can you help me please what's the problem with google? i'm so stressed about this results. PS 1: my google webmaster is totally clean and every thing is fine. and of course my "search analytics" positioned the missing keywords in page 2 or 3 but i can't find them! google is showing wrong analytics results!!! some my results are totally gone. PS 2: all of my url's are truly submitted and showing in google there is no removed pages or results. just some keywords are gone after a while. best regards
Technical SEO | | ZomorodGasht0 -
Why HTML entities gets crawled as content keywords in Google search console?
My Google search console shows HTML parameters such as div, class, img, src, gif, align as content keywords, but why google crawls HTML parameters as keywords? because of this, I would be losing traffic for my on-page content keywords. Please let me know how to solve this. Thanks, Jenifer
Technical SEO | | Jenifer300 -
Google Search Console Site Map Anomalies (HTTP vs HTTPS)
Hi I've just done my usual Monday morning review of clients Google Search Console (previously Webmaster Tools) dashboard and disturbed to see that for 1 client the Site Map section is reporting 95 pages submitted yet only 2 indexed (last time i looked last week it was reporting an expected level of indexed pages) here. It says the sitemap was submitted on the 10th March and processed yesterday. However in the 'Index Status' its showing a graph of growing indexed pages up to & including yesterday where they numbered 112 (so looks like all pages are indexed after all). Also the 'Crawl Stats' section is showing 186 pages crawled on the 26th. Then its listing sub site-maps all of which are non HTTPS (http) which seems very strange since the site is HTTPS and has been for a few months now and the main sitemap index url is an HTTPS: https://www.domain.com/sitemap_index.xml The sub sitemaps are:http://www.domain.com/marketing-sitemap.xmlhttp://www.domain.com/page-sitemap.xmlhttp://www.domain.com/post-sitemap.xmlThere are no 'Sitemap Errors' reported but there are 'Index Error' warnings for the above post-sitemap, copied below:_"When we tested a sample of the URLs from your Sitemap, we found that some of the URLs were unreachable. Please check your webserver for possible misconfiguration, as these errors may be caused by a server error (such as a 5xx error) or a network error between Googlebot and your server. All reachable URLs will still be submitted." _
Technical SEO | | Dan-Lawrence
Also for the below site map URL's: "Some URLs listed in this Sitemap have a high response time. This may indicate a problem with your server or with the content of the page" for:http://domain.com/en/post-sitemap.xmlANDhttps://www.domain.com/page-sitemap.xmlAND https://www.domain.com/post-sitemap.xmlI take it from all the above that the HTTPS sitemap is mainly fine and despite the reported 0 pages indexed in GSC sitemap section that they are in fact indexed as per the main 'Index Status' graph and that somehow some HTTP sitemap elements have been accidentally attached to the main HTTPS sitemap and the are causing these problems.What's best way forward to clean up this mess ? Resubmitting the HTTPS site map sounds like right option but seeing as the master url indexed is an https url cant see it making any difference until the http aspects are deleted/removed but how do you do that or even check that's what's needed ? Or should Google just sort this out eventually ? I see the graph in 'Crawl > Sitemaps > WebPages' is showing a consistent blue line of submitted pages but the red line of indexed pages drops to 0 for 3 - 5 days every 5 days or so. So fully indexed pages being reported for 5 day stretches then zero for a few days then indexed for another 5 days and so on ! ? Many ThanksDan0 -
How do we keep Google from treating us as if we are a recipe site rather than a product website?
We sell food products that, of course, can be used in recipes. As a convenience to our customer we have made a large database of recipes available. We have far more recipes than products. My concern is that Google may start viewing us as a recipe website rather than a food product website. My initial thought was to subdomain the recipes (recipe.domain.com) but that seems silly given that you aren't really leaving our website and the layout of the website doesn't change with the subdomain. Currently our URL structure is... domain.com/products/product-name.html domain.com/recipes/recipe-name.html We do rank well for our products in general searches but I want to be sure that our recipe setup isn't detrimental.
Technical SEO | | bearpaw0 -
Dramatic Decrease in Google Organic Traffic Indicates a Penalty But None Found
So we've been having some difficulty with one of our websites since we split it in half and moved one section of content to a new domain with a new name, at the end of May. http://www.dialtosave.co.uk/mobile/ was moved to http://www.somobile.co.uk And in the following 6 weeks, the google organic traffic has fallen to miniscule levels, that seem to indicate a more serious issue than just low ranking. Initially when the site was moved, the 301s transferred the authority very quickly and the new website pages ranked well. Now, some of them simply won't rank at all unless you include the name of the website "somobile". Here is one of the current rankings that indicates an issue:
Technical SEO | | purpleindigo
"somobile" - 1
"somobile mobile phones" - not in top 50 These are some of the terms we used to rank in the top 10 on Google UK, and still do on Bing UK, but don't rank in the top 50 on Google UK now:
samsung galaxy ace
apple iphone 5 deals
samsung tocco icon Our webmaster central account says that only 30% of the pages in our sitemap are in the index. It seems like a penalty has been imposed, but our reconsideration request (just submitted because it seemed like a sensible next step) came back saying there were no manual actions taken. Can you see what it is that might be causing the problem for us? I would have thought it was the new domain (with less direct links and less brand credibility), or content issues, but I would have thought that would just reduce the ranking by a few pages rather than just hide the pages altogether.0