Is there a way to check if your site has a Google penalty?
-
Is there a way to find out if your site has an over optimization penalty?
-
Start by checking the number of indexed URL's. This can be accomplished by using the **site:yourdomainname.com **command within a Google search window. If no URL's are indexed then there is a high probability of a penalty, especially if your site used to be indexed.
Also search for your exact company name and domain name. If you no longer rank for these terms, where previously you ranked well for your own brand name, a penalty is likely.
Lastly check Google Webmaster Tools. Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to redirect 301 from high authority sites to own website?
How to redirect 301 from high authority sites to own website? If anyone know can tell me, such gigs are selling on the Fiverr.
White Hat / Black Hat SEO | | jefjaa0 -
Guest blogging penalty
We would like to receive a blogging post from guest on our blog which links to their website and vice versa....a link from their blog to our website. Does this affect us in terms of Google's "guest blogging" scenario? We have natural link exchange from our partners...website to website from partners page.
White Hat / Black Hat SEO | | vtmoz0 -
Ecommerce sites we own have similar products, is this OK?
Hello, In one of our niches, we have a big site with all products and a couple more sites that are smaller niches of the same niche. The product descriptions are different with different product names. Is this OK. We've got one big site and 2 smaller subsides in different niches that cross over with the big site. Let me know if Google is OK with this. We will have a separate blog for each with completely different content. There's not really duplicate content issues and although only the big site has a blog right now, the small ones eventually will have their own unique blog. Is this OK in Google's eyes now and in the future? What can we do to ensure we are OK? Thank you.
White Hat / Black Hat SEO | | BobGW1 -
What could go wrong? SEO on mobile site is different than desktop site.
We have a desktop site that has been getting worked on over the year regarding improving SEO. Since the mobile site is separate, the business decided to not spend the time to keep it updated and just turned it off. So any mobile user that finds a link to us in search engines, goes to a desktop site that is not responsive. Now that we're hearing Google is going to start incorporating mobile user friendliness into rankings, the business wants to turn the mobile site back on while we spend months making the desktop site responsive. The mobile site basically has no SEO. The title tag is uniform across the site, etc. How much will it hurt us to turn on that SEO horrid mobile site? Or how much will it hurt us to not turn it on?
White Hat / Black Hat SEO | | CFSSEO0 -
Google penalty having bad sites maybe and working on 1 good site ?!!!
I have a list of websites that are not spam.. there are ok sites... just that I need to work on the conent again as the sites content might not be useful for users at 100%. There are not bad sites with spammy content... just that I want to rewrite some of the content to really make great websites... the goal would be to have great content to get natual links and a great user experience.. I have 40 sites... all travel sites related to different destinations around the world. I also have other sites that I haven't worked on for some time.. here are some sites: www.simplyparis.org
White Hat / Black Hat SEO | | sandyallain
www.simplymadrid.org
www.simplyrome.org etc... Again there are not spam sites but not as useful as they coul become... I want to work on few sites only to see how it goes.... will this penalise my sites that I am working on if I have other sites with average content or not as good ? I want to make great content good for link bait 🙂0 -
How can I tell if my site was penalized from the most recent penguin update?
Hey all, I want to be able to see if my website was penalized from the most recent penguin update because we have several hundred websites built and at the bottom of each on it says something along the lines Website by, Web Design by, Hosting by and links back to our homepage. Could this possibly be penalizing us since these links have similar anchor text and on sites that have nothing to do with our services? Thanks, Ryan
White Hat / Black Hat SEO | | MonsterWeb280 -
Why Google still display search result so bad?
When I search this keyword Backlink คือ by Google browser(Google.co.th) then I saw these Domain that is spam keyword and worse content (Spin content and can not understand what it said) อํานาจเจริญ.dmc.tv/?p=19
White Hat / Black Hat SEO | | taradmkt
ฉะเชิงเทรา.dmc.tv/?p=28 พังงา.dmc.tv/?tag=backlink หนองคาย.dmc.tv/?p=97 ขอนแก่น.dmc.tv/?tag=backlink ชัยนาท.dmc.tv/?p=70 ตราด.dmc.tv/?tag=backlink etc As you can see the search result**.** My question is 1. How to tell Google to check these network 2. Why these network display Top 10 for 3 weeks!!!!! and after that they rank drop. 3. Why Facebook page rank on Google in the search result Please make me clear.0 -
Massive drop in Google traffic after upping pagecount 8-fold.
I run a book recommendation site -- Flashlight Worthy. It's a collection of original, topical book lists: "The Best Books for Healthy (Vegetarian) Babies" or "Keystone Mysteries: The Best Mystery Books Set in Pennsylvania" or "5 Books That Helped Me Discover and Love My Italian Heritage". It's been online for 4+ years. Historically, it's been made up of: a single home page ~50 "category" pages, and ~425 "book list" pages. (That 50 number and 425 number both started out much smaller and grew over time but has been around 425 for the last year or so as I've focused my time elsewhere.) On Friday, June 15 we made a pretty big change to the site -- we added a page for every Author who has a book that appears on a list. This took the number of pages in our sitemap from ~500 to 4,149 overnight. If an Author has more than one book on the site, the page shows every book they have on the site, such as this page: http://www.flashlightworthybooks.com/books-by/Roald-Dahl/2805 ..but the vast majority of these author pages have just one book listed, such as this page: http://www.flashlightworthybooks.com/books-by/Barbara-Kilarski/2116 Obviously we did this as an SEO play -- we figured that our content was getting ~1,000 search entries a day for such a wide variety of queries that we may as well create pages that would make natural landing pages for a broader array of queries. And it was working... 5 days after we launched the pages, they had ~100 new searches coming in from Google. (Ok, it peaked at 100 and dropped down to a steady 60 or so day within a few days, but still. And then it trailed off for the last week, dropping lower and lower every day as if they realized it was repurposed content from elsewhere on our site...) Here's the problem: For the last several years the site received ~30,000 search entries a month... a little more than 1,000 a day on weekdays, a little lighter on weekends. This ebbed and flowed a bit as Google made tweaked things (Panda for example), as we garnered fresh inbound links, as the GoodReads behemoth stole some traffic... but by and large, traffic was VERY stable. And then, on Saturday, exactly 3 weeks after we added all these pages, the bottom fell out of our search traffic. Instead of ~1,000 entries a day, we've had ~300 on Saturday and Sunday and it looks like we'll have a similar amount today. And I know this isn't just some Analytics reporting problem as Chartbeat is showing the same drop. As search is ~80% of my traffic I'm VERY eager to solve this problem... So: 1. Do you think the drop is related to my upping my pagecount 8-fold overnight? 2. Do you think I'd climb right back into Google's good graces if I removed all the pages at once? Or just all the pages that only list one author (which would be the vasy majority). 3. Have you ever heard of a situation like this? Where Google "punishes" a site for creating new pages out of existing content? Really, it's useful content -- and these pages are better "answers" for a lot of queries. When someone searches for "Norah Ephron books" it's better they land on a page of ours that pulls together the 4 books we have than taking them to a page that happens to have just one book on it among 5 or 6 others by other authors. What else? Thanks so much, help is very appreciated. Peter
White Hat / Black Hat SEO | | petestein1
Flashlight Worthy Book Recommendations
Recommending books so good, they'll keep you up past your bedtime. 😉0