How to find out if I have been penalized?
-
I have launched a new website beginning January this year and have seen slowly more and more traffic coming from google to the website until the 20th of March where suddenly there are no more visitors from the google search engine. The only traffic left is from google images, social networks or other search engines. Without visitors from google search this reduces our overall traffic by ~66%.
I can't easily find anymore our website in the search results of google by using terms which we usually ranked quite well. Nevertheless, the website is still indexed as I can find it using the "site:" search query. In google webmaster tools there are no messages and we have only been doing a bit of link building on website and blog directories (nothing excessive and nothing paid neither).
Is there any way to find out if google penalized my website? I guess it has... and what would be the best thing to do right now?
The website is hellasholiday (dot) com
Thanks in advance for your idea and suggestions
-
I am not a fan of CMS, i realize there are pros and cons, but when you try to do too much and be all things to all people you tend to have a lot of compromises.
There is one other reason i dont like to use robots,txt, i remeber Matt Cutts saying that it is a spam signal because they can not see what you are hiding, not that it is going to get you flaged by itself, but with other signals it can. If i remember correctly he was talking about hiding malware in scripts blocked by robots.
If you are interested, the best CMS for SEO i had found was Orchard CMS but even that has some silly errors, it puts more then one H1 tag in pages, but is still the best solution I have looked at. It is more customizable via code.
-
After having read your post and all the linked articles you have recommended I understand the issue and have adapted the robots.txt accordingly. Basically only leaving one single Disallow for the WordPress plugins. I hope this will help but I suppose I will see this in the next few days...
Now regarding WordPress I would suggest them to adapt their documentation as it is really misleading. Also I think they should implement all these noindex meta tags where necessary natively into wordpress and not by having to use a plugin for that, but this is another story.
-
Wordpress do many things that are not recommened, and blocking by robots is not recomened, what they are suggesting is a extream messure to solve the softewares problems. there are better ways to solve duplicate content without giveing away your link juice
Read this section "WordPress Robots.txt blocking Search results and Feeds"
on this page http://yoast.com/example-robots-txt-wordpress/
These plug-ins like yoast and word press itself, do not produse very good results. I have crawled many wordpress sites and they all have the same old problems many caused by the yoast plugin.
What google is refereing to in the link, is not getting pages of little value into their index, this is for their advantage not yours.
Its quite simple, if you block a page, the links pointing to that page waste their link juice, if you dont, or at least allow follow with a meta tag, you will get the link juice back.
See this article where Dr Pets calls it an extream messure, search for robots.txt you will see many comments refering to my point http://www.seomoz.org/blog/duplicate-content-in-a-post-panda-world
See Dr pets comments here http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions
-
I thought it would be no use for google to index and cache small icons, logos and cached resized images which have no meaningful name or so. So now I have at least removed the Disallow for these but for WordPress blog I want to keep the Disallow rules as recommended by WordPress itself for SEO purposes as documented here http://codex.wordpress.org/Search_Engine_Optimization_for_WordPress#Robots.txt_Optimization assuming they know what they are speaking about.
Anyhow I don't have the feeling this is really the problem why my website doesn't show up anymore in the google search engine results...
-
The question should be why block them?
its like cutting off your hand, because you have a splinter.
If duplicate content is a problem, then you can (in order of prefrance) fix it, use a canonical, a noindex,follow meta tag, but not robots
-
Many thanks Alan for your answer!
Regarding the robots.txt, basically I just would like to block/disallow some cached images and small icons/pictures from the website as well as some stuff for the associated WordPress blog which is also host on the same website. For the blog I am disallowing the admin pages, feeds, comments, trackbacks, content theme files etc. Here wold be the complete list just in case:
Disallow /wp-admin
Disallow: /wp-includes
Disallow: /wp-content/plugins
Disallow: /wp-content/cache
Disallow: /wp-content/themes
Disallow: /trackback
Disallow: /feed
Disallow: /comments
Disallow: /category//
Disallow: /*/trackback
Disallow: /*/feed
Disallow: /*/comments
Disallow: /?
Disallow: /*?
So maybe I should change my question to "what URLs should I disallow for a WordPress blog?"
Also where can I see all the pages which are blocked by my robots.txt file?
-
You can ask for reconsideration from google though webmaster tools. But since you have no warnings and you are still in the index, i have doubts that you have been flagegd manualaly, but you may have been algorthmicly.
I notived that you have blokced hundreds of pages with robots.txt, thios had led to thousonds of links pointing to pages that are not indexed, this means these links are puiring away link juice into nowhere.
You should not use robots text to block pages that are linked to, its a waste of valuable link juice.
if you must no-index the pages, use a meta noindex,follow tag, this way you will get most of the link juice back though the pages outlinks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there anyway to find historical ranking for a specific keyword?
for example - if i want to know who has ranked for the term "seo company" on google for the last 12 months (or there abouts), can it be done? James
Technical SEO | | isntworkdull1 -
Finding Broken Back Links
Hello there I am new here but really want to mend my broken website by myself as I enjoy a challenge! I used to have great rankings but have moved websites a few times (same domain) and the last move was to wordpress. I now have loads of broken links in the SERPS and wondered if there was an easy way to flush google of them as they are getting lots of 404 errors? They really are too many to do a 301 on (I have done the main pages) Also how do I do a crawl of my website for any internal broken links? Does SEOmoz have something or is there an external program you would recommend? Thanks Victoria
Technical SEO | | vcasebourne0 -
URL Structure for "Find A Professional" Page
I've read all the URL structure posts out there, but I'm really undecided and would love a second opinion. Currently, this is how the developer has our professionals directory working: 1. You search by inputting your Zip Code and selecting a category (such as Pool Companies) and we return all professionals within a X-mile radius of that ZIP. This is how the URL's are structured... 1. Main Page: /our-professionals 2. The URL looks like this after a search for "Deck Builders" in ZIP 19033: /our-professionals?zipcode=19033&HidSuppliers=&HiddenSpaces=&HidServices=&HidServices_all=[16]%2C&HidMetroareas=&srchbox= 3. When I click one of the businesses, URL looks like this: viewprofile.php?id=409 I know how to go about doing this, but I'm undecided on the best structure for the URL's. Maybe for results pages do this: find-professionals/deck-builders/philadelphia-pa-19033 And for individual pro's profiles do this: /deck-builders/philadelphia-pa-19033/Billys-Deck-Service Any input on how to best structure this so that we can have a good chance of showing in SERPs for "Deck Builders near New Jersey" and the such, would be much appreciated.
Technical SEO | | zDucketz0 -
Can't find mistake in robots.txt
Hi all, we recently filled our robots.txt file to prevent some directories from crawling. Looks like: User-agent: * Disallow: /Views/ Disallow: /login/ Disallow: /routing/ Disallow: /Profiler/ Disallow: /LILLYPROFILER/ Disallow: /EventRweKompaktProfiler/ Disallow: /AccessIntProfiler/ Disallow: /KellyIntProfiler/ Disallow: /lilly/ now, as Google Webmaster Tools hasn't updated our robots.txt yet, I checked our robots.txt in some ckeckers. They tell me that the User agent: * contains an error. **Example:** **Line 1: Syntax error! Expected <field>:</field> <value></value> 1: User-agent: *** **`I checked other robots.txt written the same way --> they work,`** accordign to the checkers... **`Where the .... is the mistake???`** ```
Technical SEO | | accessKellyOCG0 -
How to find and fix 404 and broken links?
Hi, My campaign is showing me many 404 problems and other tools are also showing me broken links, but the links they show me dose work and I cant seem to find the broken links or the cause of the 404. Can you help?
Technical SEO | | Joseph-Green-SEO0 -
If googlebot fetch doesnt find our site will it be indexed?
We have a problem with one of our sites (wordpress) not getting fetched by googlebot. Some folders on the url get found others not. So we have isolated it as a wordpress issue. Will this affect our page in google serps anytime soon? Does any whizz kid out there know how to begin fixing this as we have spent two days solid on this. url is www.holden-jones.co.uk Thanks in advance guys Rob
Technical SEO | | wonderwall0 -
How do you find bad links to your site?
My website has around 900 incoming links and I have a Google 50 penalty that is sitewide. I have been doing research and from what I can see is that the 50 penalty is usually associated with scetchy links. The penalty started last year. I had about 40 related domains to my main site and each had a simple one page site with a link to the main site. (I know I screwed up) I cleaned up all of those links by removing them. The single page site still exist, but they have no links and several of them still rank very well. I also had an outside SEO person that bought a few links. I came clean with Google and told them everything. I gave them all of my sites and that the SEO person had bought links. I gave them full disclosure and removed everything. I have one site that I can't get the link removed from. I have contacted them numerous times to remove the link and I get no response. I am curious if anyone has had a simular experience and how they corrected the situation. Another issue is that my site is "thin" because its an ecommerce affiliate site and full of affiliate links. I work in the costume market. I'm also afraid that I have other bad links pointing to my site. Dooes anyone know of a tool to identify bad links that Google may be penalizing me for at this time. Here is Google's latest denial of my reconsideration request. Dear site owner or webmaster of XXXXXXXXX.com. We received a request from a site owner to reconsider XXXXXXXX.com for compliance with Google's Webmaster Guidelines. We've reviewed your site and we believe that some or all of your pages still violate our quality guidelines. In order to preserve the quality of our search engine, pages from XXXXXXXXXX.com may not appear or may not rank as highly in Google's search results, or may otherwise be considered to be less trustworthy than sites which follow the quality guidelines. If you wish to be reconsidered again, please correct or remove all pages that are outside our quality guidelines. When such changes have been made, please visit https://www.google.com/webmasters/tools/reconsideration?hl=en and resubmit your site for reconsideration. If you have additional questions about how to resolve this issue, please see our Webmaster Help Forum for support. Sincerely, Google Search Quality
Technical SEO | | tadden0 -
How to find artificial or unnatural links in OSE?
Hi, I just got a message from Google Webmaster Tools telling that there are "artificial or unnatural links" pointing to one of my subdomains, and that I should investigate and submit my site for reconsideration. The subdomain in question has inbound links from 4K linking root domains. We are a certificate authority (we provide SSL certificates) so the majority of those links come from the site seal that customers place on their secure pages. We sell certificates to a full spectrum site types, from all sizes of ecommerce sites to .edu, .gov, and even adult. That said, our linking root domains have always been a mixed bunch, which tells me that these offending links were recently added. Here are my questions: Is it possible to slice my link reports with some sort of time element, so that I can narrow the search to only the newest inbound links? How else might I use OSE to find these "artificial or unnatural links"? Are there any particular attributes I should be looking for in a linking root domain that might suggest it's seen by Google as "artificial or unnatural". Any help with any aspect of this issue would be greatly appreciated. Thanks, Dennis p.s. I should probably state that I've never bought links or participated in link schemes.
Technical SEO | | dennis.globalsign0