Is a reconcideration request required?
-
I have a smaller site that has a google penalty, webmaster tools says there might be a doorway page. Im wondering if, once the site is fixed, is it necessary to file a reconsideration request, or, if the site is no longer in violation, will the site be included eventually by itself, without a reconsideration request?
-
hehehehehe
I loved it: google hey, i want humans to come.
Very good.
-
A reconsideration request is fine, as long as you're sure that nothing else is bad about your site. You're telling Google that "hey, I want humans to come look at this site and look for anything they may find wrong about it, not just the stuff that I think I found". So as long as you think everything else is good, I think it should be fine. It's when you've only cleaned up half of the paid links coming into your site, and have a few pages of hidden text lying around still that it's not the best thing.
-
I think that reconsideration should be made to follow up the response from Google. I think it will not hurt you, and can help.
-
The reconsideration request is just to speed things up. Google will eventually crawl back over your site, but since you know about and have access to this short-cut, there really isn't any reason not to take it
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
XML sitemap and rel alternate hreflang requirements for Google Shopping
Our company implemented Google Shopping for our site for multiple countries, currencies and languages. Every combination of language and country is accessible via a url path and for all site pages, not just the pages with products for sale. I was not part of the project. We support 18 languages and 14 shop countries. When the project was finished we had a total of 240 language/country combinations listed in our rel alternate hreflang tags for every page and 240 language/country combinations in our XML sitemap for each page and canonicals are unique for every one of these page. My concern is with duplicate content. Also I can see odd language/country url combinations (like a country with a language spoken by a very low percentage of people in that country) are being crawled, indexed, and appearing in serps. This uses up my crawl budget for pages I don't care about. I don't this it is wise to disallow urls in robots.txt for that we are simultaneously listing in the XML sitemap. Is it true that these are requirements for Google Shopping to have XML sitemap and rel alternate hreflang for every language/country combination?
Technical SEO | | awilliams_kingston0 -
Blocking Google from telemetry requests
At Magnet.me we track the items people are viewing in order to optimize our recommendations. As such we fire POST requests back to our backends every few seconds when enough user initiated actions have happened (think about scrolling for example). In order to eliminate bots from distorting statistics we ignore their values serverside. Based on some internal logging, we see that Googlebot is also performing these POST requests in its javascript crawling. In a 7 day period, that amounts to around 800k POST requests. As we are ignoring that data anyhow, and it is quite a number, we considered reducing this for bots. Though, we had several questions about this:
Technical SEO | | rogier_slag
1. Do these requests count towards crawl budgets?
2. If they do, and we'd want to prevent this from happening: what would be the preferred option? Either preventing the request in the frontend code, or blocking the request using a robots.txt line? The latter question is given by the fact that a in-app block for the request could lead to different behaviour for users and bots, and may be Google could penalize that as cloaking. The latter is slightly less convenient from a development perspective, as all logic is spread throughout the application. I'm aware one should not cloak, or makes pages appear differently to search engine crawlers. However these requests do not change anything in the pages behaviour, and purely send some anonymous data so we can improve future recommendations.0 -
Google rejected my reconsideration request of unnatural link manual action, and list one blog article twice as example?
Hi Moz Community, On April 22 my site received a manual action in Google Webmaster telling me it's caused by unnatural links. After some a deep cleaning of all the sitewide links, which I think is the major problem of my external links, I requested a reconsideration request on May 4. And Google rejected my reconsideration request of unnatural link manual action on May 29, and list one blog article twice as example, which is quite weird to me. Is it normal for Google to list one URL twice as example in the feedback? I don't quite see the reason for that. Does anybody have any idea about that? This is really quite frustrating to me. And to be honest, I don't see much problems about the article Google listed as well. Yeah it's all about our product and it has 3 do-follow links to our site. But it contains no words such as sponsor, advertisement, or rewards... And the blog itself is quite healthy as well. The post also get rather high engagement, with organic comments and shares. How did Google flag that out? I don't think it's possible that Google will go into all our site links one by one... Hope you guys can help me with that. Thanks in advance! Ben
Technical SEO | | Ben_fotor0 -
Reconsideration Request a Success!
Hi all, Well I've finally gotten been able to get the penalty removed judging by this email: "Dear site owner or webmaster of xxx, We received a request from a site owner to reconsider xxx for compliance with Google's Webmaster Guidelines. Previously the webspam team had taken manual action on your site because we believed it violated our quality guidelines. After reviewing your reconsideration request, we have revoked this manual action. It may take some time before our indexing and ranking systems are updated to reflect the new status of your site. Of course, there may be other issues with your site that could affect its ranking without a manual action by the webspam team. Google's computers determine the order of our search results using a series of formulas known as algorithms. We make hundreds of changes to our search algorithms each year, and we employ more than 200 different signals when ranking pages. As our algorithms change and as the web (including your site) changes, some fluctuation in ranking can happen as we make updates to present the best results to our users. If your site continues to have trouble in our search results, please see this article for help with diagnosing the issue. Thank you for helping us to maintain the quality of our search results. Sincerely, Google Search Quality Team" This was after a reconsideration request was sent prior to disavow tool being released. In addition I also applied a disavow of all the links I was unsuccessful in removingwithout contacting Google and letting the original reconsideration request run it's course. I am making this post just to let everyone know that the hard work pays off and Google is just trying to make sure you are doing your best in removing the links. As 'Ryan Kent' always emphasizes, you must really be diligent and honest when trying to remove links. You also need to keep documentation, I anchored contact pages and email addresses, 1st, 2nd, 3rd, and even 4th attempt dates. Now with the disavow tool, I believe if you do a "good faith" in removing the links, and it is well documented, you can use the disavow tool after multiple attempts, correlating both the disavow links and the spreadsheet sent to Google is and should be very important in a reconsideration request. Good luck! Also I received the message from WMT, and wondering does anyone know how long is 'some time' before site is reindexed? So far our organic traffic is still about the same prior. So I would like to hear what other's experience are after a successful reconsideration. Feel free to ask any questions!
Technical SEO | | William.Lau3 -
Site Recovered from hack, should I submit a reinclusion request?
Hello, The site i'm referring to is http://www.pokeronamac.com, it was hacked via something called the "WordPress Pharma Hack" http://theblawblog.wordpress.com/2012/06/21/restoring-a-pharma-hacked-wordpress-site-wp-3-4/ We restored it as far as I can tell, but if anyone can confirm this by doing a site search and not getting redirected it would be appreciated. You will see that some search results still show up as spam, but when I click on them, they 404. I want to know If I should submit a reinclusion request, I wasn't notified by WMT of malaware, so I want to know the SOP here. Thanks Zach
Technical SEO | | Zachary_Russell1 -
Ambiguous Response to Google Reconsideration Request
Hello, On 9/11/12, we submitted a reconsideration request to Google for http://macpokeronline.com, at the time we received penalties from both penguin and manual removal. We have since worked on cleaning up our link profile, and got this response from Google: We received a request from a site owner to reconsider how we index the following site: http://www.macpokeronline.com/. We've now reviewed your site. When we review a site, we check to see if it's in violation of our Webmaster Guidelines. If we don't find any problems, we'll reconsider our indexing of your site. If your site still doesn't appear in our search results, check our Help Center for steps you can take. I honestly don't even know how to take this, we always showed up #1 while doing a site search, so it is kind of irrelevant to us in this case. Is this the reply of them accepting our request? Thanks Zach
Technical SEO | | Zachary_Russell0 -
Keyword Variants - Low conversion rates - Is a site redesign required?
Hi Guys, I'm Chris & i'm a noob to SEO. Thanks for taking the time to read this and for offering your support! It is greatly appreciated! I work for a small family run company called custom designed cables and we manufacture bespoke cable and bespoke retractable cables. I posted a couple of days ago about how to move forward with our site. Our website has been targeting specific keywords and their variants across the relevant pages to somewhat reasonable effect. Since signing up to SEOMOZ, i have been trying to narrow down the keywords that we target per page and also tried to improve the content on the pages with the help of the on page optimisation tool. I have a few questions that i would like somebody to help me with please if it isn't to much trouble. Firstly, for our keywords, there are a lot of variants. For example, we braid cables. Here are some of the variants of this keyword: cable braid, cable braiding, cable screen, cable screening, cable shielding etc etc. The way i have gone in the past has been to try and target all of those keywords on the relevant page. Now, writing content has been the issue as whilst trying to write good informative information, squeezing in all of the keywords has been a problem and more than likely affected the readability of the page. From what i have learnt since i signed up, this is not a good practice and therefore i have tried to narrow it down somewhat yet i don't really want to lose potential customers finding us by only targeting one or two of the keywords. This is a similar situation for most of our keywords on the majority of our pages. What is the best way to approach this? If i was to write a page per keyword, i don't believe that it would look very good as we could end up with over 100 pages with say 5 or 6 of them talking about the same subject which then leads onto the problem of writing good content. It would be difficult to write good content for pretty much the same thing across such a wide number of pages. If this is the solution though then i would more than happily tackle it. What would be the best way to move forward? The next issue that I have is that since i have been modifying the pages with the help of SEOMOZ, the number of enquries that we have received have fell off the charts. We used to average around 50 enquiries a month from the website and since i have modified the site, i'd say we've had probably 20 since the end of February. The funny thing is though, we have also averaged around 200 hits more per month since the changes. So the hits have increased yet the enquiries have died. I was wondering if anybody was willing to take a look at our site: http://www.customdesignedcable.co.uk to give me some general feedback as to why this may be happening and what your overall opinions of the site may be with regards to the layout, look and general feel etc because i am beginning to wonder whether or not the design is what could be causing us to convert so little of our new found visitors. If anyone could also provide me with actionable feedback with regards to our keyword targeting for our pages that would also really really help! I am currently considering re-designing the site in it's entirety and i am interested in your opinions on whether or not this would be a good way to go. Any help that you can give me would be greatly appreciated guys! Thanks again! Chris (Just another website/seo noob desperately attempting to avoid the sack)
Technical SEO | | Chris_CDC0 -
On-Site Sitemaps - Guidance Required
Hi, I am looking to find good examples of on-site sitemaps. We already submit our XML sitemap regularly through GWMT but I now wonder if we still need an on-site sitemap, as we have about 30 static pages and 300+ Wordpress blogs which in a sense makes that a spammy page as it has too many links and a higher than average keyword density. The reason I am looking for good examples is that I want to create a basic on-site sitemap that aids navigation but is styled to look ok as well. The Solution I have in mind: mydomain.com/link-example-one.php
Technical SEO | | tdsnet
mydomain.com/link-example-two.php
mydomain.com/liink-example-ten.php mydomain.com/blog then links to my 300 WP blogs, broken down into chunks navigated by using breadcrumbs. Will Google crawl this ok or should I stick to the current format listing ALL posts on one page? Thanks0