What is the average response time for Reconsideration request
-
I know that Google states 'several' weeks but just wondering if anybody has any experience with a Reconsideration request and if they got any type of reply and what their general experience was.
thanks
-
It took 2 weeks for us a few months ago, but we were simply told that the site did not have a penalty.
We made some changes both internally and created some better links and we were back up the rankings,
-
Hey Barry
Having a bit of insight into your problem from our email discussion, I think you will find that making the changes will be enough and your site will just pop out of the filter when the problems are resolved.
I may be wrong, it's a one way flow of information with Google on this so definitely make the request but expect anything up to the seven weeks. Though, even post panda when i guess they were getting hammered, one site I helped got responded to in around 4 weeks so... 3 is a good bet.
Cheers
Marcus
-
I recently filed a reconsideration request and got a response within a week. It was a standard form letter and appeared only in WMT (not in an email to me). In that particular case my reconsideration request was not approved. After I filed another request a few days later, it took about 3 weeks to receive the next response.
-
Actually I will correct my original response.
On Mar 22, I filed a reconsideration request. The site involved had received a manual penalty from Google which had previously been confirmed by Google in writing.
On April 2nd, I received a response from Google via WMT titled "We've processed your reconsideration request".
The response stated "We've now reviewed your site. When we review a site, we check to see if it is in violation of our Webmaster Guidelines. If we don't find any problems, we'll reconsider our indexing of your site."
Immediately upon receipt of that message, I was able to find the site had been added to Google's index.
-
Was that an acutal written response?
-
I have only filed one reconsideration request this year. It was a 3 week response time.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Indexing Request - Typical Time to Complete?
In Google Search Console, when you request the (re) indexing of a fetched page, what's the average amount of time it takes to re-index and does it vary that much from site to site or are manual re-index request put in a queue and served on a first come - first serve basis despite the site characteristics like domain/page authority?
Intermediate & Advanced SEO | | SEO18050 -
For responsive site what should be lowest Screen Resolution for Desktop?
Hello Guys, Can you please share in details screen resolution I have to define for my responsive site for desktop, tablet & mobile. Your inputs are very valuable to me. Thanks! Micey
Intermediate & Advanced SEO | | micey0 -
Content Publishing Volume/Timing
I am working with a company that has a bi-monthly print magazine that has several years' worth of back issues. We're working on building a digital platform, and the majority of articles from the print mag - tips, how-tos, reviews, recipes, interviews, etc - will be published online. Much of the content is not date-sensitive except for the occasional news article. Some content is semi-date-sensitive, such as articles focusing on seasonality (e.g. winter activities vs. summer activities). My concern is whether, once we prepare to go live, we should ensure that ALL historical content is published at once, and if so, whether back-dates should be applied to each content piece (even if dating isn't relevant), or whether we should have a strategy in place in terms of creating a publishing schedule and releasing content over time - albeit content that is older but isn't necessarily time-sensitive (e.g. a drink recipe). Going forward, all newly-created content will be published around the print issue release. Are there pitfalls I should avoid in terms of pushing out so much back content at once?
Intermediate & Advanced SEO | | andrewkissel0 -
Third Party Subdomain Slow Load Times Affect Our SERPs?
We have a third party subdomain that is not hosted on our server (smugmug.com gallery). It periodically has slow load times. The question is; does anybody know if the SERPs, or more specifically Google, would see that subdomain as our site? I want to gather insight into whether or not this might affect our results. Thank you!
Intermediate & Advanced SEO | | leslieevarts0 -
High resolution (retina) images vs load time
I have an ecommerce website and have a product slider with 3 images. Currently, I serve them at the native size when viewed on a desktop browser (374x374). I would like to serve them using retina image quality (748px). However how will this affect my ranking due to load time? Does Google take into account image load times even though these are done asynchronously? Also as its a slider, its only the first image which needs to load. Do the other images contribute at all to the page load time?
Intermediate & Advanced SEO | | deelo5551 -
Pitfalls when implementing the “VARY User-Agent” server response
We serve up different desktop/mobile optimized html on the same URL, based on a visitor’s device type. While Google continue to recommend the HTTP Vary: User-Agent header for mobile specific versions of the page (http://www.youtube.com/watch?v=va6qtaiZRHg), we’re also aware of issues raised around CDN caching; http://searchengineland.com/mobile-site-configuration-the-varies-header-for-enterprise-seo-163004 / http://searchenginewatch.com/article/2249533/How-Googles-Mobile-Best-Practices-Can-Slow-Your-Site-Down / http://orcaman.blogspot.com/2013/08/cdn-caching-problems-vary-user-agent.html As this is primarily for Google's benefit, it's been proposed that we only returning the Vary: User-Agent header when a Google user agent is detected (Googlebot/MobileBot/AdBot). So here's the thing: as the server header response is not “content” per se I think this could be an okay solution, though wanted to throw it out there to the esteemed Moz community and get some additional feedback. You guys see any issues/problems with implementing this solution? Cheers! linklater
Intermediate & Advanced SEO | | linklater0 -
I'm having an exteremly hard time with SERPs despite my best efforts. Can someone help?
My site is www.drupalgeeks.org Our traffic is going up but our SERPS are not. We simply don't rank for any of our targeted keywords. I have covered nearly every white hat SEO strategy possible. Our site has a great social presence (Facebook, Twitter, LinkedIn, Pinterest), we write blogs regularly, and even guest blog. We have a YouTube channel, an RSS feed. We've cleaned up page speed times, set 301 redirects, checked for duplicate content. We use Bing and Google webmaster tools and have submitted a sitemap. We are indexed and webmaster tools see our keywords as relevant in our content. We have a robots.txt file configured properly. The only thing I can think of is that our services pages also display (as a truncated summary) on our homepage. Could this be considered duplicate content, and is this causing a problem? Is there anything else we can do? Or are we missing something vital? We thank you in advance for your help! Candice
Intermediate & Advanced SEO | | candylotus0 -
Having a hard time with duplicate page content
I'm having a hard time redirecting website.com/ to website.com The crawl report shows both versions as duplicate content. Here is my htaccess: RewriteEngine On
Intermediate & Advanced SEO | | cgman
RewriteBase /
#Rewrite bare to www
RewriteCond %{HTTP_HOST} ^mywebsite.com
RewriteRule ^(([^/]+/)*)index.php$ http://www.mywebsite.com/$1 [R=301,L] RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME}.php -f
RewriteRule ^(.*)$ $1.php [NC,L]
RewriteCond %{HTTP_HOST} !^.localhost$ [NC]
RewriteRule ^(.+)/$ http://%{HTTP_HOST}$1 [R=301,L] I added the last 2 lines after seeing a Q&A here, but I don't think it has helped.0