Reconsideration request failed - New website?
-
I am looking at website with MOZ PA 34. The website belong to a shop in Manhattan. Simple shop, simple man, not one that do tricks.
Reconsideration request failed twice! Never happened to me in the past.. Google ignored some domains in the two disavow files we submitted. All of these domains are asking $ to remove links that as much as I know we didn't even bought
My Question
Can I create a brand new domain/website and transfer the PA juice WITHOUT the bad links?
-
Great answer Sha. I will post the outcome when changes occur.
-
Sorry Marie,
Should have included you in my comments, but could not see that you had commented since I was dealing with a "disappearing answer" catastrophe by composing elsewhere and pasting in.
Why is it only the long ones that do that?
Sha
-
Hi Elchanan,
Well Eyepaq is batting a thousand today!
Eyepaq is quite correct. The only way to "transfer" the bulk of the link equity is to redirect the domain which would inevitably result in a transfer of the manual action as well. In fact, it is worse than that. In recent times a number of domains have been dealt a manual action by association without a redirect even being in place. These manual actions have been applied because the Webspam team believes that the sites are related and are part of a larger scale manipulative effort, or indeed an effort to get out from under a penalty. Matt Cutts talked about this at SMX West, stating that people should not be able to "move down the road" to avoid a manual action.
If you genuinely needed to build a new site before the manual action and have not put your life savings and years of effort into building a brand, there could be a business case for starting again with a brand new domain, BUT remember you will be starting with nothing - even less than you will have if you successfully clean up the existing domain. This has to be a careful business decision and any new site would need to be completely unique and without any connection to the penalized site. Personally, starting over would be my last resort unless the site was fundamentally broken (and the domain name was a poor choice) to start with.
There are generally five broad reasons why a reconsideration request may fail:
Insufficient data - maybe links in the backlink profile have not been surfaced in the data gathering stage. Incomplete data is common and is best remedied by using as many data sources as possible and in some cases by pulling multiple samples over a period of days or weeks.
(Remember that the links returned by the Webspam team when a reconsideration request fails are "examples". They are intended to point you toward other links in the backlink profile which follow the same patterns or use the same unnatural linking tactics.)Mistakes in Analysis - If links have been misclassified as natural and are kept, the reconsideration request will fail. Sometimes this happens because people rely solely on algorithmic analysis tools to determine which links to keep or remove and results are not 100% accurate. I would always argue that a real human should be the primary tool when doing analysis because I believe there is no room for mistakes in a job that your livelihood depends upon!
Sometimes human analysis can go wrong too - most often because people forget that this is about "unnatural" linking. That means links that were created rather than earned.
Another mistake that people make at this point is to try to just remove the worst of the unnatural links to preserve some of the benefits that were gained from unnatural linking. Omitting unnatural links from the cleanup effort because you think they are not so bad is a big mistake for two reasons:- It will mean leaving unnatural links in the backlink profile - on their own they could cause the manual action to be upheld, but even in the rare instance that this might be allowed to scrape past on reconsideration, retaining them leaves the site vulnerable to the Penguin algorithm
- It immediately shows the Webspam team reviewer that the site owner's manipulative mindset has not changed. Making a case for reconsideration requires that they are able to trust the site owner will never employ those kinds of tactics again.
Incomplete or ineffective Disavow submissions - As mentioned above, it is always best to disavow at the domain level to ensure that any links you are unaware of do not remain in play and sabotage your efforts. The only exceptions to this rule are unnatural links on high value domains you might reasonably love to have "natural" links from. In these extremely rare cases you should disavow the specific URLs to ensure that any natural links are preserved and any natural links you might earn from that domain in the future will be accorded their rightful value. Also - a red light went on for me when I saw "Google ignored some domains in the two disavow files we submitted". This causes me to wonder whether you have uploaded two completely separate files to the Disavow Links Tool? If so, then this could be the problem. The Disavow Links Tool submission is an overwrite, not an update. This means that you need to combine any existing disavow list with the new list before uploading. If you don't do this then you are effectively re-avowing all of the domains or links that were in the existing file.
If you need to update an existing disavow file with a new list, you can use this free tool to make it easy. Once you have created a free account you can upload your existing list, then upload any new list in the future to create an updated disavow file. When you upload a new list the tool will combine the data, remove duplicates and add date notations so that you can keep track of when domains were added. The tool also ensures that your new disavow file is within the 2Mb file size limit and generates it in the correct text format, ready for submission.Insufficient effort in the cleanup - Sometimes this is actually just that there is insufficient evidence provided that the work has been done. Most common mistakes here:
- Omitting domains completely from the cleanup effort because a WhoIs email address is not available
- Including domains that do not have a WhoIs email address, but not bothering to look any further to find a method of contact. If there are email addresses or contact forms available on the site, a "good faith effort" will require that you have used them to attempt to contact the domain owner.
- Being unhelpful when requesting that links are removed. The more that can be done to help the domain owner easily locate and remove the links, the better the success rate for the entire link removal campaign.The Webspam team needs to see that a "significant proportion" of the links have been removed. The better the cleanup rate, the smoother the path to getting a manual action revoked.
- The Bullying approach. Link removal requests should always be written with three things in mind a) You are asking someone to do you a favor b) threats or demands are unlikely to make someone feel that they want to be helpful c) the Brand is at stake here - whatever impression is created by the request will reflect heavily on the Brand. When people get this incredibly wrong, flow-on results can be catastrophic.
- Not recording and providing evidence of link removal efforts for domains that have not been successfully cleaned up. Keep good records. Provide evidence where domain owners have refused to help or requested payment.Provide evidence where on-site forms do not function. Make it easy for the Webspam team to make an assessment by providing good documentation.
Not making a case for reconsideration - Site owners need to demonstrate that they understand where they went wrong and will not repeat the same mistakes. In addition to this they need to convince the Webspam team that they have made a "good faith effort" to remove the links. Also, if there are links that are known to be natural, but may look suspicious, address them. Give a reasonable explanation as to why links have been retained (as long as there IS a reasonable explanation). You can use this checklist to make sure you have covered the most important things in your reconsideration request.
This Slide Deck provides an overview that might be helpful.
Any or all of these things can be playing a part in a failed reconsideration effort. It is not uncommon for it to take multiple attempts to have a penalty revoked, but the more of these potential problems we can eliminate by following best practice from start to finish, the more predictable the results.
Best of luck with resolving the manual action and getting things back on track.
Hope that helps,
Sha
-
Whether or not you should start over is a decision that probably can't be made in a forum post as there are many factors to consider. But, I would say that failing a reconsideration request is not on its own enough to make me want to start over.
Did Google give you example links that were already in your disavow? If so, did you disavow on the domain or url level? Often if you've disavowed on the url level you'll be missing links. If the link truly was disavowed then you've probably got other similar ones in your profile that need to be removed/disavowed.
Did you make extensive effort to get links removed? That's vital when you have a manual action.
"Can I create a brand new domain/website and transfer the PA juice WITHOUT the bad links?"
No. What makes up the PA is the equity from the links. If you start a new domain and 301 the old to the new you'll pass ALL of the link signals good and bad. There are ways to start over and redirect users from your old site to the new without passing the penalty, but it will be like totally starting afresh.
Another factor to consider is that if you start new you'll need new content as well. If you just put the old content on a new url Google will usually recognize that this is the same site and apply canonical tags which essentially still point the unnatural links at the new site.
I've yet to see a site that could not get its unnatural links penalty lifted....and I've seen some REALLY badly spammed sites. But, it's not uncommon for it to take several attempts in order to succeed.
-
Hi,
No, transfering (and by that you probabaly mean redirecting ) old authority to the new site will also transfer the manual action
Just make sure the disavow file is correct - make sure you use domains not links (as the disavow file is "sensitive" to duplicates: www vs non www, http vs https, slash at the end vs non slash at the end etc)
Cheers.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Dual website strategy
We have two websites (different businesses) in the technology sector that sell the same products on the same platform (OSC) but have different branding. We have tried to make the static content different and the user generated content is different. SEO as largely different. But the one site has much better rankings than the other. Whilst the under performing site is not responsive yet, I need to decide whether to merge the two businesses into one or continue on the two separate websites approach. I would only pursue the latter approach and invest further time and effort into this under performing website if I knew I was "on the right" track. My SEO knowledge is not extensive and so I would be interested in any views the community has? I note that kogan.com.au and dicksmith.com.au have a similar dual website approach (same company) and they are both major brands in Australia. I thank you in advance for any thoughts you may have.
Local Website Optimization | | Alpine91 -
Google still indexing home page even after with 301 - Ecommerce Website
Hi all,
Local Website Optimization | | David1986
We have a 301 redirect problem. Google seems to continue indexing a 301 redirect to our old home page. Even after months. We have a multiple language domain, with subfolders: www.example.com (ex page, now with a redirect to the right locale in the right country) www.example.com/it/home (canonical) www.example.com/en/home (canonical) www.example.com/es/home (canonical) www.example.com/fr/home (canonical) www.example.com/de/home (canonical) We still see the old page (www.example.com) in Google results, with old metadata in English and, just in some countries (i.e.: France), we see the correct result, the "new" homepage, www.example.com/fr/home in first position.
The real problem is that Google is still indexing and showing www.example.com as the "real" and "trusted" URL, even if we set: a 301 redirect the right language for every locale in Google Search Console a canonical tag to the locale url an hreflang tag inside the code a specific sitemap with hreflang tag specified for the new homepages Now our redirect process is the following (Italy example).
www.example.com -->301
www.example.com/en/home --> default version --->301
www.example.com/it/home --> 200 Every online tool, from Moz to Bot simulators see that there is a 301. So Correct. Google Search Console says that: on www.example.com there is a 301 (correct) in the internal link section of Google Search Console the www.example.com is still in first position with 34k links. Many of these links are cominig from property subdomains. Should we change those links inside those third level domain? From www.example.com to www.example.com/LOCALE/home? the www.example.com/LOCALE/home are the real home page, they give 200 code Do you know if there's a way to delete the old home page from Google results since this is 301? Do you think that, even after a 301 redirect, if Google sees too many internal links decides to ignore the 301? Thanks for your help! Davide0 -
Multi-Country Multi-Language content website
Hi Community! I'm starting a website that is going to have content from various countries and in several languages. What is the best URL structure in this case? I was thinking of doing something like: english name of the plant, content in english, content for USA:
Local Website Optimization | | phiber
www.flowerpedia.com/flowers/red-roses spanish name of the plant, content in spanish, content for MX:
mx.flowerpedia.com/es/rosas/rosas-rojas english name of the plant, content in english, content for MX:
mx.flowerpedia.com/roses/red-roses
this content is not the same as flowerpedia/flowers/red-roses Content for Mexico would not exist in languages other than english and spanish. So for example:
mx.flowerpedia.com/jp/flowers/red-roses would not exist and it would redirect
to the english version:
mx.flowerpedia.com/flowers/red-roses What would be the best URL structure in this case?0 -
Ranking a Website that Services Multiple Cities
We have a website that offers services to various cities in a state. However, since we don't want to do keyword stuffing, how do we rank this website for all of these cities when it comes to the **title tags? **For example, how do we optimize the homepage title tag? Obviously I know we can't put all the cities into it, so how do we choose which city to use? I know we can add city/local pages and optimize them for those locations, but I'm referring specifically to the homepage and other main pages of the website. How do you determine which cities to use in those title tags?
Local Website Optimization | | SEOhughesm0 -
Website Rankings - Provincial Movers
Hello Moz Community, We have been working with a company called Provincial Movers to optimize their website. We are focusing our efforts on building external local & relevant citations, however, I can't help but think there is more we can do internally --> www.provincialmoving.com The previous provider created a LOT of articles that are not necessarily relevant to the website like this: http://www.provincialmoving.com/blogs/ Do you guys have any suggestions for cleaning up the website so it performs better on Google? Thanks, Anton
Local Website Optimization | | Web3Marketing870 -
2 clients. 2 websites. Same City. Both bankruptcy attorneys. How to make sure Google doesn't penalize...
Hi Moz'ers! I am creating 2 new websites for 2 different bankruptcy attorneys in the same city. I plan to use different templates BUT from the same template provider. I plan to host with the same hosting company (unless someone here advises me not to). The content will be custom, but similar, as they both practice bankruptcy law. They have different addresses, as they are different law firms. My concern is that Google will penalize for duplicate content because they both practice the same area of law, in the same city, hosting the same, template maker the same, and both won't rank. What should I do to make sure that doesn't happen? Will it be enough that they have different business names, address, and phone numbers? Thanks for any help!!
Local Website Optimization | | BBuck0 -
Nominet have made the geographic new TLD available for UK. How will this affect SEO?
Nominet have made a new TLD available, the .uk TLD. Some might argue that this is a cynical move by Nominet to get more money out of British businesses, but either way, we need to decide how we handle this. As I see it we have 4 options. 1. Do nothing - At the moment, only websites can register their .uk domain. That won't last for ever though, and eventually, if we don't register it, someone else will.
Local Website Optimization | | Stewart_SEO
2. Register a domain but do nothing with it.
3. Register a domain and simply redirect it to the existing .co.uk domain. I suspect this is the best option.
4. Register the .uk domain and redirect the .co.uk domain to the new domain. From a technical point of view, what is the best option? For businesses that have multi-lingual sites the 4th appears the best option but why do we need to act when we do not even know the SEO value of any of this, and where Google sit regarding the new British TLD?1