Changing url to recover from algorythmic penalty
-
Hello,
If I think that a website was hit algorithmically, I would like to buy a new domain name and publish all the content from the first website there. I will take the first site down and this one would be the only one this content.
Will Google see that it's the same content than a penalized website posted before and will penalize the new domain name even though it has 0 links pointing to it?
Regards.
-
Marie is correct - this is unlikely to work unless you are VERY careful to not let Google know that the websites are connected, as they're partial to transferring penalties from one site to another if you try to get rid of a penalty by starting a new, identical website. Simply redirecting a penalised site is a trick that used to work back in 2009, 2010 (you don't mention redirection, but it's worth noting that this used to work, so if you see it mentioned online it's probably old information).
Even if you do not redirect the old site, Google may still recognise that the content is identical to a website it previously penalised, especially if all the new site's registration information, hosting, template, etc. is the same as the old site.
That's not for sure - you may get away with doing this if there are no ties between the old, penalised site and the new site, but using identical content is a big give-away.
Assuming that your penalty was links-related, the safest way to do this is to remove the old site's content, wait until Google cache the old site with the content gone (so the content is completely out of the index), take the site down and re-publish on the new domain. That said, Google's ability to remember what it has seen before could result in the scenario Marie describes.
-
This is probably not going to work well for you.
If you've been affected adversely by the Panda algorithm, then Panda is all about the content on your site. If you take the same site and move it to a new domain then the same issues are there and the same demotion is going to happen. You might rank well for a month or so and then when Panda refreshes you'd be back where you started.
If Penguin is the issue, then the problem is with links. If you move to a new domain you're starting fresh with no links. However, if the content is all the same then Google will usually apply an invisible canonical and apply all of the links from the old site to the new site. You'll see something in WMT that says, "via this intermediate link" when this happens. As such, the demotion that came along with having the bad links will affect the new site the next time that Penguin refreshes.
-
You may not know that that if you take your website and move it to a completely 'new domain' but you have not redirected the old domain to the new, that Google may also pass along the penalty without redirecting the URLs.
If you have a website site with a 'penalty' I strongly advise you to find out exactly what it is first. If you are going to take the site and simply move it to a new domain, like you say, even without using site migration tools or setting up essential redirects, Google may or probably will in most cases figure out it was you and pass along the unwanted penalty juice.
-
Hi
Before you do anything, you need to specifically find out why you have been penalised?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL disappeared from the search results
Hey folks, A URL on my webpage that has been climbing in search results ever since has suddenly completely disapeared from the search results and i'm absolutely stuck - no idea what the reason might be. It was ranked #11 for the targeted keyword, than it slightly started dropping down to #14 and #17 after which it completely disappeared, not only for specific targeted keyword, but also for exact name of the product. The URL has vanished from search results. I looked in search console, no particular errors or messages from Google. The only case I might come with is that many URLs are cannonicaly linked to the URL in matter, but i don't assume this might be the case. Does anyone have a suggestion what might the reason why the URL has completely vanished from the search results? Thank you a lot. The URL: http://chemometec.com/cell-counters/cell-counter-nc-200-nucleocounter/ Targeted keyword: 'cell counter'
White Hat / Black Hat SEO | | Chemometec0 -
Do searchs bot understand SEF and non SEF url as the same ones ?
I've jsut realized that since almost for ever I use to code first my website using the non sef for internal linkings. It's very convenient as I'm sure that what ever will be the final url the link will always be good. ex: website.com/component1/id=1 Before releasing the website I use extensions to make the url user friendly according the choosen strategy. ex: website.com/component1/id=1 -> website.com/article1.html But I just wondered if google consider both urls as the same ones or if it consider just as a 301 redirection. What do you think is the best to do ?
White Hat / Black Hat SEO | | AymanH0 -
Recovering from an algorithmic bodyslam
Hi there. We inherited a client who didn't receive a manual penalty, but holy cow they have a good sized algorithmic penalty on their site. Here is what we have done since receiving the client: Client arrived with a bad backlink profile and an algorithmic penalty. We knew this, but underestimated the effort in removing it. We researched great forum posts like http://moz.com/community/q/google-penguin-2-0-how-to-recover http://moz.com/community/q/penguin-2-1-how-to-recover The researched great blog posts like http://moz.com/ugc/what-a-penguin-recovery-looks-like http://moz.com/ugc/recovery-from-google-penguin-tips-from-the-trenches http://moz.com/ugc/a-theory-for-preventing-recovering-from-a-google-penguin-penalty Outside of Moz, we researched a lot as well. We felt armed that we needed to do 3 major things. Remove all of the bad backlinks Create good content within the site Fix any unnatural on page SEO tactics (keyword stuffing, etc) Here is how we tackled it step by step Step 1: For step 1, we contacted over 100 of the bad backlinks. Many of them wanted a fee for removing the backlinks. They were from sites that were literally like "freeseobacklinks.org". Crazy bad ones. But we only got a few removed. The rest either ignored us or wanted some money. Hence our round(s) of disavow. Our SEO manager at the time of the first disavow only did 50 domains on the disavow. She was extremely thorough, followed the guidelines to a T, and performed it. We actually fell back in ranking afterward, even though I didn't think it was possible. With nothing to lose, besides lots of time and budget, we went through thousands of links and manually compiled an extravagant spreadsheet for our next round of disavow. Again, limited to no response from site owners. So we went ahead and pushed forth with nearly 300 domains for the disavow. By this time, the site was in the abyss, so it couldn't hurt anymore. We kept all of the great links, which surprisingly there were a fair amount. Step 2:
White Hat / Black Hat SEO | | Boogily
Our SEO manager and our content writer began to write for the website. Our graphic design created an awesome infographic, and a good slideshare too. We've been putting 3-4 articles / posts on the site monthly. Typical word range is 750+ Step 3:
We did a full site analysis and removed all unnatural location based keywords. There wasn't a ton of unnatural on page SEO going on. The bulk of the damage must have came from the bad backlinks. Summary:
On top of this we have been doing this for at least 6 months. All of the pages that are hit by the penalty are just gone. Nowhere to be found on Google, unless you search with the site operator or search for that exact page. We seem to make zero headway with all of this. I'm not sure what else we can be doing. We even optimized for conversions and longer time on site, as well as page speed. We've confirmed that there is no manual penalty. I'm starting to feel as if the site is permanently deemed bad or something. I also don't want to keep wasting our writers and manager's time on this one. Any ideas on next steps? Can anyone restore my confidence in this site? Thanks for the long read and any response, Have a great day,1 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Moving content to a clean URL
Greetings My site was seriously punished in the recent penguin update. I foolishly got some bad out sourced spammy links built and I am now paying for it 😞 I am now thinking it best to start fresh on a new url, but I am wondering if I can use the content from the flagged site on the new url. Would this be flagged as duplicate content, even if i took the old site down? your help is greatly appreciated Silas
White Hat / Black Hat SEO | | Silasrose0 -
Preparing for Penguin: Remove, Disavow, or change to branded
For someone that has 80 root domains pointing to their domain and 10 of them are sitewide backlinks from 10 PR4+ sites. All paid for. All with the same main keyword anchor text Should I advise him to remove the links, dissavow the links, dissavow then remove or just change to branded anchor text for the 10 sitewide links. Another option is to just keep one link (preferrably editorial) from each site. The only reason not to pull them off right away is that the client could not sustain his business with a drop in sales. These are by far the strongest 10 root domains. Eventually, when he has enough good backlinks these are all coming off. There was a huge drop in sales for this site last fall, but it recovered almost completely by changing keyword stuffing and adding ecommerce content. Looking to keep his sales and also prepare for this years updates.
White Hat / Black Hat SEO | | BobGW0 -
EXPERT CHALLENGE: What link building strategies do YOU think will work after the latest 3/29/2012 Google algorithm change?
FOR ALL SEO THOUGHT LEADERS...What link building strategies do YOU think will work after the latest 3/29/2012 Google algorithm change? NOTE: My hope is that the responses left on this thread will ultimately benefit all members of the community and give recognition to the true thought leaders within the SEO space. That being said, my challenge is a 2 part question: With the 80/20 rule in mind, and in light of recent algorithm changes, what would YOU focus most of your SEO budget on if you had to choose? Let's assume you're in a competitive market (ie #1-5 on page 1 has competitors with 20,000+ backlinks - all ranging from AC Rank 7 to 1). How would you split your total monthly SEO budget as a general rule? Ex) 60% link building / 10% onsite SEO / 10% Social Media / 20% content creation? I realize there are many "it depends" factors but please humor us anyways. Link building appears to have become harder and harder as google releases more and more algorithm changes. For link building, the only true white hat way of proactively generating links (that I know of) is creating high quality content that adds value to customers (ie infographics, videos, etc.), guest blogging, and Press Releases. The con to these tactics is that you are waiting for others to find and pick up your content which can take a VERY long time, so ROI is difficult to measure and justify to clients or C-level management. That being said, how are YOU allocating your link building budget? Are all of these proactive link building tactics a waste of time now? I've heard it couldn't hurt to still do some of these, but what are your thoughts and what is / isn't working for you? Here they are: A. Using spun articles edited by US based writers for guest blog content B. 301 Redirects C. Social bookmarking D. Signature links from Blog commenting E. Directory submissions F. Video Submissions G. Article Directory submissions H. Press release directory submissions I. Forum Profile Submissions J. Forum signature links K. RSS Feed submissions L. Link wheels M. Building links (using scrapebox, senukex, etc.) to pages linked to your money site N. Links from privately owned networks (I spoke to an SEO company that claims to have over 4000 unique domains which he uses to boost rankings for his clients) O. Buying Contextual Text Links All Expert opinions are welcomed and appreciated 🙂
White Hat / Black Hat SEO | | seoeric2 -
Anchor text penalty doesn't work?!
How do you think, does the anchortext penalty exactly work? Keyword domains obviously can't over-optimize for their main keyword (for example notebook.com for the keyword notebook). And a lot of non-keyword-domains do optimize especially in the beginning for their main keyword to get a good ranking in google (and it always works). Is there any particular point (number of links) I can reach, optimizing for one keyword, after what i'm gonna get a penalty?
White Hat / Black Hat SEO | | TheLastSeo0