How much content is it safe to change?
-
I have read that it is unsafe to change more than 20% of your site’s content in any update.
The rationale is that "Changing too much at once can flag your site within the Google algorithm as having something suspicious going on."
Is this true, has anyone had any direct experiences of this or similar?
-
I would have absolutely no concerns over changing your content, as long as you are confident you are changing it for the better.
-
I definitely don't think there's any truth to your site possibly being penalized merely for changing a certain percentage of the site.
However:
- In general, it's always good to change your site in slow increments wherever possible.
- This allows search engine's to slowly adjust to your changes in url structure, content, or site design. Your site wouldn't be "penalized" for it, but considering search engine's have to recrawl your site and adjust to your changes, making sure you don't shock the system by making too many changes is certainly never a bad idea.
-
Sorry should have been clearer its when the text changes as part of a redesign.
Still its seems odd to me.
-
Haven't heard anything like that before. There are plenty of sites who change more than 20% of a page's content often (news sites, forums, search pages) and aren't penalized.
The only thing you should worry about is if you change more than 20% of the content, Google would recrawl and analyze the text and adjust it's relevance for keyword searches accordingly. So maybe you are getting a lot of traffic from a certain paragraph that you remove when you update the page, your rankings will drop from losing that keyword on the page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Rankings Change
Hi I've noticed some ranking drops on 15th Jan, I saw there was an update on Jan 8th - could this have had a knock on effect a week later?
Algorithm Updates | | BeckyKey0 -
Ecommerce: How does having fresh content affect rankings
Hello, For an Ecommerce site, how does fresh content affect rankings? Meaning, fresh vs old content on the home page, category pages, product pages, articles. Thanks.
Algorithm Updates | | BobGW0 -
Content Caching Memory & Removal of 301 Redirect for Relieving Links Penalty
Hi, A client site has had very poor link legacy, stretching for over 5 years. I started the campaign a year ago, providing valuable good quality links. Link removals and creating a disavow to Google have been done, however after months and months of waiting nothing has happened. If anything, after the recent penguin update, results have been further affected. A 301 redirect was undertaken last year, consequently associating those bad links with the new site structure. I have since removed the 301 redirect in an attempt to detach this legacy, however with little success. I have read up on this and not many people appear to agree whether this will work. Therefore, my new decision is to start a fresh using a new domain, switching from the .com to .co.uk version, helping remove all legacy and all association with the spam ridden .com. However, my main concern with this is whether Google will forever cach content from the spammy .com and remember it, because the content on the new .co.uk site will be exactly the same (content of great quality, receiving hundreds of visitors each month from the blog section along) The problem is definitely link related and NOT content as I imagine people may first query. This could then cause duplicate content, knowing that this content pre-existed on another domain - I will implement a robots.txt file removing all of the .com site , as well as a no index no follow - and I understand you can present a site removal to Google within webmaster tools to help fast track the deindexation of the spammy .com - then once it has been deindexed, the new .co.uk site will go live with the exact same content. So my question is whether Google will then completely forget that this content has ever existed, allowing me to use exactly the same content on the new .co.uk domain without the threat of a duplicate content issue? Also, any insights or experience in the removal of a 301 redirect, detaching legacy and its success would also be very helpful! Thank you, Denver
Algorithm Updates | | ProdoDigital0 -
Could we run into issues with duplicate content penalties if we were to borrow product descriptions?
Hello, I work for an online retailer that has the opportunity to add a lot of SKUs to our site in a relatively short amount of time by borrowing content from another site (with their permission). There are a lot of positives for us to do this, but one big question we have is what the borrowed content will do to our search rankings (we normally write our own original content in house for a couple thousand SKUs). Organic search traffic brings in a significant chunk of our business and we definitely don't want to do something that would jeopardize our rankings. Could we run into issues with duplicate content penalties if we were to use the borrowed product descriptions? Is there a rule of thumb for what proportion of the site should be original content vs. duplicate content without running into issues with our search rankings? Thank you for your help!
Algorithm Updates | | airnwater0 -
Why is Google changing my title tags?
I have a few sites set up this way with their title tags: "Keyword rich phrase(s) | Company name" and Google is showing more and more of them like this in the SERPs - "Company name: Keyword rich phrase(s)" I don't see this happening to many other sites...am I hallucinating or what's going on here? Is this happening to anyone else? I don't see it necessarily affecting rankings, but for my sites with little brand recognition I want those keywords first. Bueller? Bueller?
Algorithm Updates | | NetvantageMarketing0 -
Rel="alternate" hreflang="x" or Unique Content?
Hi All, I have 3 sites; brand.com, brand.co.uk and brand.ca They all have the same content with very very minor changes. What's best practice; to use rel="alternate" hreflang="x" or to have unique content written for all of them. Just wondering after Panda, Penguin and the rest of the Zoo what is the best way to run multinational sites and achieve top positions for all of them in their individual countries. If you think it would better to have unique content for each of them, please let us know your reasons. Thanks!
Algorithm Updates | | Tug-Agency0 -
Javascript hidden divs, links to anchor content
Hello, I am working on a web project that breaks up its sections by utilizing hidden divs shown via javascript activated through anchor links. http://www.janandtom.com/ First question: Is this SEO suicide? I have confirmed that the content is being indexed by searching for specific text but have been led to believe that hidden div content will be afforded a lower 'importance'. One suggestion has having the text as display:block and then hiding it on page load. Will this make a difference? Second: Is there any way to have Google index the anchored content by the specific anchor text? An example for the second question: If you search google right now for: buyers like to look at floorplans Tom & Jan You will get a link to: http://www.janandtom.com but I would rather it be: [http://www.janandtom.com/#Interactive Floorplans](http://www.janandtom.com/#Interactive Floorplans) Sorry if this is redundant or addressed before. I tried searching the questions but wasn't getting and definitive direction to go and this project is a little unique for me. Also, I'm just getting my feet we into this 'high-end' seo (new member of SEOMoz) so please bear with me. Any help would be greatly appreciated. Thanks!
Algorithm Updates | | MASSProductions0 -
High bounce rates from content articles influencing our rankings for rest of site
We have a large content article section on our e-commerce site that receives a lot of visits but also have very high bounce rates. We are wondering if this is hurting the rest of our site's rankings. **When I say bounce rates I mean what ever metrics Google is using to determine quality content (specifically after the Panda update). ** We are trying to determine if having the content articles on our domain hurts us. We only have the content articles for link building.
Algorithm Updates | | seozachz0