Do we get de-indexed for changing some content and tags frequently? What is the scope in 2017?
-
Hi all,
We are making some changes in our website content at some paragraphs and tags with our main keywords. I'm just wondering if this is going to make us de indexed from Google? Because we recently dropped in rankings when we added some new content; so I am worried whether there are any chances it will turn more risky when we try to make anymore changes like changing the content. There are actually many reasons a website gets de indexed from Google but we don't employ any such black hat techniques. Our website got a reputation with thousands of direct traffic and organic search. However I am curious to know what are the chances of getting de indexed as per the new trends at Google?
Thanks
-
Google's goal is to have the most relevant information possible, so improving and updating your content won't get you deindexed—quite the opposite.
When you say you dropped in rankings when you added new content, what do you mean? The pages you changed dropped down? Or dropped out of Google's index?
It is hard to say why that would have happened without seeing what changed but if you are sure that your changes made the pages better for what they were ranking for, it is likely that they will come back.
In the meantime, be sure you didn't change something you didn't intend to change like the index/noindex status of the page. (It seems obvious, but sometimes these things can just slip through!)
-
Hi
You will not be de-indexed from Google by updating content. Don't worry about that! Updating content is just good for your SEO if you just keep making the content better than it was.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do with internal spam url's google indexed?
I am in SEO for years but never met this problem. I have client who's web page was hacked and there was posted many, hundreds of links, These links has been indexed by google. Actually these links are not in comments but normal external urls's. See picture. What is the best way to remove them? use google disavow tool or just redirect them to some page? The web page is new, but ranks good on google and has domain authority 24. I think that these spam url's improved rankings too 🙂 What would be the best strategy to solve this. Thanks. k9Bviox
White Hat / Black Hat SEO | | AndrisZigurs0 -
Does Google crawl and index dynamic pages?
I've linked a category page(static) to my homepage and linked a product page (dynamic page) to the category page. I tried to crawl my website using my homepage URL with the help of Screamingfrog while using Google Bot 2.1 as the user agent. Based on the results, it can crawl the product page which is a dynamic. Here's a sample product page which is a dynamic page(we're using product IDs instead of keyword-rich URLs for consistency):http://domain.com/AB1234567 Here's a sample category page: http://domain.com/city/area Here's my full question, does the spider result (from Screamingfrog) means Google will properly crawl and index the property pages though they are dynamic?
White Hat / Black Hat SEO | | esiow20130 -
Is one of my competitors trying to get my site penalized?
Hi guys, I have been ranking #2 for a popular search term for several months now, and today I noticed a drop to #5, so I went to check my backlink profile, and I'm seeing thousands of no-follow exact keyword matched backlinks, all from spammy looking websites. I looked at some of the links and they do link to me, but I didn't generate these links, and I have never paid anybody externally to build links for me. What is the best course of action for me here? link disavow tool?
White Hat / Black Hat SEO | | davegill0 -
Noindexing Thin Content Pages: Good or Bad?
If you have massive pages with super thin content (such as pagination pages) and you noindex them, once they are removed from googles index (and if these pages aren't viewable to the user and/or don't get any traffic) is it smart to completely remove them (404?) or is there any valid reason that they should be kept? If you noindex them, should you keep all URLs in the sitemap so that google will recrawl and notice the noindex tag? If you noindex them, and then remove the sitemap, can Google still recrawl and recognize the noindex tag on their own?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Am i getting backlink benefits from sites i design and host
I own & host over 300 domains for as many businesses. They all link back to my site from every page. but seomoz shows only hundred. so do other seo tools. why is that?
White Hat / Black Hat SEO | | nooptee0 -
Rel Noindex Nofollow tag vs meta noindex nofollow
Hi Mozzers I have a bit of thing I was pondering about this morning and would love to hear your opinion on it. So we had a bit of an issue on our client's website in the beginning of the year. I tried to find a way around it by using wild cards in my robots.txt but because different search engines treat wild cards differently it dint work out so well and only some search engines understood what I was trying to do. so here goes, I had a parameter on a big amount of URLs on the website with ?filter being pushed from the database we make use of filters on the site to filter out content for users to find what they are looking for much easier, concluding to database driven ?filter URLs (those ugly &^% URLs we all hate so much*. So what we looking to do is implementing nofollow noindex on all the internal links pointing to it the ?filter parameter URLs, however my SEO sense is telling me that the noindex nofollow should rather be on the individual ?filter parameter URL's metadata robots instead of all the internal links pointing the parameter URLs. Am I right in thinking this way? (reason why we want to put it on the internal links atm is because the of the development company states that they don't have control over the metadata of these database driven parameter URLs) If I am not mistaken noindex nofollow on the internal links could be seen as page rank sculpting where as onpage meta robots noindex nofolow is more of a comand like your robots.txt Anyone tested this before or have some more knowledge on the small detail of noindex nofollow? PS: canonical tags is also not doable at this point because we still in the process of cleaning out all the parameter URLs so +- 70% of the URLs doesn't have an SEO friendly URL yet to be canonicalized to. Would love to hear your thoughts on this. Thanks, Chris Captivate.
White Hat / Black Hat SEO | | DROIDSTERS0 -
Using Programmatic Content
My company has been approached a number of times by computer generated content providers (like Narrative Science and Comtex). They are providing computer generated content to a number of big name websites. Does anyone have any experience working with companies like this? We were burned by the first panda update because we were busing boilerplate forms for content
White Hat / Black Hat SEO | | SuperMikeLewis0 -
Hidden H1 tag - ?permissable
Until now I have been building websites either from scratch or with a template. Recently I decided to learn Adobe Dreamweaver. At the end of the first "Building a Website using Dreamweaver" lesson, the author notes the site is done but an H1 tag is missing. The instructor advises "The page doesn't have a top-level heading ( ). The design uses the banner image instead. This looks fine in a browser, but search engines and screen readers expect pages to be organized with a proper hierarchy of headings: at the top of the page, ..." The instructor then walks readers step-by-step into creating an H1 tag and using absolute positioning of -500px top to cause the tag to not be visible. My initial thought was the instructor was completely wrong for offering this advise, and users would be banned from search engines for following these instructions. I had planned to contact the writer and suggest the instructions be modified. Prior to doing such, I wanted to request a bit of feedback. The banner image's text in this example is "Check Magazine: Fashion and Lifestyle". The H1 tag that is created and positioned off-screen uses that exact same text. In an old blog comment, Matt Cutts shared "If you’re straight-out using CSS to hide text, don’t be surprised if that is called spam. I’m not saying that mouseovers or DHTML text or have-a-logo-but-also-have-text is spam; I answered that last one at a conference when I said “imagine how it would look to a visitor, a competitor, or someone checking out a spam report. If you show your company’s name and it’s Expo Markers instead of an Expo Markers logo, you should be fine. If the text you decide to show is ‘Expo Markers cheap online discount buy online Expo Markers sale …’ then I would be more cautious, because that can look bad.”" I would like to get some mozzer feedback on this topic. Do you view this technique as white hat? black hat? or grey hat?
White Hat / Black Hat SEO | | RyanKent0