Do we get de-indexed for changing some content and tags frequently? What is the scope in 2017?
-
Hi all,
We are making some changes in our website content at some paragraphs and tags with our main keywords. I'm just wondering if this is going to make us de indexed from Google? Because we recently dropped in rankings when we added some new content; so I am worried whether there are any chances it will turn more risky when we try to make anymore changes like changing the content. There are actually many reasons a website gets de indexed from Google but we don't employ any such black hat techniques. Our website got a reputation with thousands of direct traffic and organic search. However I am curious to know what are the chances of getting de indexed as per the new trends at Google?
Thanks
-
Google's goal is to have the most relevant information possible, so improving and updating your content won't get you deindexed—quite the opposite.
When you say you dropped in rankings when you added new content, what do you mean? The pages you changed dropped down? Or dropped out of Google's index?
It is hard to say why that would have happened without seeing what changed but if you are sure that your changes made the pages better for what they were ranking for, it is likely that they will come back.
In the meantime, be sure you didn't change something you didn't intend to change like the index/noindex status of the page. (It seems obvious, but sometimes these things can just slip through!)
-
Hi
You will not be de-indexed from Google by updating content. Don't worry about that! Updating content is just good for your SEO if you just keep making the content better than it was.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My Website is getting too many DMCA Hits
My Website has been getting too many DMCA Hits since last december then my rankings dropped i would like to know if getting a new domain would be advisable ... and would it be good to redirect my website that is getting DMCA hits to the new domain i want to get it is advisable to build links for it the new domain or would it pass link juice to it (it has some spammy links tho)
White Hat / Black Hat SEO | | emmycircle0 -
Canonical tag On Each Page With Same Page URL - Its Harmful For SEO or Not?
Hi. I have an e-commerce project and they have canonical code in each and every page for it's own URL. (Canonical on Original Page No duplicate page) The url of my wesite is like this: "https://www.website.com/products/produt1"
White Hat / Black Hat SEO | | HuptechWebseo
and the site is having canonical code like this: " This is occurring in each and every products as well as every pages of my website. Now, my question is that "is it harmful for the SEO?" Or "should I remove this tags from all pages?" Is that any benefit for using the canonical tag for the same URL (Original URL)?0 -
How do I get more video reviews on a new site?
Hi, I'm working on a site that has video reviews of various places. It's general information/experience that most people possess and production-wise they are selfies. These videos are then transcribed and, voila... searchable content. My problem is this... how do I get large numbers of people to go to the trouble to make a 2 minute selfie? I thought about HARO, since one could work in a plug for something, but they have a site traffic threshold that this new site isn't at. Any and all ideas on how to efficiently generate this content for a new site with very little traffic would be appreciated. Thanks!
White Hat / Black Hat SEO | | 945010 -
What tools do you use to find scraped content?
This hasn’t been an issue for our company so far, but I like to be proactive. What tools do you use to find sites that may have scraped your content? Looking forward to your suggestions. Vic
White Hat / Black Hat SEO | | VicMarcusNWI0 -
What EMD Meta Title should we use and what about getting links to the same C-Block IP?
Situation: Recently I encountered two problems with both internal and external SEO for my company websites.
White Hat / Black Hat SEO | | TT_Vakantiehuizen
This Dutch company has four websites on one server. Three closely related EMD(Exact Match Domain) websites and one overarching website. (Holiday homes rental websites) Vakantiehuizen-Verhuur.nl (overarching)
Vakantiehuizen-Frankrijk.nl (EMD)
Vakantiehuizen-Italie.nl (EMD)
Vakantiehuizen-Spanje.nl (EMD) Question 1:
What would be a preferable Meta Title for the EMD websites (homepage/subpages)? Keep in mind that the domains are EMD. The homepage will target the most important keywords and should not compete with subpages. Options for the homepage:
1. Vakantiehuizen Frankrijk | Alle vakantiehuizen in Frankrijk op een rij!
2. Vakantiehuizen Frankrijk | Vakantiehuizen-Frankrijk.nl onderdeel van Vakantiehuizen-Verhuur.nl
3. Suggestions? Options for the subpages:
1. Vakantiehuis Normandie | Vakantiehuizen Frankrijk
2. Vakantiehuis Normandie | Vakantiehuizen-Frankrijk.nl
3. Suggestions? And concerning the keywords in the beginning; is it wise to use both plural and singular terms in the meta title? For Example:
Hotel New York. Best hotels in New York | Company Name Question 2: Many SEOs state that getting (too many) links from the same C-Block IP is bad practice and should be avoided. Is this also applicable if one website links out to different websites with the same C-Block IP? Thus, website A, B and C (on the same server) link to website D (different server) could be seen as spam but is this the same when website D links to website A, B and C?0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Rel Noindex Nofollow tag vs meta noindex nofollow
Hi Mozzers I have a bit of thing I was pondering about this morning and would love to hear your opinion on it. So we had a bit of an issue on our client's website in the beginning of the year. I tried to find a way around it by using wild cards in my robots.txt but because different search engines treat wild cards differently it dint work out so well and only some search engines understood what I was trying to do. so here goes, I had a parameter on a big amount of URLs on the website with ?filter being pushed from the database we make use of filters on the site to filter out content for users to find what they are looking for much easier, concluding to database driven ?filter URLs (those ugly &^% URLs we all hate so much*. So what we looking to do is implementing nofollow noindex on all the internal links pointing to it the ?filter parameter URLs, however my SEO sense is telling me that the noindex nofollow should rather be on the individual ?filter parameter URL's metadata robots instead of all the internal links pointing the parameter URLs. Am I right in thinking this way? (reason why we want to put it on the internal links atm is because the of the development company states that they don't have control over the metadata of these database driven parameter URLs) If I am not mistaken noindex nofollow on the internal links could be seen as page rank sculpting where as onpage meta robots noindex nofolow is more of a comand like your robots.txt Anyone tested this before or have some more knowledge on the small detail of noindex nofollow? PS: canonical tags is also not doable at this point because we still in the process of cleaning out all the parameter URLs so +- 70% of the URLs doesn't have an SEO friendly URL yet to be canonicalized to. Would love to hear your thoughts on this. Thanks, Chris Captivate.
White Hat / Black Hat SEO | | DROIDSTERS0 -
Indexing search results
One of our competitors indexes all searches performed by users on their site. They automatically create new pages/ new urls based on those search terms. Is it black hat technique? Do search engines specifically forbid this?
White Hat / Black Hat SEO | | AEM131