Are they going to deindex everyone?
-
Looking at the over-optimisation list it pretty much seems to me something that everyone is doing so are we suddenly going to find the best results actually de-indexed.
Maybe Google will slowly shut off indexing stage by stage so everyone changes.
What are your thoughts?
-
Alan, I like most of your responses. A fat thumbs up for you! I don't risk getting my clients' websites. Imagine me losing all the income? I had a talk with my client that I can get him to the top by creating a little shady things (spam) OR I could take the long hard road. We took the long hard road which was based off his decision to retire in the next two years. We can't afford to get Google smacked.
@Gareth, not everyone spams or over-optimizes. I definitely don't over optimize. It lowers conversions because the content becomes unreadable. ACTUALLY, my sites are ranking higher now so I am glad this is happening.
-
I would be inclined to agree with the gentleman who have answered above me. I do not know what Google will do for sure as I believe almost no one including even some that work at Google do. However I would imagine Google would be more apt to tackle the people who are manipulating their ranking on Google by use of keyword stuffing, link farms and on and on. The fact that someone can be penalized for having a website that is to highly optimized I find disturbing however I really doubt and hope it is only the people that have practiced Google has told us our black/grey hat tactics. I consider myself a ethical business man like the gentleman above me said I would never practice any of the black/grey hat techniques to improve my own or my clients ranking it just doesn't make sense to do that. I would imagine if they were to delist everyone that has done something to make sure there on page seo is optimized they would devalue some of their results that everyday people count on Google. To put it plainly if they did that Google would hurt themselves because it ordinary people not get what they are looking for when they search Google. I hope this is of some help.
-
At the end of the day google is trying to get poor websites who are listing because they either stuffed title tags, crammed keywords in filler content in a footer and other poor seo tactics. If you optimized your site properly with keyword research in a logical manner I do not suspect to see an impact. The only impact you could see is the devaluing of links from poor websites to yours and you see a trickle down effect.
-
What list is this?
From all i have heard, it is spam that is getting de-indexed, i would not say everyone is spamming, I certainly do not risk my clients sites with spam.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Home Page Deindexed overnight?
Hi, Hope you guys can help. I run an e-commerce site https://alloywheels.com Last night our home page (and a few other pages, but not all) were de-indexed by Google. The site has been ranking (UK) for years in P1 for the "alloy wheels" keyword and on the whole been running very successfully. However recently I have noticed from fluctuation on the "alloy wheels" keyword, dropping to P3 then P5 then back to P3, but this morning I noticed we were not even ranking on the first page. When I check inside Search Console there are no messages or warnings but the "/" page was de-indexed. There were a few other key pages that were also de-indexed. I have request reindexing and they have come back, P7 for the home page for "alloy wheels" The only thing I have changed was I realised yesterday there was no robots.txt on the site and was being recommended by web.dev to add one, so I did. It was just an allow all: User-agent: *
Technical SEO | | JamesDolden
Disallow Sitemap: https://alloywheels.com/sitemap.xml I ran tests on the robots.txt before it was uploaded and it all came green. I have removed the robots.txt for now. Has anybody seen anything like this before? With the recent ranking fluctuation I am not sure whether it is to do with that, the robots.txt or something different altogether? Thanks in advance, James0 -
Stuck trying to deindex pages from google
Hi There, We had developers put a lot of spammy markups in one of our websites. We tried many ways to deindex them by fixing it and requesting recrawls... However, some of the URLs that had these spammy markups were incorrect URLs - redirected to the right version, (ex. same URL with or without / at the end) so now all the regular URLs are updated and clean, however, the redirected URLs can't be found in crawls so they weren't updated, and couldn't get the spam removed. They still show up in the serp. I tried deindexing those spammed pages by making then no-index in the robot.txt file. This seemed to be working for about a week, and now they showed up again in the serp Can you help us get rid of these spammy urls? edit?usp=sharing
Technical SEO | | Ruchy0 -
Google Deindexing Site, but Reindexing 301 Redirected Version
A bit of a strange one, a client's .com site has recently been losing rankings on a daily basis, but traffic has barely budged. After some investigation, I found that the .co.uk domain (which has been 301 redirected for some years) has recently been indexed by Google. According to Ahrefs the .co.uk domain started gaining some rankings in early September, which has increased daily. All of these rankings are effectively being stolen from the .com site (but due to the 301 redirect, the site loses no traffic), so as one keyword disappears from the .com's ranking, it reappears on the .co.uk's ranking report. Even searching for the brand name now brings up the .co.uk version of the domain whereas less than a week ago the brand name brought up the .com domain. The redirects are all working fine. There's no instance of any URLs on the site or in the sitemaps leading to the .co.uk domain. The .co.uk domain does not have any backlinks except for a single results page on ask.com. The site hasn't recently had any design or development done, the last changes being made in June. Has anyone encountered this before? I'm not entirely sure how or why Google would start indexing 301'd URLs after several years of not indexing these.
Technical SEO | | lyuda550 -
Where did the "Location" go, on Google SERP?
In order to emulate different locations, I've always done a Google query, then used the "Location" button under "Search Tools" at the top of the SERP to define my preferred location. It seems to have disappeared in the past few days? Anyone know where it went, or if it's gone forever? Thanks!
Technical SEO | | measurableROI0 -
How do I deindex url parameters
Google indexed a bunch of our URL parameters. I'm worried about duplicate content. I used the URL parameter tool in webmaster to set it so future parameters don't get indexed. What can I do to remove the ones that have already been indexed? For example, Site.com/products and site.com/products?campaign=email have both been indexed as separate pages even though they are the same page. If I use a no index I'm worried about de indexing the product page. What can I do to just deindexed the URL parameter version? Thank you!
Technical SEO | | BT20090 -
Google has deindexed 40% of my site because it's having problems crawling it
Hi Last week i got my fifth email saying 'Google can't access your site'. The first one i got in early November. Since then my site has gone from almost 80k pages indexed to less than 45k pages and the number is lowering even though we post daily about 100 new articles (it's a online newspaper). The site i'm talking about is http://www.gazetaexpress.com/ We have to deal with DDoS attacks most of the time, so our server guy has implemented a firewall to protect the site from these attacks. We suspect that it's the firewall that is blocking google bots to crawl and index our site. But then things get more interesting, some parts of the site are being crawled regularly and some others not at all. If the firewall was to stop google bots from crawling the site, why some parts of the site are being crawled with no problems and others aren't? In the screenshot attached to this post you will see how Google Webmasters is reporting these errors. In this link, it says that if 'Error' status happens again you should contact Google Webmaster support because something is preventing Google to fetch the site. I used the Feedback form in Google Webmasters to report this error about two months ago but haven't heard from them. Did i use the wrong form to contact them, if yes how can i reach them and tell about my problem? If you need more details feel free to ask. I will appreciate any help. Thank you in advance C43svbv.png?1
Technical SEO | | Bajram.Kurtishaj1 -
How best to go about creating an application?
Hi there, I work within the travel sector, and I've had an idea of getting an embeddable application built, which would be of use to my company, but also lots of other companies (our competitors) and general websites in our niche. The idea would be that we'd get (and pay for) the application to be built, and then allow other parties to embed it into their site with a snippet of our code so we get the link back from them. There are obviously some technical issues here. The app will be built with Javascript (we can't use PHP on our web server , its a long story!) and I'd want a way to stop other swiping the code and using without the link to us. Is this going to be possible? Also, whats going to be the best way to get the link from them? If a competitor used it, they are less likely to do so with our company name plastered all over it, so it would need to be subtle, or an image link, or something. Not sure. Anyone done this sort of thing before? Thanks
Technical SEO | | neilpagecruise0 -
Is having "rel=canonical" on the same page it is pointing to going to hurt search?
i like the rel=canonical tag and i've seen matt cutts posts on google about this tag. for the site i'm working on, it's a great workaround because we often have two identical or nearly identical versions of pages: 1 for patients, 1 for doctors. the problem is this: the way our content management system is set up, certain pages are linked up in a number of places and when we publish, two different versions of the page are created, but same content. because they are both being made from the same content templates, if i put in the rel=canonical tag, both pages get it. so, if i have: http://www.myhospital.com/patient-condition.asp and http://www.myhospital.com/professional-condition.asp and they are both produced from the same template, and have the same content, and i'm trying to point search at http://www.myhospital.com/patient-condition.asp, but that tag appears on both pages similarly, we have various forms and we like to know where people are coming from on the site to use those forms. to the bots, it looks like there's 600 versions of particular pages, so again, rel=canonical is great. however, because it's actually all the same page, just a link with a variable tacked on (http://www.myhospital.com/makeanappointment.asp?id=211) the rel=canonical tag will appear on "all" of them. any insight is most appreciated! thanks! brett
Technical SEO | | brett_hss0