Are they going to deindex everyone?
-
Looking at the over-optimisation list it pretty much seems to me something that everyone is doing so are we suddenly going to find the best results actually de-indexed.
Maybe Google will slowly shut off indexing stage by stage so everyone changes.
What are your thoughts?
-
Alan, I like most of your responses. A fat thumbs up for you! I don't risk getting my clients' websites. Imagine me losing all the income? I had a talk with my client that I can get him to the top by creating a little shady things (spam) OR I could take the long hard road. We took the long hard road which was based off his decision to retire in the next two years. We can't afford to get Google smacked.
@Gareth, not everyone spams or over-optimizes. I definitely don't over optimize. It lowers conversions because the content becomes unreadable. ACTUALLY, my sites are ranking higher now so I am glad this is happening.
-
I would be inclined to agree with the gentleman who have answered above me. I do not know what Google will do for sure as I believe almost no one including even some that work at Google do. However I would imagine Google would be more apt to tackle the people who are manipulating their ranking on Google by use of keyword stuffing, link farms and on and on. The fact that someone can be penalized for having a website that is to highly optimized I find disturbing however I really doubt and hope it is only the people that have practiced Google has told us our black/grey hat tactics. I consider myself a ethical business man like the gentleman above me said I would never practice any of the black/grey hat techniques to improve my own or my clients ranking it just doesn't make sense to do that. I would imagine if they were to delist everyone that has done something to make sure there on page seo is optimized they would devalue some of their results that everyday people count on Google. To put it plainly if they did that Google would hurt themselves because it ordinary people not get what they are looking for when they search Google. I hope this is of some help.
-
At the end of the day google is trying to get poor websites who are listing because they either stuffed title tags, crammed keywords in filler content in a footer and other poor seo tactics. If you optimized your site properly with keyword research in a logical manner I do not suspect to see an impact. The only impact you could see is the devaluing of links from poor websites to yours and you see a trickle down effect.
-
What list is this?
From all i have heard, it is spam that is getting de-indexed, i would not say everyone is spamming, I certainly do not risk my clients sites with spam.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Where am I going wrong?
Ok so been trying for some time to get our site ranking. Then the other day these guys pop up from nowhere. https://www.toastsupport.co.uk I can only see domain age as my issue at the moment but maybe I am missing something? I mean I don't have any backlinks just yet but they don't seem to even have any mozrank or anything for that matter https://wordpresswebsitemanagement.co.uk
Technical SEO | | Gavlar0 -
Top 10 keywords is still going strong but the rest just got smashed!
Hi SEOMOZ and it's USERs, Been trying to find my answer online but now after three weeks of reading blogposts I'm going to try this. 🙂 My website was ranking really good on 10 important keywords but not so good on the long tail, between 11 - 50 on maybe 30 different other, not so important keywords.
Technical SEO | | Drillo
So I began doing some work (I'm a newbie) but this is what I did:
1. Changed top navigation structure to get 4 other pages (w/ keywords as links) in it. (used a dropdown)
2. Wrote plenty of text that was a good fit for the page. (The text is OK and not to spammy looking.)
3. Added three links from high quality sites with keywords as links to these pages. I added them from my own site that is on the same server, same IP. 😉 I know, not looking so good.
4. Changed URL structure on a couple of pages to get a keyword in it. (did a correct 301)
5. Changed to better Titles and headings on the page. Keywords in them both but not the same. The result:
1. My 10 most important keywords I began ranking even better. I rank no. 1 on 9 out of 10.
2. Almost all the other pages went from ranking ~ 15 - 50 to not > 50. It has now been 4 weeks since I did most of the changes and 3 weeks since all the pages was hit > 50. So now I'm thinking about what to do?
1. Should I clean up my text, titles so they don't look to over optimized?
2. Should I remove my links from my own pages? (my link profile in general is actually pretty good.)
3. or should I just wait? Because changing more will just indicate to Google that somehing fishy is going on 😉 ? In the beginning I hoped that Google killed my rankings just because of the big changes. But now after 3 weeks I'm more sceptical and thinks I've been hit by a over-optimizing filter. According to webmaster tools I've not been hit by a manually penalty. Please, help me. I would really appreciate all ideas from people here with more experience.0 -
Time to deindexing: WMT Request vs. Server not found
Google indexed some subdomains (13!) that were never supposed to exist, but apparently returned a 200 code when Google somehow crawled them. I can get these subdomains to return a "server not found" error by turning off wildcard subdomains at my DNS. I've been told that these subdomains will be deindexed just from this server not found error. I was going to use Webmaster Tools and verify each domain, but I'm on an economy goDaddy server and apparently subdomains just get forwarded to a directory, so subdomain.domain.com gets redirected to domain.com/subdomain. I'm not even sure with this being the case, if I can get WMT to recognize and remove these subdomains like that. Should I fret about this, or will the "server not found" message get Google to remove these soon enough?
Technical SEO | | erin_soc0 -
How do I find which pages are being deindexed on a large site?
Is there an easy way or any way to get a list of all deindexed pages? Thanks for reading!
Technical SEO | | DA20130 -
Is Go Daddy a bad domain?
I heard today that Go Daddy is not the besting hosting domain for websites...it isn't crawled well by websites. Is this true? What is the best hosting domain?
Technical SEO | | CapitolShine0 -
Lots of Domains Going Nowhere - Point to a Real Domain?
I have hundreds of domains that I have purchased over the years that arent going anywhere except GoDaddy's Cash Parking system, which returns very little revenue, if at all. I wonder if it would make more sense to just point these domains to actually e-commerce sites that I own. If so, how best to take these domains and point them so that SEO credit is given properly. Most of these available domains dont have anything to do with the e-commerce stores. So not sure it would help. Furthermore, if I were to purchase new domains that were more relevant to the keywords to our e-commerce sites, how best to set them up so we can generate traffic on them and point them over to the actual domains? Many thanks.
Technical SEO | | findachristianjob0 -
I have a site that has both http:// and https:// versions indexed, e.g. https://www.homepage.com/ and http://www.homepage.com/. How do I de-index the https// versions without losing the link juice that is going to the https://homepage.com/ pages?
I can't 301 https// to http:// since there are some form pages that need to be https:// The site has 20,000 + pages so individually 301ing each page would be a nightmare. Any suggestions would be greatly appreciated.
Technical SEO | | fthead90 -
Is having "rel=canonical" on the same page it is pointing to going to hurt search?
i like the rel=canonical tag and i've seen matt cutts posts on google about this tag. for the site i'm working on, it's a great workaround because we often have two identical or nearly identical versions of pages: 1 for patients, 1 for doctors. the problem is this: the way our content management system is set up, certain pages are linked up in a number of places and when we publish, two different versions of the page are created, but same content. because they are both being made from the same content templates, if i put in the rel=canonical tag, both pages get it. so, if i have: http://www.myhospital.com/patient-condition.asp and http://www.myhospital.com/professional-condition.asp and they are both produced from the same template, and have the same content, and i'm trying to point search at http://www.myhospital.com/patient-condition.asp, but that tag appears on both pages similarly, we have various forms and we like to know where people are coming from on the site to use those forms. to the bots, it looks like there's 600 versions of particular pages, so again, rel=canonical is great. however, because it's actually all the same page, just a link with a variable tacked on (http://www.myhospital.com/makeanappointment.asp?id=211) the rel=canonical tag will appear on "all" of them. any insight is most appreciated! thanks! brett
Technical SEO | | brett_hss0