Are they going to deindex everyone?
-
Looking at the over-optimisation list it pretty much seems to me something that everyone is doing so are we suddenly going to find the best results actually de-indexed.
Maybe Google will slowly shut off indexing stage by stage so everyone changes.
What are your thoughts?
-
Alan, I like most of your responses. A fat thumbs up for you! I don't risk getting my clients' websites. Imagine me losing all the income? I had a talk with my client that I can get him to the top by creating a little shady things (spam) OR I could take the long hard road. We took the long hard road which was based off his decision to retire in the next two years. We can't afford to get Google smacked.
@Gareth, not everyone spams or over-optimizes. I definitely don't over optimize. It lowers conversions because the content becomes unreadable. ACTUALLY, my sites are ranking higher now so I am glad this is happening.
-
I would be inclined to agree with the gentleman who have answered above me. I do not know what Google will do for sure as I believe almost no one including even some that work at Google do. However I would imagine Google would be more apt to tackle the people who are manipulating their ranking on Google by use of keyword stuffing, link farms and on and on. The fact that someone can be penalized for having a website that is to highly optimized I find disturbing however I really doubt and hope it is only the people that have practiced Google has told us our black/grey hat tactics. I consider myself a ethical business man like the gentleman above me said I would never practice any of the black/grey hat techniques to improve my own or my clients ranking it just doesn't make sense to do that. I would imagine if they were to delist everyone that has done something to make sure there on page seo is optimized they would devalue some of their results that everyday people count on Google. To put it plainly if they did that Google would hurt themselves because it ordinary people not get what they are looking for when they search Google. I hope this is of some help.
-
At the end of the day google is trying to get poor websites who are listing because they either stuffed title tags, crammed keywords in filler content in a footer and other poor seo tactics. If you optimized your site properly with keyword research in a logical manner I do not suspect to see an impact. The only impact you could see is the devaluing of links from poor websites to yours and you see a trickle down effect.
-
What list is this?
From all i have heard, it is spam that is getting de-indexed, i would not say everyone is spamming, I certainly do not risk my clients sites with spam.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Recently re-built our site and changed domain. Now I want to go back to old domain - it it a bad idea?
About a year ago I rebuilt our website and changed our domain name. We rent villas in Tuscany, we used to be 'invitationtotuscany.com'. Then I started doing the same in Provence, and in the italian lakes, so i had further sites called invitationtoprovence.com and invitationtotheitalianlakes.com. But maintaining them was awkward and I wanted to have one site. So I put them all onto invitationto.com and did 301s from the old domains and sites. Now I'd dropped off organic search results and I've also realised that invitationto.com is far less clear as a business address. My inclination is to go back to invitationtotuscany.com - Tuscany is still 80% of our business and have the other areas in there too - optimised for SEO for Provence etc. I'm being told its a really bad idea to change domain, 301 the old one, and then revert to the original domain. But I'm suffering anyway, so I wonder if I sjhouldn't just bite the bullet. A lot of my old good backlinks still point to invitationtotuscany.com (BBC, Sunday Times, etc) and the DA is 33 against 22 on the new one.. All help gratefully received! : )
Technical SEO | | DanWrightson0 -
Is this going to be seen by google as duplicate content
Hi All, Thanks in advance for any help that you can offer in regards to this. I have been conducted a bit of analysis of our server access file to see what googlebot is doing, where it is going etc. Now firstly, I am not SEO but have an interest. What I am seeing a lot of is that we have URL's that have an extension that sets the currency that is displayed on the products so that we can conduct Adwords campaigns in other countries, these show as follows: feedurl=AUD, feedurl=USD, feedurl=EUR etc. What I can see is that google bot is hitting a URL such as /some_product, then /someproduct?feedurl=USD and then /someproduct?feedurl=EUR and then /someproduct?feedurl=AUD all after each other. Now this is the same product page and just has the price shown slightly different on each. Would this count as a duplicate content issue? Should I disavow feedurl? Any assistance that you can offer would be greatly appreciated. Thanks, Tim
Technical SEO | | timsilver0 -
Is a micro site the way to go?
Hello, a client has asked us today to quote for how much it would cost them to get a micro site built. A Google employee has told them that because their current URL doesn't include .co.uk or.com it is simply: brandname.word that it will be harder for them to get their website to rank. My understanding is that micro sites aren't a good solution for any problem as Google doesn't like them. Would it be better for them to buy a .co.uk (they are a UK company) url and then redirect the url to their current website or is there a better solution? Many thanks
Technical SEO | | mblsolutions0 -
Top 10 keywords is still going strong but the rest just got smashed!
Hi SEOMOZ and it's USERs, Been trying to find my answer online but now after three weeks of reading blogposts I'm going to try this. 🙂 My website was ranking really good on 10 important keywords but not so good on the long tail, between 11 - 50 on maybe 30 different other, not so important keywords.
Technical SEO | | Drillo
So I began doing some work (I'm a newbie) but this is what I did:
1. Changed top navigation structure to get 4 other pages (w/ keywords as links) in it. (used a dropdown)
2. Wrote plenty of text that was a good fit for the page. (The text is OK and not to spammy looking.)
3. Added three links from high quality sites with keywords as links to these pages. I added them from my own site that is on the same server, same IP. 😉 I know, not looking so good.
4. Changed URL structure on a couple of pages to get a keyword in it. (did a correct 301)
5. Changed to better Titles and headings on the page. Keywords in them both but not the same. The result:
1. My 10 most important keywords I began ranking even better. I rank no. 1 on 9 out of 10.
2. Almost all the other pages went from ranking ~ 15 - 50 to not > 50. It has now been 4 weeks since I did most of the changes and 3 weeks since all the pages was hit > 50. So now I'm thinking about what to do?
1. Should I clean up my text, titles so they don't look to over optimized?
2. Should I remove my links from my own pages? (my link profile in general is actually pretty good.)
3. or should I just wait? Because changing more will just indicate to Google that somehing fishy is going on 😉 ? In the beginning I hoped that Google killed my rankings just because of the big changes. But now after 3 weeks I'm more sceptical and thinks I've been hit by a over-optimizing filter. According to webmaster tools I've not been hit by a manually penalty. Please, help me. I would really appreciate all ideas from people here with more experience.0 -
Deindexed site - is it best to start over?
A potential client's website has been deindexed from Google. We'd be completely redesigning his site with all new content. Would it be best to purchase a new url and redirect the old deindexed site to the new one, or try stick with the old domain?
Technical SEO | | WillWatrous0 -
Why is google not deindexing pages with the meta noindex tag?
On our website www.keystonepetplace.com we added the meta noindex tag to category pages that were created by the sorting function. Google no longer seems to be adding more of these pages to the index, but the pages that were already added are still in the index when I check via site:keystonepetplace.com Here is an example page: http://www.keystonepetplace.com/dog/dog-food?limit=50 How long should it take for these pages to disappear from the index?
Technical SEO | | JGar-2203710 -
Pros & Cons of deindexing a site prior to launch of a new site on the same domain.
If you were launching a new website to completely replace an older existing site on the same domain, would there be any value in temporarily deindexing the old site prior to launching the new site? Both have roughly 3000 pages, will launch on the same domain but have a completely new url structure and much better optimized for the web. Many high ranking pages will be redirected with 301 to the corresponding new page. I believe the hypothesis is this would eliminate a mix of old & new pages from sharing space in the serps and the crawlers are more likely to index more of the new site initially. I don't believe this is a great strategy, on the other hand I see some merit to the arguments for it.
Technical SEO | | medtouch0 -
Is having "rel=canonical" on the same page it is pointing to going to hurt search?
i like the rel=canonical tag and i've seen matt cutts posts on google about this tag. for the site i'm working on, it's a great workaround because we often have two identical or nearly identical versions of pages: 1 for patients, 1 for doctors. the problem is this: the way our content management system is set up, certain pages are linked up in a number of places and when we publish, two different versions of the page are created, but same content. because they are both being made from the same content templates, if i put in the rel=canonical tag, both pages get it. so, if i have: http://www.myhospital.com/patient-condition.asp and http://www.myhospital.com/professional-condition.asp and they are both produced from the same template, and have the same content, and i'm trying to point search at http://www.myhospital.com/patient-condition.asp, but that tag appears on both pages similarly, we have various forms and we like to know where people are coming from on the site to use those forms. to the bots, it looks like there's 600 versions of particular pages, so again, rel=canonical is great. however, because it's actually all the same page, just a link with a variable tacked on (http://www.myhospital.com/makeanappointment.asp?id=211) the rel=canonical tag will appear on "all" of them. any insight is most appreciated! thanks! brett
Technical SEO | | brett_hss0