Are they going to deindex everyone?
-
Looking at the over-optimisation list it pretty much seems to me something that everyone is doing so are we suddenly going to find the best results actually de-indexed.
Maybe Google will slowly shut off indexing stage by stage so everyone changes.
What are your thoughts?
-
Alan, I like most of your responses. A fat thumbs up for you! I don't risk getting my clients' websites. Imagine me losing all the income? I had a talk with my client that I can get him to the top by creating a little shady things (spam) OR I could take the long hard road. We took the long hard road which was based off his decision to retire in the next two years. We can't afford to get Google smacked.
@Gareth, not everyone spams or over-optimizes. I definitely don't over optimize. It lowers conversions because the content becomes unreadable. ACTUALLY, my sites are ranking higher now so I am glad this is happening.
-
I would be inclined to agree with the gentleman who have answered above me. I do not know what Google will do for sure as I believe almost no one including even some that work at Google do. However I would imagine Google would be more apt to tackle the people who are manipulating their ranking on Google by use of keyword stuffing, link farms and on and on. The fact that someone can be penalized for having a website that is to highly optimized I find disturbing however I really doubt and hope it is only the people that have practiced Google has told us our black/grey hat tactics. I consider myself a ethical business man like the gentleman above me said I would never practice any of the black/grey hat techniques to improve my own or my clients ranking it just doesn't make sense to do that. I would imagine if they were to delist everyone that has done something to make sure there on page seo is optimized they would devalue some of their results that everyday people count on Google. To put it plainly if they did that Google would hurt themselves because it ordinary people not get what they are looking for when they search Google. I hope this is of some help.
-
At the end of the day google is trying to get poor websites who are listing because they either stuffed title tags, crammed keywords in filler content in a footer and other poor seo tactics. If you optimized your site properly with keyword research in a logical manner I do not suspect to see an impact. The only impact you could see is the devaluing of links from poor websites to yours and you see a trickle down effect.
-
What list is this?
From all i have heard, it is spam that is getting de-indexed, i would not say everyone is spamming, I certainly do not risk my clients sites with spam.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Recently re-built our site and changed domain. Now I want to go back to old domain - it it a bad idea?
About a year ago I rebuilt our website and changed our domain name. We rent villas in Tuscany, we used to be 'invitationtotuscany.com'. Then I started doing the same in Provence, and in the italian lakes, so i had further sites called invitationtoprovence.com and invitationtotheitalianlakes.com. But maintaining them was awkward and I wanted to have one site. So I put them all onto invitationto.com and did 301s from the old domains and sites. Now I'd dropped off organic search results and I've also realised that invitationto.com is far less clear as a business address. My inclination is to go back to invitationtotuscany.com - Tuscany is still 80% of our business and have the other areas in there too - optimised for SEO for Provence etc. I'm being told its a really bad idea to change domain, 301 the old one, and then revert to the original domain. But I'm suffering anyway, so I wonder if I sjhouldn't just bite the bullet. A lot of my old good backlinks still point to invitationtotuscany.com (BBC, Sunday Times, etc) and the DA is 33 against 22 on the new one.. All help gratefully received! : )
Technical SEO | | DanWrightson0 -
Is it possible to deindex old URLs that contain duplicate content?
Our client is a recruitment agency and their website used to contain a substantial amount of duplicate content as many of the listed job descriptions were repeated and recycled. As a result, their rankings rarely progress beyond page 2 on Google. Although they have started using more unique content for each listing, it appears that old job listings pages are still indexed so our assumption is that Google is holding down the ranking due to the amount of duplicate content present (one software returned a score of 43% duplicate content across the website). Looking at other recruitment websites, it appears that they block the actual job listings via the robots.txt file. Would blocking the job listings page from being indexed either by robots.txt or by a noindex tag reduce the negative impact of the duplicate content, but also remove any link juice coming to those pages? In addition, expired job listing URLs stay live which is likely to be increasing the overall duplicate content. Would it be worth removing these pages and setting up 404s, given that any links to these pages would be lost? If these pages are removed, is it possible to permanently deindex these URLs? Any help is greatly appreciated!
Technical SEO | | ClickHub-Harry0 -
Google has deindexed 40% of my site because it's having problems crawling it
Hi Last week i got my fifth email saying 'Google can't access your site'. The first one i got in early November. Since then my site has gone from almost 80k pages indexed to less than 45k pages and the number is lowering even though we post daily about 100 new articles (it's a online newspaper). The site i'm talking about is http://www.gazetaexpress.com/ We have to deal with DDoS attacks most of the time, so our server guy has implemented a firewall to protect the site from these attacks. We suspect that it's the firewall that is blocking google bots to crawl and index our site. But then things get more interesting, some parts of the site are being crawled regularly and some others not at all. If the firewall was to stop google bots from crawling the site, why some parts of the site are being crawled with no problems and others aren't? In the screenshot attached to this post you will see how Google Webmasters is reporting these errors. In this link, it says that if 'Error' status happens again you should contact Google Webmaster support because something is preventing Google to fetch the site. I used the Feedback form in Google Webmasters to report this error about two months ago but haven't heard from them. Did i use the wrong form to contact them, if yes how can i reach them and tell about my problem? If you need more details feel free to ask. I will appreciate any help. Thank you in advance C43svbv.png?1
Technical SEO | | Bajram.Kurtishaj1 -
Top 10 keywords is still going strong but the rest just got smashed!
Hi SEOMOZ and it's USERs, Been trying to find my answer online but now after three weeks of reading blogposts I'm going to try this. 🙂 My website was ranking really good on 10 important keywords but not so good on the long tail, between 11 - 50 on maybe 30 different other, not so important keywords.
Technical SEO | | Drillo
So I began doing some work (I'm a newbie) but this is what I did:
1. Changed top navigation structure to get 4 other pages (w/ keywords as links) in it. (used a dropdown)
2. Wrote plenty of text that was a good fit for the page. (The text is OK and not to spammy looking.)
3. Added three links from high quality sites with keywords as links to these pages. I added them from my own site that is on the same server, same IP. 😉 I know, not looking so good.
4. Changed URL structure on a couple of pages to get a keyword in it. (did a correct 301)
5. Changed to better Titles and headings on the page. Keywords in them both but not the same. The result:
1. My 10 most important keywords I began ranking even better. I rank no. 1 on 9 out of 10.
2. Almost all the other pages went from ranking ~ 15 - 50 to not > 50. It has now been 4 weeks since I did most of the changes and 3 weeks since all the pages was hit > 50. So now I'm thinking about what to do?
1. Should I clean up my text, titles so they don't look to over optimized?
2. Should I remove my links from my own pages? (my link profile in general is actually pretty good.)
3. or should I just wait? Because changing more will just indicate to Google that somehing fishy is going on 😉 ? In the beginning I hoped that Google killed my rankings just because of the big changes. But now after 3 weeks I'm more sceptical and thinks I've been hit by a over-optimizing filter. According to webmaster tools I've not been hit by a manually penalty. Please, help me. I would really appreciate all ideas from people here with more experience.0 -
Server 500: website deindexed?
Hi mozzers, Since August 22nd, (not a site I manage) has had a Server error 500 and all the pages got deindexed? This is obviously a server issue but why it got deindexed is it because it's been a while since it had this server issue? On the pages I checked the pages loads correctly so I am a bit confused here! His webmaster account show 1500 server errors! Can someone tell me what is going on and how to fix it? Thanks
Technical SEO | | Ideas-Money-Art0 -
How do I find which pages are being deindexed on a large site?
Is there an easy way or any way to get a list of all deindexed pages? Thanks for reading!
Technical SEO | | DA20130 -
Old massive site, should I nofollow all out going links?
The company I work for is in the process of rebuilding our entire website profile. Our biggest site, FrenchQuarter.com ranks pretty well for our main term "French Quarter Hotels" And we use that to drive business directly to our hotel businesses. The site is very old and one I inherited, rebuilding it won't be a priority for another 9 months or so. This site act's as a bit of a directory for the city. There are links everywhere and it's probably passing link juice to a lot of businesses scott free. In the mean time would it benefit or hurt us if I went through and no-followed most of the links? Would nofollowing links help frenchquarter.com to rank any better than it does? And could I then direct some of that link juice directly at our hotel websites to boost those as well? My goal is to get our hotel websites to rank 1st page, we get exposure in the locations pack, for 2 of our 5 hotels, but placing below that is impossible with all the competition from the OTA's (expecia, bookit.com, etc) Seems near impossible no matter what my backlink profile looks like. Thanks for any feedback, Cyril
Technical SEO | | Nola5041 -
How do I fix Duplicate Content/Title going to memberlist.php page?
I have over 6,000 duplicate title and duplicate content errors going to this link: http://community.mautofied.com/memberlist.php?mode=viewprofile&u=100299 How do I fix this?
Technical SEO | | mautofied0