Over optimization penalty on the way
-
Matt Cutts has just anouced that they are bringing in a penalty for over optimized sites, to try and reward good content.
-
It's there larger effort towards updating their algorithm based on sematic search - http://searchengineland.com/wsj-says-big-google-search-changes-coming-reality-check-time-115227
Next gen of search will arrive within a year.
-
I'm happy to hear news like that. Hope it will clean up serps a little bit.
-
I see it as good news, i am not one for overdoing it.
i believe in not making any mistakes rather then trying to game the SE.
Good content, clean easy to crawl fast loading pages.
-
Thanks. I was reading about that.
If I was the boss at Google I would put my engineers on....
- ranking small specialty sites with superior content above the huge authority sites with skimpy content
- killing scraper and spinner sites that are using the content of others to make money
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
We have a site with a lot of international traffic, can we split the site some way?
Hello, We have a series of sites and one, in particular, has around 75,000 (20%) monthly users from the USA, but we don't currently offer them anything as our site is aimed at the UK market. The site is a .com and though we own the .co.uk the .com is the primary domain. We have had a lot of success moving other sites to have the .co.uk as the primary domain for UK traffic. However, in this case, we want to keep both the UK traffic and the US traffic and if we split it into two sites, only one can win right? What could do? It would be cool to have a US version of our site but without affecting traffic too much. On the other sites, we simply did 301 redirects from the .com page to the corresponding .co.uk page. Any ideas?
White Hat / Black Hat SEO | | AllAboutGroup0 -
Google's Related Searches - Optimizing Possible?
Does anyone know how Google determines what suggestions show up at the bottom of SERPs? I've been working with a client to boost his local ranking, but every time we do a branded search for his business his competitors keep popping up in the "Searches related to ______" section.
White Hat / Black Hat SEO | | mtwelves0 -
80% of traffic lost over night, Google Penalty?
Hi all.
White Hat / Black Hat SEO | | Hemjakt
I have a website called Hemjakt (http://www.hemjakt.se/) which is a search engine for real estate currently only available on the Swedish market. The application crawl real estate websites and collect all estates on a single searchable application. The site has been released for a few months and have seen a steady growth since release, increasing by 20% weekly up to ~900 visitors per day. 3 days ago, over night, I lost 80% of my traffic. Instead of 900 visitors per day I'm at ~100 visitors per day and when I search for long, specific queries such as "Åsgatan 15, Villa 12 rum i Alsike, Knivsta" ( <adress><house type=""><rooms><area> <city>), I'm now only found on the fifth page. I suspect that I have become a subject of a Google Penalty. How to get out of this mess?</city></rooms></house></adress> Just like all search engines or applications, I do crawl other websites and scrape their content. My content is ~90% unique from the source material and I do add user value by giving them the possibility to compare houses, get ton of more data to compare pricing and history, giving them extra functionalities that source site do not offer and so on. My analytics data show good user engagement. Here is one example of a Source page and a page at my site:
Source: http://www.hemnet.se/bostad/villa-12rum-alsike-knivsta-kommun-asgatan-15-6200964
My Site: http://www.hemjakt.se/bostad/55860-asgatan-15/ So: How do I actually confirm that this is the reason I lost my traffic? When I search for my branded query, I still get result. Also I'm still indexed by Google. If I am penalized. I'm not attempting to do anything Black Hat and I really believe that the app gives a lot of value to the users. What tweaks or suggestions do you have to changes of the application, to be able to continue running the service in a way that Google is fine with?0 -
Local Map Pack: What's the best way to handle twin cities?
Google is increasing cracking down on bad local results. However, in many regions of the US there are twin cities or cities that reside next to each other, like Minneapolis-Saint Paul or Kansas City. According to Google guidelines your business should only be listed in the city in which your business is physically located. However, we've noticed that results just outside of the local map pack will still rank, especially for businesses that service the home. For example, let's say you have a ACME Plumbing in Saint Paul, MN. If you were to perform a search for "Plumbing Minneapolis" you typically see local Minneapolis plumbers, then Saint Paul outliers. Usually the outliers are in the next city or just outside of the Google map centroid. Are there any successful strategies to increase rank on these "Saint Paul outliers" that compete with local Minneapolis results or are the results always going lag behind in lieu of perceived accuracy? We're having to compete against some local competitors that are using some very blackhat techniques to rank multiple sites locally (in the map results). They rank multiple sites for the same company, under different company names and UPS store addresses. Its pretty obvious, especially when you see a UPS store on the street view of the address! We're not looking to bend the rules, but rather compete safely. Can anything be done in this service based scenario?
White Hat / Black Hat SEO | | AaronHenry0 -
Content optimized for old keywords and G Updates
Hi, We've got some old content, about 50 pages worth in an Ecommerce site, that is optimized for keywords that aren't the subject of the page - these keywords occur about 8 times (2 keywords per page) in the old content. We are going through these 50 pages and changing the title, H1, and meta description tag to match the exact subject of the page - so that we will increase in rankings again - the updates have been lowering our rankings. Do we need to completely rewrite the content for these 50 pages, or can we just sprinkle it with any needed additions of the one keyword that is the subject of the page? The reason I'm asking is that our rankings keep dropping and these 50 pages seem to be part of the problem. We're in the process of updating these 50 pages Thanks.
White Hat / Black Hat SEO | | BobGW0 -
Can i 301 redirect a website that does not have manual penalty - but definetly affected by google
ok, i have a website (website A) which has been running since 2008, done very nicely in search results, until january of this year... it dropped siginificantly, losing about two thirds of visitors etc... then in may basically lost the rest... i was pulling my hair out for months trying to figure out why, i "think" it was something to do with links and anchor text, i got rid of old SEO company, got a new SEO company, they have done link analysis, trying to remove lots of links, have dissavowed about 500 domains... put in a reconsideration request... got a reply saying there is no manual penalty... so new seo company says all they can do is carry on removing links, and wait for penguin to update and hopefully that will fix it... this will take as along as it takes penguin to update again... obviously i can not wait indefinetely, so they have advised i start a new website (website B)... which is a complete duplicate of website A. Now as we do not know whats wrong with website A - (we think its links - and will get them removed) my seo company said we cant do a 301 redirect, as we will just cause what ever is wrong to pass over to website B... so we need to create a blank page for every single page at website A, saying we have moved and put a NO FOLLOW link to the new page on website B.... Personally i think the above will look terrible, and not be a very user friendly experience - but my seo company says it is the only way to do it... before i do it, i just wanted to check with some experts here, if this is right? please advise if 301 redirects are NOT correct way to do this. thanks
White Hat / Black Hat SEO | | isntworkdull
James0 -
Removing Unnatural Link Penalties
As soon as I began working in my current position at my current company I noticed my predecessor's tendency towards buying link packages from blackhat companies... I knew we were being penalized, and had to prove to him that we needed to halt those campaigns immediately and try our darndest to remove all poison links from the internet. I did convince him and began the process. There was 57% of our backlinks tied to the same anchor phrase with 836 domains linking to the same phrase, same page. Today there are 643 of those links remaining. So I have hit a large number of them, but not nearly enough. So now I am getting messages from Google announcing that our site has been hit with an unnatural link penalty. I haven't really seen the results of this yet in the keywords I am trying to rank for, but fear it will hurt very soon and know that I could be doing better in the meantime. I really don't know what to do next. I've tried the whole "contact the webmasters" technique and maybe have had 1/100 agree to remove our links. They all want money or don't respond.. Do I really need to use this Disavow tool?
White Hat / Black Hat SEO | | jesse-landry
I hear mixed things about it.. Anybody with experience here like to share their stories? Thanks for the moral support!0 -
Abused seo unintentionally, now need a way out
Hello, I have been in contact with a smo to optimize my site for search engines and social media sites. my site was doing great from last 4 years. but suddenly it started dropping in ranking. then i came and joined seomoz pro to find a way out. i was suggested to categories content in form of subdomains ... well that put a huge toll on my rankings.. thanks to suggestions here i have 301 them to sub directories. Now another huge question arises. i found out that my smo guy was taking artificial votes or whatever youc all them on twitter, facebook and g+ ...twitter and facebook's are understandable but i am getting to think that these votings on g+ might have affected my site's ranking ? here is a sample url http://www.designzzz.com/cutest-puppy-pictures-pet-photography-tips/ if you scroll below you will see 56 google plus 1s... now the big question is, i have been creating genuince content. but nowt hat i am stuck in this situation, how to get out of it ? changing urls will be bad for readers.. will a 301 will fix it ? or any other method. thanks in advance
White Hat / Black Hat SEO | | wickedsunny10