Over optimization penalty on the way
-
Matt Cutts has just anouced that they are bringing in a penalty for over optimized sites, to try and reward good content.
-
It's there larger effort towards updating their algorithm based on sematic search - http://searchengineland.com/wsj-says-big-google-search-changes-coming-reality-check-time-115227
Next gen of search will arrive within a year.
-
I'm happy to hear news like that. Hope it will clean up serps a little bit.
-
I see it as good news, i am not one for overdoing it.
i believe in not making any mistakes rather then trying to game the SE.
Good content, clean easy to crawl fast loading pages.
-
Thanks. I was reading about that.
If I was the boss at Google I would put my engineers on....
- ranking small specialty sites with superior content above the huge authority sites with skimpy content
- killing scraper and spinner sites that are using the content of others to make money
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moz spam score 16 for some pages - Never a manual penalty: Disavow needed?
Hi community, We have some top hierarchy pages with spam score 16 as per Moz due to the backlinks with very high spam score. I read that we could ignore as long as we are not employing paid links or never got a manual penalty. Still we wanna give a try by disavowing certain domains to check if this helps. Anyway we are not going to loose any backlink score by rejecting this low-quality backlinks. Can we proceed? Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Besides description and design optimization, is there any other main factor that we can influence to get better App Store rankings?
Hi there! I do love SEO, the cracking Web Search engine, but when it comes to other Google's search engines like Youtube and Apps Store it's an unknown field for me.
White Hat / Black Hat SEO | | Gaston Riera
So, i'm diving into App Store Optimization, ASO. This is my question: Besides the text and the design in the description of the app, is there any other factor that we can manipulate or influence?(such as linkbuilding, social media or alien magic hehe). Thanks a lot!
GR.0 -
Homepage not ranking for branded searches after Google penalty removal
Hi all, A site I work on was hit with a manual action penalty some time ago for spammy links built by a former SEO agency. It was a partial match penalty so only affected some pages - most likely the homepage. We carried out a lot of work cleaning up links and disavowed suspicious links which we couldn't get removed. Again, most of these were to the homepage. The disavow file was uploaded to Google last Friday and our penalty was lifted this Tuesday. Since uploading the disavow file, our homepage does not show up at all for branded searches. I've carried out the obvious checks - robots.txt, making sure we're not accidentally noindexing the page or doing anything funky with canonicals etc and it's all good. Have any of you guys had a similar experience? I'm thinking Google simply needs time to catch up due to all the links we've disavowed and sitting tight is the best option but could do with some reassurance! Any past experiences or advice on what I might be missing would be great. Thanks in advance, Brendan.
White Hat / Black Hat SEO | | Brendan-Jackson1 -
Local Map Pack: What's the best way to handle twin cities?
Google is increasing cracking down on bad local results. However, in many regions of the US there are twin cities or cities that reside next to each other, like Minneapolis-Saint Paul or Kansas City. According to Google guidelines your business should only be listed in the city in which your business is physically located. However, we've noticed that results just outside of the local map pack will still rank, especially for businesses that service the home. For example, let's say you have a ACME Plumbing in Saint Paul, MN. If you were to perform a search for "Plumbing Minneapolis" you typically see local Minneapolis plumbers, then Saint Paul outliers. Usually the outliers are in the next city or just outside of the Google map centroid. Are there any successful strategies to increase rank on these "Saint Paul outliers" that compete with local Minneapolis results or are the results always going lag behind in lieu of perceived accuracy? We're having to compete against some local competitors that are using some very blackhat techniques to rank multiple sites locally (in the map results). They rank multiple sites for the same company, under different company names and UPS store addresses. Its pretty obvious, especially when you see a UPS store on the street view of the address! We're not looking to bend the rules, but rather compete safely. Can anything be done in this service based scenario?
White Hat / Black Hat SEO | | AaronHenry0 -
Content optimized for old keywords and G Updates
Hi, We've got some old content, about 50 pages worth in an Ecommerce site, that is optimized for keywords that aren't the subject of the page - these keywords occur about 8 times (2 keywords per page) in the old content. We are going through these 50 pages and changing the title, H1, and meta description tag to match the exact subject of the page - so that we will increase in rankings again - the updates have been lowering our rankings. Do we need to completely rewrite the content for these 50 pages, or can we just sprinkle it with any needed additions of the one keyword that is the subject of the page? The reason I'm asking is that our rankings keep dropping and these 50 pages seem to be part of the problem. We're in the process of updating these 50 pages Thanks.
White Hat / Black Hat SEO | | BobGW0 -
Attracting custom from 3 cities - Is this the best way to optimize?
Hi, I'm working for a client that draws custom from 3 nearby cities - I was thinking of creating a new page for 2 of the cities, reachable from within the website and not simply doorway pages. Each new page would include (1) General info (2) info relevant to the city in question, if relevant to client - perhaps well-known customers already coming from the city in question (3) transport from the city - directions. Is it OK to do this, or could Google see it as manipulative seeing that business is not geographically located in all 3 cities (in actual fact the business is in just one location, within the official borders of one city, in another city for some administrative services and 40 miles away from the third). Thanks in advance, Luke
White Hat / Black Hat SEO | | McTaggart0 -
Confusing penalties
Dear Mozzers, I've been working on a friend's website that is fighting for pretty competitive keywords (+90,000 gms) and has been relying almost exclusively on $1800/mo of comment spam to rank on the first page. Now that I've taken over SEO my first priorities were to: eliminate duplicate content improve site structure optimize internal links build legitimate do-follows add some keyword density fix titles and H tags Essentially just the basics, right? But since cancelling the comment spam, rankings for their primary keyword have consistently dropped over the last 3 months. I'm using the same strategies that I've used successfully on at least 6 similar websites. At the moment their homepage is still almost entirely duplicate content -- which is obviously a huge problem, but it seems a little odd that they could have been held up exclusively by that comment spam for so long, doesn't it? Even stranger, their authority and trust scores are now higher than any of the competition. Needless to say, my friends are getting pretty antsy and I'm starting to second guess myself. Do you think I should continue to push them to improve content, eliminate penalties, and build legitimate links -- or should I give in and suggest buying links as a short term solution? Advice is really appreciated!
White Hat / Black Hat SEO | | brevityworks0