Over optimization penalty on the way
-
Matt Cutts has just anouced that they are bringing in a penalty for over optimized sites, to try and reward good content.
-
It's there larger effort towards updating their algorithm based on sematic search - http://searchengineland.com/wsj-says-big-google-search-changes-coming-reality-check-time-115227
Next gen of search will arrive within a year.
-
I'm happy to hear news like that. Hope it will clean up serps a little bit.
-
I see it as good news, i am not one for overdoing it.
i believe in not making any mistakes rather then trying to game the SE.
Good content, clean easy to crawl fast loading pages.
-
Thanks. I was reading about that.
If I was the boss at Google I would put my engineers on....
- ranking small specialty sites with superior content above the huge authority sites with skimpy content
- killing scraper and spinner sites that are using the content of others to make money
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
We have a site with a lot of international traffic, can we split the site some way?
Hello, We have a series of sites and one, in particular, has around 75,000 (20%) monthly users from the USA, but we don't currently offer them anything as our site is aimed at the UK market. The site is a .com and though we own the .co.uk the .com is the primary domain. We have had a lot of success moving other sites to have the .co.uk as the primary domain for UK traffic. However, in this case, we want to keep both the UK traffic and the US traffic and if we split it into two sites, only one can win right? What could do? It would be cool to have a US version of our site but without affecting traffic too much. On the other sites, we simply did 301 redirects from the .com page to the corresponding .co.uk page. Any ideas?
White Hat / Black Hat SEO | | AllAboutGroup0 -
Google Penguin penalty is automated or manual?
Hi, I have seen some of our competitors are missing from top SERP and seems to be penalised as per this penalty checker: http://pixelgroove.com/serp/sandbox_checker/. Is this right tool to check penalty? Or any other good tools available? Are these penalties because of recent Penguin update? If so, is this a automated or manual penalty from Google? I don't think all of these tried with black-hat techniques and got penalised. The new penguin update might triggered their back-links causing this penalty. Even we dropped for last 2 weeks. What's the solution for this? How effectively link-audit works? Thanks, Satish
White Hat / Black Hat SEO | | vtmoz0 -
Besides technical error improvement, best way to increase organic traffic to movie review website
I have a friend's website, ShowBizJunkies, that they work very had at improving and providing great content. I put the website in a more modern theme, increased speed (wpengine, but maxed out with cdn, caching, image optimization, etc) But now I'm struggling how to suggest further improving the seo structure or building backlinks. I know trying to come up for those terms like "movie reviews" and many similar are ridiculously difficult, and requires tons of high quality backlinks. What is my lowest hanging fruit here, any suggestions? My current plan is: 1. Fix technical errors 2. Create more evergreen content 3. Work on timing of article release for better Google News coverage 4. More social sharing, sharing on Tumblr, Reddit, Facebook Groups, G+ Communities, etc 5. Build backlinks via outreach to tv show specific sites, movie fan sites, actor fan sites (interviews)
White Hat / Black Hat SEO | | JustinMurray1 -
Resubmitting disavow file after penalty removal
Hi, We had a manual penalty for links removed about a year ago. The disavow file we submitted was pretty extensive and we took the machete approach, as recommended by Matt Cutts. Recently we took a look over the file again and are of the firm conviction that some of the domains are entirely legit and the links are not manipulated. We would like to resubmit the disavow file excluding these domains so Google picks up the links again. Does anyone have experience of this and if so what were the results? Thanks
White Hat / Black Hat SEO | | halloranc0 -
Dynamic Content Boxes: how to use them without get Duplicate Content Penalty?
Hi everybody, I am starting a project with a travelling website which has some standard category pages like Last Minute, Offers, Destinations, Vacations, Fly + Hotel. Every category has inside a lot of destinations with relative landing pages which will be like: Last Minute New York, Last Minute Paris, Offers New York, Offers Paris, etc. My question is: I am trying to simplify my job thinking about writing some dynamic content boxes for Last Minute, Offers and the other categories, changing only the destination city (Rome, Paris, New York, etc) repeated X types in X different combinations inside the content box. In this way I would simplify a lot my content writing for the principal generic landing pages of each category but I'm worried about getting penalized for Duplicate Content. Do you think my solution could work? If not, what is your suggestion? Is there a rule for categorize a content as duplicate (for example number of same words in a row, ...)? Thanks in advance for your help! A.
White Hat / Black Hat SEO | | OptimizedGroup0 -
Can anyone recommend a Google-friendly way of utilising a large number of individual yet similar domains related to one main site?
I have a client who has one main service website, on which they have local landing pages for some of the areas in which they operate. They have since purchased 20 or so domains (although in the process of acquiring more) for which the domain names are all localised versions of the service they offer. Rather than redirecting these to the main site, they wish to operate them all separately with the goal of ranking for the specific localised terms related to each of the domains. One option would be to create microsites (hosted on individual C class IPs etc) with unique, location specific content on each of the domains. Another suggestion would be to park the domains and have them pointing at the individual local landing pages on the main site, so the domains would just be a window through which to view the pages which have already been created. The client is aware of the recent EMD update which could affect the above. Of course, we would wish to go with the most Google-friendly option, so I was wondering if anyone could offer some advice about how would be best to handle this? Many thanks in advance!
White Hat / Black Hat SEO | | AndrewAkesson0 -
Ways to find private - non-indexed forums in a niche
I would wondering if there were ways to find non-indexed content in private forums/discussion boards. Is there a scalable 'foot-print' that suggests the forum has a private section?
White Hat / Black Hat SEO | | ilyaelbert0 -
"Unnatural Linking" Warning/Penalty - Anyone's company help with overcoming this?
I have a few sites where I didn't manage the quality of my vendors and now am staring at some GWT warnings for unnatural linking. I'm assuming a penalty is coming down the pipe and unfortunately these aren't my sites so looking to get on the ball with unwinding anything we can as soon as possible. Does anyone's company have experience or could pass along a reference to another company who successfully dealt with these issues? A few items coming to mind include solid and speedy processes to removing offending links, and properly dealing with the resubmission request?
White Hat / Black Hat SEO | | b2bmarketer0