Google Panelizes to much SEO
-
I just read this interesting article about a new Google Penalty that will be up in the next upcoming weeks/months about Google making changes to the algorithm.
The penalty will be targeted towards websites that are over optimized or over seo'ed. What do you think about this? Is this a good thing or is this not a good thing for us as SEO marketeers?
here's the link: SEL.com/to-much-seo
I'm really curious as to your point of views.
regards
Jarno
-
I'm not saying that a keyword matched domain should receive a penalty, I'm saying that it should be a more neutral ground when it comes to ranking factors.
If your site happens to have a keyword matched domain but has great content and value for a visitor then it should rightly rank higher for relevant queries - and if they changed the weighting on the matched domains then it should(in theory) weed out the thinner value sites.
Matt Cutts did mention in that interview the phrase 'level playing field' - not that I personally believe this could ever be achieved with an algorithm
-
Daniel,
though i can imagine your thinking in this i do not agree with you. One of our main websites is focused on camping at the farmer (in dutch it's like: kamperen bij de boer) and we have a domain name that is similar to that. So our website:
www.kamperen-bij-de-boer.com is build for user but it does use the exact keyword people are looking. If Google changes the algorithm to penalize websites that use keyword "stuffed" domains then my website, build for users, would be demoted. That's not fair is it?
I do agree with you however in some cases. There are a lot off websites that keyword stuff the domain or use - to separate domain names from competitors. So in some cases i totally agree with you but there are still some websites that have a domain name that is the main keyword. What would happen to them?
-
Speaking from a users perspective, thats the one I would like to see changed the most myself.
So meny times there is some exact match keyword domain with poor content ranking high on the first page on seemingly the weight of that factor alone.
If the changes are along those lines it will make SEO 'easyer' unless you were using such technicques yourself.
My other guess is that they're going to improve their 'best guesses' for those pages without semanticaly correct html e.t.c. That would 'even the playing feild' but still would favour optimised content.
-
Agreed.
-
I'd like to see a lot less weight being put into keyword matched domains for a start..
-
Aran,
for example it could mean the number of times the keyword is displayed in a page. I might think i use it naturally but Google might decide otherwise..
on the article of searchengineland some of the feedbacks enroll external linking to your site. By buying links to your competitors your might invoke them as harmful websites. I don't think Google is stupid but it could be influential.
Off course it goes without saying that the only way we are going to find out what it entails it buy waiting for it and then testing. I agree with your white-hat theory. Keep using white-hat and you should always be fine.
-
Hi Jarno,
You Say "there are some factors that SEO's use that can be an issue", what do you image these could be?
Cheers
Aran
-
Aran,
i totally agree with you on the points you made but since Google is letting out more and more information about wanting to level the SEO market I was very curious as to what other SEO's think about this.
I can image that Google is going to make a point out of a lot of things but there are some factors that SEO's use that can be an issue. That is what i can imagine about it anyway.
thanks for the reply
-
Don't quote me on this but Google has been heading this way for quite some time, has it not?
Remember keyword stuffing, content farms, dodgy link practices?
As long as we build websites that
- Function correctly
- Have good unique content
- Are easy to use
We are on the right track and have little to worry about.
I'll carry on with my current strategy of putting the effort into content and doing my damnest to get people to notice it and perhaps give me a link, +1, tweet, like or bookmark.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google says Geolocation Redirects Are Okay - is this really ok ?
Our aim is to send a user from https://abc.com/en/us to** https://abc..com/en/uk/ **if they came to our US English site from the UK So we came across this document - https://webmasters.googleblog.com/2014/05/creating-right-homepage-for-your.html We are planning to follow this in our international website based on the article by google : automatically serve the appropriate HTML content to your users depending on their location and language settings. You will either do that by using server-side 302 redirects or by dynamically serving the right HTML content. Will there be any ranking issues/ penalty issue because of following this or because of 302 redirects ? **Another article - **https://www.seroundtable.com/google-geolocation-redirects-are-okay-26933.html
White Hat / Black Hat SEO | | NortonSupportSEO0 -
Does Google want contact numbers in the meta description?!
Reading up it seems like there's complete free reign to enter what you want in the meta description and they are not considered a direct ranking signal However I have added contact numbers to the meta descriptions for around 20 reasonably high ranking pages for my company and it seems to have had a negative effect (taken screen grabs and previous rankings) More strangely when you 'inspect' the page the meta description features the desired number yet when you find the page in the serps the meta description just does not feature the number (page has been cached and the description does not carry on) I'm wondering whether such direct changes are seen as spam and therefore negative to the page?
White Hat / Black Hat SEO | | Jacksons_Fencing1 -
HOW!??! Homepage Ranking Dropped Completely out of Top 100 on Google....
So I'm competing for a very competitive keyword, and I've been on the bottom of page 2 for a while now, ranking for my homepage, which is very content rich and has GREAT links pointing to it. Out of nowhere, last week I dropped completely out of the top 100 or so, yet one of my article posts now ranks on page 6 or so for the same keyword. I have great authoritative links, my on-page is spot on, all of my articles are super super high quality, I don't understand how my homepage, which has ranked for the main keyword for months on page 2, can just completely drop out of the top 100 or so.... Can anyone help provide some insight?
White Hat / Black Hat SEO | | juicyresults0 -
Forcing Google to Crawl a Backlink URL
I was surprised that I couldn't find much info on this topic, considering that Googlebot must crawl a backlink url in order to process a disavow request (ie Penguin recovery and reconsideration requests). My trouble is that we recently received a great backlink from a buried page on a .gov domain and the page has yet to be crawled after 4 months. What is the best way to nudge Googlebot into crawling the url and discovering our link?
White Hat / Black Hat SEO | | Choice0 -
How does google view...
I have two urls that are almost the same for example: www.mysite.co.uk/motoring/car_fuel www.mysite.co.uk/motoring/car-fuel both pages are very different, but on the same topic. How does google view the use of _ and - in urls? Will it see my urls as different? Please advise if you know the answer. Thank You.
White Hat / Black Hat SEO | | JamesT0 -
Are CDN's good or bad for SEO? - Edmonton Web
Hello Moz folks, We just launched a new website: www.edmontonweb.ca It is now ranking on page 2 in our city. The website is built on Wordpress and we have made every effort to make it load faster. We have enabled the right caching and we have reduced the file size. Still, some of our local competitors have lower load times and more importantly lower ttfb's. Is a CDN the right answer? I've read articles demonstrating that Clowd Flare decreased a websites rankings. Is there a better CDN to use, or a propper way to implement Clowd Flare? Thank you very much for your help! Anton,
White Hat / Black Hat SEO | | Web3Marketing87
LAUNCH Edmonton0 -
Is there a problem with google?
I have one or two competitors (in the UK) in my field who buy expired 1 - 8 year old domains on random subjects (SEO, travel, health you name it) and they are in the printing business and they stick 1 - 2 articles (unrelated to what was on there before) on these and that's it. I think they stick with PA and DA above 30 and most have 10 – 100 links so well used expired domains, hosted in the USA and most have different Ip’s although they now have that many (over 70% of their backlink profile) that some have the same ip. On further investigation none of the blogs have any contact details but it does look like they have been a little smart here and added content to the about us (similar to I use to run xxx but now do xxx) also they have one or two tabs with content on (article length) that is on the same subject they use to do and the titles are all the same content. So basically they are finding expired 1 – 10 year old domains that have only been expired (from what I can see) 6 months max and putting 1 – 2 articles on the home page in relation with print (maybe adding a third on the subject the blog use to cover), add 1 – 3 articles via tabs at the top on subjects the sites use to cover, registering the details via xbybssgcf@whoisprivacyprotect.com and that’s it. They have been ranking via this method for the last couple of years (through all the Google updates). Does Google not have any way to combat link networks other than the stupid stuff such as public link networks, it just seems that if you know what you are doing you get away, if your big enough you get away with it but the middle of the ground (mum and pop sites) get F*** over with spam pointing to there site that no spammer would dream of doing anyway?
White Hat / Black Hat SEO | | BobAnderson0 -
Best way to handle SEO error, linking from one site to another same IP
We committed an SEO sin and created a site with links back to our primary website. Although it does not matter, the site was not created for that purpose, it is actually "directory" with categorized links to thousands of culinary sites, and ours are some of the links. This occurred back in May 2010. Starting April 2011 we started seeing a large drop in page views. It dropped again in October 2011. At this point our traffic is down over 40% Although we don't know for sure if this has anything to do with it, we know it is best to remove the links. The question is, given its a bad practice what is the best fix? Should we redirect the 2nd domain to the main or just take it down? The 2nd domain does not have much page rank and I really don't think many if any back-links to it. Will it hurt us more to lose the 1600 or so back links? I would think keeping the links is a bad idea. Thanks for your advice!
White Hat / Black Hat SEO | | foodsleuth0