Why is a poor optimized url ranked first on Google ?
-
Hi there, I've been working in SEO for more than five years and I'm always telling clients about the more than 200 factors that influence rankings, but sometimes I meet several urls or websites who haven't optimized their pages nor built links and still appear first.
This is the case of the keyword "Escorts en Tenerife" in google.es. If you search that keyword in google.es you'll find this url: escortislacanarias.com... (I don't want to give them a link).
My question is why the heck this url is ranking first on Google for that keyword if the url isn't optmized, the page content isn't optimized and hasn't got many or valuable incoming links?
Do an on page grader to that url regarding that keyword an it gets an F !!! So there is no correlation between better optimization and good rankings.
-
Thansk again for the effort. I like your answers, they are very helpful. Although in this case our target aren't English speaking people, just Spanish people from or in Tenerife for vacations.
-
_Google knows that Tenerife is a geographic location in the Canary Islands. This website has the word Tenerife on it many times. It has pages with Tenerife in the title, in the URL. _
Egol is just explaining why their page shows up for 'Tenerife' searches.
I have no idea if you're targeting tourists, but I expect you are, this means you're effectively targeting two types of customer - those that plan ahead and those than plan 'on the fly'.
So if I wanted an escort for my trip to Tenerife (LOL, no, I don't... happily married lady!) I might search for one before I leave the UK to get it all organised ahead of time. So I would search 'Escorts in Tenerife' (and maybe localise even further to find one close to where I was staying)...
Or, I may be lonely on one night when I get to Tenerife, and do a search to find an escort from Tenerife.
These two scenarios might show different results - the person searching from Tenerife would get localised results, whereas the person searching from the UK is relying on 'Tenerife' being included on the website to appear in the SERPs.
I hope this makes sense!
I think Egol has a lot of great information to share, and I've often found his /her responses to questions in this forum to be very useful and informative. That last comment though was a little harsh, but the point is: _if you feel the current situation is hard to overcome, then imagine how bad it would be if they did know what they are doing!!! _This is how I read it anyway. Perhaps it's just a case of what my colleague calls ‘the impersonal interaction impertinence imperative’ - sounds like something from the hitch hikers guide, doesn't it! But basically means sometimes written, impersonal communications, can be misinterpreted. I often send stuff that sounds harsher than it's meant to be because I forget that sarcasm and suchlike are not easy to interpret in written form!
I'm glad you found my answer helpful
-
Agreed the end user doesn't care about SEO. But if the user experience is even horrendous difficult to navigate it must have a high bounce rate.
One site I am not going to mention but the server he is running is blacklisted for email spam, I know one SEO company report them for Spam to Google, they're keywords stuff, poor user experience looks like its from 1999 and has spammy links. Still sites above sites which if we took Google advise. High quality content, good user experience, not too many links on page,nice and quick to load it is absolutely everything opposite to Google is recommending.
-
Hi and thanks for the comment. It helps. But tell me, what are the great insights you are talking about EGOL? Did I miss anything? Was his/their answer profound for you? I really can't see any good in EGOL's answer but a bit of arrogance instead.
That kind of answer doesn't help. Yours, on the other hand, is very usefeul, thank you ameliavargo ! I really did like and very much appreciate your third parragraph most of all !
-
I completely agree with you, there seems to be no correlation. I don't mean to say all this metrics don't work or aren't useful, I0m just saying there are sites up there in top positions and they still don't deserve it according to moz tools, for example. Thanks for your comment.
-
1.- I did give them a mention and I knew it would be good for them, but I did it because I wanted moz people to understand the precise case, but perhaps I was wrong in that, in that case sorry.
2.- I didn't say it was hard to beat. I'm only asking why is that url on top position when according to all metrics it doesn't deserve it. There are 3 or 4 competitors that have more and better incoming links, a better page optimization, better content, etc.
About user experience. have you entered the site? Did you really see anything that you did like of that page for the user?
What would you say is the main reason for them to be on top? "This website has the word Tenerife on it many times. It has pages with Tenerife in the title, in the URL." So that is your conclusion ? Thanks in advance.
-
As always, EGOL has some great insights and I agree with them...
Onsite usage metrics are also used in the algo. I'm not entirely sure exactly what Google uses, but I imagine it's something like bounce rate (though not bounce rate as they've said they don't use it... but they lie too so they might do!...), time on site, pages per visit - that kind of thing. The stuff that tells you if people are engaging with your website.
They also use organic CTR (though again, I don't think this is ever explicitly stated by them) - if your CTR is high then they will reward you (though a high CTR with bad usage metrics would not on its own help you - rather hinder I think).
This is why pi$$ poor sites often end up in high positions - people like them. Doesn't always make sense, but if the site you mention provides something people want, they aren't going to CARE whether it's been 'SEO'd' or not - and really Google doesn't either. If visitor behaviour indicates that people are getting value from visiting that site then Google is not going to move it from its high position - until you provide something that people like more and importantly onsite behaviour indicates the same.
Hope this helps
-
Google knows that Tenerife is a geographic location in the Canary Islands. This website has the word Tenerife on it many times. It has pages with Tenerife in the title, in the URL.
Even though you did not give them a link you gave them a mention... and in the same sentence you associated it with the keyword that you are searching for. All of that helps them.
Google thinks beyond the page.
If you think this website is hard to beat, wait until they get an SEO who knows what to do.
-
I don't think Google really has sorted the algo's out, there seems to be absolutely no correlation. There are sites on my targeted keyword list which no way should be on the 1st page of Google however they're up at the top. Even with higher domain authority, page authority better link profile and sites speed. Sites which are keyword stuffed also rank in the top 3 positions.
It would be great to see what other moz'ers are experiencing
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do i beat my spammy competitors? they are ranking on page 1 of google!
my competitor is ranking on page 1 of google. he has 3.4 million backlinks and 1400 refering domains. he has aquired these backlinks from various websites, but he doesnot have links from his niche . how do i beat my competition with less backlinks because if i follo his technique, it would a takea lot of time and people to build backlinks one of my strategy is to get .edu links second strategy is to have 6000 word content and rank for really low competition keywords related to my website.( my competitors website has 1500 words content!) any other strategy you can suggest?
Intermediate & Advanced SEO | | calvinkj0 -
My website is my name. Overnight it went from being the number one google search to not showing up at all when you google my name. Why would this happen?
I built my website via square space. It is my name. If you google my name it was the number one hit. Suddenly 2 weeks ago it doesn't show up AT ALL. I went through square spaces SEO check list, secured my site etc. Still doesn't show up. Why would this happen all of the sudden and What can I do? Thank you!
Intermediate & Advanced SEO | | Jbark0 -
Website redesign, some urls are no longer available. How do I mitigate ranking drops?
I am currently refreshing my WordPress business website. I used a theme that had a built in portfolio option. I wanted to strip down the bloat and move to something more simple to better articulate my message. Upon switching themes I will loose my urls for my portfolio projects. I should have never used this built in function but it did exactly what I needed and wanted here is an example. http://silvernailwebdesign.com/portfolio-view/central-jersey-claims-association-wordpress-consulting/ Now on my staging site these portfolio pieces have vanished and the urls are indexed with google. I could create posts and recreate the portfolio pieces however the problem with the url is the /portfolio-view/ portion. I cannot recreate the part of the url. Any advice would be greatly appreciated. I receive some traffic through the portfolio pages but not much, however, I do not want to loose any traffic. I am looking for a strategy that will solve this url issue with WordPress. I have about 10 separate portfolio pages with this url issue.
Intermediate & Advanced SEO | | donsilvernail0 -
URL Injection Hack - What to do with spammy URLs that keep appearing in Google's index?
A website was hacked (URL injection) but the malicious code has been cleaned up and removed from all pages. However, whenever we run a site:domain.com in Google, we keep finding more spammy URLs from the hack. They all lead to a 404 error page since the hack was cleaned up in the code. We have been using the Google WMT Remove URLs tool to have these spammy URLs removed from Google's index but new URLs keep appearing every day. We looked at the cache dates on these URLs and they are vary in dates but none are recent and most are from a month ago when the initial hack occurred. My question is...should we continue to check the index every day and keep submitting these URLs to be removed manually? Or since they all lead to a 404 page will Google eventually remove these spammy URLs from the index automatically? Thanks in advance Moz community for your feedback.
Intermediate & Advanced SEO | | peteboyd0 -
Some site's links look different on google search. For example Games.com › Flash games › Decoration games How can we do our url's like this?
For example Games.com › Flash games › Decoration games How can we do our url's like this?
Intermediate & Advanced SEO | | lutfigunduz0 -
Does putting a Google custom search box on make Google think my users are bouncing?
I added a Google custom search box to my pages, that's doing an advanced Google search. A lot of people are using it. So users are coming to my site from a Google search, and then often performing another Google search on my site. Should I be worried that Google may interpret the resultant user behavior as a bounce or pogo-stick? Or will the fact that the second search occurred on my site, using custom search, and with advanced parameters signal to Google that this is not a dissatisfied user returning to Google? Thanks
Intermediate & Advanced SEO | | GilReich0 -
Google showing high volume of URLs blocked by robots.txt in in index-should we be concerned?
if we search site:domain.com vs www.domain.com, We see: 130,000 vs 15,000 results. When reviewing the site:domain.com results, we're finding that the majority of the URLs showing are blocked by robots.txt. They are subdomains that we use as production environments (and contain similar content as the rest of our site). And, we also find the message "In order to show you the most relevant results, we have omitted some entries very similar to the 541 already displayed." SEER Interactive mentions that this is one way to gauge a Panda penalty: http://www.seerinteractive.com/blog/100-panda-recovery-what-we-learned-to-identify-issues-get-your-traffic-back We were hit by Panda some time back--is this an issue we should address? Should we unblock the subdomains and add noindex, follow?
Intermediate & Advanced SEO | | nicole.healthline0 -
Multiple Locations Google Places (URL's)?
I'm managing a restaurant chain with 10 locations. Can they all share the home page url of the corporate site in Google Places or is it better to link each location url separately? Meaning can I use www.company.com for all locations in Google places for all locations or is it better to go with www.company.com/location.html for each location. The page authority of the home page is 60 while individual location pages the page authority is in the 20's. Hope this makes sense. Thanks
Intermediate & Advanced SEO | | YMD
Gary0