Why is a poor optimized url ranked first on Google ?
-
Hi there, I've been working in SEO for more than five years and I'm always telling clients about the more than 200 factors that influence rankings, but sometimes I meet several urls or websites who haven't optimized their pages nor built links and still appear first.
This is the case of the keyword "Escorts en Tenerife" in google.es. If you search that keyword in google.es you'll find this url: escortislacanarias.com... (I don't want to give them a link).
My question is why the heck this url is ranking first on Google for that keyword if the url isn't optmized, the page content isn't optimized and hasn't got many or valuable incoming links?
Do an on page grader to that url regarding that keyword an it gets an F !!! So there is no correlation between better optimization and good rankings.
-
Thansk again for the effort. I like your answers, they are very helpful. Although in this case our target aren't English speaking people, just Spanish people from or in Tenerife for vacations.
-
_Google knows that Tenerife is a geographic location in the Canary Islands. This website has the word Tenerife on it many times. It has pages with Tenerife in the title, in the URL. _
Egol is just explaining why their page shows up for 'Tenerife' searches.
I have no idea if you're targeting tourists, but I expect you are, this means you're effectively targeting two types of customer - those that plan ahead and those than plan 'on the fly'.
So if I wanted an escort for my trip to Tenerife (LOL, no, I don't... happily married lady!) I might search for one before I leave the UK to get it all organised ahead of time. So I would search 'Escorts in Tenerife' (and maybe localise even further to find one close to where I was staying)...
Or, I may be lonely on one night when I get to Tenerife, and do a search to find an escort from Tenerife.
These two scenarios might show different results - the person searching from Tenerife would get localised results, whereas the person searching from the UK is relying on 'Tenerife' being included on the website to appear in the SERPs.
I hope this makes sense!
I think Egol has a lot of great information to share, and I've often found his /her responses to questions in this forum to be very useful and informative. That last comment though was a little harsh, but the point is: _if you feel the current situation is hard to overcome, then imagine how bad it would be if they did know what they are doing!!! _This is how I read it anyway. Perhaps it's just a case of what my colleague calls ‘the impersonal interaction impertinence imperative’ - sounds like something from the hitch hikers guide, doesn't it! But basically means sometimes written, impersonal communications, can be misinterpreted. I often send stuff that sounds harsher than it's meant to be because I forget that sarcasm and suchlike are not easy to interpret in written form!
I'm glad you found my answer helpful
-
Agreed the end user doesn't care about SEO. But if the user experience is even horrendous difficult to navigate it must have a high bounce rate.
One site I am not going to mention but the server he is running is blacklisted for email spam, I know one SEO company report them for Spam to Google, they're keywords stuff, poor user experience looks like its from 1999 and has spammy links. Still sites above sites which if we took Google advise. High quality content, good user experience, not too many links on page,nice and quick to load it is absolutely everything opposite to Google is recommending.
-
Hi and thanks for the comment. It helps. But tell me, what are the great insights you are talking about EGOL? Did I miss anything? Was his/their answer profound for you? I really can't see any good in EGOL's answer but a bit of arrogance instead.
That kind of answer doesn't help. Yours, on the other hand, is very usefeul, thank you ameliavargo ! I really did like and very much appreciate your third parragraph most of all !
-
I completely agree with you, there seems to be no correlation. I don't mean to say all this metrics don't work or aren't useful, I0m just saying there are sites up there in top positions and they still don't deserve it according to moz tools, for example. Thanks for your comment.
-
1.- I did give them a mention and I knew it would be good for them, but I did it because I wanted moz people to understand the precise case, but perhaps I was wrong in that, in that case sorry.
2.- I didn't say it was hard to beat. I'm only asking why is that url on top position when according to all metrics it doesn't deserve it. There are 3 or 4 competitors that have more and better incoming links, a better page optimization, better content, etc.
About user experience. have you entered the site? Did you really see anything that you did like of that page for the user?
What would you say is the main reason for them to be on top? "This website has the word Tenerife on it many times. It has pages with Tenerife in the title, in the URL." So that is your conclusion ? Thanks in advance.
-
As always, EGOL has some great insights and I agree with them...
Onsite usage metrics are also used in the algo. I'm not entirely sure exactly what Google uses, but I imagine it's something like bounce rate (though not bounce rate as they've said they don't use it... but they lie too so they might do!...), time on site, pages per visit - that kind of thing. The stuff that tells you if people are engaging with your website.
They also use organic CTR (though again, I don't think this is ever explicitly stated by them) - if your CTR is high then they will reward you (though a high CTR with bad usage metrics would not on its own help you - rather hinder I think).
This is why pi$$ poor sites often end up in high positions - people like them. Doesn't always make sense, but if the site you mention provides something people want, they aren't going to CARE whether it's been 'SEO'd' or not - and really Google doesn't either. If visitor behaviour indicates that people are getting value from visiting that site then Google is not going to move it from its high position - until you provide something that people like more and importantly onsite behaviour indicates the same.
Hope this helps
-
Google knows that Tenerife is a geographic location in the Canary Islands. This website has the word Tenerife on it many times. It has pages with Tenerife in the title, in the URL.
Even though you did not give them a link you gave them a mention... and in the same sentence you associated it with the keyword that you are searching for. All of that helps them.
Google thinks beyond the page.
If you think this website is hard to beat, wait until they get an SEO who knows what to do.
-
I don't think Google really has sorted the algo's out, there seems to be absolutely no correlation. There are sites on my targeted keyword list which no way should be on the 1st page of Google however they're up at the top. Even with higher domain authority, page authority better link profile and sites speed. Sites which are keyword stuffed also rank in the top 3 positions.
It would be great to see what other moz'ers are experiencing
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Update
My rank has dropped quite a lot this past week and I can see from the Moz tools that there is an unconfirmed Google update responsible. Is there any information from Moz on this?
Intermediate & Advanced SEO | | moon-boots0 -
Moz page optimization score issue, have a score of 95, but can get to 99 if I ad my keyword basically twice in the url.
Hello, I have a keyword for lack of providing too much info we will say my keyword is laptop-bags. Now we have a /laptop-bags/ page and inside that page **/laptop-bags/leather-shoulder/ ** We got a score of 95 for that page. Now I got a score of 99 when I changed it to **/laptop-bags/leather-shoulder-laptop-bags/ ** The way Bigcommerce handles is it will use the product category title in the url, page title and site links, to me it feels like it's spammy, as well as on my /laptop-bags/ page, I now have 18 keywords of " laptop bags " on that page when before it was 12, since I added laptop-bags to all 6 categories inside the laptop-bags page. How would you handle this, use the /keyword/ then /longtail-keyword/ in full or would using /laptop-bag/leather-shoulder/ still rank for leather shoulder laptop bags? I've asked this before and was told to use whatever sounded better to the user, but now moz is telling me different.
Intermediate & Advanced SEO | | Deacyde0 -
Im scoring 100% in the page optimization, wht else I need to do, because I rank 7-12 in search results
Hi All, Pls check the below url http://www.powerwale.com/inverter-battery for inverter battery keyword in google.co.in im scoring 100% in the page optimization, wht else I need to do, and also I still rank in between 7 to 12 in search results.. How can be in Top 3 search results.. Pls suggest.. Thanks
Intermediate & Advanced SEO | | Rahim1191 -
How to switch from URL based navigation to Ajax, 1000's of URLs gone
Hi everyone, We have thousands of urls generated by numerous products filters on our ecommerce site, eg./category1/category11/brand/color-red/size-xl+xxl/price-cheap/in-stock/. We are thinking of moving these filters to ajax in order to offer a better user experience and get rid of these useless urls. In your opinion, what is the best way to deal with this huge move ? leave the existing URLs respond as before : as they will disappear from our sitemap (they won't be linked anymore), I imagine robots will someday consider them as obsolete ? redirect permanent (301) to the closest existing url mark them as gone (4xx) I'd vote for option 2. Bots will suddenly see thousands of 301, but this is reflecting what is really happening, right ? Do you think this could result in some penalty ? Thank you very much for your help. Jeremy
Intermediate & Advanced SEO | | JeremyICC0 -
3 Pages Ranking Beside Each Other | How do I consolidate so one ranks better?
An ecommerce website I own called backyardGamez.com sells outdoor games, for example cornhole boards, bags, etc. One such product is a cornhole board carrying case. If you search the above phrase, my site has three pages that rank on the first page. The term isn't high volume, so I'm assuming that is part of the reason. Is this a good, normal thing or does this mean I have inadvertently broken up my ranking power from one powerful page to 3 OK pages? Does anyone know how I can take two of these pages and use them to make the 3rd page more powerful? For example, I would prefer 1 page ranks higher on page 1 in the serps and the other two fall a bit from supporting the other. Thanks, Adam
Intermediate & Advanced SEO | | Soft-Lite0 -
Can changing G+ authorship on a well-ranking article drop its search ranking?
We have an article that ranks #1 in Google SERP for the keyword we want it to rank for. We decided to revise the article because although it's performing well, we knew it could be better and more informative for the user. Now that we've revised the content, we're wondering: Should we update the article author (and the G+ authorship markup) to reflect that the revisor authored the content, or keep the original author listed? Can changing G+ authorship on an article impact its search ranking, or is that an issue that's a few Google algorithm updates down the road?
Intermediate & Advanced SEO | | pasware0 -
Are links that are disavowed with Google Webmaster Tools removed from the Google Webmaster Profile for the domain?
Hi, Two part question - First, are links that you disavow using google webmaster tools ever removed from the webmaster tools account profile ? Second, when you upload a file to disavow links they ask if you'd like to replace the previously uploaded file. Does that mean if you don't replace the file with a new file that contains the previously uploaded urls those urls are no longer considered disavowed? So, should we download the previous disavow file first then append the new disavow urls to the file before uploading or should we just upload a new file that contains only the new disavow urls? Thanks
Intermediate & Advanced SEO | | bgs0 -
Best way to permanently remove URLs from the Google index?
We have several subdomains we use for testing applications. Even if we block with robots.txt, these subdomains still appear to get indexed (though they show as blocked by robots.txt. I've claimed these subdomains and requested permanent removal, but it appears that after a certain time period (6 months)? Google will re-index (and mark them as blocked by robots.txt). What is the best way to permanently remove these from the index? We can't use login to block because our clients want to be able to view these applications without needing to login. What is the next best solution?
Intermediate & Advanced SEO | | nicole.healthline0