Why is a poor optimized url ranked first on Google ?
-
Hi there, I've been working in SEO for more than five years and I'm always telling clients about the more than 200 factors that influence rankings, but sometimes I meet several urls or websites who haven't optimized their pages nor built links and still appear first.
This is the case of the keyword "Escorts en Tenerife" in google.es. If you search that keyword in google.es you'll find this url: escortislacanarias.com... (I don't want to give them a link).
My question is why the heck this url is ranking first on Google for that keyword if the url isn't optmized, the page content isn't optimized and hasn't got many or valuable incoming links?
Do an on page grader to that url regarding that keyword an it gets an F !!! So there is no correlation between better optimization and good rankings.
-
Thansk again for the effort. I like your answers, they are very helpful. Although in this case our target aren't English speaking people, just Spanish people from or in Tenerife for vacations.
-
_Google knows that Tenerife is a geographic location in the Canary Islands. This website has the word Tenerife on it many times. It has pages with Tenerife in the title, in the URL. _
Egol is just explaining why their page shows up for 'Tenerife' searches.
I have no idea if you're targeting tourists, but I expect you are, this means you're effectively targeting two types of customer - those that plan ahead and those than plan 'on the fly'.
So if I wanted an escort for my trip to Tenerife (LOL, no, I don't... happily married lady!) I might search for one before I leave the UK to get it all organised ahead of time. So I would search 'Escorts in Tenerife' (and maybe localise even further to find one close to where I was staying)...
Or, I may be lonely on one night when I get to Tenerife, and do a search to find an escort from Tenerife.
These two scenarios might show different results - the person searching from Tenerife would get localised results, whereas the person searching from the UK is relying on 'Tenerife' being included on the website to appear in the SERPs.
I hope this makes sense!
I think Egol has a lot of great information to share, and I've often found his /her responses to questions in this forum to be very useful and informative. That last comment though was a little harsh, but the point is: _if you feel the current situation is hard to overcome, then imagine how bad it would be if they did know what they are doing!!! _This is how I read it anyway. Perhaps it's just a case of what my colleague calls ‘the impersonal interaction impertinence imperative’ - sounds like something from the hitch hikers guide, doesn't it! But basically means sometimes written, impersonal communications, can be misinterpreted. I often send stuff that sounds harsher than it's meant to be because I forget that sarcasm and suchlike are not easy to interpret in written form!
I'm glad you found my answer helpful
-
Agreed the end user doesn't care about SEO. But if the user experience is even horrendous difficult to navigate it must have a high bounce rate.
One site I am not going to mention but the server he is running is blacklisted for email spam, I know one SEO company report them for Spam to Google, they're keywords stuff, poor user experience looks like its from 1999 and has spammy links. Still sites above sites which if we took Google advise. High quality content, good user experience, not too many links on page,nice and quick to load it is absolutely everything opposite to Google is recommending.
-
Hi and thanks for the comment. It helps. But tell me, what are the great insights you are talking about EGOL? Did I miss anything? Was his/their answer profound for you? I really can't see any good in EGOL's answer but a bit of arrogance instead.
That kind of answer doesn't help. Yours, on the other hand, is very usefeul, thank you ameliavargo ! I really did like and very much appreciate your third parragraph most of all !
-
I completely agree with you, there seems to be no correlation. I don't mean to say all this metrics don't work or aren't useful, I0m just saying there are sites up there in top positions and they still don't deserve it according to moz tools, for example. Thanks for your comment.
-
1.- I did give them a mention and I knew it would be good for them, but I did it because I wanted moz people to understand the precise case, but perhaps I was wrong in that, in that case sorry.
2.- I didn't say it was hard to beat. I'm only asking why is that url on top position when according to all metrics it doesn't deserve it. There are 3 or 4 competitors that have more and better incoming links, a better page optimization, better content, etc.
About user experience. have you entered the site? Did you really see anything that you did like of that page for the user?
What would you say is the main reason for them to be on top? "This website has the word Tenerife on it many times. It has pages with Tenerife in the title, in the URL." So that is your conclusion ? Thanks in advance.
-
As always, EGOL has some great insights and I agree with them...
Onsite usage metrics are also used in the algo. I'm not entirely sure exactly what Google uses, but I imagine it's something like bounce rate (though not bounce rate as they've said they don't use it... but they lie too so they might do!...), time on site, pages per visit - that kind of thing. The stuff that tells you if people are engaging with your website.
They also use organic CTR (though again, I don't think this is ever explicitly stated by them) - if your CTR is high then they will reward you (though a high CTR with bad usage metrics would not on its own help you - rather hinder I think).
This is why pi$$ poor sites often end up in high positions - people like them. Doesn't always make sense, but if the site you mention provides something people want, they aren't going to CARE whether it's been 'SEO'd' or not - and really Google doesn't either. If visitor behaviour indicates that people are getting value from visiting that site then Google is not going to move it from its high position - until you provide something that people like more and importantly onsite behaviour indicates the same.
Hope this helps
-
Google knows that Tenerife is a geographic location in the Canary Islands. This website has the word Tenerife on it many times. It has pages with Tenerife in the title, in the URL.
Even though you did not give them a link you gave them a mention... and in the same sentence you associated it with the keyword that you are searching for. All of that helps them.
Google thinks beyond the page.
If you think this website is hard to beat, wait until they get an SEO who knows what to do.
-
I don't think Google really has sorted the algo's out, there seems to be absolutely no correlation. There are sites on my targeted keyword list which no way should be on the 1st page of Google however they're up at the top. Even with higher domain authority, page authority better link profile and sites speed. Sites which are keyword stuffed also rank in the top 3 positions.
It would be great to see what other moz'ers are experiencing
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Update? Anyone seeing a drastic change in rankings?
Hey everyone, Has anyone seen a drastic change in clients Google Rankings, we have one which has dropped from 9.5% visibility to 5.4% in one month.. It's extremely worrying as I have never seen a drop like this before since I've managed the account, in fact rankings have increased month on month since we took over the account. I've also looked at google, no penalty showing, I've started removing spammy sites who link to us and also added no follow to some of the links on the site which linked out to try and keep some link juice in the site. Anything else I can try / do?
Intermediate & Advanced SEO | | Unbranded_Lee2 -
I am lost at where to go. My optimization rating is 95% + and rankings are on pages 4+. I would like to know what I should do to increase my rankings.
My site is Glare-Guard.com. My Domain Authority has not moved from 17 in a long time. i have done everything to optimize the different pages. I have 90%+ ratings for the various pages, yet I am still not even close to the first page for many of the keywords I am looking to rank for. Do you have any tips or ideas? Should I try to rewrite my content and add more information? I am just at a loss for where I should go to get the right traffic to my site. Any help would be greatly appreciated.
Intermediate & Advanced SEO | | bigskyinc0 -
Removing Parameterized URLs from Google Index
We have duplicate eCommerce websites, and we are in the process of implementing cross-domain canonicals. (We can't 301 - both sites are major brands). So far, this is working well - rankings are improving dramatically in most cases. However, what we are seeing in some cases is that Google has indexed a parameterized page for the site being canonicaled (this is the site that is getting the canonical tag - the "from" page). When this happens, both sites are being ranked, and the parameterized page appears to be blocking the canonical. The question is, how do I remove canonicaled pages from Google's index? If Google doesn't crawl the page in question, it never sees the canonical tag, and we still have duplicate content. Example: A. www.domain2.com/productname.cfm%3FclickSource%3DXSELL_PR is ranked at #35, and B. www.domain1.com/productname.cfm is ranked at #12. (yes, I know that upper case is bad. We fixed that too.) Page A has the canonical tag, but page B's rank didn't improve. I know that there are no guarantees that it will improve, but I am seeing a pattern. Page A appears to be preventing Google from passing link juice via canonical. If Google doesn't crawl Page A, it can't see the rel=canonical tag. We likely have thousands of pages like this. Any ideas? Does it make sense to block the "clicksource" parameter in GWT? That kind of scares me.
Intermediate & Advanced SEO | | AMHC0 -
Https & http urls in Google Index
Hi everyone, this question is a two parter: I am now working for a large website - over 500k monthly organic traffic. The site currently has both http and https urls in Google's index. The website has not formally converted to https. The https began with an error and has evolved unchecked over time. Both versions of the site (http & https) are registered in webmaster tools so I can clearly track and see that as time passes http indexation is decreasing and https has been increasing. The ratio is at about 3:1 in favor of https at this time. Traffic over the last year has slowly dipped, however, over the last two months there has been a steady decline in overall visits registered through analytics. No single page appears to be the culprit, this decline is occurring across most pages of the website, pages which traditionally draw heavy traffic - including the home page. Considering that Google is giving priority to https pages, could it be possible that the split is having a negative impact on traffic as rankings sway? Additionally, mobile activity for the site has steadily increased both from a traffic and a conversion standpoint. However that traffic has also dipped significantly over the last two months. Looking at Google's mobile usability error's page I see a significant number of errors (over 1k). I know Google has been testing and changing mobile ranking factors, is it safe to posit that this could be having an impact on mobile traffic? The traffic declines are 9-10% MOM. Thank you. ~Geo
Intermediate & Advanced SEO | | Geosem0 -
Brackets vs Encoded URLs: The "Same" in Google's eyes, or dup content?
Hello, This is the first time I've asked a question here, but I would really appreciate the advice of the community - thank you, thank you! Scenario: Internal linking is pointing to two different versions of a URL, one with brackets [] and the other version with the brackets encoded as %5B%5D Version 1: http://www.site.com/test?hello**[]=all&howdy[]=all&ciao[]=all
Intermediate & Advanced SEO | | mirabile
Version 2: http://www.site.com/test?hello%5B%5D**=all&howdy**%5B%5D**=all&ciao**%5B%5D**=all Question: Will search engines view these as duplicate content? Technically there is a difference in characters, but it's only because one version encodes the brackets, and the other does not (See: http://www.w3schools.com/tags/ref_urlencode.asp) We are asking the developer to encode ALL URLs because this seems cleaner but they are telling us that Google will see zero difference. We aren't sure if this is true, since engines can get so _hung up on even one single difference in character. _ We don't want to unnecessarily fracture the internal link structure of the site, so again - any feedback is welcome, thank you. 🙂0 -
Indexed non existent pages, problem appeared after we 301d the url/index to the url.
I recently read that if a site has 2 pages that are live such as: http://www.url.com/index and http://www.url.com/ will come up as duplicate if they are both live... I read that it's best to 301 redirect the http://www.url.com/index and http://www.url.com/. I read that this helps avoid duplicate content and keep all the link juice on one page. We did the 301 for one of our clients and we got about 20,000 errors that did not exist. The errors are of pages that are indexed but do not exist on the server. We are assuming that these indexed (nonexistent) pages are somehow linked to the http://www.url.com/index The links are showing 200 OK. We took off the 301 redirect from the http://www.url.com/index page however now we still have 2 exaact pages, www.url.com/index and http://www.url.com/. What is the best way to solve this issue?
Intermediate & Advanced SEO | | Bryan_Loconto0 -
Overly-Dynamic URLs & Changing URL Structure w Web Redesign
I have a client that has multiple apartment complexes in different states and metro areas. They get good traffic and pretty good conversions but the site needs a lot of updating, including the architecture, to implement SEO standards. Right now they rank for " <brand_name>apartments" on every place but not " <city_name>apartments".</city_name></brand_name> There current architecture displays their URLs like: http://www.<client_apartments>.com/index.php?mainLevelCurrent=communities&communityID=28&secLevelCurrent=overview</client_apartments> http://www.<client_apartments>.com/index.php?mainLevelCurrent=communities&communityID=28&secLevelCurrent=floorplans&floorPlanID=121</client_apartments> I know it is said to never change the URL structure but what about this site? I see this URL structure being bad for SEO, bad for users, and basically forces us to keep the current architecture. They don't have many links built to their community pages so will creating a new URL structure and doing 301 redirects to the new URLs drastically drop rankings? Is this something that we should bite the bullet on now for future rankings, traffic, and a better architecture?
Intermediate & Advanced SEO | | JaredDetroit0 -
Does having a file type on the end of a url affect rankings (example www.fourcolormagnets.com/business-cards.php VS www.fourcolormagnets.com/business-cards)????
Does having a file type on the end of a url affect rankings (example www.fourcolormagnets.com/business-cards.php VS www.fourcolormagnets.com/business-cards)????
Intermediate & Advanced SEO | | JHSpecialty0