Why is a poor optimized url ranked first on Google ?
-
Hi there, I've been working in SEO for more than five years and I'm always telling clients about the more than 200 factors that influence rankings, but sometimes I meet several urls or websites who haven't optimized their pages nor built links and still appear first.
This is the case of the keyword "Escorts en Tenerife" in google.es. If you search that keyword in google.es you'll find this url: escortislacanarias.com... (I don't want to give them a link).
My question is why the heck this url is ranking first on Google for that keyword if the url isn't optmized, the page content isn't optimized and hasn't got many or valuable incoming links?
Do an on page grader to that url regarding that keyword an it gets an F !!! So there is no correlation between better optimization and good rankings.
-
Thansk again for the effort. I like your answers, they are very helpful. Although in this case our target aren't English speaking people, just Spanish people from or in Tenerife for vacations.
-
_Google knows that Tenerife is a geographic location in the Canary Islands. This website has the word Tenerife on it many times. It has pages with Tenerife in the title, in the URL. _
Egol is just explaining why their page shows up for 'Tenerife' searches.
I have no idea if you're targeting tourists, but I expect you are, this means you're effectively targeting two types of customer - those that plan ahead and those than plan 'on the fly'.
So if I wanted an escort for my trip to Tenerife (LOL, no, I don't... happily married lady!) I might search for one before I leave the UK to get it all organised ahead of time. So I would search 'Escorts in Tenerife' (and maybe localise even further to find one close to where I was staying)...
Or, I may be lonely on one night when I get to Tenerife, and do a search to find an escort from Tenerife.
These two scenarios might show different results - the person searching from Tenerife would get localised results, whereas the person searching from the UK is relying on 'Tenerife' being included on the website to appear in the SERPs.
I hope this makes sense!
I think Egol has a lot of great information to share, and I've often found his /her responses to questions in this forum to be very useful and informative. That last comment though was a little harsh, but the point is: _if you feel the current situation is hard to overcome, then imagine how bad it would be if they did know what they are doing!!! _This is how I read it anyway. Perhaps it's just a case of what my colleague calls ‘the impersonal interaction impertinence imperative’ - sounds like something from the hitch hikers guide, doesn't it! But basically means sometimes written, impersonal communications, can be misinterpreted. I often send stuff that sounds harsher than it's meant to be because I forget that sarcasm and suchlike are not easy to interpret in written form!
I'm glad you found my answer helpful
-
Agreed the end user doesn't care about SEO. But if the user experience is even horrendous difficult to navigate it must have a high bounce rate.
One site I am not going to mention but the server he is running is blacklisted for email spam, I know one SEO company report them for Spam to Google, they're keywords stuff, poor user experience looks like its from 1999 and has spammy links. Still sites above sites which if we took Google advise. High quality content, good user experience, not too many links on page,nice and quick to load it is absolutely everything opposite to Google is recommending.
-
Hi and thanks for the comment. It helps. But tell me, what are the great insights you are talking about EGOL? Did I miss anything? Was his/their answer profound for you? I really can't see any good in EGOL's answer but a bit of arrogance instead.
That kind of answer doesn't help. Yours, on the other hand, is very usefeul, thank you ameliavargo ! I really did like and very much appreciate your third parragraph most of all !
-
I completely agree with you, there seems to be no correlation. I don't mean to say all this metrics don't work or aren't useful, I0m just saying there are sites up there in top positions and they still don't deserve it according to moz tools, for example. Thanks for your comment.
-
1.- I did give them a mention and I knew it would be good for them, but I did it because I wanted moz people to understand the precise case, but perhaps I was wrong in that, in that case sorry.
2.- I didn't say it was hard to beat. I'm only asking why is that url on top position when according to all metrics it doesn't deserve it. There are 3 or 4 competitors that have more and better incoming links, a better page optimization, better content, etc.
About user experience. have you entered the site? Did you really see anything that you did like of that page for the user?
What would you say is the main reason for them to be on top? "This website has the word Tenerife on it many times. It has pages with Tenerife in the title, in the URL." So that is your conclusion ? Thanks in advance.
-
As always, EGOL has some great insights and I agree with them...
Onsite usage metrics are also used in the algo. I'm not entirely sure exactly what Google uses, but I imagine it's something like bounce rate (though not bounce rate as they've said they don't use it... but they lie too so they might do!...), time on site, pages per visit - that kind of thing. The stuff that tells you if people are engaging with your website.
They also use organic CTR (though again, I don't think this is ever explicitly stated by them) - if your CTR is high then they will reward you (though a high CTR with bad usage metrics would not on its own help you - rather hinder I think).
This is why pi$$ poor sites often end up in high positions - people like them. Doesn't always make sense, but if the site you mention provides something people want, they aren't going to CARE whether it's been 'SEO'd' or not - and really Google doesn't either. If visitor behaviour indicates that people are getting value from visiting that site then Google is not going to move it from its high position - until you provide something that people like more and importantly onsite behaviour indicates the same.
Hope this helps
-
Google knows that Tenerife is a geographic location in the Canary Islands. This website has the word Tenerife on it many times. It has pages with Tenerife in the title, in the URL.
Even though you did not give them a link you gave them a mention... and in the same sentence you associated it with the keyword that you are searching for. All of that helps them.
Google thinks beyond the page.
If you think this website is hard to beat, wait until they get an SEO who knows what to do.
-
I don't think Google really has sorted the algo's out, there seems to be absolutely no correlation. There are sites on my targeted keyword list which no way should be on the 1st page of Google however they're up at the top. Even with higher domain authority, page authority better link profile and sites speed. Sites which are keyword stuffed also rank in the top 3 positions.
It would be great to see what other moz'ers are experiencing
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
After hack and remediation, thousands of URL's still appearing as 'Valid' in google search console. How to remedy?
I'm working on a site that was hacked in March 2019 and in the process, nearly 900,000 spam links were generated and indexed. After remediation of the hack in April 2019, the spammy URLs began dropping out of the index until last week, when Search Console showed around 8,000 as "Indexed, not submitted in sitemap" but listed as "Valid" in the coverage report and many of them are still hack-related URLs that are listed as being indexed in March 2019, despite the fact that clicking on them leads to a 404. As of this Saturday, the number jumped up to 18,000, but I have no way of finding out using the search console reports why the jump happened or what are the new URLs that were added, the only sort mechanism is last crawled and they don't show up there. How long can I expect it to take for these remaining urls to also be removed from the index? Is there any way to expedite the process? I've submitted a 'new' sitemap several times, which (so far) has not helped. Is there any way to see inside the new GSC view why/how the number of valid URLs in the indexed doubled over one weekend?
Intermediate & Advanced SEO | | rickyporco0 -
Competing URLs
Hi We have a number of blogs that compete with our homepage for some keywords/phrases. The URLs of the blogs contain the keywords/phrases. I would like to re-work the blogs so that they target different keywords that don't compete and are more relevant. Should I change the URLs as I think this is what is mainly causing the issue? If so, should I 301 old URL's to the homepage? For example, say we we're a site that specialised in selling plastic cups. Currently there is a blog with the URL www.mysite.com/plastic-cups that outranks the homepage for _plastic cups. _The blog isn't particularly relevant to plastic cups and the homepage should rank for this term. How should I let Google know that it is the homepage that is most relevant for this term? Thanks
Intermediate & Advanced SEO | | Buffalo_71 -
Why is Google no longer Indexing and Ranking my state pages with Dynamic Content?
Hi, We have some state specific pages that display dynamic content based on the state that is selected here. For example this page displays new york based content. But for some reason google is no longer ranking these pages. Instead it's defaulting to the page where you select the state here. But last year the individual state dynamic pages were ranking. The only change we made was move these pages from http to https. But now google isn't seeing these individual dynamically generated state based pages. When I do a site: url search it doesn't find any of these state pages. Any thoughts on why this is happening and how to fix it. Thanks in advance for any insight. Eddy By the way when I check these pages in google search console fetch as google, google is able to see these pages fine and they're not being blocked by any robot.txt.
Intermediate & Advanced SEO | | eddys_kap0 -
Wrong URLs indexed, Failing To Rank Anywhere
I’m struggling with a client website that's massively failing to rank. It was published in Nov/Dec last year - not optimised or ranking for anything, it's about 20 pages. I came onboard recently, and 5-6 weeks ago we added new content, did the on-page and finally changed from the non-www to the www version in htaccess and WP settings (while setting www as preferred in Search Console). We then did a press release and since then, have acquired about 4 partial match contextual links on good websites (before this, it had virtually none, save for social profiles etc.) I should note that just before we added the (about 50%) new content and optimised, my developer accidentally published the dev site of the old version of the site and it got indexed. He immediately added it correctly to robots.txt, and I assumed it would therefore drop out of the index fairly quickly and we need not be concerned. Now it's about 6 weeks later, and we’re still not ranking anywhere for our chosen keywords. The keywords are around “egg freezing,” so only moderate competition. We’re not even ranking for our brand name, which is 4 words long and pretty unique. We were ranking in the top 30 for this until yesterday, but it was the press release page on the old (non-www) URL! I was convinced we must have a duplicate content issue after realising the dev site was still indexed, so last week, we went into Search Console to remove all of the dev URLs manually from the index. The next day, they were all removed, and we suddenly began ranking (~83) for “freezing your eggs,” one of our keywords! This seemed unlikely to be a coincidence, but once again, the positive sign was dampened by the fact it was non-www page that was ranking, which made me wonder why the non-www pages were still even indexed. When I do site:oursite.com, for example, both non-www and www URLs are still showing up…. Can someone with more experience than me tell me whether I need to give up on this site, or what I could do to find out if I do? I feel like I may be wasting the client’s money here by building links to a site that could be under a very weird penalty 😕
Intermediate & Advanced SEO | | Ullamalm0 -
URL Errors for SmartPhone in Google Search Console/Webmaster Tools
Howdy all, In recent weeks I have seen a steady increase in the number of smartphone related url errors on Googles Search Console (formerly webmaster tools). THe crawler appears to be searching for a /m/ or /mobile/ directory within the URLs. Why is it doing this? Any insight would be greatly appreciated. Unfortunately this is for an unresponsive site, would setting the viewport help stop the issue for know until my new responsive site is launched shortly. Cheers fello Mozzers 🙂 Tim NDh1RNs
Intermediate & Advanced SEO | | TimHolmes1 -
Local Google Place Ranking loss
One of our clients lost rankings on the local map results. Last month we changed the phone number on the G+ page so the number is the same as on the website but it's still a call tracking number. We also changed the url to example.nl/plumber-newyork so it directly links to the local page and we made the local G+ author of the local page in the website. Can these changes have something to do with the ranking loss in google maps results?
Intermediate & Advanced SEO | | remkoallertz0 -
URL Parameter Being Improperly Crawled & Indexed by Google
Hi All, We just discovered that Google is indexing a subset of our URL’s embedded with our analytics tracking parameter. For the search “dresses” we are appearing in position 11 (page 2, rank 1) with the following URL: www.anthropologie.com/anthro/category/dresses/clothes-dresses.jsp?cm_mmc=Email--Anthro_12--070612_Dress_Anthro-_-shop You’ll note that “cm_mmc=Email” is appended. This is causing our analytics (CoreMetrics) to mis-attribute this traffic and revenue to Email vs. SEO. A few questions: 1) Why is this happening? This is an email from June 2012 and we don’t have an email specific landing page embedded with this parameter. Somehow Google found and indexed this page with these tracking parameters. Has anyone else seen something similar happening?
Intermediate & Advanced SEO | | kevin_reyes
2) What is the recommended method of “politely” telling Google to index the version without the tracking parameters? Some thoughts on this:
a. Implement a self-referencing canonical on the page.
- This is done, but we have some technical issues with the canonical due to our ecommerce platform (ATG). Even though page source code looks correct, Googlebot is seeing the canonical with a JSession ID.
b. Resubmit both URL’s in WMT Fetch feature hoping that Google recognizes the canonical.
- We did this, but given the canonical issue it won’t be effective until we can fix it.
c. URL handling change in WMT
- We made this change, but it didn’t seem to fix the problem
d. 301 or No Index the version with the email tracking parameters
- This seems drastic and I’m concerned that we’d lose ranking on this very strategic keyword Thoughts? Thanks in advance, Kevin0 -
Why are bit.ly links being indexed and ranked by Google?
I did a quick search for "site:bit.ly" and it returns more than 10 million results. Given that bit.ly links are 301 redirects, why are they being indexed in Google and ranked according to their destination? I'm working on a similar project to bit.ly and I want to make sure I don't run into the same problem.
Intermediate & Advanced SEO | | JDatSB1