Sites still rank who don't seem like they should. Why?
-
So you've been MOZing and SEOing for years and we're all convinced of the 10x factor when it comes to content and ranking for certain search terms... right? So what do you do when some older sites that don't even produce content dominate the first page of a very important search term? They're home pages with very little content and have clearly all dabbled in pre Panda SEO. Surely people are still seeing this and wondering why?
-
Hi Radi!
It looks like you're a Pro member, so one thing I'd suggest is to run a Full SERP Report on the keywords you're concerned with using the Keyword Difficulty tool. It can give you a better sense of what's going on.
Cyrus made a great video on the report here.
I hope that helps!
-
Hi Radi,
You are not alone. Its the question that all SEO's ask? Why him ?. There is no simple answer for this. Have you compared all your metrics with competitors? I would highly recommend you to see the below video of Rand Fishkin. Hope it answers all.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Weird rankings on my website, can't figure it out
Hey guys, One of my post popular pages for "Rust Hacks" use to be - http://www.ilikecheats.com/01/rust-cheats-hacks-aimbot/ Now when searching Google for site:ilikecheats.com rust hacks This page shows as the highest ranking - http://forum.ilikecheats.com/forums/221-Rust-Hacks-Rust-Cheats-Public-Forum What's weird is it seems the entire front end (Wordpress site) isn't ranking well anymore on page #1 of Google and the forums are ranking better currently. I did have a huge penalty from backlinks last year but cleared it. I got Yoast to do a site review and I'm cleaning up everything now. I also cleared most of the bad links via the disavow tool. Another example is when I search for "warz hacks" the forums show up in 4th place but the main website isn't showing at all back to page 10. If I search site:ilikecheats.com warz hacks the links directly to the main site doesn't show until page #2. So is this still a penalty that is carried over or is something else going on? Can't seem to figure it out, thanks in advance for looking. 😃 Any ideas what's going on and why the main pages no longer rank - http://www.ilikecheats.com
Intermediate & Advanced SEO | | Draden670 -
Why isnt this site ranking?
I just took over for a site and noticed they have no presence for any keywords...not even low ranks. Their backlink profile is not the best, but webmaster tools says they have no manual actions. vonderhaar.com Thoughts on the matter?
Intermediate & Advanced SEO | | Atomicx0 -
Google isn't seeing the content but it is still indexing the webpage
When I fetch my website page using GWT this is what I receive. HTTP/1.1 301 Moved Permanently
Intermediate & Advanced SEO | | jacobfy
X-Pantheon-Styx-Hostname: styx1560bba9.chios.panth.io
server: nginx
content-type: text/html
location: https://www.inscopix.com/
x-pantheon-endpoint: 4ac0249e-9a7a-4fd6-81fc-a7170812c4d6
Cache-Control: public, max-age=86400
Content-Length: 0
Accept-Ranges: bytes
Date: Fri, 14 Mar 2014 16:29:38 GMT
X-Varnish: 2640682369 2640432361
Age: 326
Via: 1.1 varnish
Connection: keep-alive What I used to get is this: HTTP/1.1 200 OK
Date: Thu, 11 Apr 2013 16:00:24 GMT
Server: Apache/2.2.23 (Amazon)
X-Powered-By: PHP/5.3.18
Expires: Sun, 19 Nov 1978 05:00:00 GMT
Last-Modified: Thu, 11 Apr 2013 16:00:24 +0000
Cache-Control: no-cache, must-revalidate, post-check=0, pre-check=0
ETag: "1365696024"
Content-Language: en
Link: ; rel="canonical",; rel="shortlink"
X-Generator: Drupal 7 (http://drupal.org)
Connection: close
Transfer-Encoding: chunked
Content-Type: text/html; charset=utf-8 xmlns:content="http://purl.org/rss/1.0/modules/content/"
xmlns:dc="http://purl.org/dc/terms/"
xmlns:foaf="http://xmlns.com/foaf/0.1/"
xmlns:og="http://ogp.me/ns#"
xmlns:rdfs="http://www.w3.org/2000/01/rdf-schema#"
xmlns:sioc="http://rdfs.org/sioc/ns#"
xmlns:sioct="http://rdfs.org/sioc/types#"
xmlns:skos="http://www.w3.org/2004/02/skos/core#"
xmlns:xsd="http://www.w3.org/2001/XMLSchema#"> <title>Inscopix | In vivo rodent brain imaging</title>0 -
Can't get auto-generated content de-indexed
Hello and thanks in advance for any help you can offer me! Customgia.com, a costume jewelry e-commerce site, has two types of product pages - public pages that are internally linked and private pages that are only accessible by accessing the URL directly. Every item on Customgia is created online using an online design tool. Users can register for a free account and save the designs they create, even if they don't purchase them. Prior to saving their design, the user is required to enter a product name and choose "public" or "private" for that design. The page title and product description are auto-generated. Since launching in October '11, the number of products grew and grew as more users designed jewelry items. Most users chose to show their designs publicly, so the number of products in the store swelled to nearly 3000. I realized many of these designs were similar to each and occasionally exact duplicates. So over the past 8 months, I've made 2300 of these design "private" - and no longer accessible unless the designer logs into their account (these pages can also be linked to directly). When I realized that Google had indexed nearly all 3000 products, I entered URL removal requests on Webmaster Tools for the designs that I had changed to "private". I did this starting about 4 months ago. At the time, I did not have NOINDEX meta tags on these product pages (obviously a mistake) so it appears that most of these product pages were never removed from the index. Or if they were removed, they were added back in after the 90 days were up. Of the 716 products currently showing (the ones I want Google to know about), 466 have unique, informative descriptions written by humans. The remaining 250 have auto-generated descriptions that read coherently but are somewhat similar to one another. I don't think these 250 descriptions are the big problem right now but these product pages can be hidden if necessary. I think the big problem is the 2000 product pages that are still in the Google index but shouldn't be. The following Google query tells me roughly how many product pages are in the index: site:Customgia.com inurl:shop-for Ideally, it should return just over 716 results but instead it's returning 2650 results. Most of these 1900 product pages have bad product names and highly similar, auto-generated descriptions and page titles. I wish Google never crawled them. Last week, NOINDEX tags were added to all 1900 "private" designs so currently the only product pages that should be indexed are the 716 showing on the site. Unfortunately, over the past ten days the number of product pages in the Google index hasn't changed. One solution I initially thought might work is to re-enter the removal requests because now, with the NOINDEX tags, these pages should be removed permanently. But I can't determine which product pages need to be removed because Google doesn't let me see that deep into the search results. If I look at the removal request history it says "Expired" or "Removed" but these labels don't seem to correspond in any way to whether or not that page is currently indexed. Additionally, Google is unlikely to crawl these "private" pages because they are orphaned and no longer linked to any public pages of the site (and no external links either). Currently, Customgia.com averages 25 organic visits per month (branded and non-branded) and close to zero sales. Does anyone think de-indexing the entire site would be appropriate here? Start with a clean slate and then let Google re-crawl and index only the public pages - would that be easier than battling with Webmaster tools for months on end? Back in August, I posted a similar problem that was solved using NOINDEX tags (de-indexing a different set of pages on Customgia): http://moz.com/community/q/does-this-site-have-a-duplicate-content-issue#reply_176813 Thanks for reading through all this!
Intermediate & Advanced SEO | | rja2140 -
What NAP format do I use if the USPS can't even find my client's address?
My client has a site already listed on Google+Local under "5208 N 1st St". He has some other NAPs, e.g., YellowPages, under "5208 N First Street". The USPS finds neither of these, nor any variation that I can possibly think of! Which is better? Do I just take the one that Google has accepted and make all the others like it as best I can? And doesn't it matter that the USPS doesn't even recognize the thing? Or no? Local SEO wizards, thanks in advance for your guidance!
Intermediate & Advanced SEO | | rayvensoft0 -
I can't help but think something is wrong with my SEO
So we re-launched our site about a month ago, and ever since we've seen a dramatic drop in search results (probably due to some errors that were made) when changing servers and permalink structure. But, I can't help but think something else is at play here. When we write something, I can check 24 hours later, and if I copy the Title verbatim, but we don't always show up in SERPs. In fact, I looked at a post today, and the meta description showing is not the same, but when I check the source code, it's right. What shows up in Google: http://d.pr/i/jGJg What's actually in the source code: http://d.pr/i/p4s8 Why is this happening? Website is The Tech Block
Intermediate & Advanced SEO | | ttb0 -
What if you can't navigate naturally to your canonicalized URL?
Assume this situation for a second... Let's say you place a rel= canonical tag on a page and point to the original/authentic URL. Now, let's say that that original/authentic URL is also populated into your XML sitemap... So, here's my question... Since you can't actually navigate to that original/authentic URL (it still loads with a 200, it's just not actually linkded to from within the site itself), does that create an issue for search engines? Last consideration... The bots can still access those pages via the canonical tag and the XML sitemap, it's just that the user wouldn't be able to access those original/authentic pages in their natural site navigation. Thanks, Rodrigo
Intermediate & Advanced SEO | | AlgoFreaks0 -
Does having many 302 redirects on a site hurt rankings?
I am working with an affiliate website which has many product listings but the "Buy Now" button on each product listing is an external link (affiliate link) to the appropriate product page on the actual website where the product can be bought. Each of these external links passes through an internal redirect, which is implemented as a 302. Consequently, when SEOmoz crawls that site, it gives me a warning that there are hundreds of 302 redirects on the site. Do you think this will hurt the site's potential to rank? Should I remove the redirects and leave them as direct links to the external site?
Intermediate & Advanced SEO | | AlexFusman0