How can a site with two questionable inbound links outperform sites with 500-1000 links good PR?
-
Our site for years was performing at #1 for but in the last 6 months been pushed down to about the #5 spot. Some of the domains above us have a handful of links and they aren't from good sources. We don't have a Google penalty. We try to only have links from quality domains but have been pushed down the SERP's? Any suggestions?
-
Hi Graham
A quick look at your anchor text tells me you have an over optimization issue
--> http://screencast.com/t/lwzNZgEu5X
The very high percentage of commercial keyword anchors linking to the site is why the site used to rank well, and now this is not helping sites anymore or even hurting them.
I have seen this many times now, where sites with less authority but more "natural" anchor text perform better.
There are some options;
- You can request these sites change the anchor text to the brand name or domain name
- You can disavow the links (NOT recommended unless done with a super amount of care).
- You can do step one AND work on building newer more natural links
When Google sees those commercial anchors they know some form of link building has been done, and it gives a negative mark on the site - now sites like that have to work hard to earn Google's trust again.
-
Could help you with this. PM me your email
-
I implemented all these suggestions and it looks like google has re-indexed most of our page. We are still experiencing low rankings but when I use open opensiteexplorer it shows that we should be ranking better as our overall rank is 53 (Compared to 38, 31, 25, 41)
I looked at our links and we have about 75 links from DA 40 or above?
Is the site explorer flawed or am I missing something? I appreciate your feedback so far. Some great suggestions.
-
Thanks! You could move to Canada
-
But how to get those 10-20 quality links (natural ones, not bought)... that is a harder question!
-
Took a quick look, few things:
A. You lack authoritative links. Your highest PA/DA link that is not internal is 20/40 and its a directory site. Get high PA/DA links.
B. Probably insert a Privacy Policy and Terms and Conditions to meet Googles Quality guidelines and general good practice.
C. Your top competitor "mydefence.ca" is using SAPE links. Usually this is unsustainable for more than 6 months but who knows how long he can rank with it.
D. Your other competitor simply has more authoritative links than you.
E. Add more content onto your homepage. 400 words or more.
F. Reduce the word count of "law," "criminal," "criminal law". Word density is too high.
G. Switch up your H1 so it doesn't match your title exactly. Change the order of words.
Make these changes and get some high PA/DA links. This analysis was just done in 5 minutes. One month and 10-20 quality links and you will rank. Damn, this is so much easier than in Los Angeles. :S
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GSC Performance completely dropped off, but Google Analytics is steady. Why can't GSC track my site anymore?
Hey everyone! I'm having a weird issue that I've never experienced before. For one of my clients, GSC has a complete drop-off in the Performance section. All of the data shows that everything fell flat, or almost completely flat. But in Google Analytics, we have steady results. No huge drop-off in traffic, etc. Do any of you know why GSC would all of a sudden be unable to crawl our site? Or track this data? Let me know what you think!
Algorithm Updates | | TaylorAtVelox
Thanks!2 -
Should I create a menu link for sitemap?
Hi guys, I am new to SEO and I have a question for you guys. We created a sitemap for our website. I was thinking of creating a sitemap link on our homepage. Do you think it's a good idea? Would this help us in terms of ranking improvements? Or would help with anything at all? Thanks
Algorithm Updates | | ahmetkul0 -
Link reclamation and many 301 redirect to one URL
We have many incoming links to a non existing pages of a sub-domain, which we are planning to take down or redirect to a sub-directory. But we are not ready to loose pagerank or link juice as many links of this sub-domain are referred from different external links. It's going to be double redirect obviously. What is the best thing we can go to reclaim these links without loss of link juice or PR? Can we redirect all these links to same sub-domain and redirect the same sub-domain to sub-directory? Will this double redirect works? Or Can we redirect all these links to same sub-domain and ask visitors to visit sub-directory, manual redirection? How fair to manually redirect visitors? Any other options? Thanks, Satish
Algorithm Updates | | vtmoz0 -
How long for google to de-index old pages on my site?
I launched my redesigned website 4 days ago. I submitted a new site map, as well as submitted it to index in search console (google webmasters). I see that when I google my site, My new open graph settings are coming up correct. Still, a lot of my old site pages are definitely still indexed within google. How long will it take for google to drop off or "de-index" my old pages? Due to the way I restructured my website, a lot of the items are no longer available on my site. This is on purpose. I'm a graphic designer, and with the new change, I removed many old portfolio items, as well as any references to web design since I will no longer offering that service. My site is the following:
Algorithm Updates | | rubennunez
http://studio35design.com0 -
Site´s Architecture - Categories . What´s the best in my case?
My Dear friends of MOZ, I´ve got you a case that has been driving me crazy for 2 weeks, Im doing an SEO audit for big brand that sells electronics. Since they sell all kind of electronics, and are very popular the site is quite big and has several categories. Now...Im working particularly in a kind of micro-site that sells two kind of products that are very similar but not the same. Lets say in this site they are selling super-light-weight-Laptops and tablets, so if you look the site its a Laptop/Tablet site. But the site is not under a laptop/tablet directory, some pages are under laptop and others in Tablet directory . For example : Home page URL: /light-laptops/home.asp ; Products general page page URL is light-pads/products.asp ; and each single product page is under laptops or pads according the type of product. From my point of view, they should create a new directory called /light-laptops-pads/ and single directories for products, and case studies, etc.. Since they want to show both products together when you click in products (off course they will be creating sub-directories for the two types of products). At the begining I thought they were really mistaken, but now that I see that all light-pad content is in one folder and light-laptops content is in another, and the site jumps from one category to the other I am a little bit confused. PLEASE HELP ME PD: I want to make clear that general categories like products, case studies , contact us, solutions pages are in some cases under /light-pad/ directory and in other cases under /light-laptops / directory PLEASE PARDON MY ENGLISH!
Algorithm Updates | | facupp10 -
Do links from unrelated sites dilute your rankings for your key phrases?
do links from unrelated sites dilute your rankings for your key phrases? i've always heard don't get links from unrelated sites but if that mattered, then how would sites with totally diverse pages such as newspaper sites, sears, and other catalogue sites rank for these diverse subjects on their site? How does Facebook rank when it gets 100,000 links a day from sites that have nothing to do with a social media site? I'd love to hear everyone's opinion on this. Also, Do links from unrelated sites give less push than related links? Take care,
Algorithm Updates | | Ron10
Ron0 -
When to remove bad links.
Hi everyone. We were hit on the 5th Oct with manual penalties - after building some good links and building good content we saw some gains in our SERPS, not to where they were, but they are definately improving for some low competition keywords. In this case would people recommend still trying to remove bad links? We have audited our links and identified ones which seem spammy. We were going to go through a step by step process, emailing bad link providers where possible, and then sending a disavow for any links we were not able to remove. If we have started to see gains through other means is it wise in people's opinion to start contacting google? We watched Matt Cutts video on disavow usage and he states not to use it unless in extreme situations, so we don't want to 'wake the beast'. Many thanks. James.
Algorithm Updates | | Quime0 -
Too Many On-Page Links
After running a site analysis on here it has come up and said that I have a lot o pages with too many on page links and that this might be why the site is being penalized. Thing is I am not sure how to remedy this as one page that says it has 116 links is this one : http://www.whosjack.org/10-films-with-some-crazy-bitches/ Although there is only one link in the body Then again our home page has 165 http://www.whosjack.org which again it says is too many. The thing is is that surely it doesn't count on links all over the page as other wise every news homepage would be penalised? For example what would happen here on this home page? : http://www.dazeddigital.com/ Can anyone help me see what I am missing? Are there possible hidden links anywhere I should be looking for etc? Thanks
Algorithm Updates | | luwhosjack0