How can a site with two questionable inbound links outperform sites with 500-1000 links good PR?
-
Our site for years was performing at #1 for but in the last 6 months been pushed down to about the #5 spot. Some of the domains above us have a handful of links and they aren't from good sources. We don't have a Google penalty. We try to only have links from quality domains but have been pushed down the SERP's? Any suggestions?
-
Hi Graham
A quick look at your anchor text tells me you have an over optimization issue
--> http://screencast.com/t/lwzNZgEu5X
The very high percentage of commercial keyword anchors linking to the site is why the site used to rank well, and now this is not helping sites anymore or even hurting them.
I have seen this many times now, where sites with less authority but more "natural" anchor text perform better.
There are some options;
- You can request these sites change the anchor text to the brand name or domain name
- You can disavow the links (NOT recommended unless done with a super amount of care).
- You can do step one AND work on building newer more natural links
When Google sees those commercial anchors they know some form of link building has been done, and it gives a negative mark on the site - now sites like that have to work hard to earn Google's trust again.
-
Could help you with this. PM me your email
-
I implemented all these suggestions and it looks like google has re-indexed most of our page. We are still experiencing low rankings but when I use open opensiteexplorer it shows that we should be ranking better as our overall rank is 53 (Compared to 38, 31, 25, 41)
I looked at our links and we have about 75 links from DA 40 or above?
Is the site explorer flawed or am I missing something? I appreciate your feedback so far. Some great suggestions.
-
Thanks! You could move to Canada
-
But how to get those 10-20 quality links (natural ones, not bought)... that is a harder question!
-
Took a quick look, few things:
A. You lack authoritative links. Your highest PA/DA link that is not internal is 20/40 and its a directory site. Get high PA/DA links.
B. Probably insert a Privacy Policy and Terms and Conditions to meet Googles Quality guidelines and general good practice.
C. Your top competitor "mydefence.ca" is using SAPE links. Usually this is unsustainable for more than 6 months but who knows how long he can rank with it.
D. Your other competitor simply has more authoritative links than you.
E. Add more content onto your homepage. 400 words or more.
F. Reduce the word count of "law," "criminal," "criminal law". Word density is too high.
G. Switch up your H1 so it doesn't match your title exactly. Change the order of words.
Make these changes and get some high PA/DA links. This analysis was just done in 5 minutes. One month and 10-20 quality links and you will rank. Damn, this is so much easier than in Los Angeles. :S
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Which is the best way to rank a site?
Hi, I have been working on SEO for a long time, recently I started a new site where I was aiming to rank different niches but I am stuck. First I covered some keywords related to sports then I shifted the niche to hunting. My idea was to cover a niche fully then move on to the 2nd so the authority of the site can also help rank the 2nd niche but the problem is I am unable to rank my site. Should I be considering only a very specific niche site or should I continue doing all the stuff on the same site. Please checkout my site ReviewsCase.com and let me know. And if has also done the same please let me know.
Algorithm Updates | | seoasikhan20 -
Our site dropped by April 2018 Google update about content relevance: How to recover?
Hi all, After Google's confirmed core update in April 2018, we dropped globally and couldn't able to recover later. We found the update is about the content relevance as officially stated by Google later. We wonder how we are not related in-terms of content being ranking for same keywords over years. And we are expecting to find a solution to this. Are there any standard ways to measure the content relevancy? Please suggest! Thank you
Algorithm Updates | | vtmoz0 -
What happens when a de-indexed subdomain is redirected to another de-indexed subdomain? What happens to the link juice?
Hi all, We are planning to de-index and redirect a sub domain A to sub domain B. Consequently we now need to d-index sub domain B also. What happens now to the link juice or page rank they gained from hundreds and thousands of backlinks? Will there be any ranking impact on main domain? Backlinks of these sub domains are not much relevant to main domain content. Thanks
Algorithm Updates | | vtmoz1 -
Parallax Scrolling when used with “hash bang” technique is good for SEO or not?
Hello friends, One of my client’s website http://chakracentral.com/ is using Parallax scrolling with most of the URLs containing hash “#” tag. Please see few sample URLs below: http://chakracentral.com/#panelBlock4 (service page)
Algorithm Updates | | chakraseo
http://chakracentral.com/#panelBlock3 (about-us page) I am planning to use “hash bang” technique on this website so that Google can read all the internal pages (containing hash “#” tag) with the current site architecture as the client is not comfortable in changing it. Reference: https://developers.google.com/webmasters/ajax-crawling/docs/getting-started#2-set-up-your-server-to-handle-requests-for-urls-that-contain-escaped_fragment But the problem that I am facing is that, lots of industry experts do not consider parallax websites (even with hash bang technique) good for SEO especially for mobile devices. See some references below: http://searchengineland.com/the-perils-of-parallax-design-for-seo-164919
https://moz.com/blog/parallax-scrolling-websites-and-seo-a-collection-of-solutions-and-examples So please find my queries below for which I need help: 1. Will it be good to use the “hash bang” technique on this website and perform SEO to improve the rankings on desktop as well as mobile devices?
2. Is using “hash bang” technique for a parallax scrolling website good for only desktop and not recommended for mobile devices and that we should have a separate mobile version (without parallax scrolling) of the website for mobile SEO?
3. Parallax scrolling technique (even with "hash bang") is not at all good for SEO for both desktop as well as mobile devices and should be avoided if we want to have a good SEO friendly website?
4. Any issue with Google Analytics tracking for the same website? Regards,
Sarmad Javed0 -
How do I code SEO for a secondary site without impacting the main site?
We have a secondary site for our online magazine, how do I code the SEO so I don't steal links from the main site?
Algorithm Updates | | gacwebteam0 -
Site´s Architecture - Categories . What´s the best in my case?
My Dear friends of MOZ, I´ve got you a case that has been driving me crazy for 2 weeks, Im doing an SEO audit for big brand that sells electronics. Since they sell all kind of electronics, and are very popular the site is quite big and has several categories. Now...Im working particularly in a kind of micro-site that sells two kind of products that are very similar but not the same. Lets say in this site they are selling super-light-weight-Laptops and tablets, so if you look the site its a Laptop/Tablet site. But the site is not under a laptop/tablet directory, some pages are under laptop and others in Tablet directory . For example : Home page URL: /light-laptops/home.asp ; Products general page page URL is light-pads/products.asp ; and each single product page is under laptops or pads according the type of product. From my point of view, they should create a new directory called /light-laptops-pads/ and single directories for products, and case studies, etc.. Since they want to show both products together when you click in products (off course they will be creating sub-directories for the two types of products). At the begining I thought they were really mistaken, but now that I see that all light-pad content is in one folder and light-laptops content is in another, and the site jumps from one category to the other I am a little bit confused. PLEASE HELP ME PD: I want to make clear that general categories like products, case studies , contact us, solutions pages are in some cases under /light-pad/ directory and in other cases under /light-laptops / directory PLEASE PARDON MY ENGLISH!
Algorithm Updates | | facupp10 -
How can I submit a reconsideration request while not having any manual action?
Hello, My URL is: www.BannerBuzz.com and it has been penalized by google algorithm and my all ranking are getting down so i have found some links and disavowed them in google webmaster tool and now I want to submit a reconsideration request with google but I am not able to do the same. When I Click on "Request reconsideration of your site." it takes me to Check any manual action against my website. After Checking the manual action it says "No manual web spam actions found." and After that No Option to Submit My Request! Can we submit a request while not having any manual action against the site? Thanks
Algorithm Updates | | CommercePundit0 -
How can I use Intuit without getting duplicate content issues
All of my Intuit site show duplicate content on the index pages. How can I avoid this
Algorithm Updates | | onestrohm0