How can a site with two questionable inbound links outperform sites with 500-1000 links good PR?
-
Our site for years was performing at #1 for but in the last 6 months been pushed down to about the #5 spot. Some of the domains above us have a handful of links and they aren't from good sources. We don't have a Google penalty. We try to only have links from quality domains but have been pushed down the SERP's? Any suggestions?
-
Hi Graham
A quick look at your anchor text tells me you have an over optimization issue
--> http://screencast.com/t/lwzNZgEu5X
The very high percentage of commercial keyword anchors linking to the site is why the site used to rank well, and now this is not helping sites anymore or even hurting them.
I have seen this many times now, where sites with less authority but more "natural" anchor text perform better.
There are some options;
- You can request these sites change the anchor text to the brand name or domain name
- You can disavow the links (NOT recommended unless done with a super amount of care).
- You can do step one AND work on building newer more natural links
When Google sees those commercial anchors they know some form of link building has been done, and it gives a negative mark on the site - now sites like that have to work hard to earn Google's trust again.
-
Could help you with this. PM me your email
-
I implemented all these suggestions and it looks like google has re-indexed most of our page. We are still experiencing low rankings but when I use open opensiteexplorer it shows that we should be ranking better as our overall rank is 53 (Compared to 38, 31, 25, 41)
I looked at our links and we have about 75 links from DA 40 or above?
Is the site explorer flawed or am I missing something? I appreciate your feedback so far. Some great suggestions.
-
Thanks! You could move to Canada
-
But how to get those 10-20 quality links (natural ones, not bought)... that is a harder question!
-
Took a quick look, few things:
A. You lack authoritative links. Your highest PA/DA link that is not internal is 20/40 and its a directory site. Get high PA/DA links.
B. Probably insert a Privacy Policy and Terms and Conditions to meet Googles Quality guidelines and general good practice.
C. Your top competitor "mydefence.ca" is using SAPE links. Usually this is unsustainable for more than 6 months but who knows how long he can rank with it.
D. Your other competitor simply has more authoritative links than you.
E. Add more content onto your homepage. 400 words or more.
F. Reduce the word count of "law," "criminal," "criminal law". Word density is too high.
G. Switch up your H1 so it doesn't match your title exactly. Change the order of words.
Make these changes and get some high PA/DA links. This analysis was just done in 5 minutes. One month and 10-20 quality links and you will rank. Damn, this is so much easier than in Los Angeles. :S
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can Yoast plugin change back old title automatically?
I have a problem with the first page on my website. Im using Wordpress and Yoast SEO plugin. After I have changed and updated some information on the page, I also wanted to change and make a better page title. That I understand is to be changed under pages, by scrolling down to the Yoast setting for the specific page. Is that right? However, I have changed the title over and over again, and asked Google to re-index the page. Everything is fine directly after that. But when I check again after 24-48 hours, the page have automatically changed back to the old title? How is that possible? I´ve tried about 5-10 times, but it does the same thing after 24-48 hours every time. Hope you people with great knowledge can help me out here. 🙂
Algorithm Updates | | Masse0 -
Can you rank without spending lots of money?
Hello Everyone, This is a general question, and its one I have been thinking about recently because I am working on promoting couple of websites. I want to know if it is still possible to make websites rank consistently without spending lots of money. I am self employed, and 5 or so years ago, I remember I did all my own link building/content for about 6 or so websites that I owned, and I managed to make most of them rank consistently. I am not in the SEO business, and I have not touched any SEO type of work for about 5 years now. And I always did it for my own websites/business. I know this is a fast moving industry, and my general knowledge may be a bit out of date. I kind of get the feeling that the days of when small business owners could make a website rank on a shoe string budget and make a bit of money with an online business may have died or is dying. I am a realist and I know that only a very tiny percentage of websites make really good quality/fresh content that everyone wants to reads. I know a common advise is that you should create a site with such amazing content that everyone talks about you and mentions/links to you without you even to need to do any link building. But in my opinion (I could be wrong), but I feel that this probably happens to less than 0.01 percent of websites. And I also know even third rate websites with blogs or content sites charge to post an article with a link. So this makes me think that nowadays you need a good budget and plenty of time to make a website rank. Am I wrong? In today's internet, do you need to spend money to rank? I genuinely want to know peoples experience and or opinion on this subject. Thanks.
Algorithm Updates | | Ryan.Shahed0 -
Ecommerce SEO: Is it bad to link to product/category pages directly from content pages?
Hi ! In Moz' Whiteboard friday video Headline Writing and Title Tag SEO in a Clickbait World, Rand is talking about (among other things) best practices related to linking between search, clickbait and conversion pages. For a client of ours, a cosmetics and make-up retailer, we are planning to build content pages around related keywords, for example video, pictures and text about make-up and fashion in order to best target and capture search traffic related to make-up that is prevalent earlier in the costumer journey. Among other things, we plan to use these content pages to link directly to some of the products. For example a content piece about how to achieve full lashes will to link to particular mascaras and/or the mascara category) Things is, in the Whiteboard video Rand Says:
Algorithm Updates | | Inevo
_"..So your click-bait piece, a lot of times with click-bait pieces they're going to perform worse if you go over and try and link directly to your conversion page, because it looks like you're trying to sell people something. That's not what plays on Facebook, on Twitter, on social media in general. What plays is, "Hey, this is just entertainment, and I can just visit this piece and it's fun and funny and interesting." _ Does this mean linking directly to products pages (or category pages) from content pages is bad? Will Google think that, since we are also trying to sell something with the same piece of content, we do not deserve to rank that well on the content, and won't be considered that relevant for a search query where people are looking for make-up tips and make-up guides? Also.. is there any difference between linking from content to categories vs. products? ..I mean, a category page is not a conversion page the same way a products page is. Looking forward to your answers 🙂0 -
Sitemap Question - Should I exclude or make a separate sitemap for Old URL's
So basically, my website is very old... 1995 Old. Extremely old content still shows up when people search for things that are outdated by 10-15+ years , I decided not to drop redirects on some of the irrelevant pages. People still hit the pages, but bounce... I have about 400 pages that I don't want to delete or redirect. Many of them have old backlinks and hold some value but do interfere with my new relevant content. If I dropped these pages into a sitemap, set the priority to zero would that possibly help? No redirects, content is still valid for people looking for it, but maybe these old pages don't show up above my new content? Currently the old stuff is excluded from all sitemaps.. I don't want to make one and have it make the problem worse. Any advise is appreciated. Thx 😄
Algorithm Updates | | Southbay_Carnivorous_Plants0 -
Need List of new high pr free local USA based directories list.
Need List of new high pr free local USA based directories list. Anyone can help ?
Algorithm Updates | | mnkpso0 -
Microsites for Local Search / Location Based sites?
Referring to the webinar on SEOMoz about Local Search that was presented by Nifty Marketing (http://www.seomoz.org/webinars/be-where-local-is-going). I have a question my client asked us regarding why we broke out their locations into microsites, and not just used subfolders. So here are the details: The client has one main website in real estate. They have 5 branches. Each branch covers about a 50 mile radius. Each branch also covers a specialized niche in their areas. When we created the main site we incorporated the full list of listings on the main site; We then created a microsite for each branch, who has a page of listings (same as the main site) but included the canonical link back to the main site. The reason we created a microsite for each branch is that the searches for each branch are very specific to their location and we felt that having only a subfolder would take away from the relevancy of the site and it's location. Now, the location sites rank on the first page for their very competitive, location based searches. The client, as we encourage, has had recommendations from others saying this is hurting them, not helping them. My question is this... How can this hurt them when the microsites include a home page specific to the location, a contact page that is optimized with location specific information (maps, text, directions, NAP, call to action, etc.), a page listing area information about communities/events/etc., a page of the location's agents, and of course real estate listings (with canonical back to the main site)? Am I misunderstanding? I understood that if the main site could support the separation of a section into a microsite, this would help local search. Local search is the bread and butter of this client's conversions. AND if you tell me we should go back to having subfolders for each location, won't that seriously hurt our already excellent rankings? The client sees significant visitors from their placement of the location URLs. THANKS!
Algorithm Updates | | gXeSEO
Darlene1 -
Why is site dropping in rank after we update it?
One of our sites - supereyes.com - appears to drop in rank after we update it. The client notified us of this today and I've verified that it did indeed drop in Google -- four spots since last week. He says this happens every time we make changes to the site, but then a week later it will go back up and is usually higher than where it was before. I have not verified this, but I'm very worried it may not rise again In the past week, we've posted a new blog entry to their site and we've changed some of the content -- specifically, added their locations to the header, added a contact page and put two testimonials in their sidebar. We've also had someone submitting their site to directories and local business sites like Angie's List and so forth. There are about 16 new backlinks established in the past 2-3 weeks. Also, I should note, traffic is higher than it's ever been, but the client doesn't look at traffic. They only look at their Google results. Can anyone offer any insight into what's going on here and if I need to be worried the site won't rise again in the rankings?
Algorithm Updates | | aloley0 -
Can AJAX implementation affect the rankings in Google Panda?
Hi there, I have the following situation with one of our job sites. We migrate the site to a new application, which is better from design point of view and also usability. For this we use a lot AJAX especially in searches. So every time a user is filtering down their search new results will be shown on the page, at the same url and with no page load. But, having this implementation. affected Bounce rate - which increased from 38% to nearly 60%, PI/visits - which are now half, at 3 and also Avg Time on Site is half that is used to be coming to 2,5 min from nearly 6 min. From Rand post, it is clearly that the content is very important in Google Panda, and all of these parameters we should consider, as it is telling the quality of the content. So, my question will be, can this site be hit by Panda updates (maybe later on) because Bounce Rate, PI/Visits and Avg Time on site, decreased in such way? At the moment we don't measure the Ajax impresion, but as I understood that we can do that though virtual pages in GA, does anyone of you have the experience how to handle this? Won't be this an artificial increase? Thanks, Irina
Algorithm Updates | | InformMedia0