So, useless link exchange pages still work?!
-
After 3 years out of SEO I thought things might have moved on, but apparently not. Bit of back link research and all the top sites in my niche have tons of reciprocal links to barely relevant sites.
Do I really have to do this? I mean I thought this was so out of date, it's not much better than keyword stuffing.
So, should I just forget my lofty principles asking myself 'is this of any value to my users?' and just take the medicine?
-
Thanks for the replies. It's kind of what I feel - I can't really bring myself to start swapping links with 'Bulgarian dog widget' sites!
I had just been assuming that the value of a reciprocal link was pretty much null if the two sites were of similar standing - IMHO that's how it should be. I link to sites that I like, that's the way it should work.
Having said all that, I do have some more targeted, less precious sites that I might set up a 'useful links, / resources / our friends or whatever euphemism is favoured these days. But, I draw the line at some of the off-topic junk I see on my competitors.
-
Thanks for the great answer. Our site has minimal links compared to our competition but our links are quality relevant links. I was always frustrated trying to get good ranking for our site but as you said _your content, and your seo basics, and keep up your principles! _ Well after a year of sticking to our principals we have seen a dramatic increase in traffic and ranking. It's paying off. Thank you moz community.
-
Hi Chris,
Things like that sure can be frustrating. Since penguin / panda I have seen many users here question the actual benefit to them as their industries are still riddled with black hat seo tricks.
But in some cases Google can't really penalize a site simply for having 30 or 40 of these crappy links when they have 300-400 quality links. The big issue for Google is that many of these website owners did only what the seo industry told them to do, which was get as many links as possible. I think they did a pretty fair job at penalizing the biggest link rings and farms, but there are still many little guys out there that are operating. I would bet in time many more sites are going to get penalized.,
Remember your users, your content, and your seo basics, and keep up your principles!
-
In my belief, working on quality, expecting to have link on sites that are really big fish in the industry, eventually will pay out.
Yes, there are still a high number of websites which rank well with those links, but time will come when they will regret that.
So don't worry about your principles just keep up the good work.
Greetings,
Istvan
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I have an https page with an http img that redirects to an https img, is it still considered by google to be a mixed content page?
With Google starting to crack down on mixed content I was wondering, if I have an https page with an http img that redirects to an https img, is it still considered by Google to be a mixed content page? e.g. In an old blog article, there are images that weren't updated when the blog migrated to https, but just 301ed to new https images. is it still considered a mixed content page?
Algorithm Updates | | David-Stern0 -
Hit by an unnamed Google update on November 30th - Still suffering
Hi, Moz Community. Just decided to sign up for a free trial because I'm absolutely at my wits end here. Here's my site: cheapgamesguru.com I run a small PC gaming blog monetized by affiliate marketing. I do all the writing, SEO, etc. myself. The content I write is, from what I can tell, fully complying with Google's guidelines, and in and of itself is pretty informative and high-quality. My site was started in December of 2015, and it was doing very well for a good 10 or 11 months - until late November of 2016. Then something happened. My traffic started plummeting - I went from getting nearly 300 organic users a day (Not sessions - actual unique users) to 80, then 40, and now I'm lucky to get over 15 a day. I do not do ANY black hat SEO whatsoever. I have not taken part in any shady link building schemes, nor do I try to trick Google in any way. I just write good content, do good keyword research (Targeting only low-hanging fruit and low-difficulty keywords using KWFinder), and do my best to provide a good user experience. I run no ads on my site. Glenn Gabe wrote about a potential Google update on November 29th, but the stuff he said in his article doesn't seem to affect me - my mobile site is perfectly fine, according to Google's own metrics and testing tools. Here's the article in question: http://www.gsqi.com/marketing-blog/november-30-2016-google-algorithm-update/ At first, I thought it was possible that this was a result of my competitors simply doing far better than me - but that doesn't seem to be the case, as their rankings did not actually move - mine simply pummeted. And many of their sites are far worse than mine in terms of grammar, spelling, and site speed. I understand backlinks are important, by the way, but I really don't think that's why my site was hit. Many competitors of mine have little to no backlinks and are doing great, and it would also not make much sense for Google to hit an otherwise great site just because they have few backlinks. A friend of mine has reached out to Glenn Gabe himself to see if he can get his input on my site, but he's had a busy schedule and hasn't gotten a chance to take a look yet. I recently obtained a backlink from a highly relevant DA 65 site (About a month ago, same niche as my site), and it now shows up in Search Console and Ahrefs - but it hasn't affected rankings whatsoever. Important Note: I'm not only just ranking poorly for stuff, I'm ranking in position 100-150+ for many low-competition keywords. I have no idea why that is happening - is my site THAT bad, that my content deserves to be ranking on page 15 or lower? Sorry for the long question. I'm struggling here, and just wanted to give as much information as possible. I would really appreciate any input you guys can give me - if any SEO experts want to turn my site into a case study and work with me to improve things, I'd also be open to that 😉 I kid, of course - I know you guys are all busy. Thanks! P.S. I've attached a picture of my SEMRush graph, for reference, as well. mhgSw
Algorithm Updates | | polycountz0 -
How Google's "Temporarily remove URLs" in search console works?
Hi, We have created new sub-domain with new content which we want to highlight for users. But our old content from different sub-domain is making top on google results with reputation. How can we highlight new content and suppress old sub-domain in results? Many pages have related title tags and other information in similar. We are planing to hide URLs from Google search console, so slowly new pages will attain the traffic. How does it works?
Algorithm Updates | | vtmoz0 -
Can we ignore "broken links" without redirecting to "new pages"?
Let's say we have reaplced www.website.com/page1 with www.website.com/page2. Do we need to redirect page1 to page2 even page1 doesn't have any back-links? If it's not a replacement, can we ignore a "lost page"? Many websites loose hundreds of pages periodically. What's Google's stand on this. If a website has replaced or lost hundreds of links without reclaiming old links by redirection, will that hurts?
Algorithm Updates | | vtmoz0 -
Are you still seeing success with EMD's?
I am curious if any other SEO's are still seeing success with exact matching domains. I am not seeing ANY changes to any of my clients rankings since the "Exact Match Domain" filter came about in September. Also while I have conducted SERP audits in my neck of the woods I am noticing EMD's are still doing very well. What are you seeing?
Algorithm Updates | | clarktbell0 -
Yoast SEO plugin and Weak Links
The plugin has what I thought was a great feature. My main site is often scrapped and I thought 'well at least we're getting a Link out of it' - due to the RSS feature of Yoast's Wordpress SEO plugin (you can add a link to the bottom of your RSS feeds). Now Google is talking about Links from weak/crap sites and how they may impact your rankings. So - with this in mind.. Do we want links from scrappers? Are we now better off discontinuing the usage of this feature? I imagine there may be varying opinions on this so I'll open it as a discussion... thanks
Algorithm Updates | | TheHockeyWriters0 -
Did Google just give away how Penguin works?
At SMX during the You&A with Matt Cutts, Danny asked why the algo update was called Penguin. Matt said: "We thought the codename actually might give too much info about how it works so the lead engineer got to choose." Last night Google released their 39 updates for the month of May. Among them was this: "Improvements to Penguin. [launch codename "twref2", project codename "Page Quality"] This month we rolled out a couple minor tweaks to improve signals and refresh the data used by the penguin algorithm." Whoa, codename twref2 for Penguin improvement? Is this giving us an insight about how it works? I would guess the ref2 means second refresh perhaps. But tw I am not sure about. What do you think? Is there a hidden insight here?
Algorithm Updates | | DanDeceuster1 -
Using Brand Name in Page titles
Is it a good practice to append our brand name at the end of every page title? We have a very strong brand name but it is also long. Right now what we are doing is saying: Product Name | Long brand name here Product Category | Long brand name here Is this the right way to do it or should we just be going with ONLY the product and category names in our page titles? Right now we often exceed the 70 character recommendation limit.
Algorithm Updates | | mlentner1