Hey there, i'm working on search results in dutch.
-
My biggest competitor who's number 1 in main keywords in google has almost only links from 'linkfarms' and blog comments. How is he ranked that high? Would it be a good idea to add a bit the best of these in my mix, while i work on the real good quality content?
-
As long as those guest posts are well-placed on respected, topical sites, that could help. You just have to make sure your contributions to those sites are genuinely relevant to their audiences.
-
Welcome to Dutch link building, overall the quality of links is way lower than the averages in other countries. In some cases the low quality links are still seen as more high quality. We've still got some very popular link farms/ directories that in any other country would directly end up in your disavow file.
-
Could be! Its just when i use moz, to discover there best links some high quality 'link pages' come up. They do have a couple of others pretty good links could be that i quess. Would doing 2-3 guest posts be okay?
-
Thank you, i did miss that!
-
Hi Wouter,
I would say that you don't want to be following in what they are doing. There might be other reasons why they are doing so well, but without seeing the site, it is almost impossible to say why.
Perhaps their content is considered exceptionally good - perhaps they have a few non-spammy links that are really high quality? - Perhaps they have more trust gained from Google?
Remember that Google will ignore a lot of the spammy links from directories and blog comments. Look past the links if you want to see what is going on as there may be something else.
-Andy
-
Hello, my friend.
I have noticed the same thing happening to our website and our clients' websites. As you said, we see lots of bad/spammy links to our competitors and they rank high (not always higher though). Well, I asked this question here: https://moz.com/community/q/spammy-backlinks-are-working
After reading all that + just using common sense + a bit of hope for intelligence of Google updates, I just didn't have enough guts to risk the rankings we achieved so far and reputation of the domain.
So, as it's said in responses in that discussion, if you're willing to see your website get messed up in case everything goes south, you're more than welcome. Just write a case study/research after that for curious minds like me
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blocking Dynamic Search Result Pages From Google
Hi Mozzerds, I have a quick question that probably won't have just one solution. Most of the pages that Moz crawled for duplicate content we're dynamic search result pages on my site. Could this be a simple fix of just blocking these pages from Google altogether? Or would Moz just crawl these pages as critical crawl errors instead of content errors? Ultimately, I contemplated whether or not I wanted to rank for these pages but I don't think it's worth it considering I have multiple product pages that rank well. I think in my case, the best is probably to leave out these search pages since they have more of a negative impact on my site resulting in more content errors than I would like. So would blocking these pages from the Search Engines and Moz be a good idea? Maybe a second opinion would help: what do you think I should do? Is there another way to go about this and would blocking these pages do anything to reduce the number of content errors on my site? I appreciate any feedback! Thanks! Andrew
Intermediate & Advanced SEO | | drewstorys0 -
Working around Dev Site
I am working around development site. All pages are 'nofollow'. What can I advice on, when it come to SEO and is there any good article or checklist that I can go through. One thing I wanted to know is how can I check for broken links or meta data or any other SEO analytics if the page is 'nofollow' and I cannot crawl it with screaming frog or any other tools.
Intermediate & Advanced SEO | | atlanticocean0 -
Google Not Seeing My 301's
Good Morning! So I have recently been putting in a LOT of 301's into the .htaccess, no 301 plugins here, and GWMT is still seeing a lot of the pages as soft 404's. I mark them as fixed, but they come back. I will also note, the previous webmaster has ample code in our htaccess which is rewriting our URL structure. I don't know if that is actually having any effect on the issue but I thought I would add that. All fo the 301's are working, Google isn't seeing them. Thanks Guys!
Intermediate & Advanced SEO | | HashtagHustler0 -
What's the deal with significantLinks?
http://schema.org/significantLink Schema.org has a definition for "non-navigation links that are clicked on the most." Presumably this means something like the big green buttons on Moz's homepage. But does anyone know how they affect anything? In http://moz.com/blog/schemaorg-a-new-approach-to-structured-data-for-seo#comment-142936, Jeremy Nelson says " It's quite possible that significant links will pass anchor text as well if a previous link to the page was set in navigation, effictively making obselete the first-link-counts rule, and I am interested in putting that to test." This is a pretty obscure comment but it's one of the only results I could find on the subject. Is this BS? I can't even make out what all of it is saying. So what's the deal with significantLinks and how can we use them to SEO?
Intermediate & Advanced SEO | | NerdsOnCall0 -
Sitemap Folders on Search Results
Hello! We are managing SEO campaign of a video website. We have an issue about sitemap folders. I have sitemaps like ** /xml/sitemap-name.xml .** But Google is indexing my /xml/ folder and also sitemaps and they appear in search results. If i will add Disallow: /xml/ to my robots.txt and remove /xml/ folder from webmaster tools, Google could see my sitemaps? or it ignores them? Will my site effect negatively after remove /xml/ folder completely from search results? What should i do?
Intermediate & Advanced SEO | | roipublic0 -
Local results vs Normal results
Hi everyone, I am currently working on the website of a friend, who's owning a French spa treatment company. I have been working on it for the past 6 months, mostly on optimizing the page titles and the link building. So far the results are great in terms on normal results : if you type most of the keywords and the city name, the website would be very well positioned, if not top positioned. My only problem is that in the local results (Google Maps), nothing has improved at all. In most of the same keyword where the website is ranking 1st on normal results, the website doesn't appear at all on the same keywords in local results. This is confusing as you would think Google think the website is relevant to the subject according to the normal results but it doesn't show any good ones in a local matter. The website is clearly located in the city (thanks to the pages titles and there's a Google Map in a specific page dedicated to its location). The company has a Google Places page and it has positive customers reviews on different trusted websites for more than a year now (the website is 2 years old). I focused my work concerning the link building on the local websites (directories and specialized websites) for the past 2 months. The results kept improving on normal results but still no improvement at all in the local ones. As far as I know, there is no mistakes such as multiple addresses for the same business etc. Everything seems to be done by the rules. I am not sure at all what more I can do. The competitors do not seem to be working their SEO pretty much and in terms of linking (according to the -pretty good- Seomoz tools), they have up to 10 times less (good) links than us. Maybe you guys have some advice on how I can manage this situation ? I'm kind of lost here 😞 Thanks a lot for your help, appreciate it. Cheers,
Intermediate & Advanced SEO | | Pureshore
Raphael0 -
What tactics are working well for seo these days?
It seems google put the scare in everyone and all hear is content marketing is the future etc But few talk about what tactics are working to rank a site on a compettive term now not in the future. So ask from your experiences what tactics do you see working the best these days?
Intermediate & Advanced SEO | | DavidKonigsberg0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0