Hey there, i'm working on search results in dutch.
-
My biggest competitor who's number 1 in main keywords in google has almost only links from 'linkfarms' and blog comments. How is he ranked that high? Would it be a good idea to add a bit the best of these in my mix, while i work on the real good quality content?
-
As long as those guest posts are well-placed on respected, topical sites, that could help. You just have to make sure your contributions to those sites are genuinely relevant to their audiences.
-
Welcome to Dutch link building, overall the quality of links is way lower than the averages in other countries. In some cases the low quality links are still seen as more high quality. We've still got some very popular link farms/ directories that in any other country would directly end up in your disavow file.
-
Could be! Its just when i use moz, to discover there best links some high quality 'link pages' come up. They do have a couple of others pretty good links could be that i quess. Would doing 2-3 guest posts be okay?
-
Thank you, i did miss that!
-
Hi Wouter,
I would say that you don't want to be following in what they are doing. There might be other reasons why they are doing so well, but without seeing the site, it is almost impossible to say why.
Perhaps their content is considered exceptionally good - perhaps they have a few non-spammy links that are really high quality? - Perhaps they have more trust gained from Google?
Remember that Google will ignore a lot of the spammy links from directories and blog comments. Look past the links if you want to see what is going on as there may be something else.
-Andy
-
Hello, my friend.
I have noticed the same thing happening to our website and our clients' websites. As you said, we see lots of bad/spammy links to our competitors and they rank high (not always higher though). Well, I asked this question here: https://moz.com/community/q/spammy-backlinks-are-working
After reading all that + just using common sense + a bit of hope for intelligence of Google updates, I just didn't have enough guts to risk the rankings we achieved so far and reputation of the domain.
So, as it's said in responses in that discussion, if you're willing to see your website get messed up in case everything goes south, you're more than welcome. Just write a case study/research after that for curious minds like me
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do internal search results get indexed by Google?
Hi all, Most of the URLs that are created by using the internal search function of a website/web shop shouldn't be indexed since they create duplicate content or waste crawl budget. The standard way to go is to 'noindex, follow' these pages or sometimes to use robots.txt to disallow crawling of these pages. The first question I have is how these pages actually would get indexed in the first place if you wouldn't use one of the options above. Crawlers follow links to index a website's pages. If a random visitor comes to your site and uses the search function, this creates a URL. There are no links leading to this URL, it is not in a sitemap, it can't be found through navigating on the website,... so how can search engines index these URLs that were generated by using an internal search function? Second question: let's say somebody embeds a link on his website pointing to a URL from your website that was created by an internal search. Now let's assume you used robots.txt to make sure these URLs weren't indexed. This means Google won't even crawl those pages. Is it possible then that the link that was used on another website will show an empty page after a while, since Google doesn't even crawl this page? Thanks for your thoughts guys.
Intermediate & Advanced SEO | | Mat_C0 -
Recent changes to Google organic search?
We have a client's website that was on page 1 for 2 years, and then in September fell off while a new website with virtually no visitors and never showing in organic search before shot to #1. Never seen anything like it. Today, my client is back where they were two weeks ago and the #1 listing I mentioned is not even on page 1 at all. In fact it's at the bottom of page 2! It seems to us, having read about Google organic changes made around July 3rd, 2018, that even more emphasis is now on the domain name (all the results on page 1 have my client's keyword in their domain name) and the importance of H1 tags and Title tags has risen to trump many other factors. Can anyone shed some light on changes you may have seen in the past few months? Along with huge changes to Adwords and AdGrants, Google seems all over the place (at least to us) and it is more challenging then ever. Thanks!!
Intermediate & Advanced SEO | | Teamzig0 -
I'm in Canada and building a website for the US...approach?
Hi there - we already have a Canadian website for the company and we're building one for our American branch. From an SEO perspective what is the best approach here? We have already purchased a .com domain and the company is branded a little different in the US than in Canada. How do I tell Google that this site is American and should be served primarily to the American audience? Should I be tagging duplicate content with rel=canonical (for similar pages like the About us section for instance) or does that matter here? Hope you guys can help. Thanks!
Intermediate & Advanced SEO | | MelcorDev0 -
Why isn't the rel=canonical tag working?
My client and I have a problem: An ecommerce store with around 20 000 products has nearly 1 000 000 pages indexed (according to Search Console). I frequently get notified by messages saying “High number of URLs found” in search console. It lists a lot of sample urls with filter and parameters that are indexed by google, for example: https://www.gsport.no/barn-junior/tilbehor/hansker-votter/junior?stoerrelse-324=10-11-aar+10-aar+6-aar+12-aar+4-5-aar+8-9-aar&egenskaper-368=vindtett+vanntett&type-365=hansker&bruksomraade-367=fritid+alpint&dir=asc&order=name If you check the source code, there’s a canonical tag telling the crawler to ignore (..or technically commanding it to regard this exact page as another version of the page without all the parameters) everything after the “?” Does this url showing up in the Search Console message mean that this canonical isn’t working properly? If so: what’s wrong with it? Regards,
Intermediate & Advanced SEO | | Inevo
Sigurd0 -
Establishing if links are 'nofollow'
Wonder if any of you guys can tell me if there is any other way to tell google links are nofollow other than in the html (ie can you tell google to nofollow every link in a subdomain or something). I'm trying to establish if a couple of links on a very high ranking site are passing me pagerank or not without asking them directly and looking silly! Within the source code for the page they are NOT tagged as nofollow at present. Hope that all makes sense 😉
Intermediate & Advanced SEO | | mat20150 -
Rich Snippets not appearing in Search Results
Hi Everyone, Just a few questions on rich snippets please: We have now integrated microdata (data-vocabulary.org) on all our product pages like http://www.homeshop18.com/samsung-galaxy-tab-2-310-tablet/computer-peripherals/ipads-tablets/product:30409470/cid:8937/ which I have tested on the rich snippets testing tool which Google provides and all is working fine and is rendering properly in the tool but it is not coming in search results in google. Are we doing everything right or is there any issue in our implementation? How long does it take usually for rich snippets to appear in google organic search results?
Intermediate & Advanced SEO | | h1seo0 -
Local results vs Normal results
Hi everyone, I am currently working on the website of a friend, who's owning a French spa treatment company. I have been working on it for the past 6 months, mostly on optimizing the page titles and the link building. So far the results are great in terms on normal results : if you type most of the keywords and the city name, the website would be very well positioned, if not top positioned. My only problem is that in the local results (Google Maps), nothing has improved at all. In most of the same keyword where the website is ranking 1st on normal results, the website doesn't appear at all on the same keywords in local results. This is confusing as you would think Google think the website is relevant to the subject according to the normal results but it doesn't show any good ones in a local matter. The website is clearly located in the city (thanks to the pages titles and there's a Google Map in a specific page dedicated to its location). The company has a Google Places page and it has positive customers reviews on different trusted websites for more than a year now (the website is 2 years old). I focused my work concerning the link building on the local websites (directories and specialized websites) for the past 2 months. The results kept improving on normal results but still no improvement at all in the local ones. As far as I know, there is no mistakes such as multiple addresses for the same business etc. Everything seems to be done by the rules. I am not sure at all what more I can do. The competitors do not seem to be working their SEO pretty much and in terms of linking (according to the -pretty good- Seomoz tools), they have up to 10 times less (good) links than us. Maybe you guys have some advice on how I can manage this situation ? I'm kind of lost here 😞 Thanks a lot for your help, appreciate it. Cheers,
Intermediate & Advanced SEO | | Pureshore
Raphael0 -
How do Google Site Search pages rank
We have started using Google Site Search (via an XML feed from Google) to power our search engines. So we have a whole load of pages we could link to of the format /search?q=keyword, and we are considering doing away with our more traditional category listing pages (e.g. /biology - not powered by GSS) which account for much of our current natural search landing pages. My question is would the GoogleBot treat these search pages any differently? My fear is it would somehow see them as duplicate search results and downgrade their links. However, since we are coding the XML from GSS into our own HTML format, it may not even be able to tell.
Intermediate & Advanced SEO | | EdwardUpton610