Will a blog post about a collection of useful tools and web resources for a specific niche being seen as negative by google for too many links?
-
SEO newbie here, I'm thinking about creating a blog post about a collection of useful tools and web resources for my specific niche. It'd be 300 links or more, but with comments, and categorized nicely. It'd be a useful resource for my target audience to bookmark, and share.
Will google see this as a negative? If so, what's the best way to do such a blog post?
Thanks
-
All really good answers. Looks like a visual graphical resource guide will be the way to both provide value and no having too many links on the page.
thanks guys.
-
Course same value to the reader of that post if you just LIST the URLs, and do NOT make them real live links...
Value passed on still, eh!
-
Yunyi,
Definitely going to go with grobro here. 300 links is a heck of a lot to place on any single post, and might be seen as spam. If possible, try to break them down into categories and create separate pages that are directly relevant to a specific topic or set of industry tools.
Not only will this make your information more accessible to your readers and avoid search penalties, but it makes for a much more natural link-building opportunity in the future once you begin marketing your content to relevant sites for those powerful backlinks. It also means that you will be able to market your content to a wider audience and improve brand recognition and potential client base.
At the end of the day, user experience is what drives major search engines, and as much as I love to read and get new information, I think any article with more than 25-30 links would be information overload and would start becoming irrelevant if the author wasn't being careful.
If nothing else, create categories according to your needs (alphabetic, best to worst, newest to oldest, by manufacturer or provider, by domain) and place them on your site to break the information up. This will organize the information and make it clear to your users where they want to go (and what pages they wish to bookmark).
Hope this helps and best of luck!
Rob
-
Hello Yunyi,
In my personal opinion, 300 outbound links on a single page are way too much. There is no exact rule for this but for highly authoritative sites I would say the maximum is around 80 outbound links. It also depends on how relevant and valuable they are for your visitors. If there is no way around, I personally would nofollow them.
But do you think anyone would read a post about 300 tools? I would break it down to the 15 best tools for example, otherwise your readers could be overwhelmed.
This is just my opinion, I hope it helps at least a little bit.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I am Using Wix website creator. Will google be able to read the javascript?
I tried using some of the moz tools like the "on page grader" and it was not able to read any of the writing on my webpage because wix uses javascript. Will this impact my rankings on google compared to my competitors? The New Wix websites allow you to build a website in HTML. Should I switch to this? Thanks, Jonathan
Technical SEO | | H1_Marketing_Solutions0 -
Using the Google Remove URL Tool to remove https pages
I have found a way to get a list of 'some' of my 180,000+ garbage URLs now, and I'm going through the tedious task of using the URL removal tool to put them in one at a time. Between that and my robots.txt file and the URL Parameters, I'm hoping to see some change each week. I have noticed when I put URL's starting with https:// in to the removal tool, it adds the http:// main URL at the front. For example, I add to the removal tool:- https://www.mydomain.com/blah.html?search_garbage_url_addition On the confirmation page, the URL actually shows as:- http://www.mydomain.com/https://www.mydomain.com/blah.html?search_garbage_url_addition I don't want to accidentally remove my main URL or cause problems. Is this the right way this should look? AND PART 2 OF MY QUESTION If you see the search description in Google for a page you want removed that says the following in the SERP results, should I still go to the trouble of putting in the removal request? www.domain.com/url.html?xsearch_... A description for this result is not available because of this site's robots.txt – learn more.
Technical SEO | | sparrowdog1 -
Want to move site to wordpress and keep links without using redicrects
I have an old cluny site that has been around for about 56 years. It is on the homestead platform. I want to move the site to a thesis theme 2.1 wordpress platform without losing my links. I would prefer not to do 301 redicrects. With thesis I can specify the URL for each page of the wordpress site, however the wordpress site is hosted on hostgator as a subdomain of another site and the other problem is that wordpress adds a back slash that is not present on the old site. I can, however add .html to the URL's for pages on the wordpress site to conform to the URL's on the old html site. Will this work? thx Paul p.s. the URL for my old site is www.affordable-uncontested-divorce.com
Technical SEO | | diogenes0 -
Google Webmaster tools Sitemap submitted vs indexed vs Index Status
I'm having an odd error I'm trying to diagnose. Our Index Status is growing and is now up to 1,115. However when I look at Sitemaps we have 763 submitted but only 134 indexed. The submitted and indexed were virtually the same around 750 until 15 days ago when the indexed dipped dramatically. Additionally when I look under HTML improvements I only find 3 duplicate pages, and I ran screaming frog on the site and got similar results, low duplicates. Our actual content should be around 950 pages counting all the category pages. What's going on here?
Technical SEO | | K-WINTER0 -
Link to Articles for news sites in Google SERPs
I'm trying to figure out why when I search for "international news" or "world news", for example, some sites in the SERPs have links to news articles, while others don't. For "international news", result of Fox News and New York Times have links to articles, while CNN (the top result), only have sitelinks. I would appreciate any theories on why this happens. Thanks.
Technical SEO | | seoFan210 -
Suggested crawl rate in google webmaster tools?
hey moz peeps, got a general question: what is the suggested custom crawl rate in google webmaster tools? or is it better to "Let Google determine my crawl rate (recommended)" If you guys have any good suggestions on this and site why that would be very helpful, thanks guys!
Technical SEO | | david3050 -
Will Google display the "@" Symbol in a SERP Title?
In our page title's, we'd like to include the "@" symbol. Will google display that symbol in the search results if we include it in the page's title?
Technical SEO | | sftravel0 -
Very well established blog, new posts now being indexed very late
I have an established blog.We update it on daily basis. In the past, when I would publish a new post, it would get indexed within a minute or so. But since a month or so, its taking hours. Sometimes like 10-12 hours for new posts to get indexed. Only thing I have changed is robots.txt. This is the current robots file. User-agent: * Disallow: /cgi-bin Disallow: /wp-admin Disallow: /wp-includes Disallow: /wp-content/plugins Disallow: /wp-content/cache Disallow: /wp-content/themes Disallow: /wp-login.php Disallow: /*wp-login.php* Disallow: /trackback Disallow: /feed Disallow: /comments Disallow: /author Disallow: /category Disallow: */trackback Disallow: */feed Disallow: */comments Disallow: /login/ Disallow: /wget/ Disallow: /httpd/ Disallow: /*.php$ Disallow: /*?* Disallow: /*.js$ Disallow: /*.inc$ Disallow: /*.css$ Disallow: /*.gz$ Disallow: /*.wmv$ Disallow: /*.cgi$ Disallow: /*.xhtml$ Disallow: /*?* Disallow: /*? Allow: /wp-content/uploads User-agent: TechnoratiBot/8.1 Disallow: # ia_archiver User-agent: ia_archiver Disallow: / # disable duggmirror User-agent: duggmirror Disallow: / # allow google image bot to search all images User-agent: Googlebot-Image Disallow: /wp-includes/ Allow: /* # allow adsense bot on entire site User-agent: Mediapartners-Google* Disallow: Allow: /* Sitemap: http://www.domainname.com/sitemap.xml.gz Site has tons of backlinks. Just wondering if something is wrong with the robots file or if it could be something else.
Technical SEO | | rookie1230