Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Partners and Customers logo listing and links
-
We have just created a program where we list the customers that use our software and a link to their websites on a new "Customers" page. We expect to have upwards of 100 logos with links back to their sites. I want to be sure this isn't bordering on gray or black hat link building. I think it is okay since they are actual users of our software. But there is still that slight doubt.
Along these same lines, would you recommend adding a nofollow or noindex tag?
Thanks for your help.
-
Yes but we need to remember that the 100 links limit is very arbitrarily. Some sites can have less than 100 links on a page while other can have thousands and still being crawled. It really comes down to the authority of the website.
-
Indeed - Please mark the question as answered if you are satisfied at this point

-
Thank you everyone. One less thing for me to worry about in this crazy world of SEO.

-
I wouldn't worry about adding the links and logos. I'd be happy to link to all my customers (do follow links) to show my appreciation.
-
I would at least nofollow all the links, 100 external links is getting up there. In the crawl diagnostics on SEOmoz if a page has over 100 links it gets a warning letting you know that there are too many links on the page.
-
I would probably not include all 100 customers. Maybe a sampling of your customer base?
You could also nofollow the external links to sites that could be questionable. You could also noindex, nofollow that page as well. All are options.
In fact, you could just put 100 links on the page if you want. I highly doubt you will incur any kind of penalty at all. Matt Cutts has dispelled the myth that you can only have 100 links on a page. You can have as many as you want as long as it makes sense and doesn't look like your are spamming search engines with link farms. I probably wouldn't worry too much about it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitewide nav linking from subdomain to main domain
I'm working on a site that was heavily impacted by the September core update. You can see in the attached image the overall downturn in organic in 2019 with a larger hit in September bringing Google Organic traffic down around 50%. There are many concerning incoming links from 50-100 obviously spammy porn-related websites to just plain old unnatural links. There was no effort to purchase any links so it's unclear how these are created. There are also 1,000s of incoming external links (most without no-follow and similar/same anchor text) from yellowpages.com. I'm trying to get this fixed with them and have added it to the disavow in the meantime. I'm focusing on internal links as well with a more specific question: If I have a sitewide header on a blog located at blog.domain.com that has links to various sections on domain.com without no-follow tags, is this a possible source of the traffic drops and algorithm impact? The header with these links is on every page of the blog on the previously mentioned subdomain. **More generally, any advice as to how to turn this around? ** The website is in the travel vertical. 90BJKyc
White Hat / Black Hat SEO | | ShawnW0 -
Is this campaign of spammy links to non-existent pages damaging my site?
My site is built in Wordpress. Somebody has built spammy pharma links to hundreds of non-existent pages. I don't know whether this was inspired by malice or an attempt to inject spammy content. Many of the non-existent pages have the suffix .pptx. These now all return 403s. Example: https://www.101holidays.co.uk/tazalis-10mg.pptx A smaller number of spammy links point to regular non-existent URLs (not ending in .pptx). These are given 302s by Wordpress to my homepage. I've disavowed all domains linking to these URLs. I have not had a manual action or seen a dramatic fall in Google rankings or traffic. The campaign of spammy links appears to be historical and not ongoing. Questions: 1. Do you think these links could be damaging search performance? If so, what can be done? Disavowing each linking domain would be a huge task. 2. Is 403 the best response? Would 404 be better? 3. Any other thoughts or suggestions? Thank you for taking the time to read and consider this question. Mark
White Hat / Black Hat SEO | | MarkHodson0 -
Pinging Links
Interested to know if anybody still uses the strategy of pinging links to make sure they get indexed, there are a number of sites out there which offer it. Is it considered dangerous/spamy?
White Hat / Black Hat SEO | | seoman100 -
Is it Okay to Nofollow all External Links
So, we all "nofollow" most of the external links or all external links to hold back the page rank. Is it correct? As per Google, only non-trusty and paid links must be nofollow. Is it all same about external links and nofollow now?
White Hat / Black Hat SEO | | vtmoz0 -
Dealing with links to your domain that the previous owner set up
Hey everyone, I rebranded my company at the end of last year from a name that was fairly unique but sounded like I cleaned headstones instead of building websites. I opted for a name that I liked, it reflected my heritage - however it also seems to be quite common. Anyway, I registered the domain name as it was available as the previous owner's company had been wound up. It's only been in the last week or two where I've managed to have a website on that domain and I've been tracking it's progress through Moz, Google & Bing Webmaster tools. Both the webmaster tools are reporting back that my site triggers 404 errors for some specific links. However, I don't have or have never used those links before. I think the previous owner might have created the links before he went bust. My question is in two parts. The first part is how do I find out what websites are linking to me with these broken URL's, and the second is will these 404'ing links affect my SEO? Thanks!
White Hat / Black Hat SEO | | mickburkesnr0 -
Hiding content or links in responsive design
Hi, I found a lot of information about responsive design and SEO, mostly theories no real experiment and I'd like to find a clear answer if someone tested that. Google says:
White Hat / Black Hat SEO | | NurunMTL
Sites that use responsive web design, i.e. sites that serve all devices on the same set of URLs, with each URL serving the same HTML to all devices and using just CSS to change how the page is rendered on the device
https://developers.google.com/webmasters/smartphone-sites/details For usability reasons sometimes you need to hide content or links completely (not accessible at all by the visitor) on your page for small resolutions (mobile) using CSS ("visibility:hidden" or "display:none") Is this counted as hidden content and could penalize your site or not? What do you guys do when you create responsive design websites? Thanks! GaB0 -
Should I Do a Social Bookmarking Campaign and a Tier 2 Linking?
I don't see anything bad in manually creating links on different (about 50) social bookmarking services. Is this method labeled as White Hat? I was wondering if it would be fine to create Tier 2 linking (probably blog comments) for indexing of the social bookmarking links? Please share your thoughts on the topic.
White Hat / Black Hat SEO | | zorsto0 -
Deny visitors by referrer in .htaccess to clean up spammy links?
I want to lead off by saying that I do not recommend trying this. My gut tells me that this is a bad idea, but I want to start a conversation about why. Since penguin a few weeks ago, one of the most common topics of conversation in almost every SEO/Webmaster forum is "how to remove spammy links". As Ryan Kent pointed out, it is almost impossible to remove all of these links, as these webmasters and previous link builders rarely respond. This is particularly concerning given that he also points out that Google is very adamant that ALL of these links are removed. After a handful of sleepless nights and some research, I found out that you can block traffic from specific referring sites using your.htaccess file. My thinking is that by blocking traffic from the domains with the spammy links, you could prevent Google from crawling from those sites to yours, thus indicating that you do not want to take credit for the link. I think there are two parts to the conversation... Would this work? Google would still see the link on the offending domain, but by blocking that domain are you preventing any strength or penalty associated with that domain from impacting your site? If for whatever reason this would nto work, would a tweak in the algorithm by Google to allow this practice be beneficial to both Google and the SEO community? This would certainly save those of us tasked with cleaning up previous work by shoddy link builders a lot of time and allow us to focus on what Google wants in creating high quality sites. Thoughts?
White Hat / Black Hat SEO | | highlyrelevant0