Can hidden backlinks ever be ok?
-
Hi all,
I'm very new to SEO and still learning a lot.
Is it considered a black hat tactic to wrap a link in a DIV tag, with display set to none (hidden div), and what can the repercussions be?
From what I've learnt so far, is that this is a very unethical thing to be doing, and that the site hosting these links can end up being removed from Google/Bing/etc indexes completely. Is this true?
The site hosting these links is a group/parent site for a brand, and each hidden link points to one of the child sites (similar sites, but different companies in different areas).
Thanks in advance!
-
Hi Ryan,
Thanks for the quick feedback.
This clears up things for me a bit.Thanks,
Stephen -
The separation between black hat and white hat tactics is generally a clear line. The simple question is, does the code exist for the benefit of your site's visitors or solely to manipulate search engines?
DIV tags are used to apply CSS rules to specific pieces of code. If you have a link contained in a DIV and the display set to none, that link would clearly never be seen by the site's visitors. It is apparent the link exists solely to manipulate search engine results, and therefore is a black hat tactic.
When Google and other search engines discover black hat tactics being used on a site, they will take action. The action can be relatively minor such as ignoring the link. The action could be mid-range such as removing the page containing the link from the index. At the extreme end, they can remove the entire site from the index.
Each search engine has their own internal guidelines on how to handle these issues. Some issues are handled automatically via algorithms, while other issues are handled by manual review. There are no published standards on exactly which punishments will be handed out for a given violation. It is simply best to completely avoid anything black hat.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Backlink quality vs quantity: Should I keep spammy backlinks?
Regarding backlinks, I'm wondering which is more advantageous for domain authority and Google reputation: Option 1: More backlinks including a lot of spammy links Option 2: Fewer backlinks but only reliable, non-spam links I've researched this topic around the web a bit and understand that the answer is somewhere in the middle, but given my site's specific backlink volume, the answer might lean one way or the other. For context, my site has a spam score of 2%, and when I did a quick backlink audit, roughly 20% are ones I want to disavow. However, I don't want to eliminate so many backlinks that my DA goes down. As always, we are working to build quality backlinks, but I'm interested in whether eliminating 20% of backlinks will hurt my DA. Thank you!
Technical SEO | | LianaLewis1 -
ADA, WCAG, Section 508 Accessibility and hidden text
I am working on fixing accessibility issues on client's site, and they have contracted with a vendor who provides both tools to monitor the site and consulting to help us fix issues that are found. When there are spatial relationships between elements on a page that would be not be evident to someone listening via a screen reader, a strategy that they recommended to us to is to add text helpers that are not visible, but still read by the screen readers. An example: Directions to our Fifth Avenue Store I have seen this technique used on a major brand site but I am concerned that their brand strength insulates them from a hidden text penalty far more than my client's brand would. Also, their implementation uses class names like "ada_hidden" which may help search engines understand the intent, or may not at all. I am looking for opinions regarding the use of this technique. Normally I wouldn't use it for risk of penalty, but here the intent is to improve the user experience of the pages. Anyone used similar techniques for ADA/WCAG, or solved the problem in a more SEO-friendly way? Thanks, Will
Technical SEO | | WillW0 -
Can spiders crawl jQuery Fancy Box scripts
Hi Everyone - I'm not a technical person at all. I have some content that will be hidden until a user clicks "learn more" where upon it will be displayed via jQuery Fancy Box script. The content behind the learn more javascript is important and I need it to be crawled by search engine spiders. Does anyone know if there will be a problem with this script?
Technical SEO | | Santaur0 -
What is the best way to remove and fight back backlink spam?
Removing low quality and spam backlinks. What is the most effective clean-up process?
Technical SEO | | matti_wilson0 -
Can you do a 301 redirect without a hosting account?
Trying to retire domain1 and 301 it to domain2 - just don't want to get stuck having to pay the old hosting provider simply to serve a .htaccess file with the redirect rule.
Technical SEO | | TitanDigital0 -
Why is there such a big discrepancy between OSE and GWT regarding # backlinks?
Hello, We have been doing some analysis around our backlink profiles for our sites and have been experiencing a massive discrepancy between what is reported as number of C class linking domains in OSE and the information returned in Google Webmaster tools. For a variety of sites OSE is reporting numbers < 10 for C class linking doamins while GWT shows >100 unique domains linking (we confirmed that the majority of these links are in different C classes) Is this simply a matter of the limited index size of OSE or could there be another explanation? It is interesting that the links that do show up in OSE a nearly exclusively sites that we own. /T
Technical SEO | | tomypro0 -
How can I prevent sh404SEF Anti-flood control from blocking SEOMoz?
I'm using sh404SEF on my Joomla 1.5 website. Last week, I activated the security functions of the tool, which includes an anti-flood control feature. This morning when I looked at my new crawl statistics in SEOMoz, I noticed a significant drop in the number of webpages crawled, and I'm attributing that to the security configurations that I made earlier in the week. I'm looking for a way to prevent this from happening so the next crawl is accurate. I was thinking of using sh404SEFs "UserAgent white list" feature. Does SEOMoz have a UserAgent string that I could try adding to my white list? Is this what you guys recommend as a solution to this problem?
Technical SEO | | JBradySD0 -
On a dedicated server with multiple IP addresses, how can one address group be slow/time out and all other IP addresses OK?
We utilize a dedicated server to host roughly 60 sites on. The server is with a company that utilizes a lady who drives race cars.... About 4 months ago we realized we had a group of sites down thanks to monitoring alerts and checked it out. All were on the same IP address and the sites on the other IP address were still up and functioning well. When we contacted the support at first we were stonewalled, but eventually they said there was a problem and it was resolved within about 2 hours. Up until recently we had no problems. As a part of our ongoing SEO we check page load speed for our clients. A few days ago a client who has their site hosted by the same company was running very slow (about 8 seconds to load without cache). We ran every check we could and could not find a reason on our end. The client called the host and were told they needed to be on some other type of server (with the host) at a fee increase of roughly $10 per month. Yesterday, we noticed one group of sites on our server was down and, again, it was one IP address with about 8 sites on it. On chat with support, they kept saying it was our ISP. (We speed tested on multiple computers and were 22MB down and 9MB up +/-2MB). We ran a trace on the IP address and it went through without a problem on three occassions over about ten minutes. After about 30 minutes the sites were back up. Here's the twist: we had a couple of people in the building who were on other ISP's try and the sites came up and loaded on their machines. Does anyone have any idea as to what the issue is?
Technical SEO | | RobertFisher0