Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Can hidden backlinks ever be ok?
-
Hi all,
I'm very new to SEO and still learning a lot.
Is it considered a black hat tactic to wrap a link in a DIV tag, with display set to none (hidden div), and what can the repercussions be?
From what I've learnt so far, is that this is a very unethical thing to be doing, and that the site hosting these links can end up being removed from Google/Bing/etc indexes completely. Is this true?
The site hosting these links is a group/parent site for a brand, and each hidden link points to one of the child sites (similar sites, but different companies in different areas).
Thanks in advance!
-
Hi Ryan,
Thanks for the quick feedback.
This clears up things for me a bit.Thanks,
Stephen -
The separation between black hat and white hat tactics is generally a clear line. The simple question is, does the code exist for the benefit of your site's visitors or solely to manipulate search engines?
DIV tags are used to apply CSS rules to specific pieces of code. If you have a link contained in a DIV and the display set to none, that link would clearly never be seen by the site's visitors. It is apparent the link exists solely to manipulate search engine results, and therefore is a black hat tactic.
When Google and other search engines discover black hat tactics being used on a site, they will take action. The action can be relatively minor such as ignoring the link. The action could be mid-range such as removing the page containing the link from the index. At the extreme end, they can remove the entire site from the index.
Each search engine has their own internal guidelines on how to handle these issues. Some issues are handled automatically via algorithms, while other issues are handled by manual review. There are no published standards on exactly which punishments will be handed out for a given violation. It is simply best to completely avoid anything black hat.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the difference between 301 redirects and backlinks?
i have seen some 301 redirects on my site billsonline, can anyone please explain the difference between backlinks and 301 redirects, i have read some articles where the writer was stating that 301 are not good for website.
Technical SEO | May 9, 2024, 5:14 PM | aliho0 -
What can I do to stop ranking for a keyword that has nothing to do with the companies website?
A website that we maintain keeps ranking for the keyword 'homeless shelter'. The company is UTILIS USA and they produce heavy duty shelters for military personnel. They have nothing to do with homeless shelters but continue to receive traffic concerning the phrase.
Technical SEO | May 20, 2014, 10:05 PM | ReviveMedia0 -
Can hotlinking images from multiple sites be bad for SEO?
Hi, There's a very similar question already being discussed here, but it deals with hotlinking from a single site that is owned by the same person. I'm interested whether hotlinking images from multiple sites can be bad for SEO. The issue is that one of our bloggers has been hotlinking all the images he uses, sometimes there are 3 or 4 images per blog from different domains. We know that hotlinking is frowned upon, but can it affect us in the SERPs? Thanks, James
Technical SEO | Mar 8, 2013, 12:41 PM | OptiBacUK0 -
Can too many pages hurt crawling and ranking?
Hi, I work for local yellow pages in Belgium, over the last months we introduced a succesfull technique to boost SEO traffic: we have created over 150k of new pages, all targeting specific keywords and all containing unique content, a site architecture to enable google to find these pages through crawling, xml sitemaps, .... All signs (traffic, indexation of xml sitemaps, rankings, ...) are positive. So far so good. We are able to quickly build more unique pages, and I wonder how google will react to this type of "large scale operation": can it hurt crawling and ranking if google notices big volumes of content (unique content)? Please advice
Technical SEO | Jan 7, 2013, 4:16 PM | TruvoDirectories0 -
Can dynamically translated pages hurt a site?
Hi all...looking for some insight pls...i have a site we have worked very hard on to get ranked well and it is doing well in search. The site has about 1000 pages and climbing and has about 50 of those pages in translated pages and are static pages with unique urls. I have had no problems here with duplicate content and that sort of thing and all pages were manually translated so no translation issues. We have been looking at software that can dynamically translate the complete site into a handfull of languages...lets say about 5. My problem here is these pages get produced dynamically and i have concerns that google will take issue with this aswell as the huge sudden influx of new urls....as now we could be looking at and increase of 5000 new urls. (which usually triggers an alarm) My feeling is that it could be risking the stability of the site that we have worked so hard for and maybe just stick with the already translated static pages. I am sure the process could be fine but fear a manual inspection and a slap on the wrist for having dynamically created content?? and also just risk a review trigger period. These days it is hard to know what could get you in "trouble" and my gut says keep it simple and as is and dont shake it up?? Am i being overly concerned? Would love to here from others who have tried similar changes and also those who have not due to similar "fear" thanks
Technical SEO | Oct 4, 2012, 1:36 PM | nomad-2023230 -
Can you mark up a page using Schema.org and Facebook Open Graph?
Is it possible to use both Schema.org and Facebook Open Graph for structured data markup? On the Google Webmaster Central blog, they say, "you should avoid mixing the formats together on the same web page, as this can confuse our parsers." Source - http://googlewebmastercentral.blogspot.com/2011/06/introducing-schemaorg-search-engines.html
Technical SEO | Jan 9, 2013, 6:36 PM | SAMarketing1 -
Thoughts about stub pages - 200 & noindex ok, or 404?
With large database/template driven websites it is often possible to get a lot of pages with no content on them. What are the current thoughts regarding these pages with no content, options; Return a 200 header code with noindex meta tag Return a 404 page & header code Something else? Thanks
Technical SEO | Jul 24, 2012, 7:47 PM | slingshot0 -
Is there such thing as a good text/code ratio? Can it effect SERPs?
As it says on the tin; Is there such thing as a good text/code ratio? And can it effect SERPs? I'm currently looking at a 20% ratio whereas some competitors are closer to 40%+. Best regards,
Technical SEO | Nov 1, 2011, 10:28 AM | ARMofficial
Sam.0