Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Can hotlinking images from multiple sites be bad for SEO?
-
Hi,
There's a very similar question already being discussed here, but it deals with hotlinking from a single site that is owned by the same person.
I'm interested whether hotlinking images from multiple sites can be bad for SEO.
The issue is that one of our bloggers has been hotlinking all the images he uses, sometimes there are 3 or 4 images per blog from different domains.
We know that hotlinking is frowned upon, but can it affect us in the SERPs?
Thanks,
James
-
Sorry, hotlinking was the wrong word to use, we're actually just embedding the images.
Is it possible that Google recognises that spammy sites (as an example) tend to embed lots of images and therefore use it as an indicator of spam?
Also, is poor netiquette ever taken into account? Again, maybe because Google is trying to find spammy sites?
For the record, it is something we'll be fixing (especially from a copyright point of view), but we're trying to prioritise this. If there's a potential SEO impact, we'll sort it quick, if not, then we'll do more pressing things first.
-
Okay, so hotlinking is the wrong terminology to use. Do you think embedding images is taken into account by Google?
For example, would Google see spammy sites embedding lots of images, and therefore use it as an indicator of spam?
-
That's confused me too! Embedding an image from another site is hotlinking. A href doesn't have anything to do with it.
-
Excuse me, it's late in the day. Embedding is still referencing the sites image URL right?
Also, what if the site changes the directory or something and all the images on your site now 404.
-
Another thing to consider is that requesting images from multiple sites will create a lag in load times. Most modern browsers will download multiple files in parallel from the one host. Multiple hosts will mean the page load will occur in series (not parallel) and this will create a slower load time.
Hope this helps!
Dan
-
Sorry, I assumed you meant you were hotlinking images, rather than just embedding them. If you're just using tags with no <href> defined (so just embedding, not hotlinking), then you're right - this won't cause a problem.</href>
-
Create and host your own image or use a royalty-free image so you won't suffer from someone claiming copyright, this should be your biggest concern here.
-
Takeshi is right. Bandwidth can cost money, so there's that as well as the copyright theft. You could also fall victim to a 'switcheroo': http://www.deuceofclubs.com/switcheroo/index.html - I've done this myself before by adding a polite message asking someone not to hotlink.
Google don't include hotlinked images in Google News so it is something they may take into account when ranking a page in their general search.
-
Surely that only works if it's an actual link, right? Simply using the tag shouldn't be regarded as a link by Google?
-
You are definitely missing out on image traffic by not hosting your own images. Plus, hotlinking is poor netiquette since you are using someone else's bandwidth without their permission. If the images are copyrighted, then you could be hit by DMCA requests which can negatively impact your SEO.
-
Hi James
A lot of this will depend on the site you're linking to.
It's long been a part of the ranking algorithm that if you link to sites that are seen negatively by Google, due to spam/malware/etc, then your site may be viewed negatively itself. Without knowing where your blogger has been linking from, it's hard to say - but it's worth running a check just in case.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best Web-site Structure/ SEO Strategy for an online travel agency?
Dear Experts! I need your help with pointing me in the right direction. So far I have found scattered tips around the Internet but it's hard to make a full picture with all these bits and pieces of information without a professional advice. My primary goal is to understand how I should build my online travel agency web-site’s (https://qualistay.com) structure, so that I target my keywords on correct pages and do not create a duplicate content. In my particular case I have very similar properties in similar locations in Tenerife. Many of them are located in the same villa or apartment complex, thus, it is very hard to come up with the unique description for each of them. Not speaking of amenities and pricing blocks, which are standard and almost identical (I don’t know if Google sees it as a duplicate content). From what I have read so far, it’s better to target archive pages rather than every single property. At the moment my archive pages are: all properties (includes all property types and locations), a page for each location (includes all property types). Does it make sense adding archive pages by property type in addition OR in stead of the location ones if I, for instance, target separate keywords like 'villas costa adeje' and 'apartments costa adeje'? At the moment, the title of the respective archive page "Properties to rent in costa adeje: villas, apartments" in principle targets both keywords... Does using the same keyword in a single property listing cannibalize archive page ranking it is linking back to? Or not, unless Google specifically identifies this as a duplicate content, which one can see in Google Search Console under HTML Improvements and/or archive page has more incoming links than a single property? If targeting only archive pages, how should I optimize them in such a way that they stay user-friendly. I have created (though, not yet fully optimized) descriptions for each archive page just below the main header. But I have them partially hidden (collapsible) using a JS in order to keep visitors’ focus on the properties. I know that Google does not rank hidden content high, at least at the moment, but since there is a new algorithm Mobile First coming up in the near future, they promise not to punish mobile sites for a collapsible content and will use mobile version to rate desktop one. Does this mean I should not worry about hidden content anymore or should I move the descirption to the bottom of the page and make it fully visible? Your feedback will be highly appreciated! Thank you! Dmitry
Technical SEO | | qualistay1 -
Tools/Software that can crawl all image URLs in a site
Excluding Screaming Frog, what other tools/software to use in order to crawl all image URLs in a site? Because in Screaming Frog, they don't crawl image URLs which are not under the site domain. Example of an image URL outside the client site: http://cdn.shopify.com/images/this-is-just-a-sample.png If the client is: http://www.example.com, Screaming Frog only crawls images under it like, http://www.example.com/images/this-is-just-a-sample.png
Technical SEO | | jayoliverwright0 -
Word mentioned twice in URL? Bad for SEO?
Is a URL like the one below going to hurt SEO for this page? /healthcare-solutions/healthcare-identity-solutions/laboratory-management.html I like the match the URL and H1s as close as possible but in this case it looks a bit funky. /healthcare-solutions/healthcare-identity-solutions/laboratory-management.html
Technical SEO | | jsilapas0 -
How can I block incoming links from a bad web site ?
Hello all, We got a new client recently who had a warning from Google Webmasters tools for manual soft penalty. I did a lot of search and I found out one particular site that sounds roughly 100k links to one page and has been potentialy a high risk site. I wish to block those links from coming in to my site but their webmaster is nowhere to be seen and I do not want to use the disavow tool. Is there a way I can use code to our htaccess file or any other method? Would appreciate anyone's immediate response. Kind Regards
Technical SEO | | artdivision0 -
Google Showing Multiple Listings For Same Site?
I've been optimizing a small static HTML site and have been working to increase the keyword rankings, yet have always ranked #1 for the company name. But, I've now noticed the company name is taking more than just the first position - the site is now appearing in 1st, 2nd, and 3rd position (each position referencing a different page of the site). Great.. who doesn't want to dominate a page of Google! ..But it looks kind of untidy and not usually how links from the same site are displayed. Is this normal? I'm used to seeing results from the same site grouped under the primary result, but not like this. any info appreciated 🙂
Technical SEO | | GregDixson0 -
Move established site from .co.uk to .org - good or bad idea?
I am currently considering moving our site from the current .co.uk domain to the .org version which we also own. The site is established and indexed for 7 years, ranks well and has circa 10k traffic per month which is mainly UK & US traffic. The reason for the change to the .org domain is to make the site more global facing and give us the opportunity to develop the site into multi language within directories (.org/es/ etc.) and then target those to the local search engines. For the kind of site it is (community based) it wouldn’t really work to split this into lots of separate country targeted domains. So the choice is to either stick with the .co.uk and add the other foreign language specific content in directories within the .co.uk or move to the .org and do the same (there is also a potential third option of purchasing the .com which is currently unused but that could be pricey!) We are also planning a big overhaul of the site with redesign, lots of added content and reorganisation of the site – but are thinking that it would be better to move the domain on a 1:1 basis first with the current design, content and URL structure in place and then do the other changes 2 or 3 months down the line. I have read up on SEOmoz, google guidelines etc on moving a site to a new domain and understand the theoretical approach of moving the site and the steps to take (1to1 301 redirects, sitemaps on old and new etc) and I will retain ownership of the .co.uk so the redirects can remain in place indefinitely. However having worked so hard to get the site to where it is in the search engines and traffic levels I am very worried about whether the domain change is a good move. I am more than happy to accept a temporary fluctuation in rankings & traffic for 1 – 4 weeks as reported may happen as long as I can be sure it will return after a temporary period and be as strong (or almost as strong) as the previous rankings / traffic. Looking for peoples experiences to give me the confidence / reassurance to go ahead with this or any info on why I shouldn’t Thanks in advance for your advice. Adrian.
Technical SEO | | Zilla0 -
Image Size for SEO
Hi there I have a website which has some png images on pages, around 300kb - is this too much? How many kbs a page, to what extent do you know does Google care about page load speed? is every kb important, is there a limit? Any advice much appreciated.
Technical SEO | | pauledwards0 -
Multiple Domains, Same IP address, redirecting to preferred domain (301) -site is still indexed under wrong domains
Due to acquisitions over time and the merging of many microsites into one major site, we currently have 20+ TLD's pointing to the same IP address as our "preferred domain:" for our consolidated website http://goo.gl/gH33w. They are all set up as 301 redirects on apache - including both the www and non www versions. When we launched this consolidated website, (April 2010) we accidentally left the settings of our site open to accept any of our domains on the same IP. This was later fixed but unfortunately Google indexed our site under multiple of these URL's (ignoring the redirects) using the same content from our main website but swapping out the domain. We added some additional redirects on apache to redirect these individual pages pages indexed under the wrong domain to the same page under our main domain http://goo.gl/gH33w. This seemed to help resolve the issue and moved hundreds of pages off the index. However, in December of 2010 we made significant changes in our external dns for our ip addresses and now since December, we see pages indexed under these redirecting domains on the rise again. If you do a search query of : site:laboratoryid.com you will see a few hundred examples of pages indexed under the wrong domain. When you click on the link, it does redirect to the same page but under the preferred domain. So the redirect is working and has been confirmed as 301. But for some reason Google continues to crawl our site and index under this incorrect domains. Why is this? Is there a setting we are missing? These domain level and page level redirects should be decreasing the pages being indexed under the wrong domain but it appears it is doing the reverse. All of these old domains currently point to our production IP address where are preferred domain is also pointing. Could this be the issue? None of the pages indexed today are from the old version of these sites. They only seem to be the new content from the new site but not under the preferred domain. Any insight would be much appreciated because we have tried many things without success to get this resolved.
Technical SEO | | sboelter0