Hot Linking
-
We recently noticed that our website www.efurniturehouse.com is being hot linked by axsoris.com a Dutch website with many malware issue.
Any suggestions how to proceed?
Thanks in advance
Tony
-
NO Kidding?! That's amazing. Learn something new everyday. There ya go, there's your other option Tony. Good luck!
-
Okay so what's wrong with altering the directory where you store your image files?
Assuming a cease and desist letter isn't working, what other options are there?
-
You can make hotlinking rules in your htaccess. See example here.
http://underscorebleach.net/jotsheet/2004/06/htaccess-prevent-hotlinking
-
Our bandwidth usage on our site www.efurniturehouse.com was going up dramatically while # of visitors visting was going down.
The site steeling our images is based in Holland with lots of malware. This has been going on for a while now but it got worse now. We need to find a good reliable way to stop the hot linking of images and being connected to a malware site.
-
Brilliant. This is why I love EGOL. I'm still laughing.
-
Some guy on ebay was hot linking lots of my images in his auctions. These were big images in really busy auctions with lots of people viewing. I asked him to stop and he said.. FU, I am not using your images. So I edited the images to say...
Parts missing. Sold "as is".
He stopped using my images.
-
Maybe I misunderstood what is being hotlinked exactly. Normally I think hotlinking refers to pulling a source file from your server as opposed to housing it on their own.. Typically done with images. So are you saying your primary domain is "hotlinked?" And do you mean just that there's a link to your domain?
Please elaborate and sorry if I'm confused.
-
Can't you just rename the image file or whatever is being hotlinked? Or move it to a different directory or something? Basically 404 it on them?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
#Page Jump link sharing
Hi I'm managing an in-house link building campaign in order to help in our key search term 'Location Holidays'. We were historically number 1 for this term until a recent re-design in May where our web design agency butchered our SEO. All of the main issued fixed, we're now fluctuating between 3rd & 4th on a daily basis. I'm putting together a social share comp to promote through the press in order to boost our backlink profile. We're nesting the competition within the body of the page we want to improve the rankings for. I will be including a #page jump link to quickly access it as it will be further down the page. My question is that if we get press to link to http://holidaycompany.com/destination/#comp will http://holidaycompany.com/destination/ receive the link juice or will http://holidaycompany.com/destination/#comp be looked upon as a whole new page? Thanks in advance!
Technical SEO | | MattHolidays0 -
Link juice and max number of links clarification
I understand roughly that "Link Juice" is passed by dividing PR by the number of links on a page. I also understand the juice available is reduced by some portion on each iteration. 50 PR page 10 links on page 5 * .9 = 4.5 PR goes to each link. Correct? If so and knowing Google stops counting links somewhere around 100, how would it impact the flow to have over 100 links? IE 50 PR page 150 links on the page .33 *.9 = .29PR to each link BUT only for 100 of them. After that, the juice is just lost? Also, I assume Google, to the best of its ability, organizes the links in order of importance such that content links are counted before footer links etc.
Technical SEO | | sprynewmedia0 -
Devalued links or negative affect?
Hi there, I'm looking into an issue with a site that was hit after Penguin was introduced. The site lost 70% of traffic over night. The site in question seemed to have a large number of backlinks with over optimized anchor text which seems to most likely be the reason for drop in rankings. But there is also some links from blog networks here too unfortunately, so my question here really is do Google just devalue these links and discount them from consideration in their ranking algorithm or do the links still count but instead of adding positive affects in SERPs add a negative affect? My reason for this question is I'm trying to determine whether it's worth saving this website or just starting fresh with a new domain. That does bring me to another question, if I have to start fresh on a new domain is it a possibility to reuse the content from the old site? (providing I remove the URLs from Google via Webmaster tools). Any help/advice/answers here would be greatly appreciated. Thanks in advance.
Technical SEO | | jayderby0 -
Modifying urls cause broken links?
I want to modify my urls for the purpose of creating clean, informative urls that can be pasted directly for backlink purposes. Instead I have urls with long, garbled, strange characters. When I change the URL it breaks existing back links! Any way around this?
Technical SEO | | natearistotle0 -
Affiliate links
Is there a best practice for linking out to affiliates URLs post panda? I know some believe it can be a factor.
Technical SEO | | PeterM220 -
Code problem and the impact on links
We have a specific URL naming convention for 'city landing pages': .com/Burbank-CA .com/Boston-MA etc. We use this naming convention almost exclisively as the URLs for links. Our website had a code breakdown and all those URLs within that naming convention led to an error message on the website. Will this impact our links?
Technical SEO | | Storitz0 -
Which version of pages should I build links to?
I'm working on the site www.qualityauditor.co.uk which is built in Moonfruit. Moonfruit renders pages in Flash. Not ideal, I know, but it also automatically produces an HTML version of every page for those without Flash, Javascript and search engines. This HTML version is fairly well optimised for search engines, but sits on different URLs. For example, the page you're likely to see if browsing the site is at http://www.qualityauditor.co.uk/#/iso-9001-lead-auditor-course/4528742734 However, if you turn Javascript off you can see the HTML version of the page here <cite>http://www.qualityauditor.co.uk/page/4528742734</cite> Mostly, it's the last version of the URL which appears in the Google search results for a relevant query. But not always. Plus, in Google Webmaster Tools fetching as Googlebot only shows page content for the first version of the URL. For the second version it returns HTTP status code and a 302 redirect to the first version. I have two questions, really: Will these two versions of the page cause my duplicate content issues? I suspect not as the first version renders only in Flash. But will Google think the 302 redirect for people is cloaking? Which version of the URL should I be pointing new links to (bearing in mind the 302 redirect which doesn't pass link juice). The URL's which I see in my browser and which Google likes the look at when I 'fetch as Googlebot'. Or those Google shows in the search results? Thanks folks, much appreciated! Eamon
Technical SEO | | driftnetmedia0 -
Broken Inner Links - Tool Recommendations?
Do you have any recommendations for tools that scan an entire website and report broken inner links? I run several UGC centered websites and broken inner links, and external, is an issue. Being that these websites are several hundred thousand pages large, I am not really all that excited about running software on my desktop (xenu link sleuth for example). Any online solutions you could recommend would be great!
Technical SEO | | uderic0