Questions created by JohannCR
-
302 multiple domains...
Hello, I have a few domain names with orthographic variations that I'd like to redirect to my main site. The problem is my registrar (OVH) does only 302 redirects, so what are my options ? Can I keep a dozen 302's ? Do I have to change all their DNS (it's a load on my server...) ? Thanks for any ideas Johann.
Technical SEO | | JohannCR0 -
Tips to get rid of a link from an infected website ?
Hi, During some netlinking analysis I found that a website linking to one of the sites I do SEO for triggers my antivirus... It seems infected by JS/Dldr.Scripy.A Java script virus. Being the first time I deal with this kind of problem, and having not found any info on the Q&A or anywhere else, I wonder a few things : 1°) How to verify the reality of the threat and be sure it's not a false positive ? Is there some tool to scan the website, maybe an online vrus scanner ? 2°) How to contact the webmaster since I cannot look for a "contact us" page ? I looked in a whois, but I only got the e-mail of his hosting service, can I contact them directly ? 3°) Any tips or important things I should know ? Thanks for your help
Technical SEO | | JohannCR0 -
Internal search : rel=canonical vs noindex vs robots.txt
Hi everyone, I have a website with a lot of internal search results pages indexed. I'm not asking if they should be indexed or not, I know they should not according to Google's guidelines. And they make a bunch of duplicated pages so I want to solve this problem. The thing is, if I noindex them, the site is gonna lose a non-negligible chunk of traffic : nearly 13% according to google analytics !!! I thought of blocking them in robots.txt. This solution would not keep them out of the index. But the pages appearing in GG SERPS would then look empty (no title, no description), thus their CTR would plummet and I would lose a bit of traffic too... The last idea I had was to use a rel=canonical tag pointing to the original search page (that is empty, without results), but it would probably have the same effect as noindexing them, wouldn't it ? (never tried so I'm not sure of this) Of course I did some research on the subject, but each of my finding recommanded one of the 3 methods only ! One even recommanded noindex+robots.txt block which is stupid because the noindex would then be useless... Is there somebody who can tell me which option is the best to keep this traffic ? Thanks a million
Technical SEO | | JohannCR0 -
How to noindex lots of content properly : bluntly or progressively ?
Hello Mozers ! I'm quite in doubt, so I thought why not ask for help ? Here's my problem : I need to better up a website's SEO that consists in a lot (1 million+ pages) of poor content. Basically it's like a catalog, where you select a brand, a product series and then the product, to then fill out a form for a request (sorry for the cryptic description, I can't be more precise). Beside the classic SEO work, a part of what (I think) I need to do is noindex some useless pages and rewrite important ones with great content, but for the noindexing part I'm quite hesitant on the how. There's like 200 000 pages with no visits since a year, so I guess they're pretty much useless junk that would be better off in noindex. But the webmaster is afraid that noindexing that much pages will hurt its long tail (in case of future visits), so he wants to check the SERP position of every one of them, to only eliminate those that are in the top 3 (for these there's no hope of amelioration he thinks). I think it would be wasting a lot of time and resources for nothing, and I'd advise to noindex them regardless of their position. The problem is I lack the experience to be sure of it and how to do it : Is it wise to noindex 200 000 pages bluntly in one time (isn't it a bad signal for google ?) or should we do this progressively in like a few months ? Thanks a lot for your help ! Johann.
Technical SEO | | JohannCR0