What is the difference between using .htaccess file and httpd.conf in implementing thousands of 301 redirections?
-
What is the best solution in terms of website loading time or server load? Thanks in advance!
-
Do you have at least a guess on the percentage of improvement?
-
I had a discussion about that question with our programmers and they all told me that they would gladly use httpd.conf if they could... so when your refer to a case study this ist just a plain opinion of our IT
-
Hi mrcensorious! Do you have a basis or a case study for this?
-
httpd.conf is better according to page speed...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question about "sneaky" vs. non-sneaky redirects?
One of my client's biggest keyword competitors is using, what I believe to be, sneaky redirects. The company is a large, international corporation that has a local office. They use a totally unrelated domain name for local press and advertising, but there is no website. The anchor text in the backlinks automatically redirects to the corporate website. Is this sneaky or not?
White Hat / Black Hat SEO | | JCon7110 -
Removing duplicated content using only the NOINDEX in large scale (80% of the website).
Hi everyone, I am taking care of the large "news" website (500k pages), which got massive hit from Panda because of the duplicated content (70% was syndicated content). I recommended that all syndicated content should be removed and the website should focus on original, high quallity content. However, this was implemented only partially. All syndicated content is set to NOINDEX (they thing that it is good for user to see standard news + original HQ content). Of course it didn't help at all. No change after months. If I would be Google, I would definitely penalize website that has 80% of the content set to NOINDEX a it is duplicated. I would consider this site "cheating" and not worthy for the user. What do you think about this "theory"? What would you do? Thank you for your help!
White Hat / Black Hat SEO | | Lukas_TheCurious0 -
Multiple sites in the same niche (Should we redirect these to our Main Site)
I will keep this short and sweet. We have some websites in the same niche area but want to focus only on our newest site (basically all the information that was being posted on the other sites will now be part of our new site) This will save us a lot of time and increase our focus on 1 entity. Should we redirect these website with a 301 redirect to the specific categories that they focus on in the new site? or should we redirect to the main domain.
White Hat / Black Hat SEO | | CMcMullen0 -
A site is using their competitors names in their Meta Keywords and Descriptions
I can't imagine this is a White Hat SEO technique, but they don't seem to be punished for it by Google - yet. How does Google treat the use of your competitors names in your meta keywords/descriptions? Is it a good idea?
White Hat / Black Hat SEO | | PeterConnor0 -
All pages going through 302 redirect - bad?
So, our web development company did something I don't agree with and I need a second opinion. Most of our pages are statically cached (the CMS creates .html files), which is required because of our traffic volume. To get geotargeting to work, they've set up every page to 302 redirect to a geodetection script, and back to the geotargeted version of the page. Eg: www.example.com/category 302 redirects to www.example.com/geodetect.hp?ip=ip_address. Then that page 302 redirects back to either www.example.com/category, or www.example.com/geo/category for the geo-targeted version. **So all of our pages - thousands - go through a double 302 redirect. It's fairly invisible to the user, and 302 is more appropriate than 301 in this case, but it really worries me. I've done lots of research and can't find anything specifically saying this is bad, but I can't imagine Google being happy with this. ** Thoughts? Is this bad for SEO? Is there a better way (keeping in mind all of our files are statically generated)? Is this perfectly fine?
White Hat / Black Hat SEO | | dholowiski0 -
When to NOT USE the disavow link tool
Im not here to say this is concrete and should never do this, and please if you disagree with me then lets discuss. One of the biggest things out there today especially after the second wave of Penguin (2.0) is the fear striken web masters who run straight to the disavow tool after they have been hit with Penguin or noticed a drop shortly after. I had a friend who's site who never felt the effects of Penguin 1.0 and thought everything was peachy. Then P2.0 hit and his rankings dropped of the map. I got a call from him that night and he was desperately asking me for help to review his site and guess what might have happened. He then tells me the first thing he did was compile a list of websites back linking to him that might be the issue and create his disavow list and submitted it. I asked him "How long did you research these sites before you came the conclusion they were the problem?" He Said "About an hour" Then I asked him "Did you receive a message in your Google Webmaster Tools about unnatural linking?" He Said "No" I said "Then why are you disavowing anything?" He Said "Um.......I don't understand what you are saying?" In reading articles, forums and even here in the Moz Q/A I tend to think there is some misconceptions about the disavow tool from Google that do not seem to be clearly explained. Some of my findings with the tool and when to use it is purely based on logic IMO. Let me explain When to NOT use the tool If you spent an hour reviewing your back link profile and you are to eager to wait any longer to upload your list. Unless you have less than 20 root domains linking to you, you should spend a lot more than an hour reviewing your back link profile You DID NOT receive a message from GWT informing you that you had some "unnatural" links Ill explain later If you spend a very short amount of time reviewing your back link profile. Did not look at each individual site linking to you and every link that exists, then you might be using it WAY TO SOON. The last thing you want to do is disavow a link that actually might be helping you. Take the time to really look at each link and ask your self this question (Straight from the Google Guidelines) "A good rule of thumb is whether you'd feel comfortable explaining what you've done to a website that competes with you, or to a Google employee" Studying your back link profile We all know when we have cheated. Im sure 99.9% of all of us can admit to it at one point. Most of the time I can find back links from sites and look right at the owner and ask him or her "You placed this back link didn't you?" I can see the guilt immediately in their eyes 🙂 Remember not ALL back links you generate are bad or wrong because you own the site. You need to ask yourself "Was this link necessary and does it apply to the topic at hand?", "Was it relevant?" and most important "Is this going to help other users?". These are some questions you can ask yourself before each link you place. You DID NOT receive a message about unnatural linking This is were I think the most confusing takes place (and please explain to me if I am wrong on this). If you did not receive a message in GWT about unnatural linking, then we can safely say that Google does not think you contain any "fishy" spammy links in which they have determined to be of a spammy nature. So if you did not receive any message yet your rankings dropped, then what could it be? Well it's still your back links that most likely did it, but its more likely the "value" of previous links that hold less or no value at all anymore. So obviously when this value drops, so does your rank. So what do I do? Build more quality links....and watch you rankings come back 🙂
White Hat / Black Hat SEO | | cbielich1 -
Can you use the image description for IMG ALT?
ello ello! We're running an ecommerce site with thousands of products. None of the product pages have an IMG ALT. We're been thinking about an IMG ALT rule to apply to all product page images. Every image currently has a detailed caption so the thought was, why don't we use the description as the IMG ALT? It's perfect as it explains the image. Now the thing is, the length of the description, some of them come to 150 - 200 characters with spaces. Do you think this is too much? Also, would having a caption and the IMG ALT be the same cause issues? Have you guys employed any rules for IMG ALT in a bulk way?
White Hat / Black Hat SEO | | Bio-RadAbs0 -
Use of 301 redirects
Scenario Dynamic page produces great results for the user but produces a long very un-user and un-search friendly URL http://www.OURSITE.co.uk/s/ref=nb_sb_noss?url=search-alias%3Daps&field-keywords=loving&x=0&y=0#/ref=nb_sb_noss_1?url=search-alias%3Daps&field-keywords=lovingthebead&rh=i%3Aaps%2Ck%3Alovingthebead Solution 301 redirect in .htaccess Fantastic - works a treat BUT after redirect the original long ugly old URL appears in the location field Would really like this showing the new short user friendly short URL What am I doing wrong? Thank you all. CB
White Hat / Black Hat SEO | | GeezerG0