What is the difference between using .htaccess file and httpd.conf in implementing thousands of 301 redirections?
-
What is the best solution in terms of website loading time or server load? Thanks in advance!
-
Do you have at least a guess on the percentage of improvement?
-
I had a discussion about that question with our programmers and they all told me that they would gladly use httpd.conf if they could... so when your refer to a case study this ist just a plain opinion of our IT
-
Hi mrcensorious! Do you have a basis or a case study for this?
-
httpd.conf is better according to page speed...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Infinite Scrolling on Publisher Sites - is VentureBeat's implementation really SEO-friendly?
I've just begun a new project auditing the site of a news publisher. In order to increase pageviews and thus increase advertising revenue, at some point in the past they implemented something so that as many as 5 different articles load per article page. All articles are loaded at the same time and from looking in Google's cache and the errors flagged up in Search Console, Google treats it as one big mass of content, not separate pages. Another thing to note is that when a user scrolls down, the URL does in fact change when you get to the next article. My initial thought was to remove this functionality and just load one article per page. However I happened to notice that VentureBeat.com uses something similar. They use infinite scrolling so that the other articles on the page (in a 'feed' style) only load when a user scrolls to the bottom of the first article. I checked Google's cached versions of the pages and it seems that Google also only reads the first article which seems like an ideal solution. This obviously has the benefit of additionally speeding up loading time of the page too. My question is, is VentureBeat's implementation actually that SEO-friendly or not. VentureBeat have 'sort of' followed Google's guidelines with regards to how to implement infinite scrolling https://webmasters.googleblog.com/2014/02/infinite-scroll-search-friendly.html by using prev and next tags for pagination https://support.google.com/webmasters/answer/1663744?hl=en. However isn't the point of pagination to list multiple pages in a series (i.e. page 2, page 3, page 4 etc.) rather than just other related articles? Here's an example - http://venturebeat.com/2016/11/11/facebooks-cto-explains-social-networks-10-year-mission-global-connectivity-ai-vr/ Would be interesting to know if someone has dealt with this first-hand or just has an opinion. Thanks in advance! Daniel
White Hat / Black Hat SEO | | Daniel_Morgan1 -
Spam signals from old company site are hurting new company site, but we can't undo the redirect.
My client was forced to change its domain name last year (long story). We were largely able to regain our organic rankings via 301-redirects. Recently, the rankings for the new domain have begun to plummet. Nothing specific took place that could have caused any ranking declines on the new site. However, when we analyze links to the OLD site, we are seeing a lot of link spam being built to that old domain over recent weeks and months. We have no idea where these are coming from but they appear to be negatively impacting our new site. We cannot dismantle the redirects as the old site has hundreds, if not thousands, of quality links pointing to it, and many customers are accustomed to going to that home page. So those redirects need to stay in place. We have already disavowed all the spam we have found on the old Search Console. We are continuing to do so as we find new spam links. But what are we supposed to do about this spam negatively impacting our new site? FYI we have not received any messages in the search console.
White Hat / Black Hat SEO | | FPD_NYC1 -
Moving website and domain name without 301 Redirect or rel=canonical
I do not wish to draw attention to my company, so I am using code names. For the sake of this discussion, we are a new car dealership representing Brand X Cars. The manufacturer of Brand X Cars pushes its dealers toward a website hosting company called CarWebsites in order to maintain a level of quality and control with each dealer. However, we have found the platform to be too restricting, and are switching to our own WordPress site. Unfortunately Brand X is claiming ownership of our original domain, BrandXCarDealer.net, so we have switched to BrandXCarDealer.com (which we prefer anyways). Now both websites are running, and there is duplicate content of everything. Brand X is not cooperative and will not 301 redirect to the new site, and we do not have access to the of the website for a rel=canonical. Brand X is also dragging its feet on shutting down BrandXCarDealer.net. We do still have access to change the content of the pages on the BrandXCarDealer.net site, but that is pretty much as far as our control goes. So my question is, is there anything we can do, without using a 301 redirect or rel=canonical, to tell Google to pay attention to the new BrandXCarDealer.com rather than the old BrandXCarDealer.net? Any suggestions are appreciated. Thanks!
White Hat / Black Hat SEO | | VanMaster0 -
Hreflang/Canonical Inquiry for Website with 29 different languages
Hello, So I have a website (www.example.com) that has 29 subdomains (es.example.com, vi.example.com, it.example.com, etc). Each subdomain has the exact same content for each page, completely translated in its respective language. I currently do not have any hreflang/canonical tags set up. I was recently told that this (below) is the correct way to set these tags up -For each subdomain (es.example.com/blah-blah for this example), I need to place the hreflang tag pointing to the page the subdomain is on (es.example.com/blah-blah), in addition to every other 28 subdomains that have that page (it.example.com/blah-blah, etc). In addition, I need to place a canonical tag pointing to the main www. version of the website. So I would have 29 hreflang tags, plus a canonical tag. When I brought this to a friends attention, he said that placing the canonical tag to the main www. version would cause the subdomains to drop out of the SERPs in their respective country search engines, which I obviously wouldn't want to do. I've tried to read articles about this, but I end up always hitting a wall and further confusing myself. Can anyone help? Thanks!
White Hat / Black Hat SEO | | juicyresults0 -
Acquire domains to boost yours, how to redirect an acquired domain
What is the best way to redirect for best SEO benefits? Examples: glaspunt.nl -> glas.nl fietstassen.eu -> loodgieter.nl Any technical information how to (root) redirect for best SEO practices?
White Hat / Black Hat SEO | | remkoallertz0 -
Do I need to use meta noindex for my new website before migration?
I just want to know your thoughts if it is necessary to add meta noindex nofollow tag in each page of my new website before migrating the old pages to new pages under a new domain? Would it be better if I'll just add a blockage in my robots.txt then remove it once we launch the new website? Thanks!
White Hat / Black Hat SEO | | esiow20130 -
Advice on using the disavow tool to remove hacked website links
Hey Everyone, Back in December, our website suffered an attack which created links to other hacked webistes which anchor text such as "This is an excellent time to discuss symptoms, fa" "Open to members of the nursing/paramedical profes" "The organs in the female reproductive system incl" The links were only visible when looking at the Cache of the page. We got these links removed and removed all traces of the attack such as pages which were created in their own directory on our server 3 months later I'm finding websites linking to us with similar anchor text to the ones above, however they're linking to the pages that were created on our server when we were attacked and they've been removed. So one of my questions is does this effect our site? We've seen some of our best performing keywords drop over the last few months and I have a feeling it's due to these spammy links. Here's a website that links to us <colgroup><col width="751"></colgroup>
White Hat / Black Hat SEO | | blagger
| http://www.fashion-game.com/extreme/blog/page-9 | If you do view source or look at the cached version then you'll find a link right at the bottom left corner. We have 268 of these links from 200 domains. Contacting these sites to have these links removed would be a very long process as most of them probably have no idea that those links even exist and I don't have the time to explain to each one how to remove the hacked files etc. I've been looking at using the Google Disavow tool to solve this problem but I'm not sure if it's a good idea or not. We haven't had any warnings from Google about our site being spam or having too many spam links, so do we need to use the tool? Any advice would be very much appreciated. Let me know if you require more details about our problem. <colgroup><col width="355"></colgroup>
| | | |0 -
Is it outside of Google's search quality guidelines to use rel=author on the homepage?
I have recently seen a few competitors using rel=author to markup their homepage. I don't want to follow suit if it is outside of Google's search quality guidelines. But I've seen very little on this topic, so any advice would be helpful. Thanks!
White Hat / Black Hat SEO | | smilingbunny0