.htaccess and SEO
-
Hey Everyone,
New to SEOMOZ and I have an important question:
We launched a new version of our site about 6 months ago and had a TON of redirects in our HTCaccess file due to a change in our permalink structure (over 2000 easily).
Anyways, recently we went back in and took 2000+ lines of individual htaccess redirects and consolidated them into a RegularExpression for the ones where we could find a pattern for and the others (30 or so) are just the actual redirect link.
Since doing that, it appears our search engine traffic has dropped a bit. It's not crazy, but it's definitely noticeable. I'm not an SEO expert, so my question is this the reason why? How long will we see this decline before we're back at normal levels? We're seeing a lot less crawl errors since doing this, so I think it's a good thing. But I just wanted to check and see.
The site is http://thetechblock.com if you want to take a look. Any help would be really appreciated.
-
Hi Bayan,
Sorry to hear your search engine traffic has dropped.
It might be helpful if you posted the section of the .htaccess file in question.
Here's some things I would double check:
1. Does the .htaccess file serve a 301 response code for the redirect? (probably, but worth double checking) What I might try is create a file of all your OLD urls and upload them into a crawler like Screaming Frog and test them all out to see if they both redirect to the proper URL and with the correct response code.
2. Did you redirect the 2000 pages to unique URLs, or did you redirect them to a single url (or handful or urls)? If you consolidated your URLs to only a handful, this could effect your rankings.
3. Did the content and other HTML elements stay the same during the redirect? For example, did the title tags stay the same or reasonable close to the original? Big differences could cause the URLs to lose relevance and thus rankings.
4. Less crawl errors = good. I would check the Index Status in Google Webmaster to see if the number of pages discovered/indexed matches up well with the number of URLs on your site.
5. Proper sitemaps submitted? Oftentimes when you change your URL structure it's good to submit 2 sitemaps - one listing all your old URLs and another for the new. This way, search engines will attempt to crawl the old URLs and "process" the redirect. Probably not an issue for you since the change was 6 months ago, however.
6. Finally, I'd keep my eyes open for any other causes that may have caused the drop in traffic, i.e. Algorythm Updates, Site Issues, Backlinks and so on.
That's all I can think of, but there may be more. Let us know if you find anything!
-
Anyone?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How .com and .me effect SEO?
Hi, We have a project https://www.shipwaves.me/ and https://www.shipwaves.com/ and however if we are doing SEO for .me domain - is this give early results like as .com domain. I'm not sure that .me is in the category of .me domains. Well, if we are following the same strategy as like following for .com domains - is that the results will be the same or not. Also, in terms of any additional strategy to be done to make the SEO little faster for the .me domain. Guys, please share your thought on this.
Algorithm Updates | | LayaPaul0 -
Hotel SEO, 3-pack & Search Console: How to get the right data and how to improve CTR?
Hey guys, I've been working with some hotels and I feel like there are some specific issues which need special solutions.
Algorithm Updates | | Maggiathor
Maybe some of you also work for hotels and face similar problems. Question 1: Google "forces" 3-packs impressions to OTAs like booking.com via Hotel Ads. You basically have a big blue "book now" button and a small little website button. This ends up basically leading to CTRs below 1% despite a 1-3 Position. Is there any way to improve the organic CTR? Of course we use hotel ads, but they offer bad analytics AND we basically pay for our SEO-Performance. Question 2: Search console doesn't specify wether or not a impression comes from 3-Pack or the rest of the organic results, which basically leads to a average position which says nothing. It's hard to evaluate the performance of meta-titles and texts, because the ctr is also mixed. What would be a better way to get this data or do you think google will change this in some time (new search console doesn't offer this). Question 3: Hotel Rankings are dominated by OTAs, Meta-Searchers and BIg Chains. Has anyone experience in SEO for smaller, family owned Hotels? Any tricks how to get a steady traffic source outside of brand results? Hope there are some travel experts in here 🙂0 -
SEO Friendly IFRAMES?
Hi Everyone, My company is using an iframe for an About US page because we are having coding issues with our CMS. The content is coming directly from our server. After a couple of weeks passed, I searched for the page in Google and I noticed in the search result that the meta description was using the textual content served from the iframe on the page. Does this mean the iframe we are using is SEO friendly? Thanks, Jon
Algorithm Updates | | JMSCC0 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
Do Explainer Videos Help SEO?
My company makes explainer videos. I often come across a lot of (seemingly) inflated & unprovable stats, pertaining to explainer videos, from other companies. This article claims that "Having an explainer video on your web page makes it 53% more likely to show up on the first page of Google search results" Is there any real data to back up such a claim? Do explainer videos really help SEO? How?
Algorithm Updates | | WickVideo0 -
ALT TAGS for SEO - whats the latest recommendation?
ALT Tags used strategically have always been a part of my SEO recommendations (relevant, under 7 words, not keyword stuffed but focused on primary page keyword). I have been getting mixed views on updates that search engines don't use them anymore in ranking determination. The Q&A on this subject was last addressed in 2011, what is the most recent approach on this?
Algorithm Updates | | MikeSEOTruven0 -
Does articles for SEO purposes have a minimal and maximum word count in ordered to be crawled/indexed by Google and other search engines?
Does articles for SEO purposes have a minimal and maximum word count in ordered to be crawled/indexed by Google and other search engines?
Algorithm Updates | | WebRiverGroup0 -
Non .Com or .Co Versus .ca or .fm sites - In terms of SEO value
We are launching a new site with a non traditional top level domain . We were looking at either .ca or .in as we are not able to get the traditional .com or .co or .net etc . I was wondering if this has any SEO effect ? Does Google/Bing treat this domain differently .Will it be penalized ? Note : My site is a US based site targeting US audience
Algorithm Updates | | Chaits0