Meta refresh = 0 seconds
-
For a number of reasons I'm confined to having to do a client side redirect for html pages. Am I right in thinking that Google treats zero seconds roughly the same as proper 301 redirects? Anyone have experience with zero second meta refresh redirects, good or bad?
-
Interesting approach, thank you.
-
We just went through a situation like this with a pretty decent size client - 400+ ,htm pages that couldn't be redirected to .aspx due to us not being able to modify IIS settings on the server; and the url directory paths were all different too - basically a nightmare.
Like you probably already know, it could go either way with a meta refresh. You'd probably be ok, but I'd avoid if possible. Our solution worked really well, but it's specific to windows servers.
Our solution was to create a spreadsheet with 2 columns - left was all the .htm pages to be redirected - the right- the new .aspx page that it should 301 redirect to. We then wrote a script to dynamically create new copies of the .htm pages and insert a runatserver redirection code snippet at the top of each that pointed to the proper redirect page.
1 month out, everything looks good. No issues and the site is kickin.
-
Thanks.
-
Unfortunately, I've seen mixed reviews on this one, test-wise. The inconsistency is why we don't recommend it (as GNC said). Generally, though, I'd say it's better than nothing.
-
Thanks for the reply Cowboy.
301 is the ultimate destination but could be months or year away for reasons beyond my control and there is enough juice being lost to warrant a temporary solution. I've seen the references to Google and meta refreshses, which is why I posed the question, but I've also seen people say 0 second refreshes have worked.
I just want to make sure nobody had a story like: "we did that once and dropped off the index", etc. I'm thinking that the temporary gain is worth the risk if any, unless I hear differently from somebody.
-
Hi Derek:From the Moz manual, "Meta refreshes do pass some link juice but are not recommended as an SEO tactic due to usability and the loss of link juice passed"
Also, some SEO's feel that Google looks askance at their use.
There's no way to talk them into a 301 redirect, huh?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
"Items 1 - 24 of 75" Appearing in Meta Description - How Do I Remove It?
Hey guys, I've noticed that the item count is appearing at the beginning of the meta description for our brand pages, e.g. "Items 1 - 24 of 75 -". The issue I have with this is that it reduces the character limit (due to truncation), consequently leaving me with little room to play with to include more useful information. Is there a way to remove this? Cheers, A
Technical SEO | | RobTucker0 -
Staging & Development areas should be not indexable (i.e. no followed/no index in meta robots etc)
Hi I take it if theres a staging or development area on a subdomain for a site, who's content is hence usually duplicate then this should not be indexable i.e. (no-indexed & nofollowed in metarobots) ? In order to prevent dupe content probs as well as non project related people seeing work in progress or finding accidentally in search engine listings ? Also if theres no such info in meta robots is there any other way it may have been made non-indexable, or at least dupe content prob removed by canonicalising the page to the equivalent page on the live site ? In the case in question i am finding it listed in serps when i search for the staging/dev area url, so i presume this needs urgent attention ? Cheers Dan
Technical SEO | | Dan-Lawrence0 -
Meta Description VS Rich Snippets
Hello everyone, I have one question: there is a way to tell Google to take the meta description for the search results instead of the rich snippets? I already read some posts here in moz, but no answer was found. In the post was said that if you have keywords in the meta google may take this information instead, but it's not like this as i have keywords in the meta tags. The fact is that, in this way, the descriptions are not compelling at all, as they were intended to be. If it's not worth for ranking, so why google does not allow at least to have it's own website descriptions in their search results? I undestand that spam issues may be an answer, but in this way it penalizes also not spammy websites that may convert more if with a much more compelling description than the snippets. What do you think? and there is any way to fix this problem? Thanks!
Technical SEO | | socialengaged
Eugenio0 -
Is any code to prevent duplicate meta description on blog pages
Is any code to prevent duplicate meta description on blog pages I use rell canonical on blog page and to prevent duplicate title y use on page category title de code %%page%% Is there any similar code so to description?
Technical SEO | | maestrosonrisas0 -
Timely use of robots.txt and meta noindex
Hi, I have been checking every possible resources for content removal, but I am still unsure on how to remove already indexed contents. When I use robots.txt alone, the urls will remain in the index, however no crawling budget is wasted on them, But still, e.g having 100,000+ completely identical login pages within the omitted results, might not mean anything good. When I use meta noindex alone, I keep my index clean, but also keep Googlebot busy with indexing these no-value pages. When I use robots.txt and meta noindex together for existing content, then I suggest Google, that please ignore my content, but at the same time, I restrict him from crawling the noindex tag. Robots.txt and url removal together still not a good solution, as I have failed to remove directories this way. It seems, that only exact urls could be removed like this. I need a clear solution, which solves both issues (index and crawling). What I try to do now, is the following: I remove these directories (one at a time to test the theory) from the robots.txt file, and at the same time, I add the meta noindex tag to all these pages within the directory. The indexed pages should start decreasing (while useless page crawling increasing), and once the number of these indexed pages are low or none, then I would put the directory back to robots.txt and keep the noindex on all of the pages within this directory. Can this work the way I imagine, or do you have a better way of doing so? Thank you in advance for all your help.
Technical SEO | | Dilbak0 -
How do you add a description to the Meta Description area to Tag Pages in Wordpress? This way I do not get the errors: "Missing Meta Description Tag" from SEOMoz Bot? Thanks!
I tried to add descriptions to my tags in Wordpress (well actually one to test), but I still keep getting the "Missing Meta Description Tag" error. Any suggestions on how to fix this in Wordpress? Thanks!
Technical SEO | | jhblogger0 -
I am wondering if I should use the Meta 'Cache" tag?
I am working on removing unnecessary meta tags that have little impact on SEO and I have read so many mixed reviews about using the Meta 'Cache' tag. I need to informative information on whether or not this tag should be used.
Technical SEO | | ImagetecLP0