Noindex a meta refresh site
-
I have a client's site that is a vanity URL, i.e. www.example.com, that is setup as a meta refresh to the client's flagship site: www22.example.com, however we have been seeing Google include the Vanity URL in the index, in some cases ahead of the flagship site. What we'd like to do is to de-index that vanity URL. We have included a no-index meta tag to the vanity URL, however we noticed within 24 hours, actually less, the flagship site also went away as well. When we removed the noindex, both vanity and flagship sites came back.
We noticed in Google Webmaster that the flagship site's robots.txt file was corrupt and was also in need of fixing, and we are in process of fixing that -
Question: Is there a way to noindex vanity URL and NOT flagship site? Was it due to meta refresh redirect that the noindex moved out the flagship as well? Was it maybe due to my conducting a google fetch and then submitting the flagship home page that the site reappeared? The robots.txt is still not corrected, so we don't believe that's tied in here.
To add to the additional complexity, the client is UNABLE to employ a 301 redirect, which was what I recommended initially.
Anyone have any thoughts at all, MUCH appreciated!
-
You are so welcome, and believe me, I understand how hard it can be to convince a client to either not do something or do something differently, because from their perspective all they see is whether the page works or doesn't work. It cost us $1,000 to get a developer to fix our meta refresh and I would never have been able to convince the company to spend the money if SEOMoz had provided some support and validation to the plan.
Good luck!
-
Hello Maurizio -
Yes, I do use webmaster tools, but do not have the vanity URL setup,just the flagship site. The real resolution does have to do with including a 301 redirect, which the IT team wasn't or isn't able to include for reasons still not clear to me.
Just goes to show how really important the server side redirect has always been!
Thanks for your response!
-
Thanks for your expert perspective Dana! I was looking for someone to balance out or provide the feedback I have been trying to provide all this time. I also agree with checking into OSE, and of course it provides further evidence as to why we can't just deindex the vanity, since its built up a ton of SEO equity.
Thanks again!!
-
Hi,
I have a question before
you use Webmaster Tools?
You can upload some files in the root of this domains?
Ciao
Maurizio
-
Having just gone through the pain of fixing a meta refresh on a large e-commerce site, I would highly recommend fixing the meta refresh and making it a 301-redirect instead.
I would strongly recommend against telling Google to de-index the vanity URL because, chances are, that's the URL people are going to remember and use if they want to link back to your client's site. Chances are, the link profile is already fragmented. All the external links linking to the vanity profile are giving clout to that URL, but it's not passing on to the flagship site. Have you put both URLs in OSE to compare the linking profiles of each? Definitely do this before you make any big changes. (You may have already considered all of this, I'm not sure). But I can tell you from experience, fixing the meta refresh and putting in a proper 301-redirect will have a significant impact.
Hope that helps!
Dana
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Having problem with multiple ccTLD sites, SERP showing different sites on different region
Hi everyone, We have more than 20 websites for different region and all the sites have their specific ccTLD. The thing is we are having conflict in SERP for our English sites and almost all the English sites have the same content I would say 70% of the content is duplicating. Despite having a proper hreflang, I see co.uk results in (Google US) and not only .co.uk but also other sites are showing up (xyz.in, xyz.ie, xyz.com.au)The tags I'm using are below, if the site is for the US I'm using canonical and hreflang tag :https://www.xyz.us/" />https://www.xyz.us/" hreflang="en-us" />and for the UK siteshttps://www.xyz.co.uk/" />https://www.xyz.co.uk/" hreflang="en-gb" />I know we have ccTLD so we don't have to use hreflang but since we have duplicate content so just to be safe we added hreflang and what I have heard/read that there is no harm if you have hreflang (of course If implemented properly).Am I doing something wrong here? Or is it conflicting due to canonicals for the same content on different regions and we are confusing Google so (Google showing the most authoritative and relevant results)Really need help with this.Thanks,
Intermediate & Advanced SEO | | shahryar890 -
Merging Two Unrelated Sites into a Third Site
We have a new client interested in possibly merging 2 sites into one under the brand of a new parent company. Here's a breakdown of the scenario..... BrandA.com sells a variety of B2B widget-services via their online store. BrandB.com sells a variety of B2B thing-a-majig products and services (some of them large in size) not sold through an online store. These are sold more consultatively via a sales team. The new parent company, BrandA-B.com is considering combining the two sites under the new brand parent company domain. The Widget-services and Thing-A-Majigs have very little similarity or purchase crossover; so just because you're interested in one doesn't make you a good candidate for the other. We feel pretty confident that we can round-up all the necessary pages and inbound links to do proper transitioning to a new, separate third domain though we're not in agreement that this is the best course of action. Currently the individual brand sites are fairly well known in their industry and each ranks fairly well for a variety of important terms though there is room for improvement and each site has good links with the exception of the new site which has considerably fewer. BrandA.com DA = 73 - 19 years old
Intermediate & Advanced SEO | | OPM
BrandB.com DA = 55 - 18 years old
BrandA-B.com DA = 40 - 1 year old Our SEO team members have opinions on what the potential outcome(s) of this would be but are wondering what the community here thinks. Will the combining of the sites cause a dilution of the topics of the two sites and hurt rankings? Will the combining of the domain authority help one set part of the business but hurt the other? What do you think? What would you do?0 -
Should I use meta noindex and robots.txt disallow?
Hi, we have an alternate "list view" version of every one of our search results pages The list view has its own URL, indicated by a URL parameter I'm concerned about wasting our crawl budget on all these list view pages, which effectively doubles the amount of pages that need crawling When they were first launched, I had the noindex meta tag be placed on all list view pages, but I'm concerned that they are still being crawled Should I therefore go ahead and also apply a robots.txt disallow on that parameter to ensure that no crawling occurs? Or, will Googlebot/Bingbot also stop crawling that page over time? I assume that noindex still means "crawl"... Thanks 🙂
Intermediate & Advanced SEO | | ntcma0 -
To noindex or not to noindex
Our website lets users test whether any given URL or keyword is censored in China. For each URL and keyword that a user looks up, a page is created, such as https://en.greatfire.org/facebook.com and https://zh.greatfire.org/keyword/freenet. From a search engines perspective, all these pages look very similar. For this reason we have implemented a noindex function based on certain rules. Basically, only highly ranked websites are allowed to be indexed - all other URLs are tagged as noindex (for example https://en.greatfire.org/www.imdb.com). However, we are not sure that this is a good strategy and so are asking - what should a website with a lot of similar content do? Don't noindex anything - let Google decide what's worth indexing and not. Noindex most content, but allow some popular pages to be indexed. This is our current approach. If you recommend this one, we would like to know what we can do to improve it. Noindex all the similar content. In our case, only let overview pages, blog posts etc with unique content to be indexed. Another factor in our case is that our website is multilingual. All pages are available (and equally indexed) in Chinese and English. Should that affect our strategy?References:https://zh.greatfire.orghttps://en.greatfire.orghttps://www.google.com/search?q=site%3Agreatfire.org
Intermediate & Advanced SEO | | GreatFire.org0 -
In mobile searches, does Google recognize HTML5 sites as mobile sites?
Does Google recognize HTML5 sites using responsive design as mobile sites? I know that for mobile searches, Google promotes results on mobile sites. I'm trying to determine if my site, created in HTML5 with responsive design falls into that category. Any insights on the topic would be very helpful.
Intermediate & Advanced SEO | | BostonWright0 -
How To Best Close An eCommerce Site?
We're closing down one of our eCommerce sites. What is the best approach to do this? The site has a modest link profile (a young site). It does have a run of site link to the parent site. It also has a couple hundred email subscribers and established accounts. Is there a gradual way to do this? How do I treat the subscribers and account holders? The impact won't be great, but I want to minimize collateral damage as much as possible. Thanks.
Intermediate & Advanced SEO | | AWCthreads0 -
Optimize a Classifieds Site
Hi, I have a classifieds website and would like to optimize it. The issues/questions I have: A Classifieds site has, say, 500 cities. Is it better to create separate subdomains for each city (http://city_name.site.com) or subdirectory (http://site.com/city_name)? Now in each city, there will be say 50 categories. Now these 50 categories are common across all the cities. Hence, the layout and content will be the same with difference of latest ads from each city and name of the city and the urls pointing to each category in the relevant city. The site architecture of a classifieds site is highly prone to have major content which is not really a duplicate content. What is the best way to deal with this situation? I have been hit by Panda in April 2011 with traffic going down 50%. However, the traffic since then has been around same level. How to best handle the duplicate content penalty in case with site like a classifieds site. Cheers!
Intermediate & Advanced SEO | | ketan90 -
Backlinks from Chinese Big sites
Hello, I wish I know your position regarding backlinks from chinese websites. I am able to get a text link(from homepage) from a very big site in chinese. It has PR8 and over 10M users monthly. My site is in english. Will it help me ? Will I be penalised (my site is 5 years old, PR4) and some decent traffic(6-7k daily) Thanks!
Intermediate & Advanced SEO | | adresanet0