Subdomain Place Holder
-
So long story short - we are rolling out a new website earlier than expected. Unfortunately, we are being rushed and in order to make the deadline, we have decided to create a www2. subdomain and release our HTML only version of the site for the next 2 weeks. During that time, the HTML site will be ported over to a Drupal 8 instance, and resume its www. domain.
My question is - will a temporary (302) from www to ww2 and then back to www screw the proverbial pooch? Is there a better way to implement a temporary site?
Feel free to probe with some questions - I know I could be clearer here
Thanks community!
-
Can't you do it the other way? Put the temp HTML site on www - use the www2 to put the Drupal instance. Once the other is ready - remove the HTML & activate the Drupal version? Don't really understand what you're going to do with the www version while the www2 temp version is online.
If not possible - can only agree with Egol - it's quite a risk you're running. Quite possible your rankings will take a dive. As we all know, it's easy to go down but a lot harder to regain position.
-
OK... "It all pays the same".
Good luck. I hope you do well.
-
If I only had a choice...unfortunately marching orders are in place. We have voiced the issue - but unfortunately the answer was the same.
-
I wouldn't do this.
It could muck up your search rankings.
Just do things properly.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Indexing & Caching Some Other Domain In Place of Mine-Lost Ranking -Sucuri.net Found nothing
Again I am facing same Problem with another wordpress blog. Google has suddenly started to Cache a different domain in place of mine & caching my domain in place of that domain. Here is an example page of my site which is wrongly cached on google, same thing happening with many other pages as well - http://goo.gl/57uluq That duplicate site ( protestage.xyz) is showing fully copied from my client's site but showing all pages as 404 now but on google cache its showing my sites. site:protestage.xyz showing all pages of my site only but when we try to open any page its showing 404 error My site has been scanned by sucuri.net Senior Support for any malware & there is none, they scanned all files, database etc & there is no malware found on my site. As per Sucuri.net Senior Support It's a known Google bug. Sometimes they incorrectly identify the original and the duplicate URLs, which results in messed ranking and query results. As you can see, the "protestage.xyz" site was hacked, not yours. And the hackers created "copies" of your pages on that hacked site. And this is why they do it - the "copy" (doorway) redirects websearchers to a third-party site [http://www.unmaskparasites.com/security-report/?page=protestage.xyz](http://www.unmaskparasites.com/security-report/?page=protestage.xyz) It was not the only site they hacked, so they placed many links to that "copy" from other sites. As a result Google desided that that copy might actually be the original, not the duplicate. So they basically hijacked some of your pages in search results for some queries that don't include your site domain. Nonetheless your site still does quite well and outperform the spammers. For example in this query: [https://www.google.com/search?q=](https://www.google.com/search?q=)%22We+offer+personalized+sweatshirts%22%2C+every+bride#q=%22GenF20+Plus+Review+Worth+Reading+If+You+are+Planning+to+Buy+It%22 But overall, I think both the Google bug and the spammy duplicates have the negative effect on your site. We see such hacks every now and then (both sides: the hacked sites and the copied sites) and here's what you can do in this situation: It's not a hack of your site, so you should focus on preventing copying the pages: 1\. Contact the protestage.xyz site and tell them that their site is hacked and that and show the hacked pages. [https://www.google.com/search?q=](https://www.google.com/search?q=)%22We+offer+personalized+sweatshirts%22%2C+every+bride#q=%22GenF20+Plus+Review+Worth+Reading+If+You+are+Planning+to+Buy+It%22 Hopefully they clean their site up and your site will have the unique content again. Here's their email flang.juliette@yandex.com 2\. You might want to send one more complain to their hosting provider (OVH.NET) abuse@ovh.net, and explain that the site they host stole content from your site (show the evidence) and that you suspect the the site is hacked. 3\. Try blocking IPs of the Aruba hosting (real visitors don't use server IPs) on your site. This well prevent that site from copying your site content (if they do it via a script on the same server). I currently see that sites using these two IP address: 149.202.120.102\. I think it would be safe to block anything that begins with 149.202 This .htaccess snippet should help (you might want to test it) #-------------- Order Deny,Allow Deny from 149.202.120.102 #-------------- 4\. Use rel=canonical to tell Google that your pages are the original ones. [https://support.google.com/webmasters/answer/139066?hl=en](https://support.google.com/webmasters/answer/139066?hl=en) It won't help much if the hackers still copy your pages because they usually replace your rel=canonical with their, so Google can' decide which one is real. But without the rel=canonical, hackers have more chances to hijack your search results especially if they use rel=canonical and you don't. I should admit that this process may be quite long. Google will not return your previous ranking overnight even if you manage to shut down the malicious copies of your pages on the hacked site. Their indexes would still have some mixed signals (side effects of the black hat SEO campaign) and it may take weeks before things normalize. The same thing is correct for the opposite situation. The traffic wasn't lost right after hackers created the duplicates on other sites. The effect build up with time as Google collects more and more signals. Plus sometimes they run scheduled spam/duplicate cleanups of their index. It's really hard to tell what was the last drop since we don't have access to Google internals. However, in practice, if you see some significant changes in Google search results, it's not because of something you just did. In most cases, it's because of something that Google observed for some period of time. Kindly help me if we can actually do anything to get the site indexed properly again, PS it happened with this site earlier as well & that time I had to change Domain to get rid of this problem after I could not find any solution after months & now it happened again. Looking forward for possible solution Ankit
Intermediate & Advanced SEO | | killthebillion0 -
Duplicate content issues from mirror subdomain : facebook.domianname.com
Hey Guys,
Intermediate & Advanced SEO | | b2bmarketer
Need your suggestions.
I have got a website that has duplicate content issue.
a sub-domain called facebook.asherstrategies .com comes from no where and is getting indexed.
Website Link : asherstrategies .com
subdomain link: facebook.asherstrategies .com This sub domain is actually a mirror of the website and i have no idea how is is created.
trying to resolve the issue but could not find the clue.0 -
E-Commerce Multilanguage - Better on Subdomains?
Hi, We have an e-commerce store in English and Spanish - same products. URLs differ like this: ENGLISH:
Intermediate & Advanced SEO | | bjs2010
www.mydomain.com/en/manufacturer-sku-productnameinenglish.html SPANISH:
www.mydomain.com/es/manufacturer-sku-productnameinspanish.html All content on pages is translated, e.g, H1, Titles, keywords, descriptions and site content itself is in the language displayed. Is there a risk of similar or near dupe content here in the eyes of the big G? Would it be worth implementing different languages on subdomains or completely different domains? thank you B0 -
PDFs and images in Sub folder or subdomain?
What would you recommend as best practice? Our ecommerce site has a lot of PDFs supporting the product page. Currently they are kept in a sub domain and so are all images. Would it be better to keep them all in a subfolder? I've read about blogs being hosted on a subfolder to be better than subdomain but what about pdfs and images? thoughts?
Intermediate & Advanced SEO | | Bio-RadAbs0 -
A few questions regarding listings in Google Places
For an SAB (Service Area Business) with a hidden address - Can you have more then one listing? Can you use a free Google Voice number? Can you forward the number to a main number? Can the listing be in an office building? Such as a rented space... For a non SAB listing with the address visible - Can you use free Google voice numbers for each listing and forward them to one main number?
Intermediate & Advanced SEO | | Bryan_Loconto0 -
Best places to seed articles UK based
Hi I have written 2 articles for 2 seperate businesses and markets. One article is a Top 10 tips on choosing a conservatory How would I go about promoting and seeding this around related home improvement websites around the UK, use stumbleupon? 2. I also have recipes for a restaurant which I need to seed and promote online in order to gain links and promote the restaurant. Again which methods are best in finding sources to list these recipes and to related blogs etc Many Thanks
Intermediate & Advanced SEO | | ocelot0 -
Robots.txt disallow subdomain
Hi all, I have a development subdomain, which gets copied to the live domain. Because I don't want this dev domain to get crawled, I'd like to implement a robots.txt for this domain only. The problem is that I don't want this robots.txt to disallow the live domain. Is there a way to create a robots.txt for this development subdomain only? Thanks in advance!
Intermediate & Advanced SEO | | Partouter0 -
Google places: 7 weeks and still not verified. Reasons?
I've created Google Places entries for the business' 25 locations in 8 countries. It's been 7 weeks and Google hasn't validated my bulk upload yet. Which of the following would you say might do the trick? URL: I have pointed to http://site.com/country/city for every location. Should I point to http://site? I've created categories in English, but being this about local I guess I should do it in the local language of each country... Should I cancel the bulk upload and do one country per country? My list of countries include Latin America, Europe, Africa and Asia so I don't know if manual verification is done by google centrally and that might be a problem. Otherwise the entries are (IMHO) well written, detailed and all. But I'm kind of desperate now because it's 7 weeks already... thanks for you help
Intermediate & Advanced SEO | | TIBA0