Discourage search engines from indexing this site AFTER a site launch
-
Hi,
I have unticked "Discourage search engines from indexing this site" a few months before the initial release of my website. I don't want to be found by search engines until the official release (still a few months left). Do you think that ticking this box again will harm the website's long-term ranking or have any repercussion on the website? Do you have any additional advice to avoid being temporarily ranked until the official release which won't harm the website in SERPs?
Thanks for your answers.
-
I would just leave that box in wordpress checked off or use the meta robots noindex tags on all of your pages. When you want the site to be indexed remove the tags and fetch your pages in GSC.
-
Hey there
Presume you're using WordPress here. From my past experience - no, that won't have a long-term detrimental effect on your site's ability to rank, once the site goes live.
If you're concerned, however, you could install a "construction" or "coming soon" page, which will allow the site to index, but prevent other URLs from being found/crawled (so long as you don't submit a sitemap until you're ready).
Seedprod's free plugin is highly recommended, and I've used it before to good effect: https://en-gb.wordpress.org/plugins/coming-soon/
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Loss of search visibility-consecutive drops in one month - something I did or competitors?
I am fairly new to Moz. I co-manage a national website with about 400 common pages and separate location areas for cities in Australia. 1 city starting their own separate website a year ago. A drop in search visibility of the whole national site and my location page started in mid July according to Moz stats.- 8%>12%>$38% consecutive drops per week. In google analytics the organic search has dropped 8% overall & 2% on my location page in last month. I did minor optimisation to the my page and articles using Moz in July - upped H2 to H1 title, tweaked main keyword, wrote slightly different SEO title and included keywords in body copy. The rankings of the target keywords went up but other keyword rankings went down. The other thing that started in June was Facebook advertising of our blog articles (click-throughs have a high bounce rate of 95%). The office with its own website (with a similar brand name) also started doing Facebook advertising and SEO for it earlier this year. I can see their own website traffic really shot up in June/July, and they also maintained their traffic on the national site. Wondering if any of these are causing the drop, or if this is more an indicator of competitor activity or alogorthms? Any ideas about causes and solutions appreciated.
Local Website Optimization | | SueMclean0 -
Local Site Linking to Corporate Site In Main Menu - Bad for SEO?
Hi, We have 'local' websites for different countries (UK, DE, FR, AP, US etc.) and a corporate website, the local websites are going to be linking back to the corporate website in the main menu (think about us, terms and conditions kind of pages). Any local products will have their own pages on the local website but global products will be linked back to the corporate website. We will be placing an indication the user will be going to another website next to those menu links that go to the corporate website. Is there any drawback to this for SEO? Should we use nofollow in the menu structure of regional websites for these links? Thanks for your help.
Local Website Optimization | | UNIT40 -
I've submitted my site to google search console, and only 6 images of 89 images have been indexed in 2 weeks. Should I be worried?
I've submitted my site to google search console, and only 6 images of 89 images have been indexed in 2 weeks. Should I be worried? My site is http://bayareahomebirth.org Images are a pretty big part of this site's content and SEO value. Thanks for your help!
Local Website Optimization | | mattchew0 -
Local Search Location Keyword Use
Hello. Whats the best way to approach the use of location phrases within the page content itself? Say your based in a large city but also work in smaller surrounding areas, would you target the main location i.e. "London" on the home page and the main product/service pages directly. Or would you leave this all to deeper pages where you can more easily add value? I can imagine that the inclusion of the location i.e. "London" might compromise the quality of the writing. And put off the users from other locations. For example on the Home Page if your targeting:
Local Website Optimization | | GrouchyKids
Keyword: Widgets
Location: London Widgets in London and Beyond For the best Widgets in London come to... And for a key product or service page if your targeting:
Keyword: Car Widgets
Location: London Car Widgets London and Beyond For the best Car Widgets in London come to... On deeper pages its going to be easier to make this work, but how would you approach it on the main pages and homepage? Hope that all makes sense?0 -
International Sites
Hi Guys Just wanting to get some feedback on best practices for international website. The main website is a .co.uk there looking to target France & Belgium. The web hosting is UK based. Do we replicate the UK site and translate to local language but use a .fr domain and have 3 versions of the websites on 3 separate domains? or do just use the co.uk with french & Belgium translation have pages related to those countries? Any assistance will be appreciated
Local Website Optimization | | Cocoonfxmedia0 -
Query results being indexed and providing no value to real estate website - best course of action?
Hi friends, I have a real estate website that has thousands of these type of query results pages indexed - http://search.myrealestatewebsite.com/l/43453/New_York_City_Rentals?per=100&start=159 What would be the best course of action to ensure those do not get indexed, as most provide no value whatsoever. 1. I'm limited to what I can do in the IDX, but I do believe I can modify the URL parameters for the website in Webmaster tools? Would this be correct? What would my parameter look like? 2. I have a webmaster tools for the website, then also the subdomain, which one would I submit the url parameter, or both?
Local Website Optimization | | JustinMurray0 -
Moving back to .com site
Hi Many thanks for all the input we have had from the Moz expert team here. We have had some great thoughts and we have finally decided that we need to move our site to an new provider and to go back to one single .com site for all our global traffic, as we cannot get round possible duplicate pages as we cannot use canonical nor alternate links with our current website provider and this has meant a big rethink in the last couple of weeks. We where running two sites, .com which has been running for 7 years and a .co.uk site which was dormant since 2007 until 2013 and used from last year to serve our local customers. Domain Authority for .com 19 and 23 for .co.uk Our new site will serve 3 currencies so we can offer £ $ & € without the need for duplicate pages or local pages. We plan but are flexible about using a 301 from the .co.uk site to the dot com. and have enough data to ensure we can do all 301 redirects at page level from our current .co.uk site to our new .com site. Can anyone provide any SEO tips on ensure we grow our rankings when we make the switch in about 3 weeks. Many thanks Bruce
Local Website Optimization | | BruceA2 -
Launching Hundreds of Local Pages At Once or Tiered? If Tiered, In What Intervals Would You Recommend?
Greeting Mozzers, This is a long question, so please bare with me 🙂 We are an IT and management training company that offers over 180 courses on a wide array of topics. We have multiple methods that our students can attend these courses, either in person or remotely via a technology called AnyWare. We've also opened AnyWare centers in which you can physically go a particular location near you, and log into a LIVE course that might be hosted in say, New York, even if you're in say, LA. You get all the in class benefits and interaction with all the students and the instructor as if you're in the classroom. Recently, we've opened 43 AnyWare centers giving way to excellent localization search opportunities to our website (e.g. think sharepoint training in new york or "whatever city we are located in). Each location has a physical address, phone number, and employee working there so we pass those standards for existence on Google Places (which I've set up). So, why all this background? Well, we'd like to start getting as much visibility for queries that follow the format of "course topic area that we offered" followed by "city we offer it in." We offer 22 course topic areas and, as I mentioned, 43 locations across the US. Our IS team has created custom pages for each city and course topic area using a UI. I won't get into detailed specifics, but doing some simple math (22 topic areas multiplied by 43 location) we get over 800 new pages that need to eventually be crawled and added to our site. As a test, we launched the pages 3 months ago for DC and New York and have experienced great increases in visibility. For example, here are the two pages for SharePoint training in DC and NY (total of 44 local pages live right now). http://www2.learningtree.com/htfu/usdc01/washington/sharepoint-training
Local Website Optimization | | CSawatzky
http://www2.learningtree.com/htfu/usny27/new-york/sharepoint-training So, now that we've seen the desired results, my next question is, how do we launch the rest of the hundreds of pages in a "white hat" manner? I'm a big fan of white hat techniques and not pissing off Google. Given the degree of the project, we also did our best to make the content unique as possible. Yes there are many similarities but courses do differ as well as addresses from location to location. After watching Matt Cutt's video here: http://searchengineland.com/google-adding-too-many-pages-too-quickly-may-flag-a-site-to-be-reviewed-manually-156058 about adding too man pages at once, I'd prefer to proceed cautiously, even if the example he uses in the video has to do with tens of thousands to hundreds of thousands of pages. We truly aim to deliver the right content to those searching in their area, so I aim no black hat about it 🙂 But, still don't want to be reviewed manually lol. So, in what interval should we launch the remaining pages in a quick manner to raise any red flags? For example, should we launch 2 cities a week? 4 cities a month? I'm assuming the slower the better of course, but I have some antsy managers I'm accountable to and even with this type of warning and research, I need to proceed somehow the right way. Thanks again and sorry for the detailed message!0