Re-Launched Website: Developer Fogot to Remove noindex tags.
-
Our company's website has maintained decent rankings for the last 12 years we've been in business for our primary keywords. We recently had our website rebuilt from the ground up, and the developers left the noindex tags on all of our 400+ pages when we launched it. I didn't catch the error for 6 days. During which time, I used the Fetch feature in Google, submitting a site-wide fetch, as well as manual submissions for our top 100 URLs . In addition, every page that was indexed previously had a 301 set up for it, which was pointing to a destination with a noindex.
I caught the error today, and the developer removed the tags. Does anyone have any experience with a situation similar to this? In the SERPs, we are still ranking at this moment, and it's displaying our old URLs, and they are 301 redirecting just fine. But, what happens now? For 6 full days, we told Google not to index any of our pages, while also using the Fetch feature, contradicting ourselves.
Any words of wisdom or advice as to what I can do at this point to avoid potential fall out?Thanks
-
I appreciate everyone's feedback. Very helpful- thank you for taking the time to respond. Heading over to upload a sitemap now!
Thanks again,
Kristin -
One of our competitors, who ranked #1 for a good money term (we were #2) had a developer redo their entire site. He had noindex on every page when the new site went up.
When we saw the new site we sniffed the code, saw the noindex in there and laughed really hard.
A couple days later they dropped completely from the SERPs and we started getting all of their sales.
It took them a couple weeks to figure out what happened. But when they fixed it they popped right back into the SERPs at old rankings a couple days later.
We talk to these guys by phone occasionally. If they would have called us we would have told them how to fix it... but since they hired an expensive developer we didn't want to stick our noses in.
-
I've dealt with similar issues with robots.txt blocks of the entire site, as well as robots meta noindex tags. You should be fine now that you've taken the noindex tag off, and the old pages are redirecting. It may take longer for Google to update their index with the new URLs, but otherwise I don't think you need to worry too much. Maybe resubmit the sitemap and do another fetch on key pages.
Good luck!
-
Make sure you send in a sitemap and all should be well.
I've dealt with cases where certain pages were noindex but then removed. As long as you fixed all your errors, it should be back to normal. Think of a site going down intermittently, rankings don't get affected too much (I believe Matt Cutts confirmed this in a youtube video)
-
Hi Kristin
I have no experience of this happening, but I would suggest that you create a full sitemap and submit that to Google Webmaster tools asap.
Peter
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have a WP site which uses categories to display the same content in several locations. Which items should get a canonical tag to avoid a ding for duplicate content?
So...I have a Knowledge Center and press room that pretty much use the same posts. So...technically the content looks like its on several pages because the post shows up on the Category listing page. Do I add a Canonical tag to each individual post...so that it is the only one that is counted? Also...I have a LONG disclaimer that goes at the bottom of most of the posts. would this count as duplicate content? Is there a way to markup a single paragraph to tell the spiders not to crawl it?
Reporting & Analytics | | LindsayiHart0 -
Two GTM tags
Hi guys, I need some help on the following: My client has different websites worldwide all with country extensions and on overall .com domain. These websites (and GTM accounts) are all managed by local agencies. We would like to implement a cross domain tracking GA account in order to see some overall trends and numbers. We don't want to implement our codes in the existing containers as they are in many cases managed by local agencies. Is it really negative to implement two GTM containers on one website? Any other ideas? Thanks,
Reporting & Analytics | | WeAreDigital_BE
Sander0 -
Enable Ecommerce Tracking with Google Tag Manager
Hello all, I am having an issue with tracking the sales on a webshop, and I would like to know how I can enable the ecommerce tracking with Google Tag Manager? Right now I am tracking the pageviews fine, firing a Universal Analytics tag. How can I achieve this? Thank you.
Reporting & Analytics | | renehansen0 -
Google Tag Manager breaking integration
Using Live Chat's (www.livechatinc.com) Google Analytics integration was populating events and virtual pageviews into my analytics account. I've since added Tag Manager and moved my analytics tracking code into there, but since doing so, the integration no longer seems to work as there is no population of either events or pageviews anymore. Anyone else had any experience of something similar? Any other suggestions (beyond not using GTM for analytics code anymore)? I was considering setting up the event tracking code manually in GTM, but not really sure how to do so seeing as I'm not sure what to fire the different events on. This is the live chat JS code:
Reporting & Analytics | | AdrianCordiner0 -
Google Tag Assistant for Chrome
I'm using the Google Tag Assistant for Chrome, and I noticed something really weird. No matter what pages I look at, the same two GA tags show up. It's weird. You can see the tag that is "working", and then there are two repeats. For example, when I look at this page, I see the GA tag that is working and then all the remarketing tags. Then I see UA-36732895-1 repeated twice. Anyone have any idea what this is? Thanks!
Reporting & Analytics | | PGD20110 -
Totally Remove "localhost" entries from Google Analytics
Hello All, In Google Analytics I see a bunch of traffic coming from "localhost:4444 / referral". I had tried once before to create a filter to exclude this traffic source, but obviously I did it wrong since it's still showing up. Here is the filter I have currently: Filter Name: Exclude localhost
Reporting & Analytics | | Robert-B
Filter Type: Custom filter > Exclude
Filter Field: Referral
Filter Pattern: .localhost:4444.
Case Sensitive: No Can anyone see what I'm doing wrong and give me a push in the right direction? Thanks in advance!0 -
Solving link and duplicate content errors created by Wordpress blog and tags?
SEOmoz tells me my site's blog (a Wordpress site) has 2 big problems: a few pages with too many links and duplicate content. The problem is that these pages seem legit the way they are, but obviously I need to fix the problem, sooooo... Duplicate content error: error is a result of being able to search the blog by tags. Each blog post has mutliple tags, so the url.com/blog/tag pages occasionally show the same articles. Anyone know of a way to not get penalized for this? Should I exclude these pages from being crawled/sitemapped? Too many links error: SEOmoz tells me my main blog page has too many links (both url.com/blog/ and url.com/blog-2/) - these pages have excerpts of 6 most recent blog posts. I feel like this should not be an error... anyone know of a solution that will keep the site from being penalized by these pages? Thanks!
Reporting & Analytics | | RUNNERagency0 -
Analytics URL Tagging
For some reason I can't get Google Analytics to pick up my URL tags, am I doing something wrong? http://www.example.com/?utm_source=carscom&utm_campaign=3rdparty&utm_medium=referral
Reporting & Analytics | | kylesuss0