Re-Launched Website: Developer Fogot to Remove noindex tags.
-
Our company's website has maintained decent rankings for the last 12 years we've been in business for our primary keywords. We recently had our website rebuilt from the ground up, and the developers left the noindex tags on all of our 400+ pages when we launched it. I didn't catch the error for 6 days. During which time, I used the Fetch feature in Google, submitting a site-wide fetch, as well as manual submissions for our top 100 URLs . In addition, every page that was indexed previously had a 301 set up for it, which was pointing to a destination with a noindex.
I caught the error today, and the developer removed the tags. Does anyone have any experience with a situation similar to this? In the SERPs, we are still ranking at this moment, and it's displaying our old URLs, and they are 301 redirecting just fine. But, what happens now? For 6 full days, we told Google not to index any of our pages, while also using the Fetch feature, contradicting ourselves.
Any words of wisdom or advice as to what I can do at this point to avoid potential fall out?Thanks
-
I appreciate everyone's feedback. Very helpful- thank you for taking the time to respond. Heading over to upload a sitemap now!
Thanks again,
Kristin -
One of our competitors, who ranked #1 for a good money term (we were #2) had a developer redo their entire site. He had noindex on every page when the new site went up.
When we saw the new site we sniffed the code, saw the noindex in there and laughed really hard.
A couple days later they dropped completely from the SERPs and we started getting all of their sales.
It took them a couple weeks to figure out what happened. But when they fixed it they popped right back into the SERPs at old rankings a couple days later.
We talk to these guys by phone occasionally. If they would have called us we would have told them how to fix it... but since they hired an expensive developer we didn't want to stick our noses in.
-
I've dealt with similar issues with robots.txt blocks of the entire site, as well as robots meta noindex tags. You should be fine now that you've taken the noindex tag off, and the old pages are redirecting. It may take longer for Google to update their index with the new URLs, but otherwise I don't think you need to worry too much. Maybe resubmit the sitemap and do another fetch on key pages.
Good luck!
-
Make sure you send in a sitemap and all should be well.
I've dealt with cases where certain pages were noindex but then removed. As long as you fixed all your errors, it should be back to normal. Think of a site going down intermittently, rankings don't get affected too much (I believe Matt Cutts confirmed this in a youtube video)
-
Hi Kristin
I have no experience of this happening, but I would suggest that you create a full sitemap and submit that to Google Webmaster tools asap.
Peter
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Since implementing GDPR, has anyone seen website traffic plummet?
On December 1 and 2, I implemented a cookie banner on 5 of my Wix sites (https://www.sanitationsolutionsinc.com is one example) in order to be in compliance with GDPR, CCPA, and LGPD. Since then my traffic according to Wix Analytics and Google Analytics has plummeted. Anyone else have the same issue? How did you fix it?
Reporting & Analytics | | Jason_Taylor0 -
Why might my websites crawl rate....explode?
Hi Mozzers, I have a website with approx 110,000 pages. According to search console, Google will usually crawl, on average, anywhere between 500 - 1500 pages per day. However, lately the crawl rate seems to have increased rather drastically: 9/5/16 - 923
Reporting & Analytics | | Silkstream
9/6/16 - 946
9/7/16 - 848
9/8/16 - 11072
9/9/16 - 50923
9/10/16 - 60389
9/11/16 - 17170
9/12/16 - 79809 I was wondering if anyone could offer any insight into why may be happening and if I should be concerned?
Thanks in advance for all advice.1 -
Stripping referrer on website with a mix of both http and https
I know going from https to http (usually) strips referrers but I was wondering if the referrer is stripped when your website is a mix of both http and https? Say someone browses your site (on http), adds a product and then goes to your cart (https), then decides to go back to another page on your website which is http. Will this strip the referrer? Any help on this would be great, thanks!
Reporting & Analytics | | Fitto0 -
URL Formatting for Internal Link Tagging
After doing some research on internal campaign link tagging, I have seen conflicting viewpoints from analytics and SEO professionals regarding the most effective and SEO-friendly way to tag internal links for a large ecommerce site. It seems there are several common methods of tagging internal links, which can alter how Google interprets these links and indexes the URLs these links point to. Query Parameter - Using ? or & to separate a parameter like cid that will be appended to all internal-pointing links. Since Google will crawl and index these, I believe this method has the potential of causing duplicate content. Hash - Using # to separate a parameter like cid that will be appended to all internal-pointing links. Javascript - Using an onclick event to pass tracking data to your analytics platform Not Tagging Internal Links - While this method will provide the cleanest possible internal link paths for Google and users navigating the site and prevent duplicate content issues, analytics will be less effective. For those of you that manage SEO or analytics for large (1 million+ visits per month) ecommerce sites, what method do you employ and why? Edit* - For this discussion, I am only concerned with tagging links within the site that point to other pages within the same site - not links that come from outside the site or lead offsite. Thank you
Reporting & Analytics | | RobbieFoglia0 -
Is there an automated way to determine which pages of your website are getting 0 traffic?
I'm doing a content audit on my company website and want to identify pages with zero traffic. I can use GA for low traffic, but not zero traffic. I can do this manually, but it would take a long time. Are there any tools to help me determine these pages?
Reporting & Analytics | | Ksink0 -
Client's Google Analytics account access is 'user' only from previous web developer
Just wondering if you have any advice on the following scenario: Client's GA account was set-up by a previous web developer. The client's GA account is a user account (not admin). When we request from the previous web developer that the account access be changed to admin, the web developer says they can't do that as we'll then have access to all the other GA accounts within their account. Does this mean we'll have to create a new GA profile for the client and lose all historic GA data? Any guidance greatly appreciated.
Reporting & Analytics | | aoifep0 -
Solving link and duplicate content errors created by Wordpress blog and tags?
SEOmoz tells me my site's blog (a Wordpress site) has 2 big problems: a few pages with too many links and duplicate content. The problem is that these pages seem legit the way they are, but obviously I need to fix the problem, sooooo... Duplicate content error: error is a result of being able to search the blog by tags. Each blog post has mutliple tags, so the url.com/blog/tag pages occasionally show the same articles. Anyone know of a way to not get penalized for this? Should I exclude these pages from being crawled/sitemapped? Too many links error: SEOmoz tells me my main blog page has too many links (both url.com/blog/ and url.com/blog-2/) - these pages have excerpts of 6 most recent blog posts. I feel like this should not be an error... anyone know of a solution that will keep the site from being penalized by these pages? Thanks!
Reporting & Analytics | | RUNNERagency0 -
Automatic Checking indexation of websites
Hi Guys, do you know a tool that can check al list of websites (directorys) wich automatically checks if the website are indexed in Google. The list is very long and I would like to have a tool wich checks them all with only CnP them once. thankx a lot der.rabauke
Reporting & Analytics | | Lincus0