Re-Launched Website: Developer Fogot to Remove noindex tags.
-
Our company's website has maintained decent rankings for the last 12 years we've been in business for our primary keywords. We recently had our website rebuilt from the ground up, and the developers left the noindex tags on all of our 400+ pages when we launched it. I didn't catch the error for 6 days. During which time, I used the Fetch feature in Google, submitting a site-wide fetch, as well as manual submissions for our top 100 URLs . In addition, every page that was indexed previously had a 301 set up for it, which was pointing to a destination with a noindex.
I caught the error today, and the developer removed the tags. Does anyone have any experience with a situation similar to this? In the SERPs, we are still ranking at this moment, and it's displaying our old URLs, and they are 301 redirecting just fine. But, what happens now? For 6 full days, we told Google not to index any of our pages, while also using the Fetch feature, contradicting ourselves.
Any words of wisdom or advice as to what I can do at this point to avoid potential fall out?Thanks
-
I appreciate everyone's feedback. Very helpful- thank you for taking the time to respond. Heading over to upload a sitemap now!
Thanks again,
Kristin -
One of our competitors, who ranked #1 for a good money term (we were #2) had a developer redo their entire site. He had noindex on every page when the new site went up.
When we saw the new site we sniffed the code, saw the noindex in there and laughed really hard.
A couple days later they dropped completely from the SERPs and we started getting all of their sales.
It took them a couple weeks to figure out what happened. But when they fixed it they popped right back into the SERPs at old rankings a couple days later.
We talk to these guys by phone occasionally. If they would have called us we would have told them how to fix it... but since they hired an expensive developer we didn't want to stick our noses in.
-
I've dealt with similar issues with robots.txt blocks of the entire site, as well as robots meta noindex tags. You should be fine now that you've taken the noindex tag off, and the old pages are redirecting. It may take longer for Google to update their index with the new URLs, but otherwise I don't think you need to worry too much. Maybe resubmit the sitemap and do another fetch on key pages.
Good luck!
-
Make sure you send in a sitemap and all should be well.
I've dealt with cases where certain pages were noindex but then removed. As long as you fixed all your errors, it should be back to normal. Think of a site going down intermittently, rankings don't get affected too much (I believe Matt Cutts confirmed this in a youtube video)
-
Hi Kristin
I have no experience of this happening, but I would suggest that you create a full sitemap and submit that to Google Webmaster tools asap.
Peter
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If website users don't accept GDPR cookie consent, does that prevent GA-GTM from tracking pageviews and any traffic from that user that would cause significant traffic decreases?
I've been doing a lot research on GDPR impact and implementation with GTM-GA for clients, but it's been 12 months since GDPR has gone live I haven't found anything on how GA traffic has been impacted if users don't accept cookie consent. However, I'm personally seeing GA accounts taking huge losses in traffic since implementing GDPR cookie solutions (because GTM/GA tags aren't firing until cookies are accepted). Is it common for websites to see significant decreases in traffic due to too many users not accepting cookie consent? Are there alternative solutions to avoid traffic loss like that and still maintain GDPR compliance? It seems to me that the industry underestimated how many people won't accept cookie consent. Most of the documentation and articles around GDPR's start (May 2018) didn't foresee or cover that aspect properly, everything seems to be technically focused with the assumption that if implemented properly most people would accept cookie consent, but I'm personally not seeing that trend and it's destroying GA data (lost traffic, minimal source attribution, inaccurate behavior data, etc). Thanks.
Reporting & Analytics | | Kickboard2 -
Connect AMP to the same website GA property or not
Hi, Asking this on behalf of a client:"I know Google recommends setting up a separate GA property for AMP which they note in a few places:https://developers.google.com/analytics/devguides/collection/amp-analytics/https://support.google.com/analytics/answer/6343176?hl=en
Reporting & Analytics | | MediaCause
From what we can tell this is because the tracking for AMP will measure things differently than the current analytics.js library. I also think it's related to the GA cookie and how it's set differently for AMP pages.But won't this cause drop-off in user journey information? For example, if a user views an AMP blog article but then clicks to our site from a link within the article. I think we would then track this as two separate users and the pages/visit would be off. I found this article which explains a tech way to connect AMP to the same website GA property but it seems a little too much of a work-around:https://www.simoahava.com/analytics/google-analytics-client-id-amp-pages/ Do you think we should just do what Google recommends and set up AMP as a separate property in GA?" --> Does anyone know if there could be any issues with this workaround?Thanks,0 -
URL Formatting for Internal Link Tagging
After doing some research on internal campaign link tagging, I have seen conflicting viewpoints from analytics and SEO professionals regarding the most effective and SEO-friendly way to tag internal links for a large ecommerce site. It seems there are several common methods of tagging internal links, which can alter how Google interprets these links and indexes the URLs these links point to. Query Parameter - Using ? or & to separate a parameter like cid that will be appended to all internal-pointing links. Since Google will crawl and index these, I believe this method has the potential of causing duplicate content. Hash - Using # to separate a parameter like cid that will be appended to all internal-pointing links. Javascript - Using an onclick event to pass tracking data to your analytics platform Not Tagging Internal Links - While this method will provide the cleanest possible internal link paths for Google and users navigating the site and prevent duplicate content issues, analytics will be less effective. For those of you that manage SEO or analytics for large (1 million+ visits per month) ecommerce sites, what method do you employ and why? Edit* - For this discussion, I am only concerned with tagging links within the site that point to other pages within the same site - not links that come from outside the site or lead offsite. Thank you
Reporting & Analytics | | RobbieFoglia0 -
Universal Analytics & Google Tag Manager - Track URLs that include hashes
Does anyone have any experience tracking URLs that include hashes (#) using Universal Analytics and Google Tag Manager? Can it be done using GTM's container for UA, using the "more settings" options? Or building another tag to work with the GTM UA container? The fallback I'm considering is implementing the UA code in GTM for every page as Custom HTML with the "ga('send', 'pageview', location.pathname + location.search + location.hash);" solution, rather than GTM's specialized UA tag. I'm not yet sure what problems may arise from that, if any. Thanks in advance.
Reporting & Analytics | | 352inc0 -
Alexa ranking certification will be usefull to handle the website ?
I have job portal site , i have idea to try alexa certification , Alexa certification will be useful ?
Reporting & Analytics | | jobtardis0 -
When I look at my SEOMOZ campaigns I see there are a lot of warnings in regards to missing Meta Tags Descriptions but they exist on a clien'ts wordpress site
when I look at my SEOMOZ campaigns I see there are a lot of warnings in regards to missing Meta Tags Descriptions but they exist on a clien'ts wordpress site
Reporting & Analytics | | Doug_Hay1 -
Dramatic Increase in referrals from own website
The past few weeks I've been wracking my brain to figure out why on earth my branded searches could be dropping off at 30-40%. Well today, I realize I've had a dramatic increase in referrals from my own site (the non www verison). I'm talking 150 in March of last year to 5k in March of this year drastic. My 301 redirects haven't changed as far as I know -- I've had them set to redirect from the non www. to the www. for at least a year or two. I'm assuming visitors from search engines are somehow getting the non www version and the redirect is attributing the traffic to referrals instead of search. The drop in search traffic and the increase in referral traffic fall on the same day. Does that sound right/possible? If so, how do I fix this? My traffic stats for two clients are all screwy because of this. I want to make sure whatever solution I implement won't hurt my search traffic numbers any more 🙂 Has anyone else seen this happen recently? I could imagine an anomaly with one site but I find it odd that it could be two client sites. I still have some others to check. Thanks in advance! Leslie
Reporting & Analytics | | LeslieVS0 -
Strange Visitors To Website
OK, not quite sure how this is happening, but......... I am having referral traffic from online game sites. Actually quite a bit of it and it seems to be raising my bounce rate a bit. B Suggestions anyone? Below is my website: http://www.allianceconcretepumps.com Thank You!!
Reporting & Analytics | | APICDA0