Help my site it's not being indexed
-
Hello... We have a client, that had arround 17K visits a month... Last september he hired a company to do a redesign of his website....They needed to create a copy of the site on a different subdomain on another root domain... so I told them to block that content in order to not affect my production site, cause it was going to be an exact replica of the content but different design....
The developmet team did it wrong and blocked the production site (using robots.txt), so my site lost all it's organica traffic, which was 85-90% of the total traffic and now only get a couple of hundreds visits a month... First I thought we had been somehow penalized, however when I the other site recieving new traffic and being indexed i realized so I switched the robots.txt and created 301 redirect from the subdomain to the production site.
After resending sitemaps, links to google+ and many things I can't get google to reindex my site.... when i do a site:domain.com search in google I only get 3 results. Its been now almost 2 month and honestly dont know what to do....
Any help would be greatly appreciated
Thanks
Dan
-
If it makes you feel any better, this is the solution about once a month in Q&A. You're not the first, and you certainly won't be the last!
-
This is way I love the SEOMOZ community no matter how stupid the solution to your problem might be people will let you know.
I feel like an amateur (cause I'm) I think I overtrusted yoast's plugin, Because whenever your are blocking the robots it will tell you, hoewever this time it didn't and the site, through wordpress config, was blocking the site.
I changed it, resubmited the sitemaps, checked the code and updated yoast's great plugin....
Thanks guys... I SEOPromise to check always the code myself
Dan
-
When I go to your page and look at the source code I see this line:
name='robots' content='noindex,nofollow' />
You are telling the bots not to index the page or follow any links on the page. This is in the source code for your home page.
I'd go back into the wordpress settings (you are using Yoast) and make sure to enable the site for search engine visibility!
Once you do that, and verify that the code is changed to "='index,follow'" then resubmit your sitemaps via webmaster tools.
-
Great tool I'm taking a look right now
thanks
Dan
-
I check GWT everyday, not even one page has been indexed... Nor We have any manual action suggested by google
Thanks
Dan
-
-
A suggestion for the future: use some type of code monitoring service, such as https://polepositionweb.com/roi/codemonitor/index.php (no relationship with the company, it's just what I use), and have it alert you to any changes in the robots.txt file on both the live and staging environments.
I was in a situation at a previous employment where the development team wasn't the best at SEO, and I had experienced the robots.txt from the dev site being put on the live site, and the other way around, and also things being added to or removed from the robots.txt without our request or knowledge. The verification files for Google and Bing Webmaster Tools would sometimes go missing, too.
I used that code monitor to check once a day and email me if there were changes to the robots.txt or verification files on the live site and the robots.txt of all of our dev and staging sites (to make sure they weren't accidentally indexed). Was a huge help!
-
Yes take another look at that robots file for sure. If you provided us with the domain we might be able to better help you.
Also, go into Webmaster Tools and poke around. Check how many pages are being indexed, look at sitemaps, do a fetch-as-google, etc.
-
Hi Dan
It sounds like your robot.txt are still blocking your site despite the redirects. You might be best getting rid of the robot.txt and starting again ensuring nothing is blocked that it shouldn't.
regards, David
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My product category pages are not being indexed on google can someone help?
My website has been indexed on google and all of its pages can be found on google except for the product category pages - which are where we want our traffic heading to, so this is a big problem for us. Our website is www.skirtinguk.com And an example of a page that isn't being indexed is https://www.skirtinguk.com/product-category/mdf-skirting-board/
Intermediate & Advanced SEO | | chelseaskirtinguk0 -
Magento: Should we disable old URL's or delete the page altogether
Our developer tells us that we have a lot of 404 pages that are being included in our sitemap and the reason for this is because we have put 301 redirects on the old pages to new pages. We're using Magento and our current process is to simply disable, which then makes it a a 404. We then redirect this page using a 301 redirect to a new relevant page. The reason for redirecting these pages is because the old pages are still being indexed in Google. I understand 404 pages will eventually drop out of Google's index, but was wondering if we were somehow preventing them dropping out of the index by redirecting the URL's, causing the 404 pages to be added to the sitemap. My questions are: 1. Could we simply delete the entire unwanted page, so that it returns a 404 and drops out of Google's index altogether? 2. Because the 404 pages are in the sitemap, does this mean they will continue to be indexed by Google?
Intermediate & Advanced SEO | | andyheath0 -
301's - Do we keep the old sitemap to assist google with this ?
Hello Mozzers, We have restructured our site and have done many 301 redirects to our new url structure. I have seen one of my competitors have done similar but they have kept the old sitemap to assist google I guess with their 301's as well. At present we only have our new site map active but am I missing a trick by not have the old one there as well to assist google with 301's. thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
Pagination and View All Pages Question. We currently don't have a canonical tag pointing to View all as I don't believe it's a good user experience so how best we deal with this.
Hello All, I have an eCommerce site and have implemented the use rel="prev" and rel="next" for Page Pagination. However, we also have a View All which shows all the products but we currently don't have a canonical tag pointing to this as I don't believe showing the user a page with shed loads of products on it is actually a good user experience so we havent done anything with this page. I have a sample url from one of our categories which may help - http://goo.gl/9LPDOZ This is obviously causing me duplication issues as well . Also , the main category pages has historically been the pages which ranks better as opposed to Page 2, Page 3 etc etc. I am wondering what I should do about the View All Page and has anyone else had this same issue and how did they deal with it. Do we just get rid of the View All even though Google says it prefers you to have it ? I also want to concentrate my link juice on the main category pages as opposed being diluted between all my paginated pages ? - Does anyone have any tips on how to best do this and have you seen any ranking improvement from this ? Any ideas greatly appreciated. thanks Peter
Intermediate & Advanced SEO | | PeteC120 -
Any issue? Redirect 100's of domains into one website's internal pages
Hi all, Imagine if you will I was the owner of many domains, say 100 demographically rich kwd domains & my plan was to redirect these into one website - each into a different relevant subfolder. e.g. www.dewsburytilers..com > www.brandname.com/dewsbury/tilers.html www.hammersmith-tilers.com > www.brandname.com/hammersmith/tilers.html www.tilers-horsforth.com > www.brandname.com/horsforth/tilers.html another hundred or so 301 redirects...the backlinks to these domains were slim but relevant (the majority of the domains do not have any backlinks at all - can anyone see a problem with this practice? If so, what would your recommendations be?
Intermediate & Advanced SEO | | Fergclaw0 -
What's the deal with significantLinks?
http://schema.org/significantLink Schema.org has a definition for "non-navigation links that are clicked on the most." Presumably this means something like the big green buttons on Moz's homepage. But does anyone know how they affect anything? In http://moz.com/blog/schemaorg-a-new-approach-to-structured-data-for-seo#comment-142936, Jeremy Nelson says " It's quite possible that significant links will pass anchor text as well if a previous link to the page was set in navigation, effictively making obselete the first-link-counts rule, and I am interested in putting that to test." This is a pretty obscure comment but it's one of the only results I could find on the subject. Is this BS? I can't even make out what all of it is saying. So what's the deal with significantLinks and how can we use them to SEO?
Intermediate & Advanced SEO | | NerdsOnCall0 -
Indexing an e-commerce site
Hi all, My client babyblingstreet.com. She sells baby and toddler clothing. Now a lot of the links on her site contain the same products. For instance: if you go to "What's new" you can find those same products in let's say her "Sale Items" link category. The real problem with this is let's say my client sells a green dress and someone accesses it through the "baby and toddler dresses" category. And let's say this URL has 10 links pointing to it. Now, let's say someone else accesses this same green dress through the "What's new" category. And let's say this particular URL has 10 links pointing to it. Instead of having 20 links pointing to one URL about the green dress, I now have 10 links pointing to one URL and 10 pointing to another URL even though both URLs feature the exact same green dress. In this particular example I would want to make the URL of the green dress in the "baby and toddler clothing" section be the canonical URL. So that means I would have to use this canonical tag on the green dress URL that's in the "what's new" category and let's say also the "sale items" category. This could get very tedious if my client has 200+ products. So I am wondering if I have to place a canonical tag on every URL that displays the green dress? More importantly, I would like to know other people's strategies for indexing e-commerce sites that have the same product featured in multiple categories throughout the site. I hope this makes sense. Thanks for your time.
Intermediate & Advanced SEO | | jenga110 -
.com ranking over other ccTLD's that were created
We had a ecommerce website that used to function as the website for every other locale we had around the world. For example the French version was Domain.com/fr_FR/ or a German version in English would be Domain.com/en_DE/. Recently we moved all of our larger international locales to their corresponding ccTLD so no we have Domain.fr and Domain.de.(This happened about two months ago) The problem with this is that we are getting hardly any organic traffic and sales on these new TLD's. I am thinking this is because they are new but I am not positive. If you compare the traffic we used to see on the old domain versus the traffic we see on the new domain it is a lot less. I am currently going through to make sure that all of the old pages are not up and the next thing I want to know is for the old pages would it be better to use a 301 re-direct or a rel=canonical to the new ccTLD to avoid duplicate content and those old pages from out ranking our new pages? Also what are some other causes for our traffic being down so much? It just seems that there is a much bigger problem but I don't know what it could be.
Intermediate & Advanced SEO | | DRSearchEngOpt0