Help my site it's not being indexed
-
Hello... We have a client, that had arround 17K visits a month... Last september he hired a company to do a redesign of his website....They needed to create a copy of the site on a different subdomain on another root domain... so I told them to block that content in order to not affect my production site, cause it was going to be an exact replica of the content but different design....
The developmet team did it wrong and blocked the production site (using robots.txt), so my site lost all it's organica traffic, which was 85-90% of the total traffic and now only get a couple of hundreds visits a month... First I thought we had been somehow penalized, however when I the other site recieving new traffic and being indexed i realized so I switched the robots.txt and created 301 redirect from the subdomain to the production site.
After resending sitemaps, links to google+ and many things I can't get google to reindex my site.... when i do a site:domain.com search in google I only get 3 results. Its been now almost 2 month and honestly dont know what to do....
Any help would be greatly appreciated
Thanks
Dan
-
If it makes you feel any better, this is the solution about once a month in Q&A. You're not the first, and you certainly won't be the last!
-
This is way I love the SEOMOZ community no matter how stupid the solution to your problem might be people will let you know.
I feel like an amateur (cause I'm) I think I overtrusted yoast's plugin, Because whenever your are blocking the robots it will tell you, hoewever this time it didn't and the site, through wordpress config, was blocking the site.
I changed it, resubmited the sitemaps, checked the code and updated yoast's great plugin....
Thanks guys... I SEOPromise to check always the code myself
Dan
-
When I go to your page and look at the source code I see this line:
name='robots' content='noindex,nofollow' />
You are telling the bots not to index the page or follow any links on the page. This is in the source code for your home page.
I'd go back into the wordpress settings (you are using Yoast) and make sure to enable the site for search engine visibility!
Once you do that, and verify that the code is changed to "='index,follow'" then resubmit your sitemaps via webmaster tools.
-
Great tool I'm taking a look right now
thanks
Dan
-
I check GWT everyday, not even one page has been indexed... Nor We have any manual action suggested by google
Thanks
Dan
-
-
A suggestion for the future: use some type of code monitoring service, such as https://polepositionweb.com/roi/codemonitor/index.php (no relationship with the company, it's just what I use), and have it alert you to any changes in the robots.txt file on both the live and staging environments.
I was in a situation at a previous employment where the development team wasn't the best at SEO, and I had experienced the robots.txt from the dev site being put on the live site, and the other way around, and also things being added to or removed from the robots.txt without our request or knowledge. The verification files for Google and Bing Webmaster Tools would sometimes go missing, too.
I used that code monitor to check once a day and email me if there were changes to the robots.txt or verification files on the live site and the robots.txt of all of our dev and staging sites (to make sure they weren't accidentally indexed). Was a huge help!
-
Yes take another look at that robots file for sure. If you provided us with the domain we might be able to better help you.
Also, go into Webmaster Tools and poke around. Check how many pages are being indexed, look at sitemaps, do a fetch-as-google, etc.
-
Hi Dan
It sounds like your robot.txt are still blocking your site despite the redirects. You might be best getting rid of the robot.txt and starting again ensuring nothing is blocked that it shouldn't.
regards, David
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google has discovered a URL but won't index it?
Hey all, have a really strange situation I've never encountered before. I launched a new website about 2 months ago. It took an awfully long time to get index, probably 3 weeks. When it did, only the homepage was indexed. I completed the site, all it's pages, made and submitted a sitemap...all about a month ago. The coverage report shows that Google has discovered the URL's but not indexed them. Weirdly, 3 of the pages ARE indexed, but the rest are not. So I have 42 URL's in the coverage report listed as "Excluded" and 39 say "Discovered- currently not indexed." When I inspect any of these URL's, it says "this page is not in the index, but not because of an error." They are listed as crawled - currently not indexed or discovered - currently not indexed. But 3 of them are, and I updated those pages, and now those changes are reflected in Google's index. I have no idea how those 3 made it in while others didn't, or why the crawler came back and indexed the changes but continues to leave the others out. Has anyone seen this before and know what to do?
Intermediate & Advanced SEO | | DanDeceuster0 -
Can a duplicate page referencing the original page on another domain in another country using the 'canonical link' still get indexed locally?
Hi I wonder if anyone could help me on a canonical link query/indexing issue. I have given an overview, intended solution and question below. Any advice on this query will be much appreciated. Overview: I have a client who has a .com domain that includes blog content intended for the US market using the correct lang tags. The client also has a .co.uk site without a blog but looking at creating one. As the target keywords and content are relevant across both UK and US markets and not to duplicate work the client has asked would it be worthwhile centralising the blog or provide any other efficient blog site structure recommendations. Suggested solution: As the domain authority (DA) on the .com/.co.uk sites are in the 60+ it would risky moving domains/subdomain at this stage and would be a waste not to utilise the DAs that have built up on both sites. I have suggested they keep both sites and share the same content between them using a content curated WP plugin and using the 'canonical link' to reference the original source (US or UK) - so not to get duplicate content issues. My question: Let's say I'm a potential customer in the UK and i'm searching using a keyword phrase that the content that answers my query is on both the UK and US site although the US content is the original source.
Intermediate & Advanced SEO | | JonRayner
Will the US or UK version blog appear in UK SERPs? My gut is the UK blog will as Google will try and serve me the most appropriate version of the content and as I'm in the UK it will be this version, even though I have identified the US source using the canonical link?2 -
Google's Knowledge Panel
Hi Moz Community. Has anyone noticed a pattern in the websites that Google pulls in to populate knowledge Panels? For example, for a lot of queries Google keeps pulling data from a specific source over and over again, and the data shown in the Knowledge Panel isn't on the target page. Is it possible that Google simply favors some sites over others and no matter what you do, you'll never make it into the Knowledge box? Thanks.
Intermediate & Advanced SEO | | yaelslater0 -
What's the best possible URL structure for a local search engine?
Hi Mozzers, I'm working at AskMe.com which is a local search engine in India i.e if you're standing somewhere & looking for the pizza joints nearby, we pick your current location and share the list of pizza outlets nearby along with ratings, reviews etc. about these outlets. Right now, our URL structure looks like www.askme.com/delhi/pizza-outlets for the city specific category pages (here, "Delhi" is the city name and "Pizza Outlets" is the category) and www.askme.com/delhi/pizza-outlets/in/saket for a category page in a particular area (here "Saket") in a city. The URL looks a little different if you're searching for something which is not a category (or not mapped to a category, in which case we 301 redirect you to the category page), it looks like www.askme.com/delhi/search/pizza-huts/in/saket if you're searching for pizza huts in Saket, Delhi as "pizza huts" is neither a category nor its mapped to any category. We're also dealing in ads & deals along with our very own e-commerce brand AskMeBazaar.com to make the better user experience and one stop shop for our customers. Now, we're working on URL restructure project and my question to you all SEO rockstars is, what can be the best possible URL structure we can have? Assume, we have kick-ass developers who can manage any given URL structure at backend.
Intermediate & Advanced SEO | | _nitman0 -
HTML5: Changing 'section' content to be 'main' for better SEO relevance?
We received an HTML5 recommendation that we should change onpage text copy contained in 'section" to be listed in 'main' instead, because this is supposedly better for SEO. We're questioning the need to ask developers spend time on this purely for a perceived SEO benefit. Sure, maybe content in 'footer' may be seen as less relevant, but calling out 'section' as having less relevance than 'main'? Yes, it's true that engines evaluate where onpage content is located, but this level of granular focus seems unnecessary. That being said, more than happy to be corrected if there is actually a benefit. On a side note, 'main' isn't supported by older versions of IE and could cause browser incompatibilities (http://caniuse.com/#feat=html5semantic). Would love to hear others' feedback about this - thanks! 🙂
Intermediate & Advanced SEO | | mirabile0 -
How do I prevent 404's from hurting my site?
I manage a real estate broker's site on which the individual MLS listing pages continually create 404 pages as properties are sold. So, on a site with 2200 pages indexed, roughly half are 404s at any given time. What can I do to mitigate any potential harm from this?
Intermediate & Advanced SEO | | kimmiedawn0 -
Google Indexed the HTTPS version of an e-commerce site
Hi, I am working with a new e-commerce site. The way they are setup is that once you add an item to the cart, you'll be put onto secure HTTPS versions of the page as you continue to browse. Well, somehow this translated to Google indexing the whole site as HTTPS, even the home page. Couple questions: 1. I assume that is bad or could hurt rankings, or at a minimum is not the best practice for SEO, right? 2. Assuming it is something we don't want, how would we go about getting the http versions of pages indexed instead of https? Do we need rel-canonical on each page to be to the http version? Anything else that would help? Thanks!
Intermediate & Advanced SEO | | brianspatterson0 -
Does a 'Certified Domain' help SEO?
I see that GoDaddy offer a 'Certified Domain' option. Does this help SEO at all?
Intermediate & Advanced SEO | | Techboy0