Help my site it's not being indexed
-
Hello... We have a client, that had arround 17K visits a month... Last september he hired a company to do a redesign of his website....They needed to create a copy of the site on a different subdomain on another root domain... so I told them to block that content in order to not affect my production site, cause it was going to be an exact replica of the content but different design....
The developmet team did it wrong and blocked the production site (using robots.txt), so my site lost all it's organica traffic, which was 85-90% of the total traffic and now only get a couple of hundreds visits a month... First I thought we had been somehow penalized, however when I the other site recieving new traffic and being indexed i realized so I switched the robots.txt and created 301 redirect from the subdomain to the production site.
After resending sitemaps, links to google+ and many things I can't get google to reindex my site.... when i do a site:domain.com search in google I only get 3 results. Its been now almost 2 month and honestly dont know what to do....
Any help would be greatly appreciated
Thanks
Dan
-
If it makes you feel any better, this is the solution about once a month in Q&A. You're not the first, and you certainly won't be the last!
-
This is way I love the SEOMOZ community no matter how stupid the solution to your problem might be people will let you know.
I feel like an amateur (cause I'm) I think I overtrusted yoast's plugin, Because whenever your are blocking the robots it will tell you, hoewever this time it didn't and the site, through wordpress config, was blocking the site.
I changed it, resubmited the sitemaps, checked the code and updated yoast's great plugin....
Thanks guys... I SEOPromise to check always the code myself
Dan
-
When I go to your page and look at the source code I see this line:
name='robots' content='noindex,nofollow' />
You are telling the bots not to index the page or follow any links on the page. This is in the source code for your home page.
I'd go back into the wordpress settings (you are using Yoast) and make sure to enable the site for search engine visibility!
Once you do that, and verify that the code is changed to "='index,follow'" then resubmit your sitemaps via webmaster tools.
-
Great tool I'm taking a look right now
thanks
Dan
-
I check GWT everyday, not even one page has been indexed... Nor We have any manual action suggested by google
Thanks
Dan
-
-
A suggestion for the future: use some type of code monitoring service, such as https://polepositionweb.com/roi/codemonitor/index.php (no relationship with the company, it's just what I use), and have it alert you to any changes in the robots.txt file on both the live and staging environments.
I was in a situation at a previous employment where the development team wasn't the best at SEO, and I had experienced the robots.txt from the dev site being put on the live site, and the other way around, and also things being added to or removed from the robots.txt without our request or knowledge. The verification files for Google and Bing Webmaster Tools would sometimes go missing, too.
I used that code monitor to check once a day and email me if there were changes to the robots.txt or verification files on the live site and the robots.txt of all of our dev and staging sites (to make sure they weren't accidentally indexed). Was a huge help!
-
Yes take another look at that robots file for sure. If you provided us with the domain we might be able to better help you.
Also, go into Webmaster Tools and poke around. Check how many pages are being indexed, look at sitemaps, do a fetch-as-google, etc.
-
Hi Dan
It sounds like your robot.txt are still blocking your site despite the redirects. You might be best getting rid of the robot.txt and starting again ensuring nothing is blocked that it shouldn't.
regards, David
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best way of crawling my entire site to get a list of NoFollow links?
Hi all, hope somebody can help. I want to crawl my site to export an audit showing: All nofollow links (what links, from which pages) All external links broken down by follow/nofollow. I had thought Moz would do it, but that's not in Crawl info. So I thought Screaming Frog would do it, but unless I'm not looking in the right place, that only seems to provide this information if you manually click down each link and view "Inlinks" details. Surely this must be easy?! Hope someone can nudge me in the right direction... Thanks....
Intermediate & Advanced SEO | | rl_uk0 -
Help me to understand why this page doesn't rank
Hello everyone. I am trying to understand why most of my website category pages don't show up in the in the first 50 organic results on Google, despite my high website DA and high PA of those pages. We used to rank high a few years ago, not clear why most of those pages have almost completely disappeared. So, just to take one as an example, please, help me to understand why this page doesn't shows up in the first 50 organic search results for the keyword "cello sheet music": http://www.virtualsheetmusic.com/downloads/Indici/Cello.html I really can't explain why, unless we are under some sort of "penalization" or similar (a curse?!)... I have analyzed any possible metric, and can't find a logical explanation. Looking forward for your thoughts guys! All the best, Fab.
Intermediate & Advanced SEO | | fablau0 -
Content From One Domain Mysteriously Indexing Under a Different Domain's URL
I've pulled out all the stops and so far this seems like a very technical issue with either Googlebot or our servers. I highly encourage and appreciate responses from those with knowledge of technical SEO/website problems. First some background info: Three websites, http://www.americanmuscle.com, m.americanmuscle.com and http://www.extremeterrain.com as well as all of their sub-domains could potentially be involved. AmericanMuscle sells Mustang parts, Extremeterrain is Jeep-only. Sometime recently, Google has been crawling our americanmuscle.com pages and serving them in the SERPs under an extremeterrain sub-domain, services.extremeterrain.com. You can see for yourself below. Total # of services.extremeterrain.com pages in Google's index: http://screencast.com/t/Dvqhk1TqBtoK When you click the cached version of there supposed pages, you see an americanmuscle page (some desktop, some mobile, none of which exist on extremeterrain.com😞 http://screencast.com/t/FkUgz8NGfFe All of these links give you a 404 when clicked... Many of these pages I've checked have cached multiple times while still being a 404 link--googlebot apparently has re-crawled many times so this is not a one-time fluke. The services. sub-domain serves both AM and XT and lives on the same server as our m.americanmuscle website, but answer to different ports. services.extremeterrain is never used to feed AM data, so why Google is associating the two is a mystery to me. the mobile americanmuscle website is set to only respond on a different port than services. and only responds to AM mobile sub-domains, not googlebot or any other user-agent. Any ideas? As one could imagine this is not an ideal scenario for either website.
Intermediate & Advanced SEO | | andrewv0 -
How to take out international URL from google US index/hreflang help
Hi Moz Community, Weird/confusing question so I'll try my best. The company I work for also has an Australian retail website. When you do a site:ourbrand.com search the second result that pops up is au.brand.com, which redirects to the actual brand.com.au website. The Australian site owner removed this redirect per my bosses request and now it leads to a an unavailable webpage. I'm confused as to best approach, is there a way to noindex the au.brand.com URL from US based searches? My only problem is that the au.brand.com URL is ranking higher than all of the actual US based sub-cat pages when using a site search. Is this an appropriate place for an hreflang tag? Let me know how I can help clarify the issue. Thanks,
Intermediate & Advanced SEO | | IceIcebaby
-Reed0 -
Does including your site in Google News (and Google) Alerts helps with SEO?
Based on the following article http://homebusiness.about.com/od/yourbusinesswebsite/a/google-alerts.htm in order to check if you are included you need to run site:domain.com and click the news search tab. If you are not there then... I ran the test on MOZ and got no results which surprised me. Next step according to :https://support.google.com/news/publisher/answer/40787?hl=en#ts=3179198 is to submit your site for inclusion. Should I? Will it help? P.S.
Intermediate & Advanced SEO | | BeytzNet
This is a followup question to the following: http://moz.com/community/q/what-makes-a-site-appear-in-google-alerts-and-does-it-mean-anything0 -
Does Google Read URL's if they include a # tag? Re: SEO Value of Clean Url's
An ECWID rep stated in regards to an inquiry about how the ECWID url's are not customizable, that "an important thing is that it doesn't matter what these URLs look like, because search engines don't read anything after that # in URLs. " Example http://www.runningboards4less.com/general-motors#!/Classic-Pro-Series-Extruded-2/p/28043025/category=6593891 Basically all of this: #!/Classic-Pro-Series-Extruded-2/p/28043025/category=6593891 That is a snippet out of a conversation where ECWID said that dirty urls don't matter beyond a hashtag... Is that true? I haven't found any rule that Google or other search engines (Google is really the most important) don't index, read, or place value on the part of the url after a # tag.
Intermediate & Advanced SEO | | Atlanta-SMO0 -
Adding Meta Languange tag to xhtml site - coding help needed
I've had my site dinged by Google and feel it's likely several quality issues and I'm hunting down these issues. One of Bing's Webmaster SEO tools said my xhtml pages (which were built in 2007) are missing Meta Language and suggested adding tag in the or on the html tag. Wanting to "not mess anything up" and validate correctly, I read in **W3C's site and it said: ** "Always add a lang attribute to the html tag to set the default language of your page. If this is XHTML 1.x you should also use the xml:lang attribute (with the same value). Do not use the meta element with http-equiv set to Content-Language." My current html leads like: QUESTION:
Intermediate & Advanced SEO | | mlm12
I'm confused on how to add the Meta Language to my website given my current coding as I"m not a coder. Can you suggest if I should add this content-language info, and if so, what is the best way to do so, considering valid w3c markup for my document type? Thank you!!!
Michelle0 -
Google penguin penalty(s), please help
Hi MozFans, I have got a question out of the field about www.coloringpagesabc.com.
Intermediate & Advanced SEO | | MaartenvandenBos
Question is why the rankings and traffic are going down down down the last 4 months. Costumer thinks he got hit by google penguin update(s). The site has about 600 page’s/posts al ‘optimized’ for old seo:
- Almost all posts are superb optimized for one keyword combination (like … coloring pages) there is a high keyword density on the keyword titles and descriptions are all the same like: <keyword>and this is the rest of my title, This is my description <keyword>and i like it internal linking is all with a ‘perfect’ keyword anchor text there is a ok backlink profile, not much links to inner pages
- there are social signals the content quality is low The site to me looks like a seo over optimized content farm Competition:
When I look at the competition. The most coloring pages websites don’t offer a lot of content (text) on there page. The offer a small text and the coloring pages (What it is about :-)) How to get the rankings back:
What I was thinking to do. rewrite the content to a smaller text. Low keyword density on the keyword and put the coloring pages up front. rewrite all titles and descriptions to unique titles and descriptions Make some internal links to related posts with a other anchor text. get linkbuilding going on inner pages get more social signals Am I on the right track? I can use some advise what to do, and where to start. Thanks!!</keyword></keyword> Maarten0