Why isn't our new site being indexed?
-
We built a new website for a client recently.
Site: https://www.woofadvisor.com/
It's been live for three weeks. Robots.txt isn't blocking Googlebot or anything.
Submitted a sitemap.xml through Webmasters but we still aren't being indexed.
Anyone have any ideas?
-
Hey Dirk,
No worries - I visited the question first time today and considered it unanswered as the site is perfectly accessible in California. I like to confirm what Search Console says as that is 'straight from the horses mouth'.
Thanks for confirming that the IP redirect has changed, that is interesting. It is impossible for us to know when that happened - I would have expected thing to get indexed quite fast when it changed.
With the extra info I'm happy to mark this as answered, but would be good to hear from the OP.
Best,
-Tom
-
Hi Tom,
I am not questioning your knowledge - I re-ran the test on webpagetest.org and I see that the site is now accessible for Californian ip (http://www.webpagetest.org/result/150911_6V_14J6/) which wasn't the case a few days ago (check the result on http://www.webpagetest.org/result/150907_G1_TE9/) - so there has been a change on the ip redirection. I also checked from Belgium - the site is now also accessible from here.
I also notice that if I now do a site:woofadvisor.com in Google I get 19 pages indexed rather than 2 I got a few days ago.
Apparently removing the ip redirection solved (or is solving) the indexation issue - but still this question remains marked as "unanswered"
rgds,
Dirk
-
I am in California right now, and can access the website just fine, which is why I didn't mark the question as answered - I don't think we have enough info yet. I think the 'fetch as googlebot' will help us resolve that.
You are correct that if there is no robots.txt then Google assumes the site is open, but my concern is that the developers on the team say that there IS a robots.txt file there and it has some contents. I have, on at least two occasions, come across a team that was serving a robots.txt that was only accessible to search bots (once they were doing that 'for security', another time because they mis-understood how it worked). That is why I suggested that Search Console is checked to see what shows up for robots.txt.
-
To be very honest - I am quite surprised that this question is still marked as "Unanswered".
The owners of the site decided to block access for all non UK / Ireland adresses. The main Googlebot is using a Californian ip address to visit the site. Hence - the only page Googlebot can see is https://www.woofadvisor.com/holding-page.php which has no links to the other parts of the site (this is confirmed by the webpagetest.org test with Californian ip address)
As Google indicates - Googlebot can also use other IP adresses to crawl the site ("With geo-distributed crawling, Googlebot can now use IP addresses that appear to come from other countries, such as Australia.") - however it's is very likely that these bots do not crawl with the same frequency/depth as the main bot (the article clearly indicates " Google might not crawl, index, or rank all of your locale-adaptive content. This is because the default IP addresses of the Googlebot crawler appear to be based in the USA).
This can easily be solved by adding a link on /holding-page.php to the Irish/UK version which contains the full content (accessible for all ip adresses) which can be followed to index the full site (so - only put the ip detection on the homepage - not on the other pages)
The fact that the robots.txt gives a 404 is not relevant: if no robots.txt is found Google assumes that the site can be indexed (check this link) - quote: "You only need a
robots.txt
file if your site includes content that you don't want Google or other search engines to index." -
I'd be concerned about the 404ing robots.txt file.
You should check in Search Console:
-
What does Search Console show in the robots.txt section?
-
What happens if you fetch a page that is no indexed (e.g. https://www.woofadvisor.com/travel-tips.php) with the 'Fetch as Googlebot' tool?
I checked and do not see any obvious indicators of why the pages are not being indexed - we need more info.
-
-
I just did a quick check on your site with Webpagetest.org with California IP address http://www.webpagetest.org/result/150907_G1_TE9/ - as you can see here these IP's also go to the holding page - which is logically the only page which can be indexed as it's the only one Googlebot can access.
rgds,
Dirk
-
Hi,
I can't access your site in Belgium - I guess you are redirecting your users based on ip address. If , like me, they are not located in your target country they are 302 redirected to https://www.woofadvisor.com/holding-page.php and there is only 1 page that is indexed.
Not sure which country you are actually targeting - but could it be that you're accidentally redirecting Google bot as well?
Check also this article from Google on ip based targeting.
rgds
Dirk
-
Strangely, there are two pages indexed on Google Search.
The homepage and one other
-
I noticed the robots.txt file returned a 404 and asked the developers to take a look and they said the content of it is fine.
Sometimes developers say this stuff. If you are getting a 404, demonstrate it to them.
-
I noticed the robots.txt file returned a 404 and asked the developers to take a look and they said the content of it is fine.
But yes, I'll doublecheck the WordPress settings now.
-
Your sitemap all looked good, but when I tried to view the robots.txt file in your root, it returned a 404 and so was unable to determine if there was an issue. Could any of your settings in your WordPress installation also be causing it to trip over.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to find temporary redirects of existing site you don't control?
I am getting ready to move a clients site from another company. They have like 35 tempory redirects according to MOZ. Question is, how can I find out then current redirects so I can update everything for the new site? Do I need access to the current htaccess file to do this?
Technical SEO | | scott3150 -
Another client copies everything to blogspot. Is that what keeps her site from ranking? Or what? Appears to be a penalty somewhere but can't find it.
This client has a brand new site: http://www.susannoyesandersonpoems.com Her previous site was really bad for SEO, yet at one time she actually ranked on the first page for "LDS poems." She came to me because she lost rank. I checked things out and found some shoddy SEO work by a very popular Wordpress webhoste that I will leave unnamed. If you do a backlink analysis you can see the articles and backlinks they created. But there are so few, so I'm not sure if that was it, or it just was because of the fact that her site was so poorly optimized and Google made a change, and down she fell. Here's the only page she had on the LDS poems topic in her old site: https://web.archive.org/web/20130820161529/http://susannoyesandersonpoems.com/category/lds-poetry/ Even the links in the nav were bad as they were all images. And that ranked in position 2 I think she said. Even with her new site, she continues to decline. In fact she is nowhere to be found for main keywords making me think there is a penalty. To try and build rank for categories, I'm allowing google to index the category landing pages and had her write category descriptions that included keywords. We are also listing the categories on the left and linking to those category pages. Maybe those pages are watered down by the poem excerpts?? Here's an example of a page we want to rank: http://susannoyesandersonpoems.com/category/lds-poetry/ Any help from the peanut gallery?
Technical SEO | | katandmouse0 -
301 redirect new site design
Hi I'm just setting up some 301 redirects for a new site design about to go live. The old site structure had some 'overview' pages in the urls (without any content) that just 302'd to a sub page. Do i need 301 redirect these overview page urls or since they had no content theres no need and I probably shouldn't or should i ? Also for pages that have no direct equivalent replacement is it still best to 301 to nearest relevant page or just leave it. For example a thank you page that currently shows after user submits email form wont be on new site (since message shows on form page after submission rather than new page). Should i 301 to form page or just leave it ? Cheers Dan
Technical SEO | | Dan-Lawrence0 -
Unused url 'A' contains frameset - can it damage the other site B?
Client has an old unused site 'A' which I've discovered during my backlink research. It contains this source code below which frames the client's 'proper' site B inside the old unused url A in the browser address. Quick question - will google penalise the website B which is the one I'm optimising? Should the client be using a redirect instead? <frameset <span class="webkit-html-attribute-name">border='0' frameborder='0' framespacing='0'></frameset <span> <frame src="http: www.clientwebsite.co.ukb" frameborder="0" noresize="noresize" scrolling="yes"></frame src="http:> Please go to http://www.clientwebsite.co.ukB <noframes></noframes> Thanks, Lu.
Technical SEO | | Webrevolve0 -
Duplicate Meta Descriptions From Pages That Don't Exist
Hi Guys I am hoping someone can help me out here. I have had a new site built with a unique theme and using wordpress as the CMS. Everything was going fine but after checking webmaster tools today I noticed something that I just cannot get my head around. Basically I am getting warnings of Duplicate page warnings on a couple of things. 1 of which i think i can understand but do not know how to get the warning to go. Firstly I get this warning of duplicate meta desciption url 1: / url 2: /about/who-we-are I understand this as the who-we-are page is set as the homepage through the wordpress reading settings. But is there a way to make the dup meta description warning disappear The second one I am getting is the following: /services/57/ /services/ Both urls lead to the same place although I have never created the services/57/ page the services/57/ page does not show on the xml sitemap but Google obviously see it because it is a warning in webmaster tools. If I press edit on services/57/ page it just goes to edit the /services/ page/ is there a way I can remove the /57/ page safely or a method to ensure Google at least does not see this. Probably a silly question but I cannot find a real comprehensive answer to sorting this. Thanks in advance
Technical SEO | | southcoasthost0 -
Brand New Site Penalized?
I recently launched 2 completely separate and unrelated websites at the same time. Both are new domains and hosting accounts. neither have any links. One is ranking for a branded search and the other is not. The interesting thing is that I tested both sites on the back end of my server before launch. The site that is not ranking for branded search IS ranking still on the back end of my site for the branded search. I have removed all content and 301 redirected the testing urls back to my portfolio page. Could this be do to Google indexing one but not the other. Does it have anything to do with testing on my server first and my DA being higher than current new sites? Or is it something completely different I'm missing completely. Is this a Penalty?
Technical SEO | | CDUBP0 -
Why aren't certain links showing in SEOMOZ?
Hi, I have been trying to understand our page rank and domains that are linking to us. When I look at the list of linking domains, I see some bigger ones are missing and I don't know why. For example, we are in the Yahoo Directory with a link to trophycentral.com, but SEOMOZ is not showing the link. If SEOMOZ is not seeing it, my guess is Google is not either, which concerns me. There are several onther high page rank domains also not showing. Anyone have any idea why? Thanks! BTW, our domain is trophycentral.com
Technical SEO | | trophycentraltrophiesandawards0 -
My Domain Authority is high but don't rank in serps
So i'm a beginner/intermediate SEO and uptil about 3 weeks ago i enjoyed Top 3 rankings for all my keywords(VIrtual Assistant,Virtual Assistants, Virtual Personal Assistant,Virtual Personal Assistants and so on) for my site www.247VirtualAssistant.com. All of a sudden i dropped in rankings and can't figure out why. I ran a link analysis and nothing looks like it changed, in fact i still command much higher domain authority than my competition, but i'm stuck on the bottom of the 2nd page. I can't tell if i'm being penalized, if the other sites all of sudden just outperformed me or something else is happening here. I've also noticed a lot of "dancing" in my serps, I've been in 2nd last position on the 2nd page, then 1st of the third page, then last on the 2nd page and so on. Can someone please help me make sense of this?? Thanks! Thomas, a very confused an desperate website owner
Technical SEO | | Shajan0