Google is not indexing an updated website
-
We just relaunched a website that has 5 years old, we maintain all the old URLs and articles but for some reason google is not picking up the new website https://www.navisyachts.com. In Google Webmaster Tools we can see the sitemap with over 1000 pages submitted but shows nothing as indexed. The site is loosing traffic rapidly and positions, from the SEO side all looks fine for me. What can be wrong? I’ll appreciate any help.
The new website is built over Joomla 3.4, we have it here at MOZ and other than some minor details it doesn't show that something can be wrong with the website.
Thank you.
-
Thanks I'll check it right away.
-
Sorry I wanted to find something more formal, this link below is a better strategy or more formal to use to review your site, than my comments above.:-
https://moz.com/ugc/8-reasons-why-your-site-might-not-get-indexed
Lets get the site indexed!
-
Ok, then you need to start at the top.
Starting where I would start. .. Have you checked your robot.txt file?
https://moz.com/learn/seo/robotstxt - cross-check no disallow etc.
2. Have you checked google webmaster? - to make sure the old site map is not providing the road map for google.
https://support.google.com/webmasters/answer/183669?hl=en&topic=8476&ctx=topic
Check them out and then revert. Is that ok?
-
Thank you John,
As navisyachts.com is an old website, google have over 2000 pages indexed from the previous website. But with the new one we did a new sitemap and we re-submitted to google the old and new pages that started to appear in the site, the old sitemaps where deleted. But when I check in Google webmaster tools how many of this webpages are indexed from the sitemap it returns a 0, I thought it might take a few days to start showing pages indexed but nothing is happening and I see drops in the impressions and in the website traffic so I am starting to get worry.
There are still some 404 appearing but we have them under control.
Any idea what can be wrong or how can I make google start indexing the pages?
Thanks.
-
You can see when google has indexed a site but typing in site:
for example "site:https://www.navisyachts.com"
Sometime it takes time, but you can force google to crawl your pages - in webmaster - crawl - fetch as google.
Quick response got to do some work! Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Indexing & Caching Some Other Domain In Place of Mine-Lost Ranking -Sucuri.net Found nothing
Again I am facing same Problem with another wordpress blog. Google has suddenly started to Cache a different domain in place of mine & caching my domain in place of that domain. Here is an example page of my site which is wrongly cached on google, same thing happening with many other pages as well - http://goo.gl/57uluq That duplicate site ( protestage.xyz) is showing fully copied from my client's site but showing all pages as 404 now but on google cache its showing my sites. site:protestage.xyz showing all pages of my site only but when we try to open any page its showing 404 error My site has been scanned by sucuri.net Senior Support for any malware & there is none, they scanned all files, database etc & there is no malware found on my site. As per Sucuri.net Senior Support It's a known Google bug. Sometimes they incorrectly identify the original and the duplicate URLs, which results in messed ranking and query results. As you can see, the "protestage.xyz" site was hacked, not yours. And the hackers created "copies" of your pages on that hacked site. And this is why they do it - the "copy" (doorway) redirects websearchers to a third-party site [http://www.unmaskparasites.com/security-report/?page=protestage.xyz](http://www.unmaskparasites.com/security-report/?page=protestage.xyz) It was not the only site they hacked, so they placed many links to that "copy" from other sites. As a result Google desided that that copy might actually be the original, not the duplicate. So they basically hijacked some of your pages in search results for some queries that don't include your site domain. Nonetheless your site still does quite well and outperform the spammers. For example in this query: [https://www.google.com/search?q=](https://www.google.com/search?q=)%22We+offer+personalized+sweatshirts%22%2C+every+bride#q=%22GenF20+Plus+Review+Worth+Reading+If+You+are+Planning+to+Buy+It%22 But overall, I think both the Google bug and the spammy duplicates have the negative effect on your site. We see such hacks every now and then (both sides: the hacked sites and the copied sites) and here's what you can do in this situation: It's not a hack of your site, so you should focus on preventing copying the pages: 1\. Contact the protestage.xyz site and tell them that their site is hacked and that and show the hacked pages. [https://www.google.com/search?q=](https://www.google.com/search?q=)%22We+offer+personalized+sweatshirts%22%2C+every+bride#q=%22GenF20+Plus+Review+Worth+Reading+If+You+are+Planning+to+Buy+It%22 Hopefully they clean their site up and your site will have the unique content again. Here's their email flang.juliette@yandex.com 2\. You might want to send one more complain to their hosting provider (OVH.NET) abuse@ovh.net, and explain that the site they host stole content from your site (show the evidence) and that you suspect the the site is hacked. 3\. Try blocking IPs of the Aruba hosting (real visitors don't use server IPs) on your site. This well prevent that site from copying your site content (if they do it via a script on the same server). I currently see that sites using these two IP address: 149.202.120.102\. I think it would be safe to block anything that begins with 149.202 This .htaccess snippet should help (you might want to test it) #-------------- Order Deny,Allow Deny from 149.202.120.102 #-------------- 4\. Use rel=canonical to tell Google that your pages are the original ones. [https://support.google.com/webmasters/answer/139066?hl=en](https://support.google.com/webmasters/answer/139066?hl=en) It won't help much if the hackers still copy your pages because they usually replace your rel=canonical with their, so Google can' decide which one is real. But without the rel=canonical, hackers have more chances to hijack your search results especially if they use rel=canonical and you don't. I should admit that this process may be quite long. Google will not return your previous ranking overnight even if you manage to shut down the malicious copies of your pages on the hacked site. Their indexes would still have some mixed signals (side effects of the black hat SEO campaign) and it may take weeks before things normalize. The same thing is correct for the opposite situation. The traffic wasn't lost right after hackers created the duplicates on other sites. The effect build up with time as Google collects more and more signals. Plus sometimes they run scheduled spam/duplicate cleanups of their index. It's really hard to tell what was the last drop since we don't have access to Google internals. However, in practice, if you see some significant changes in Google search results, it's not because of something you just did. In most cases, it's because of something that Google observed for some period of time. Kindly help me if we can actually do anything to get the site indexed properly again, PS it happened with this site earlier as well & that time I had to change Domain to get rid of this problem after I could not find any solution after months & now it happened again. Looking forward for possible solution Ankit
Intermediate & Advanced SEO | | killthebillion0 -
Our client's web property recently switched over to secure pages (https) however there non secure pages (http) are still being indexed in Google. Should we request in GWMT to have the non secure pages deindexed?
Our client recently switched over to https via new SSL. They have also implemented rel canonicals for most of their internal webpages (that point to the https). However many of their non secure webpages are still being indexed by Google. We have access to their GWMT for both the secure and non secure pages.
Intermediate & Advanced SEO | | RosemaryB
Should we just let Google figure out what to do with the non secure pages? We would like to setup 301 redirects from the old non secure pages to the new secure pages, but were not sure if this is going to happen. We thought about requesting in GWMT for Google to remove the non secure pages. However we felt this was pretty drastic. Any recommendations would be much appreciated.0 -
Why are bit.ly links being indexed and ranked by Google?
I did a quick search for "site:bit.ly" and it returns more than 10 million results. Given that bit.ly links are 301 redirects, why are they being indexed in Google and ranked according to their destination? I'm working on a similar project to bit.ly and I want to make sure I don't run into the same problem.
Intermediate & Advanced SEO | | JDatSB1 -
Why Google is not showing right title tags of my website inner pages?
Hello Everyone, I have a same problem with my 3 websites that Google is not showing right title tags of inner pages of my websites goldcoast-plumbers.com: http://screencast.com/t/2AEzDcoTkWF accountants-goldcoast.com.au: metalrecyclers-brisbane.com.au One common thing is all these websites is All in one SEO Pack Plugin for SEO Is it a problem? Thanks in advance for your help! Regards
Intermediate & Advanced SEO | | Asjad0 -
Google showing high volume of URLs blocked by robots.txt in in index-should we be concerned?
if we search site:domain.com vs www.domain.com, We see: 130,000 vs 15,000 results. When reviewing the site:domain.com results, we're finding that the majority of the URLs showing are blocked by robots.txt. They are subdomains that we use as production environments (and contain similar content as the rest of our site). And, we also find the message "In order to show you the most relevant results, we have omitted some entries very similar to the 541 already displayed." SEER Interactive mentions that this is one way to gauge a Panda penalty: http://www.seerinteractive.com/blog/100-panda-recovery-what-we-learned-to-identify-issues-get-your-traffic-back We were hit by Panda some time back--is this an issue we should address? Should we unblock the subdomains and add noindex, follow?
Intermediate & Advanced SEO | | nicole.healthline0 -
New Website Look/Structure - Should I Redirect or Update Pages w/ Quality Inbound Links
This questing is regarding an ecommerce website that I hand wrote(html) in 1997. One of the first click and buy websites, with cart/admin system that I also developed. After all this time, the Old plain HTML look just doesnt cut it. I just updated to XHTML w/ a very modern look, and believe the structured data will index better. All products and current category pages will have the identical vrls taken from the old version. I decided to go with the switch after manual penalty, which has since been removed... I figured now is the time to update. My big question is that over the years, a lot of my backlinks came from products/news that are either no longer relevant or just not available. The pages do exist, but can only be found from the Outbound Link Source. For SEO purposes, I have thought a few things I can do but can't decide which one is the best choice. Any Insight or suggestions would be Awesome! 1. Redirect the old link to the most relevant page in my current catalog. 2. Add my new header/footer to old page(this will add a navigation bar w/ brands/cats/etc) 3. Simply add a nice new image to the top of these pages linking home & update any broken/irrelevant links. I was also considering adding just the very top 2 inches of my header(logo,search box, phone, address) *note, some of these pages do receive some traffic. Nothing huge, but consider the 50+ pages, it ads up.
Intermediate & Advanced SEO | | Southbay_Carnivorous_Plants0 -
Google+ Pages on Google SERP
Do you think that a Google+ Page (not profile) could appear on the Google SERP as a Rich Snippet Author? Thanks
Intermediate & Advanced SEO | | overalia0 -
Multiple Google+ Local (Google Place) under one email address
As a automotive dealership group, we have 15+ business listings set up under one Google+ local account. Google+ Local (Google Places) offers the ability to upload a data file for 10+ listings, so we've kept all listings under one login for efficiency. Is there any specific local SEO benefit or any general benefit at all to having each business listing set up under their own separate email address?
Intermediate & Advanced SEO | | autoczar0