Why only a few pages of my website are being indexed by google
-
Our website www.navisyachts.com has in its sitemap over 3000 pages of information, and this is all unique content written by our team. Now Google Webmaster central shows only 100 urls indexed from 3500 submitted.
Can you help me understand why and how I can fix this issue?
The website has 4 years old, is a Joomla 3.3 up to date. It has part of the content in the Joomla core content systems and part in K2.
Thank you.
Pablo
-
Hi Dirk,
Thanks so much, I will focus my efforts on speed, to try to make the website faster. the problem we have is that is a very visual website and we use large images as it is the purpose of the company, they are very optimized but I will look if I can compress them even more. Anyway I will start first with optimizing the code and the issue with the HTTP and HTTPS.
Thanks.
-
Hi,
I think performance might be the main issue for your site. Check the pagespeed insights for desktop & mobile - it took the server 3.7s (!) to react. The result on Webpagetest.org is even worse 12.5s time to first byte. As you can read in this article - time to first byte seem to have a correlation with the ranking of the site. Check if your server is properly configured & sized. With these load times it's quite possible that Google refuses to index your heaviest pages as they offer a degraded user performance. Another reason could be that Google bot is not going to waste it's time waiting for your server to respond when crawling your pages.
There are some other quick wins here - compress your code, optimize images, use caching, combine css & js files, ... can all help to improve the overall performance of your pages - check these articles from Google.
A second issue is the duplicate content you have on the site - each page seems to exist in a http & https version - better to keep one & redirect the other one to the chosen version (I imagine you would want to redirect the http to the https version)
Hope this helps,
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why No Goal is recorded in Google Analytics
Hello, I am not sure if i made an error. Can someone please point out. On our sales form, when a user submits the form, the URL displayed is - https://x-y.com/thank-you-2/ I created a Goal like this - 1 ) Under Goal set up, i choose Template option 2 ) Under Goal description, i choose Type > Destination 3 ) and finally, in Goal details field, destination equals to /thank-you-2/ But, no Goal is being tracked. In the first step, should i have selected 'Custom' instead of 'Template' Thanks
Reporting & Analytics | | Johnroger0 -
Do custom tracking codes affect indexing?
Hello, My company uses a tracking system that allows our employees to apply a short code snippet to the end of our URLs for marketing attribution. An example of such a code would be: https://www.schoolofmotion.com/?ref_id=moz-test However, in Google Analytics we are seeing duplicates of our content, where the pages with the individual tracking codes are counted separately from the pages without. From a reporting perspective, this is annoying and definitely worth a fix. However, I'm curious if this problem is affecting our search potential as well. Could this tracking system be splitting traffic in Google's eyes? From an SEO perspective, how should we approach this? Would canonical tags fix this duplication issue in Google Analytics? Is there something else that we should use? Thanks in advance. The Moz community is incredible.
Reporting & Analytics | | CalebWardSoM1 -
Track buttons in Google Tag Manager
Hi! First question: I am wondering if it's possible to track two buttons with the same code in Google Tag Manager without changing the code? There are different page URL's.. Second one: My tags are displayed in Google Analytics as 'events'. When I put the events in 'goals', the number of times it fired differs from my events. Someone who can help me with this issue?
Reporting & Analytics | | conversal0 -
In Google Analytic Experiment I want to Judge Performance of Page Along with Pageviews, Bounces, Transaction etc
Hi All, For my ecommerce site for mobile site category page I have redesign the page in 2 different ways so one original page and 2 new designs. Now I want to do A/B testing with google experiment. I want to measure performance of page via pageviews, bounce rate, exit rate, conversion rate, add to basket etc. Now in objective for this experiment I can select anyone thing either Pageviews, or bounces or transaction or goal. So my query is 1) I cannot select all objective together? 2) or for same page I have to create too many experiments by selecting each objective? 3) exit rate or add to basket objective is not in experiment list so? Thanks!
Reporting & Analytics | | dsouzac0 -
Google Internal Search Tracking Kaput :-(
Buongiorno from cloudy & overcast Wetherby UK... On this site http://www.dartexcoatings.com/ I configured Google to tracdk internal search & heres how - http://i216.photobucket.com/albums/cc53/zymurgy_bucket/internal-search-jinx_zpscf86b49d.jpg But internal search data is not pulling through 😞 How can i fix this please.
Reporting & Analytics | | Nightwing
Thanks,
David0 -
800,000 pages blocked by robots...
We made some mods to our robots.txt file. Added in many php and html pages that should not have been indexed. Well, not sure what happened or if there was some type of dynamic conflict with our CMS and one of these pages, but in a few weeks we checked webmaster tools and to our great surprise and dismay, the number of blocked pages we had by robots.txt was up to about 800,000 pages out of the 900,000 or so we have indexed. 1. So, first question is, has anyone experienced this before? I removed the files from robots.txt and the number of blocked files has still been climbing. Changed the robots.txt file on the 27th. It is the 29th and the new robots.txt file has been downloaded, but the blocked pages count has been rising in spite of it. 2. I understand that even if a page is blocked by robots.txt, it still shows up in the index, but does anyone know how the blocked page affects the ranking? i.e. while it might still show up even though it has been blocked will google show it at a lower rank because it was blocked by robots.txt? Our current robots.txt just says: User-agent: *
Reporting & Analytics | | TheCraig
Disallow: Sitemap: oursitemap Any thoughts? Thanks! Craig0 -
I have two campaigns that are only crawling one page, why is this?
I have a total of three campaigns running right now, and two of them are only crawling one page. I set the campaigns up the same, what is the problem?
Reporting & Analytics | | SiteVamp0 -
Google encryption of search results
Hello Is there any definitive information on whether the recent changes to the way Google encrypts search results for people logged into Google affects the traffic present via the Google Adwords Keyword tool? Plus, how does SEOmoz ensure/minimise the affect of personalisation/localisation etc. on the rankings data provided via the pro tool. Thanks in advance for your assistance. Kind regards Neil
Reporting & Analytics | | mccormackmorrison0