Specific pages won't index
-
I have a few pages on my site that Google won't index, and I can't understand why. I've looked into possible issues with Robots, noindex, redirects, canonicals, and Search Console rules. I've got nothing.
Example:
I want this page to index https://tour.franchisebusinessreview.com/services/franchisee-satisfaction-surveys/
When I Google the full URL, I get results including the non-subdomain homepage, and various pages on the subdomain, including a child page of the page I want, but not the page itself.
Any ideas? Thanks for the help!
-
You can also submit to Google without using Search Console. https://www.google.co.uk/search?q=submit+url+to+google&oq=submit+url+to+google&aqs=chrome..69i57j0l5.2372j0j1&sourceid=chrome&ie=UTF-8 (Search "Submit URL To Google") and then just paste your url in. You can do this to any url.
-
Thanks, Sergio. I'm not an expert but I don't see why the sitemap wouldn't pick it up. The page is linked to the homepage via the top navigation. Other pages in that navigation have been indexed.
-
Thanks Martijn and Joe. I'll try submitting the page as you recommend. Our site relaunched in November 2016 so this particular page (and many others) are new to the site as of that date.
-
I agree with Martijn here, the XML sitemap is certainly important but you can also request that Google index this URL specifically through the 'Fetch as Google' Tool. Just FETCH that URL and select "Request indexing" once it has completed.
As for why Google has not indexed this page before now, I not seeing too many reasons other than what has been mentioned. When did this page go live?
-
Yeps, what I want to add here is that Sitemaps will only help in getting Google to crawl and know about the more pages when they're in the sitemaps. For small sites I really doubt it will have an impact on getting them indexed.
-
The answer has to be at sitemaps of the website...
You can submit a url to Google with: https://www.google.com/webmasters/tools/submit-url?hl=es
Have you looked that the structure of the web does not make the sitemaps obviate it?
-
That would definitively help. In addition I would look into still adding the pages manually through the Submit to Index feature.
-
Thanks, Martijn. Our webmaster discovered that the page in question was not included in the sitemap we submitted to SC. The question of why still needs to be answered.
-
Hi,
That's not fun, have you tried adding these pages to Google Search Console and submitting them to the Index from there?
Martijn.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Over 40+ pages have been removed from the indexed and this page has been selected as the google preferred canonical.
Over 40+ pages have been removed from the indexed and this page has been selected as the google preferred canonical. https://studyplaces.com/about-us/ The pages affected by this include: https://studyplaces.com/50-best-college-party-songs-of-all-time-and-why-we-love-them/ https://studyplaces.com/15-best-minors-for-business-majors/ As you can see the content on these pages is totally unrelated to the content on the about-us page. Any ideas why this is happening and how to resolve.
Technical SEO | | pnoddy0 -
Paginated pages are being indexed?
I have lots of paginated pages which are being indexed. Should I add the noindex tag to page 2 onwards? The pages currently have previous and next tags in place. Page one also has a self-referencing canonical.
Technical SEO | | WTH0 -
23,000 pages indexed, I think bad
Thank you Thank you Moz People!! I have a successful vacation rental company that has terrible seo but getting better. When I first ran Moz crawler and page grader, I had 35,000 errors and all f's.... tons of problem with duplicate page content and titles because not being consistent with page names... mainly capitalization and also rel canonical errors... with that said, I have now maybe 2 or 3 errors from time to time, but I fix every other day. Problem Maybe My site map shows in Google Webmaster submitted 1155
Technical SEO | | nickcargill
1541 indexed But google crawl shows 23,000 pages probably because of duplicate errors or possibly database driven url parameters... How bad is this and how do I get this to be accurate, I have seen google remove tool but I do not think this is right? 2) I have hired a full time content writer and I hope this works My site in google was just domain.com but I had put a 301 in to www.domain.com becauses www. had a page authority where the domain.com did not. But in webmasters I had domain.com just listed. So I changed that to www.domain.com (as preferred domain name) and ask for the first time to crawl. www.domain.com . Anybody see any problems with this? THank you MOZ people, Nick0 -
Is new created page's pagerank 1 ?
Hey I just want to know,
Technical SEO | | atakala
If I create a web page, is the pagerank of the page would be 1?1 -
Carl errors on urls that don't normally exist
Hi, I have been having heaps (thousands) of SEOMoz crawl errors on urls that don't exist normally like: mydomain.com/RoomAvailability.aspx?DateFrom=2012-Oct-26&rcid=-1&Nights=2&Adults=1&Children=0&search=BestPrice These urls are missing siteids and other parameters and I can't see how they are gererated. Does anyone have any ideas on where MOZ is finding them ? Thanks Stephen
Technical SEO | | digmarketingguy0 -
Why aren't certain links showing in SEOMOZ?
Hi, I have been trying to understand our page rank and domains that are linking to us. When I look at the list of linking domains, I see some bigger ones are missing and I don't know why. For example, we are in the Yahoo Directory with a link to trophycentral.com, but SEOMOZ is not showing the link. If SEOMOZ is not seeing it, my guess is Google is not either, which concerns me. There are several onther high page rank domains also not showing. Anyone have any idea why? Thanks! BTW, our domain is trophycentral.com
Technical SEO | | trophycentraltrophiesandawards0 -
Can I format my H1 to be smaller than H2's and H3's on the same page?
I would like to create a web design with 12px H1 and for sub headings on the page to be more like 24px. Will search engines see this and dislike it? The reason for doing it is that I want to put a generic page title in the banner, and more poetic headings above the main body. Example: Small H1: Wholesale coffee, online coffee shop and London roastery Large h2: Respect the bean... Thanks
Technical SEO | | Crumpled_Dog
Scott0 -
Can JavaScrip affect Google's index/ranking?
We have changed our website template about a month ago and since then we experienced a huge drop in rankings, especially with our home page. We kept the same url structure on entire website, pretty much the same content and the same on-page seo. We kind of knew we will have a rank drop but not that huge. We used to rank with the homepage on the top of the second page, and now we lost about 20-25 positions. What we changed is that we made a new homepage structure, more user-friendly and with much more organized information, we also have a slider presenting our main services. 80% of our content on the homepage is included inside the slideshow and 3 tabs, but all these elements are JavaScript. The content is unique and is seo optimized but when I am disabling the JavaScript, it becomes completely unavailable. Could this be the reason for the huge rank drop? I used the Webmaster Tolls' Fetch as Googlebot tool and it looks like Google reads perfectly what's inside the JavaScrip slideshow so I did not worried until now when I found this on SEOMoz: "Try to avoid ... using javascript ... since the search engines will ... not indexed them ... " One more weird thing is that although we have no duplicate content and the entire website has been cached, for a few pages (including the homepage), the picture snipet is from the old website. All main urls are the same, we removed some old ones that we don't need anymore, so we kept all the inbound links. The 301 redirects are properly set. But still, we have a huge rank drop. Also, (not sure if this important or not), the robots.txt file is disallowing some folders like: images, modules, templates... (Joomla components). We still have some html errors and warnings but way less than we had with the old website. Any advice would be much appreciated, thank you!
Technical SEO | | echo10