Google Indexing our site
-
We have 700 city pages on our site. We submitted to google via a https://www.samhillbands.com/sitemaps/locations.xml but they only indexed 15 so far.
Yes the content is similar on all of the pages...thought on getting them to index the remaining pages?
-
Hi Brian,
809 over 863 is over the 93% of the site indexed. I'd wait a little more.
Also, I dont think that i will add value that you manually submit the URLs that you think are not indexed.A reminder, WMT is being realy delayed on the latest information shown. In my case, the last record is from 4th may.
-
site:samhillbands.com/bands-by-location/
says about 809 results for /bands-by-location/
Webmaster tools says 730 out of 862 submitted.
Is there any value to manually submitting the URLS that are on my sitemap but not being indexed?
-
If it's been a couple of weeks, should I manually submit the URLS's that haven't been indexed?
-
It's hard to say...our number of pages indexed was climbing, and I checked today and now it says my sitemap is pending?
-
Hi Brian! Any update? Are you still waiting for your site to be indexed?
-
You submitted those yesterday? You should wait more time. I'd wait for about 1-2 weeks for a correct number. Remember that Google takes its time to index.
I've had site for that amount of pages and some took about a week and others took over a month
-
Thanks for the response.
I have submitted my MAIN sitemap and the sumitted my LOCATION map seperatly.
All pages are set to index at the robots.txt level.
I have submitted some individual LOCATION pages and they are being indexed.
Thoughts?
I am getting that number from search console:
# Sitemap Type Processed Issues Items Submitted Indexed --- --- --- --- --- --- --- --- --- 1 /sitemaps/locations.xml Sitemap Apr 25, 2016 - Web 771 15 2 /sitemaps/sitemap.xml Sitemap index Apr 25, 2016 - Web 862 103 -
Hi there,
Where do you get the stat fo the indexed pages?
I'm seeing about 60 indexed in the subdirectory: samhillbands.com/bands-by-location/Check it yourself by searching: site:samhillbands.com/bands-by-location/
Just to rule out, have you submitted all sitemaps to search console? Are all your pages indexable?
that last one is to be checked whether it's set to no index on the robots.txt or at every page with a meta tag.
And, have you tried indexing manually any random page and check if it's indexed?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Start a new site to get out of Google penalties?
Hey Moz, I have several questions in regards to whether I should a start a new second site to save my online presence after a series of Google penalties. The main questions being: Is this the best way to spend my time/resources? If I’m forced to jump my company over to the new site can Google see that and transfer the penalty? I plan on all new content (no link redirect, no dup content) so do I need to kill the original site? Are there any Pro’s/cons I am missing? Summary of my situation: Looking at analytics it appears I was hit with both Penguin 2.0 and 2.1, each cutting my traffic in half, despite a link remediation campaign in the summer of 2013. There was a manual penalty also imposed on the site in the fall of 2013, which was released in early 2014. With Penguin 3.0’s release at the end of 2014, the site saw a slight uptick in organic traffic, improving from essentially nothing to next to nothing. Most of the site’s issues revolved around cheap $5 links from India in the 2006-09 time frame. This link building was abandoned, and replaced with nothing but “letting them happen naturally” from 2010 through the 2013 penalties. Since 2013 we have done a small amount of quality articles on a monthly basis to promote the site, social media, and continuous link remediation. In addition the whole site has been redesigned, optimized for speed/mobile, secured, and completely rewritten. Given all of this, the site has really only recovered to page 2 and 3 of the SERPs for our key words. Even after a highly circulated piece appeared on an Authority site (97 DA) a few months ago there was zero movement. It appears we have an anvil tied around our leg until Penguin 4.0. With all of the above, and no sign of when the next penguin will be released, I ask, is it time to start investing in a new site? With no movement in 2.5 years, it’s impossible to know where my current site stands, so I don’t know what else I can do to improve it. I am considering slowly building a new site that is a high quality informational site. My thought process is it will take a year for a new site to gain any traction with Google. If by that time my main site has not recovered, I can jump to that new site, add a commercial component, and use it as a life boat for my company. If I have recovered, then I have a future asset. Thanks in advance!
Intermediate & Advanced SEO | | TheDude0 -
Google Indexing of Images
Our site is experiencing an issue with indexation of images. The site is real estate oriented. It has 238 listings with about 1190 images. The site submits two version (different sizes) of each image to Google, so there are about 2,400 images. Only several hundred are indexed. Can adding Microdata improve the indexation of the images? Our site map is submitting images that are on no-index listing pages to Google. As a result more than 2000 images have been submitted but only a few hundred have been indexed. How should the site map deal with images that reside on no-index pages? Do images that are part of pages that are set up as "no-index" need a special "no-index" label or special treatment? My concern is that so many images that not indexed could be a red flag showing poor quality content to Google. Is it worth investing in correcting this issue, or will correcting it result in little to no improvement in SEO? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
Is it Wortwhile to have a HTML site map for a Large Site
We are a large, enterprise site with many pages (some on our CMS and some old pages that exist outside our CMS). Every month we submit various an XML site map. Some pages on our site can no longer be found via following links from one page to another (orphan pages). Some of those pages are important and some not. Is it worth our while to create a HTML site map? Does any one have any recent stats or blog posts to share, showing how a HTML site map may have benefited a large site. Many thanks
Intermediate & Advanced SEO | | CeeC-Blogger0 -
Duplicate site (disaster recovery) being crawled and creating two indexed search results
I have a primary domain, toptable.co.uk, and a disaster recovery site for this primary domain named uk-www.gtm.opentable.com. In the event of a disaster, toptable.co.uk would get CNAMEd (DNS alias) to the .gtm site. Naturally the .gtm disaster recover domian is an exact match to the toptable.co.uk domain. Unfortunately, Google has crawled the uk-www.gtm.opentable site, and it's showing up in search results. In most cases the gtm urls don't get redirected to toptable they actually appear as an entirely separate domain to the user. The strong feeling is that this duplicate content is hurting toptable.co.uk, especially as .gtm.ot is part of the .opentable.com domain which has significant authority. So we need a way of stopping Google from crawling gtm. There seem to be two potential fixes. Which is best for this case? use the robots.txt to block Google from crawling the .gtm site 2) canonicalize the the gtm urls to toptable.co.uk In general Google seems to recommend a canonical change but in this special case it seems robot.txt change could be best. Thanks in advance to the SEOmoz community!
Intermediate & Advanced SEO | | OpenTable0 -
Google suddenly indexing and displaying URLs that haven't existed for years?
We recently noticed google is showing approx 23,000 indexed .jsp urls for our site. These are ancient pages that haven't existed in years and have long been 301 redirected to valid urls. I'm talking 6 years. Checking the serps the other day (and our current SEOMoz pro campaign), I see that a few of these urls are now replacing our correct ones in the serps for important, competitive phrases. What the heck is going on here? Is Google suddenly ignoring rewrite rules and redirects? Here's an example of the rewrite rules that we've used for 6+ years: RewriteRule ^(.*)/xref_interlux_antifoulingoutboards&keels.jsp$ $1/userportal/search_subCategory.do?categoryName=Bottom%20Paint&categoryId=35&refine=1&page=GRID [R=301] Now, this 'bottom paint' url has been incredibly stable in the serps for over a half decade. All of a sudden, a google search for 'bottom paint' (no quotes) brings up the jsp page at position 2-3. This is just one example of something very bizarre happening. Has anyone else had something similar happen lately? Thank You <colgroup><col width="64"></colgroup>
Intermediate & Advanced SEO | | jamestown
| RewriteRule ^(.*)/xref_interlux_antifoulingoutboards&keels.jsp$ $1/userportal/search_subCategory.do?categoryName=Bottom%20Paint&categoryId=35&refine=1&page=GRID [R=301] |0 -
How long does google take to show the results in SERP once the pages are indexed ?
Hi...I am a newbie & trying to optimize the website www.peprismine.com. I have 3 questions - A little background about this : Initially, close to 150 pages were indexed by google. However, we decided to remove close to 100 URLs (as they were quite similar). After the changes, we submitted the NEW sitemap (with close to 50 pages) & google has indexed those URLs in sitemap. 1. My pages were indexed by google few days back. How long does google take to display the URL in SERP once the pages get indexed ? 2. Does google give more preference to websites with more number of pages than those with lesser number of pages to display results in SERP (I have just 50 pages). Does the NUMBER of pages really matter ? 3. Does removal / change of URLs have any negative effect on ranking ? (Many of these URLs were not shown on the 1st page) An answer from SEO experts will be highly appreciated. Thnx !
Intermediate & Advanced SEO | | PepMozBot0 -
I have a .com site but I am only ranking good on google for Canada and not the USA.
We are located in Canada but sell our products world wide. We are ranking ok on google.ca but are not in the top 50 on google.com. Is it due to my ip address? Is there any tips that you can give me to help up my rating for google.com. Any info you can provide me with will be amazing. Thanks,
Intermediate & Advanced SEO | | drewzal0 -
Push for site-wide https, but all pages in index are http. Should I fight the tide?
Hi there, First Q&A question 🙂 So I understand the problems caused by having a few secure pages on a site. A few links to the https version a page and you have duplicate content issues. While there are several posts here at SEOmoz that talk about the different ways of dealing with this issue with respect to secure pages, the majority of this content assumes that the goal of the SEO is to make sure no duplicate https pages end up in the index. The posts also suggest that https should only used on log in pages, contact forms, shopping carts, etc." That's the root of my problem. I'm facing the prospect of switching to https across an entire site. In the light of other https related content I've read, this might seem unecessary or overkill, but there's a vaild reason behind it. I work for a certificate authority. A company that issues SSL certificates, the cryptographic files that make the https protocol work. So there's an obvious need our site to "appear" protected, even if no sensitive data is being moved through the pages. The stronger push, however, stems from our membership of the Online Trust Alliance. https://otalliance.org/ Essentially, in the parts of the internet that deal with SSL and security, there's a push for all sites to utilize HSTS Headers and force sitewide https. Paypal and Bank of America are leading the way in this intiative, and other large retailers/banks/etc. will no doubt follow suit. Regardless of what you feel about all that, the reality is that we're looking at future that involves more privacy protection, more SSL, and more https. The bottom line for me is; I have a site of ~800 pages that I will need to switch to https. I'm finding it difficult to map the tips and tricks for keeping the odd pesky https page out of the index, to what amounts to a sitewide migratiion. So, here are a few general questions. What are the major considerations for such a switch? Are there any less obvious pitfalls lurking? Should I even consider trying to maintain an index of http pages, or should I start work on replacing (or have googlebot replace) the old pages with https versions? Is that something that can be done with canonicalization? or would something at the server level be necessary? How is that going to affect my page authority in general? What obvious questions am I not asking? Sorry to be so longwinded, but this is a tricky one for me, and I want to be sure I'm giving as much pertinent information as possible. Any input will be very much appreciated. Thanks, Dennis
Intermediate & Advanced SEO | | dennis.globalsign0