Yellow Pages
-
We have just made a yellow pages site n in 3 weeks Google has just indexed 1700 pages out of 18000, so what can we do that Google index all the pages or how the process works?
Regards
-
To get a big site indexed and keep it in the index you must link deeply into the site at multiple points with heavy PR. This forces spiders into the bottom of the site and forces them to chew their way out through other pages.
These should be considered permanent links. If you remove them the spider flow will stop and Google will eventually forget about your pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why does moz give different page authority to the same page if a visit comes from adwords vs organic search?
When clicking on an adwords ad the page the landing page has a page authority of 26. When clicking on organic search to the same exact landing page the page authority is 37. Why is this. Does moz or, more importantly Google see these as the same or separate pages? Thanks Tom
Moz Pro | | ffctas1 -
Block Moz (or any other robot) from crawling pages with specific URLs
Hello! Moz reports that my site has around 380 duplicate page content. Most of them come from dynamic generated URLs that have some specific parameters. I have sorted this out for Google in webmaster tools (the new Google Search Console) by blocking the pages with these parameters. However, Moz is still reporting the same amount of duplicate content pages and, to stop it, I know I must use robots.txt. The trick is that, I don't want to block every page, but just the pages with specific parameters. I want to do this because among these 380 pages there are some other pages with no parameters (or different parameters) that I need to take care of. Basically, I need to clean this list to be able to use the feature properly in the future. I have read through Moz forums and found a few topics related to this, but there is no clear answer on how to block only pages with specific URLs. Therefore, I have done my research and come up with these lines for robots.txt: User-agent: dotbot
Moz Pro | | Blacktie
Disallow: /*numberOfStars=0 User-agent: rogerbot
Disallow: /*numberOfStars=0 My questions: 1. Are the above lines correct and would block Moz (dotbot and rogerbot) from crawling only pages that have numberOfStars=0 parameter in their URLs, leaving other pages intact? 2. Do I need to have an empty line between the two groups? (I mean between "Disallow: /*numberOfStars=0" and "User-agent: rogerbot")? (or does it even matter?) I think this would help many people as there is no clear answer on how to block crawling only pages with specific URLs. Moreover, this should be valid for any robot out there. Thank you for your help!0 -
I have double-checked the rel canonical is properly employed on our page but the On Page Grader says it's not working?
I have double-checked the rel canonical is properly employed on our page but the On Page Grader says it's not working Here is the URL - http://www.solidconcepts.com/industries/aerospace-parts-manufacturing/ What is wrong with how we are doing things?
Moz Pro | | StratasysDirectManufacturing0 -
Why did SEOMoz only crawl 1 page?
I have multiple campaigns and on a few of them SEOMoz has only crawled one page. I think this may have to do with how I set up the campaign. How do I get SEOMoz to crawl more than one page on these campaigns.
Moz Pro | | HermanAdvertising0 -
Issue in number of pages crawled
i wanted to figure out how our friend Roger Bot works. On the first crawl of one of my large sites, the number of pages crawled stopped at 10000 (due to the restriction on the pro account). However after a few weeks, the number of pages crawled went down to about 5500. This number seemed to be a more accurate count of the pages on our site. Today, it seems that Roger Bot has completed another crawl and the number is up to 10000 again. I know there has been no downtime on our site, and the items that we fixed on our site did not reduce or increase the number of pages we had. Just making sure there are no known issues with Roger Bot before I look deeper into our site to see if there is an issue. Thanks!
Moz Pro | | cchhita0 -
Home page not indexed by Google
Hello, Teacherprose.com 1. Sitemap was successfully submitted via Google webmaster tools 2. Site has been up for two years. 3. Site shows up in Google results for "Teacher Resume Service" 4. According to Google and SEOMoz, home page not indexed by Google or Bing. I'm a novice, am I missing something obvious? Thank You, Eric
Moz Pro | | monthelie10 -
Campaign keywords (48) and on page report (17)
I go into my campaign keywords and have 48, but when I view my on page report only 17 show up? All keywords have been in there since last Friday except 2-26-11, 4 I added this Friday 3-4-11.
Moz Pro | | SmallFry340