This might be a silly question...
-
I have 14,000 pages on my website, but when I do a site:domain.com search on google, it shows around 55,000.
I first thought.."hmm, maybe it is including subdomains". So I tried site:www.domain.com and now it shows 35,000. That still is more than double the pages I have.
Any ideas why? When you filter a google search using "site", isn't it meant to pick up just that site's pages?
*P.S I tried using the SEOquake add-on to download search results as a CSV file to review, but the add-on only downloads the first 100 search results
-
Thanks, I'll look at manually specifying these parameters and see if they make an impact.
-
Thank you streamline,
That's interesting, I have provided 'searchType', 'searchTerm', 'search', 'cat', 'filter2name', 'filter1name' as URL Parameters
- Are URL Parameters case sensitive?
- Should these be not set as CRAWL - 'Let Googlebot decide' and instead manually given as best practise? It looks like Google is still indexing from what you guys have found.
-
Easy way to be sure is to do a quick search on Google to see if they are ranking. If you know for sure the Parameters make no difference its usually better to specifically signal that through the WMT console. While Google tend to be pretty smart at these kind of things they can always make mistakes so may as well give as much info as possible.
-
Hi there,
I am doing a crawl on the site listed in your profile (www.abdserotec.com) using Screaming Frog SEO Spider using Googlebot as the User Agent, and I am seeing many more URLs than the 14,000 pages you have. The bulk majority of these excess pages are the Search Results pages (such as http://www.abdserotec.com/search.html?searchType=BASIC&searchTerm=STEM CELL FACTOR&cat=&Filter2Name=GO&Filter2Value=germ-cell development&filterCount=2&type=&filter1name=Spec&filter1value=STEM CELL FACTOR). While these URLs are not showing up in the Google Index when you try searching your site with the site: command, Google is still definitely accessing them and crawling them. As Tuzzell just suggested, I also highly recommend configuring the parameters within GWT.
-
We have 49 Parameters listed and given 'Let Googlebot decide'. I thought adding the parameters here would avoid google from indexing those URLs? I believe our setup already does this?
-
What do you mean by "multiple ways"? We have a search page which isn't indexed and internal links from pages but that wouldn't count would it? It's not like the URL string changes from a search page or internal hyperlink?
-
Have you discounted URL parameters through Google Webmaster tools? This would be particularly prevalent for an ecommerce site as if you have not Google could be looking at /page, /page?p=x, /page?p=y etc and counting these as unique pages. This creates obvious dupe content issues and is easily fixed in WMT by going to:
Crawl>URL Parameters
Hope that helps.
-
what about multiple ways of getting to the same product?
-
There are no blog posts, it's an ecommerce site and every product page and article page has the URL www.domain.com/.
I even looked at my GA and it reports 14,000 pages
If there was a tool to export all the search results, I could've manually looked into why the big count.
-
Hi Cyto,
Does that include your blog pages? If you have a blog, such as Wordpress, then it may be picking up the different URL's that each post may have. So for example, you might have the blog post in different categories which would mean the post is accessible from 2 different URL's
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
We are ranking for unexpected industry keywords; not for the main keywords. What might be wrong?
Hi all, Our company falls into a SaaS industry LIKE seo. We used to rank for our industry keywords just like seo, seo software, etc... We have fallen out of rankings for these obvious top keywords with high search volume, but ranking for these 2 keywords like seo homepage and seo website. I think we have some strong ranking authority at Google by ranking for our industry keywords "homepage" and "website" which represents a very relevant site to the industry. But not sure why we are not ranking for top keywords. Any thoughts on this?
Algorithm Updates | | vtmoz1 -
Ctr question with home page and product pages
do you believe that the advantage of targeting a search term on the home page is now worse off than before? as I understand it ctr is a big factor now And as far as i can see if two pages are equal on page etc the better ctr will win out, the issue with the home page is the serp stars cannot be used hence the ctr on a product page will be higher? I feel if you where able to get a home page up quicker (1 year instead of two) you still lost out in the end due to the product page winning on ctr? do you think this is correct?
Algorithm Updates | | BobAnderson0 -
Sitemap Question - Should I exclude or make a separate sitemap for Old URL's
So basically, my website is very old... 1995 Old. Extremely old content still shows up when people search for things that are outdated by 10-15+ years , I decided not to drop redirects on some of the irrelevant pages. People still hit the pages, but bounce... I have about 400 pages that I don't want to delete or redirect. Many of them have old backlinks and hold some value but do interfere with my new relevant content. If I dropped these pages into a sitemap, set the priority to zero would that possibly help? No redirects, content is still valid for people looking for it, but maybe these old pages don't show up above my new content? Currently the old stuff is excluded from all sitemaps.. I don't want to make one and have it make the problem worse. Any advise is appreciated. Thx 😄
Algorithm Updates | | Southbay_Carnivorous_Plants0 -
Google Site Links question
Are Google site links only ever shown on the top website? Or is it possible for certain queries for the site in position #2 or #3 or something to have site links but the #1 position not have them? If there are any guides, tips or write ups regarding site links and their behavior and optimization please share! Thanks.
Algorithm Updates | | IrvCo_Interactive0 -
To link or redirect? That is the question.
I have a site that I don't really use any longer but still has some okay rankings. I'd like to take advantage of the links point to that site. Is it better to redirect that site to my new one or to just place a link on the homepage pointing to my new site?
Algorithm Updates | | JCurrier0 -
Can you help with a few high-level mobile SEO questions?
Rolling out a mobile site for a client and I'm not positive about the following: Do these mobile pages need to be optimized with the same / similar page titles? If we have a product page on the regular site with an optimized title like "Men's Sweaters, Shirts and Ties - Company XYZ", should the mobile version's page have the same title? What if the dev team simply named it "Company XYZ Clothes" and missed the targeted keywords? Does it matter? Along the lines of question 1, isn't there truly just one index and your regular desktop browser version will be used for all ranking factors on both desktop and mobile SERPs? If that regular page indeed ranks well for "men's sweaters" and that term is searched on a mobile device, the visitor will be detected and served up the mobile page version, regardless of its meta tags and authority (say it's on a subdomain, m.example/.com/mens-department/ ), correct? Are meta descriptions necessary for the mobile version? Will the GoogleBot Mobile recognize them or will just the regular version work? Looks like mobile meta descriptions have about 30 less characters. Thanks in advance. Any advice is appreciated. AK
Algorithm Updates | | akim260 -
Google new update question
I was just reading this, http://www.entrepreneur.com/blog/220662 We have our official site, which has 200+ service pages, which we wrote once and we keep doing SEO for them, so they rank high all the time. Now my question is, how does Google handle the site freshness ? Service static pages or if we are adding blog items, then also they consider them as fresh site, right ? So, we dont have to update those service pages, right ?
Algorithm Updates | | qubesys0 -
Can someone explain a few hopefully simple questions for me please
Hi everyone First off for local seraches I rank very well pretty much all on first page and high up too. I am also attempting to rank well for the search term 'independent mortgage advice' I currently rank third on page 2 for the above search term. I am happy with this progress as the site is only 3 months old. I am UK based, have a .co.uk domain and although my site server is located in Germany (1and1) I have changed the geographical location in webmaster tools. My competitive domain analysis gives me the follwing results: Domain Authority: 14 Domain MozRank: 2.44 Total Links: 110 Ext. Followed Links: 19 Linking root Domains: 13 Followed Linking Root Domains:9 Linking C-Blocks: 8 Compared to my competitors around me these figures are terrible so why am I doing relatively well and how can I increase some of these figures such as Domain Authority & Domain Mozrank? The page I'm referring to is http://www.keystonemortgages.co.uk I am a novice so please don't mind calling me a numpty if it appears obvious to you Jason PS is it frowned upon to post links with title keywords here?
Algorithm Updates | | JasonHegarty0