This might be a silly question...
-
I have 14,000 pages on my website, but when I do a site:domain.com search on google, it shows around 55,000.
I first thought.."hmm, maybe it is including subdomains". So I tried site:www.domain.com and now it shows 35,000. That still is more than double the pages I have.
Any ideas why? When you filter a google search using "site", isn't it meant to pick up just that site's pages?
*P.S I tried using the SEOquake add-on to download search results as a CSV file to review, but the add-on only downloads the first 100 search results
-
Thanks, I'll look at manually specifying these parameters and see if they make an impact.
-
Thank you streamline,
That's interesting, I have provided 'searchType', 'searchTerm', 'search', 'cat', 'filter2name', 'filter1name' as URL Parameters
- Are URL Parameters case sensitive?
- Should these be not set as CRAWL - 'Let Googlebot decide' and instead manually given as best practise? It looks like Google is still indexing from what you guys have found.
-
Easy way to be sure is to do a quick search on Google to see if they are ranking. If you know for sure the Parameters make no difference its usually better to specifically signal that through the WMT console. While Google tend to be pretty smart at these kind of things they can always make mistakes so may as well give as much info as possible.
-
Hi there,
I am doing a crawl on the site listed in your profile (www.abdserotec.com) using Screaming Frog SEO Spider using Googlebot as the User Agent, and I am seeing many more URLs than the 14,000 pages you have. The bulk majority of these excess pages are the Search Results pages (such as http://www.abdserotec.com/search.html?searchType=BASIC&searchTerm=STEM CELL FACTOR&cat=&Filter2Name=GO&Filter2Value=germ-cell development&filterCount=2&type=&filter1name=Spec&filter1value=STEM CELL FACTOR). While these URLs are not showing up in the Google Index when you try searching your site with the site: command, Google is still definitely accessing them and crawling them. As Tuzzell just suggested, I also highly recommend configuring the parameters within GWT.
-
We have 49 Parameters listed and given 'Let Googlebot decide'. I thought adding the parameters here would avoid google from indexing those URLs? I believe our setup already does this?
-
What do you mean by "multiple ways"? We have a search page which isn't indexed and internal links from pages but that wouldn't count would it? It's not like the URL string changes from a search page or internal hyperlink?
-
Have you discounted URL parameters through Google Webmaster tools? This would be particularly prevalent for an ecommerce site as if you have not Google could be looking at /page, /page?p=x, /page?p=y etc and counting these as unique pages. This creates obvious dupe content issues and is easily fixed in WMT by going to:
Crawl>URL Parameters
Hope that helps.
-
what about multiple ways of getting to the same product?
-
There are no blog posts, it's an ecommerce site and every product page and article page has the URL www.domain.com/.
I even looked at my GA and it reports 14,000 pages
If there was a tool to export all the search results, I could've manually looked into why the big count.
-
Hi Cyto,
Does that include your blog pages? If you have a blog, such as Wordpress, then it may be picking up the different URL's that each post may have. So for example, you might have the blog post in different categories which would mean the post is accessible from 2 different URL's
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question: About Google's personalization of search results and its impact on monitoring ranking results
Given Google's personalization of search results for anyone who's logged into a Google property, how realistic and how actually meaningful/worthwhile is it to monitor one's ranking results for any keyword term these days?
Algorithm Updates | | RandallScrubs0 -
How can a site with two questionable inbound links outperform sites with 500-1000 links good PR?
Our site for years was performing at #1 for but in the last 6 months been pushed down to about the #5 spot. Some of the domains above us have a handful of links and they aren't from good sources. We don't have a Google penalty. We try to only have links from quality domains but have been pushed down the SERP's? Any suggestions?
Algorithm Updates | | northerncs0 -
Dumb International SEO question?
Buongiorno from 18 degrees C Wetherby UK... Client asks - "My swedish site is http://www2.kingspanpanels.se/ how important is having the swedish suffix in the url with regards to rankings in Sweden?" I find these questions really challenging, its like the Hey if i change this url my SEO problems will be fixed, as if its that easy. So my question is - "How weighted is the url suffix / ccTLD in terms of SEO success for a territory / country" Put another way "If the swedish suffix .se was removed would it impact rankings in any way in Sweden?" Grazie tanto,
Algorithm Updates | | Nightwing
David0 -
To link or redirect? That is the question.
I have a site that I don't really use any longer but still has some okay rankings. I'd like to take advantage of the links point to that site. Is it better to redirect that site to my new one or to just place a link on the homepage pointing to my new site?
Algorithm Updates | | JCurrier0 -
"No Follow", C Blocks and IP Addresses combined into one ultimate question?
I think the the theme of this question should be "Is this worth my time?" Hello, Mozcon readers and SEO gurus. I'm not sure how other hosting networks are set up, but I'm with Hostgator. I have a VPS level 5 which (I think) is like a mini personal server. I have 4 IP addresses, although it is a C block as each IP address is off by one number in the last digit of the address. I have used 3 out of the 4 IP addresses I have been given. I have added my own sites (some high traffic, some start-ups) and I've hosted a few websites that I have designed from high paying customers. -one man show, design them, host them and SEO them With the latest Penguin update, and with learning that linking between C Block sites is not a great idea, I have "No Followed" all of the footer links on client sites back to my portfolio site. I have also made sure that there are no links interlinking between any of my sites as I don't see them in the Site Explorer, and I figure if they aren't helping, they may be hurting the rankings of those keywords. Ok, so...my question is: "I have one IP address that I'm not using, and I have a popular high traffic site sharing it's IP with 5 other sites (all not related niches but high quality) Is it worth it to move the high traffic site to it's own IP address even though making the switch would take up to 48hrs for process to take affect? -My site would be down for, at the most 2 days (1 and a half if I switch the IP's at night) Is this really worth the stress of losing readers? Will moving a site on an IP with 5 other sites help the rankings if it was to be on it's own IP? Thank you very much ps- I can't make it to MOZcon this year, super bummed
Algorithm Updates | | MikePatch0 -
Can someone explain a few hopefully simple questions for me please
Hi everyone First off for local seraches I rank very well pretty much all on first page and high up too. I am also attempting to rank well for the search term 'independent mortgage advice' I currently rank third on page 2 for the above search term. I am happy with this progress as the site is only 3 months old. I am UK based, have a .co.uk domain and although my site server is located in Germany (1and1) I have changed the geographical location in webmaster tools. My competitive domain analysis gives me the follwing results: Domain Authority: 14 Domain MozRank: 2.44 Total Links: 110 Ext. Followed Links: 19 Linking root Domains: 13 Followed Linking Root Domains:9 Linking C-Blocks: 8 Compared to my competitors around me these figures are terrible so why am I doing relatively well and how can I increase some of these figures such as Domain Authority & Domain Mozrank? The page I'm referring to is http://www.keystonemortgages.co.uk I am a novice so please don't mind calling me a numpty if it appears obvious to you Jason PS is it frowned upon to post links with title keywords here?
Algorithm Updates | | JasonHegarty0 -
Bad IP Neighborhood Question
I'm interested, weather bad network neighborhood could cause some penalties in Google indexing and search? For checking your site neighbors follow this URL (enter your site URL in the end): http://www.google.com/safebrowsing/diagnostic?site=domain.com
Algorithm Updates | | bubliki0 -
301 redirect question
So I have an employer who owns a retail site and his category URLs are horrible. So, I am suggesting to him to create a new page with a pretty URL and 301 redirect the old page to the new page. I am suggesting this to him, because this will help increase CTR for the targeted keyword & help him rank higher for the term. He is apprehensive about this cause he thinks this will cause him to drop in ranking. Does anybody know any resources or have any past experiences that will back up my suggestion or his for that matter?
Algorithm Updates | | Cyle0