This might be a silly question...
-
I have 14,000 pages on my website, but when I do a site:domain.com search on google, it shows around 55,000.
I first thought.."hmm, maybe it is including subdomains". So I tried site:www.domain.com and now it shows 35,000. That still is more than double the pages I have.
Any ideas why? When you filter a google search using "site", isn't it meant to pick up just that site's pages?
*P.S I tried using the SEOquake add-on to download search results as a CSV file to review, but the add-on only downloads the first 100 search results
-
Thanks, I'll look at manually specifying these parameters and see if they make an impact.
-
Thank you streamline,
That's interesting, I have provided 'searchType', 'searchTerm', 'search', 'cat', 'filter2name', 'filter1name' as URL Parameters
- Are URL Parameters case sensitive?
- Should these be not set as CRAWL - 'Let Googlebot decide' and instead manually given as best practise? It looks like Google is still indexing from what you guys have found.
-
Easy way to be sure is to do a quick search on Google to see if they are ranking. If you know for sure the Parameters make no difference its usually better to specifically signal that through the WMT console. While Google tend to be pretty smart at these kind of things they can always make mistakes so may as well give as much info as possible.
-
Hi there,
I am doing a crawl on the site listed in your profile (www.abdserotec.com) using Screaming Frog SEO Spider using Googlebot as the User Agent, and I am seeing many more URLs than the 14,000 pages you have. The bulk majority of these excess pages are the Search Results pages (such as http://www.abdserotec.com/search.html?searchType=BASIC&searchTerm=STEM CELL FACTOR&cat=&Filter2Name=GO&Filter2Value=germ-cell development&filterCount=2&type=&filter1name=Spec&filter1value=STEM CELL FACTOR). While these URLs are not showing up in the Google Index when you try searching your site with the site: command, Google is still definitely accessing them and crawling them. As Tuzzell just suggested, I also highly recommend configuring the parameters within GWT.
-
We have 49 Parameters listed and given 'Let Googlebot decide'. I thought adding the parameters here would avoid google from indexing those URLs? I believe our setup already does this?
-
What do you mean by "multiple ways"? We have a search page which isn't indexed and internal links from pages but that wouldn't count would it? It's not like the URL string changes from a search page or internal hyperlink?
-
Have you discounted URL parameters through Google Webmaster tools? This would be particularly prevalent for an ecommerce site as if you have not Google could be looking at /page, /page?p=x, /page?p=y etc and counting these as unique pages. This creates obvious dupe content issues and is easily fixed in WMT by going to:
Crawl>URL Parameters
Hope that helps.
-
what about multiple ways of getting to the same product?
-
There are no blog posts, it's an ecommerce site and every product page and article page has the URL www.domain.com/.
I even looked at my GA and it reports 14,000 pages
If there was a tool to export all the search results, I could've manually looked into why the big count.
-
Hi Cyto,
Does that include your blog pages? If you have a blog, such as Wordpress, then it may be picking up the different URL's that each post may have. So for example, you might have the blog post in different categories which would mean the post is accessible from 2 different URL's
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question: About Google's personalization of search results and its impact on monitoring ranking results
Given Google's personalization of search results for anyone who's logged into a Google property, how realistic and how actually meaningful/worthwhile is it to monitor one's ranking results for any keyword term these days?
Algorithm Updates | | RandallScrubs0 -
Mobile Brand Markup Question
Hi Moz Community, I was searching for "Gifts for men" in Google Search on my phone and saw a few results in the 3rd (Nordstrom), 4th (Etsy) and 5th(Grommet) place that had their brand name in the area under the title tag where the green url is usually listed on desktop. One example of the green text under the title tag is Nordstrom which lookes like this: Nordstrom > Shop > Gifts Whereas the first result from UncommonGoods looks like this in the green text: www.uncommongoods.com > by recipient I'm trying to figure out what markup Nordstrom, Etsy, ect used on their site to get their brand name to show up not as a url but as a brandname Anyone know the answer to this? Thanks!
Algorithm Updates | | znotes0 -
Please help explain this (Question about search results)
What's up SEO's, I'm new the SEO world and had a quick question. I just installed the MOZBAR and did a google search: "What is Google Voice" I attached an image of the results I received. Can someone explain how MacWorld's article outranked Google's when both Google's Page Authority and Domain Authority are so much stronger than MacWorlds. This is in addition to google having many more links. This is basic, but any insight will be very helpful. Thanks guys! [Screen%20Shot%202014-02-18%20at%206.08.15%20PM.png](file:///Users/jackfarrell/Desktop/Screen%20Shot%202014-02-18%20at%206.08.15%20PM.png)
Algorithm Updates | | Petbrosia1 -
VRL Parameters Question - Exclude? or use a Canonical Tag?
I'm trying to figure something out, as I just finished my "new look" to an old website. It uses a custom built shopping cart, and the system worked pretty well until about a year when ranking went down. My primary traffic used to come from top level Brand pages. Each brand gets sorted by the shopping cart and a Parameter extension is added... So customers can click Page 1 , Page 2 , Page 3 etc. So for example : http://www.xyz.com/brand.html , http://www.xyz.com/brand.html?page=1 , http://www.xyz.com/brand.html?page=2 and so on... The page= is dynamic, therefore the page title, meta's, etc are the same, however the products displayed are different. I don't want to exclude the parameter page= completely, as the products are different on each page and obviously I would want the products to be indexed. However, at the same time my concern is that have these parameters might be causing some confusion an hence why I noticed a drop in google rankings. I also want to note - with my market, its not needed to break these pages up to target more specific keywords. Maybe using something like this would be the appropriate measure?
Algorithm Updates | | Southbay_Carnivorous_Plants0 -
Question regarding research tools
The keyword analysis tool on seomoz is currently down. Are there are any other trustworthy tools I can use?
Algorithm Updates | | uofmiamiguy0 -
Client question: What should I do?
I have a client who ranks #1 for all her branded keywords. Other than those keywords, she doesn't really have an objective with SEO other than to get her name out there. There are articles in some high end online magazines(think Forbes, Times, etc.) that mention her, and she wants those articles to show up when people do a branded keyword search for those magazines. She also wants those articles to show up when people Google her. Usually when I do SEO for a client, they have a site and they want that site to show up for a variety of targeted keywords. Has anyone run into people wanting to 1) SEO other sites to get them in the top 10 on their branded keywords and 2) get listed under other peoples branded keywords? Is this even possible? My gut says no but I feel obliged to look into it. Do I just build links to the articles with her keywords and hope for the best? I have no idea what to do with this client.
Algorithm Updates | | AdamMetrix0 -
Question about Local / Regional SEO
Good Morning Moz Community, I have a local SEO/regional SEO question. I apologize if this question is duplicated from another area on this forum but, a query of the term Regional SEO showed no results, as did similar queries. Please preference this entire question with "Knowing what we know about the most recent changes to local search" I know what has worked in the past, my concern is Now. Working with a heavily regulated client that is regional, mostly East Coast US. They are in Financial Services and state licensing is a requirement. They are licensed in 15 states. Obviously, it would look foolish, in this day in age, to Title Tag individual pages with local modifiers and have numerous pages covering a similar topic with not much difference than localized modifiers in front of the keyword. I've never found that SE's can understand broad regional terms such as New England or Mid Atlantic or Southeast or Northeast, if someone knows different please share. Aside from an exact match search. The client does have 7 offices in various states. Perfectly matching and consistent listings in G Places, Bing Local and Yahoo Local was step one and all their locations are now in those services and there are many more smaller local citation listings are in the works. We have also successfully implemented a plan to generate great reviews from actual customers, for each location, they're receiving a few a day right now. Their local places listings, where they have physical locations, are doing very well but: 1. What would the community's suggestion be on generating more targeted traffic in the 8 states where they have no physical location? 2. The client wants to begin creating smaller blogs that are highly localized to the states and major population centers that they do not have a physical location in. There is an open check book to dedicate to this effort however, I do a lot of work in this industry so I want to offer the best possible, most up to date advice, my concern is that these efforts will have two results: a. be obscured by the ”7 pack" by companies with local brick and mortar b. would detract from the equity built in their existing blog by generating content in other domains, I would prefer to continue growing the main blog. 3. As a follow up, it has been documented that Google is now using the same algorithm for local, personal and personalized, that being the case, is there any value in building links to you Places page? Can you optimize your Places page by using the same off site techniques as you would traditionally? Sorry to kill you with such a long question on a Sunday 🙂
Algorithm Updates | | dogflog1 -
Can someone explain a few hopefully simple questions for me please
Hi everyone First off for local seraches I rank very well pretty much all on first page and high up too. I am also attempting to rank well for the search term 'independent mortgage advice' I currently rank third on page 2 for the above search term. I am happy with this progress as the site is only 3 months old. I am UK based, have a .co.uk domain and although my site server is located in Germany (1and1) I have changed the geographical location in webmaster tools. My competitive domain analysis gives me the follwing results: Domain Authority: 14 Domain MozRank: 2.44 Total Links: 110 Ext. Followed Links: 19 Linking root Domains: 13 Followed Linking Root Domains:9 Linking C-Blocks: 8 Compared to my competitors around me these figures are terrible so why am I doing relatively well and how can I increase some of these figures such as Domain Authority & Domain Mozrank? The page I'm referring to is http://www.keystonemortgages.co.uk I am a novice so please don't mind calling me a numpty if it appears obvious to you Jason PS is it frowned upon to post links with title keywords here?
Algorithm Updates | | JasonHegarty0