This might be a silly question...
-
I have 14,000 pages on my website, but when I do a site:domain.com search on google, it shows around 55,000.
I first thought.."hmm, maybe it is including subdomains". So I tried site:www.domain.com and now it shows 35,000. That still is more than double the pages I have.
Any ideas why? When you filter a google search using "site", isn't it meant to pick up just that site's pages?
*P.S I tried using the SEOquake add-on to download search results as a CSV file to review, but the add-on only downloads the first 100 search results
-
Thanks, I'll look at manually specifying these parameters and see if they make an impact.
-
Thank you streamline,
That's interesting, I have provided 'searchType', 'searchTerm', 'search', 'cat', 'filter2name', 'filter1name' as URL Parameters
- Are URL Parameters case sensitive?
- Should these be not set as CRAWL - 'Let Googlebot decide' and instead manually given as best practise? It looks like Google is still indexing from what you guys have found.
-
Easy way to be sure is to do a quick search on Google to see if they are ranking. If you know for sure the Parameters make no difference its usually better to specifically signal that through the WMT console. While Google tend to be pretty smart at these kind of things they can always make mistakes so may as well give as much info as possible.
-
Hi there,
I am doing a crawl on the site listed in your profile (www.abdserotec.com) using Screaming Frog SEO Spider using Googlebot as the User Agent, and I am seeing many more URLs than the 14,000 pages you have. The bulk majority of these excess pages are the Search Results pages (such as http://www.abdserotec.com/search.html?searchType=BASIC&searchTerm=STEM CELL FACTOR&cat=&Filter2Name=GO&Filter2Value=germ-cell development&filterCount=2&type=&filter1name=Spec&filter1value=STEM CELL FACTOR). While these URLs are not showing up in the Google Index when you try searching your site with the site: command, Google is still definitely accessing them and crawling them. As Tuzzell just suggested, I also highly recommend configuring the parameters within GWT.
-
We have 49 Parameters listed and given 'Let Googlebot decide'. I thought adding the parameters here would avoid google from indexing those URLs? I believe our setup already does this?
-
What do you mean by "multiple ways"? We have a search page which isn't indexed and internal links from pages but that wouldn't count would it? It's not like the URL string changes from a search page or internal hyperlink?
-
Have you discounted URL parameters through Google Webmaster tools? This would be particularly prevalent for an ecommerce site as if you have not Google could be looking at /page, /page?p=x, /page?p=y etc and counting these as unique pages. This creates obvious dupe content issues and is easily fixed in WMT by going to:
Crawl>URL Parameters
Hope that helps.
-
what about multiple ways of getting to the same product?
-
There are no blog posts, it's an ecommerce site and every product page and article page has the URL www.domain.com/.
I even looked at my GA and it reports 14,000 pages
If there was a tool to export all the search results, I could've manually looked into why the big count.
-
Hi Cyto,
Does that include your blog pages? If you have a blog, such as Wordpress, then it may be picking up the different URL's that each post may have. So for example, you might have the blog post in different categories which would mean the post is accessible from 2 different URL's
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Domain Migration Question
Lets say there is a brand that has one primary product type at different optional tiers. (Think something like SMB/Enterprise/Individual) Lets also say that 1 year ago this brand migrated from having everything under 1 domain (Domain A) to moving 2 of their product tiers to another domain (Domain B), a new domain. They have done some initial SEO work on this domain and had a pretty successful migration but it has also been decided that they are going to no longer offer one of these product tiers and they intend to eventually migrate everything back under the 1 domain (Domain A) They just are not sure whether they should do this now or later.
Algorithm Updates | | DRSearchEngOpt
During this next year or so there is also going to be some likely re-branding/design, etc...stemming from this decision, on the domain, meaning content changes and all that fun that goes into a migration/re-design/re-branding strategy. The timing of this has not been fully decided on. Here is the question: Should they a) Migrate back to Domain A first and then do the re-design or b) Keep 2 separate domains for now, figure out the re-design/re-branding, make content changes and then migrate Site A over in a year or so after all changes have been made? My concern with option a) is that they migrated a little less than 1 year ago and will be migrating back which I feel could have a negative impact on the content and the domain. The positive side I see here is that this impact could be just as large even if we waited so doing this now might be a better, more efficient use of our time if we can migrate and make content changes fairly close together or concurrently.
My concern with option b) is that the tier they no longer offer makes up the majority of that sites business and traffic, leaving us with not much in terms of content that ranks well and garners much traffic. Trying to optimize for the remaining product tier by itself on it's own domain could be quite hard and then having to migrate it in a year or so back to Domain A could negatively impact any small organic impact I can make on applicable pages/domain. Does anybody have any input here? I am leaning towards Option A and but wanted to get some other opinions. Thanks Everybody! Edit: So far, this has received a lot of views but no input. I am hoping to have a bit of a dialog on this so any ideas or input is welcome.0 -
How can a site with two questionable inbound links outperform sites with 500-1000 links good PR?
Our site for years was performing at #1 for but in the last 6 months been pushed down to about the #5 spot. Some of the domains above us have a handful of links and they aren't from good sources. We don't have a Google penalty. We try to only have links from quality domains but have been pushed down the SERP's? Any suggestions?
Algorithm Updates | | northerncs0 -
Sitemap Question - Should I exclude or make a separate sitemap for Old URL's
So basically, my website is very old... 1995 Old. Extremely old content still shows up when people search for things that are outdated by 10-15+ years , I decided not to drop redirects on some of the irrelevant pages. People still hit the pages, but bounce... I have about 400 pages that I don't want to delete or redirect. Many of them have old backlinks and hold some value but do interfere with my new relevant content. If I dropped these pages into a sitemap, set the priority to zero would that possibly help? No redirects, content is still valid for people looking for it, but maybe these old pages don't show up above my new content? Currently the old stuff is excluded from all sitemaps.. I don't want to make one and have it make the problem worse. Any advise is appreciated. Thx 😄
Algorithm Updates | | Southbay_Carnivorous_Plants0 -
Question about Google Algo Change on June 26
I have a client who's Google Organic visits dropped significantly on June 26th. I used a chart overlay called ChartIntelligence. It says that there was an SEOF update on 6/26/2013. Does anyone know what this update (or any other updates) would be? Also, where might I find additional info on this update. I did notice that Moz's algo change tracker listed a multi-week update on June 27, but I'm not sure where to find info on what types of things were impacted by this update. Any info would be helpful.
Algorithm Updates | | TopFloor0 -
VRL Parameters Question - Exclude? or use a Canonical Tag?
I'm trying to figure something out, as I just finished my "new look" to an old website. It uses a custom built shopping cart, and the system worked pretty well until about a year when ranking went down. My primary traffic used to come from top level Brand pages. Each brand gets sorted by the shopping cart and a Parameter extension is added... So customers can click Page 1 , Page 2 , Page 3 etc. So for example : http://www.xyz.com/brand.html , http://www.xyz.com/brand.html?page=1 , http://www.xyz.com/brand.html?page=2 and so on... The page= is dynamic, therefore the page title, meta's, etc are the same, however the products displayed are different. I don't want to exclude the parameter page= completely, as the products are different on each page and obviously I would want the products to be indexed. However, at the same time my concern is that have these parameters might be causing some confusion an hence why I noticed a drop in google rankings. I also want to note - with my market, its not needed to break these pages up to target more specific keywords. Maybe using something like this would be the appropriate measure?
Algorithm Updates | | Southbay_Carnivorous_Plants0 -
Question regarding research tools
The keyword analysis tool on seomoz is currently down. Are there are any other trustworthy tools I can use?
Algorithm Updates | | uofmiamiguy0 -
SEO Budgets, the million dollar question???
Hi All, I am currently looking to revamp my SEO strategy inline with Google's latest Panda and Penguin updates, and looking to appoint a new agency. With SEO changing so much over the years and so many players in the marketplace quoting all sorts, I simply need to determine the kind of money I need to be spending on my SEO, 2) what i should be getting for the money, or different budget levels what I need to be focusing on in priority order, a top ten in sorts Should i be looking to increase or decrease my spend over the long term. I am only a small business with a turnover of about 50 - 80k and need to really cement my strategy so it work long term but also shows a steady return. I have one guy quoting $99 a month, one £250 and one £750, you can probably see my problem. Thanks in advance.
Algorithm Updates | | etsgroup0 -
Video SEO: Youtube, Vimeo PRO, Wistia, Longtail BOTR Experience and questions
Obviously Video SEO is changing, Google is figuring out how to do it themselves. We are left wondering… Below we have tried to explain what we have learned and how the different sites work and their characteristics (links to graphics provided) Our problem is: We are not getting congruent Google site:apalytics.tv Video filter results. We are wondering how duplicate content may be affecting our results… and if so, why will Youtube not be duplicate and prevent your own site SEO efforts from working. Is Youtube special? Does that include Vimeo too? We see our own duplicate videos on multiple sites in Google results, so it seems it is not duplicate related…? We’d appreciate your experience or add to our questions and work as a community to get this figured out more definitively. Thanks! We’ve tried four video hosting solutions at quite a cost monetarily and in time. 1.) Youtube, which gets all the SEO Juice and gets our clients on to other subjects or potentially competitive content. Iframes just don’t get the results we are looking for. 2.) See Vimeo Image: Vimeo PRO, a $200 year plus solution that allows us to do many video carousels on our own domains hosted on Vimeo, but are very limited in HTML as only CSS content changes are allowed. While we were using Vimeo we allowed the Vimeo.com community to SEO our content directly and they come up often in search results. Due to duplicate content concerns we have disallowed Vimeo.com from using our content and SEOing our content to their domain. However, we have many “portfolios” (micro limited carousal sites on our domains) that continue to carry the content. The Vimeo hosted micro site shows only three videos on Google: site:apalytics.tv During our testing we are concerned that duplicate content is causing issues too, so we are getting ready to shut off the many microsite domains hosted at Vimeo. (Vimeo has an old embed code that allows a NON-iframe embed – but has discontinued it recently) That makes it difficult if not impossible to retain SEO juice for anything other than their simple micro sites that are very limited! 3.) See Wistia Image: Wistia, a $2000 year plus solution that only provides private video site hosting embedding various types of video content on one’s site/s. Wistia has a free account now for three videos and limited plays – it’s a nice interface for SEO but is still different than BOTR. We opted for BOTR because of many other advertising related options, but are again trying Wistia with the free version to see if we can figure out why our BOTR videos are not showing up as hoped. We know that Google does not promise to index and feature every video on a sitemap, but why some are there and others are not and when remains a mystery that we are hoping to get some answers about. 4.) See Longtail Image: Longtail, Bits On The Run, (JW Player author) a $1,000 year plus like Wistia provides private hosting, but it allows a one button YouTube upload for the same SEO meta data and content – isn’t that duplicate content? BOTR creates and submits video sitemaps for your content, but it has not been working for us and it has been impossible to get a definitive answer as I think they too are learning or are not wanting the expose their proprietary methods (which are not yet working for us!) 2O9w0.png 0eiPv.png O9bXV.png
Algorithm Updates | | Mark_Jay_Apsey_Jr.0