This might be a silly question...
-
I have 14,000 pages on my website, but when I do a site:domain.com search on google, it shows around 55,000.
I first thought.."hmm, maybe it is including subdomains". So I tried site:www.domain.com and now it shows 35,000. That still is more than double the pages I have.
Any ideas why? When you filter a google search using "site", isn't it meant to pick up just that site's pages?
*P.S I tried using the SEOquake add-on to download search results as a CSV file to review, but the add-on only downloads the first 100 search results
-
Thanks, I'll look at manually specifying these parameters and see if they make an impact.
-
Thank you streamline,
That's interesting, I have provided 'searchType', 'searchTerm', 'search', 'cat', 'filter2name', 'filter1name' as URL Parameters
- Are URL Parameters case sensitive?
- Should these be not set as CRAWL - 'Let Googlebot decide' and instead manually given as best practise? It looks like Google is still indexing from what you guys have found.
-
Easy way to be sure is to do a quick search on Google to see if they are ranking. If you know for sure the Parameters make no difference its usually better to specifically signal that through the WMT console. While Google tend to be pretty smart at these kind of things they can always make mistakes so may as well give as much info as possible.
-
Hi there,
I am doing a crawl on the site listed in your profile (www.abdserotec.com) using Screaming Frog SEO Spider using Googlebot as the User Agent, and I am seeing many more URLs than the 14,000 pages you have. The bulk majority of these excess pages are the Search Results pages (such as http://www.abdserotec.com/search.html?searchType=BASIC&searchTerm=STEM CELL FACTOR&cat=&Filter2Name=GO&Filter2Value=germ-cell development&filterCount=2&type=&filter1name=Spec&filter1value=STEM CELL FACTOR). While these URLs are not showing up in the Google Index when you try searching your site with the site: command, Google is still definitely accessing them and crawling them. As Tuzzell just suggested, I also highly recommend configuring the parameters within GWT.
-
We have 49 Parameters listed and given 'Let Googlebot decide'. I thought adding the parameters here would avoid google from indexing those URLs? I believe our setup already does this?
-
What do you mean by "multiple ways"? We have a search page which isn't indexed and internal links from pages but that wouldn't count would it? It's not like the URL string changes from a search page or internal hyperlink?
-
Have you discounted URL parameters through Google Webmaster tools? This would be particularly prevalent for an ecommerce site as if you have not Google could be looking at /page, /page?p=x, /page?p=y etc and counting these as unique pages. This creates obvious dupe content issues and is easily fixed in WMT by going to:
Crawl>URL Parameters
Hope that helps.
-
what about multiple ways of getting to the same product?
-
There are no blog posts, it's an ecommerce site and every product page and article page has the URL www.domain.com/.
I even looked at my GA and it reports 14,000 pages
If there was a tool to export all the search results, I could've manually looked into why the big count.
-
Hi Cyto,
Does that include your blog pages? If you have a blog, such as Wordpress, then it may be picking up the different URL's that each post may have. So for example, you might have the blog post in different categories which would mean the post is accessible from 2 different URL's
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
We are ranking for unexpected industry keywords; not for the main keywords. What might be wrong?
Hi all, Our company falls into a SaaS industry LIKE seo. We used to rank for our industry keywords just like seo, seo software, etc... We have fallen out of rankings for these obvious top keywords with high search volume, but ranking for these 2 keywords like seo homepage and seo website. I think we have some strong ranking authority at Google by ranking for our industry keywords "homepage" and "website" which represents a very relevant site to the industry. But not sure why we are not ranking for top keywords. Any thoughts on this?
Algorithm Updates | | vtmoz1 -
International foreign language SEO questions
I'm looking to add some foreign language pages to a website and have a lot of international SEO questions. I think the overall question is can you do SEO yourself if you are a native English speaker for a language you don't speak (like Chinese)? 1. How do you go about doing keyword research for a foreign language? What tools are available? 2. How do you know what search engines you should optimize for in a different country? And where can you find the technical SEO requirements for each? I'm wondering things like title tag length for Baidu. Or is the Title length different for Yahoo Japan vs. US? Do you write titles and meta tags in Chinese/Japanese for respective countries? Etc.
Algorithm Updates | | IrvCo_Interactive0 -
SERP Question - Site showing up for national term over local term recently
Hey Moz, This has been happening to me with a couple of clients recently and I wanted to kick it out to the community and see if anyone else has experienced it and might be able to shed some light on why. (Disclaimer: Both clients are in the elective healthcare space)
Algorithm Updates | | Etna
Scenario: Client's site is optimized for a fairly competitive "procedural keyword + location" phrase. Historically, the site had been ranking on the first page for a while until it suddenly dropped off for that query. At the same time, the page now ranks on the first page for just the procedural term, without the location modifier (obviously much more competitive than with the location modifier). Searches on Google were set to the city in which the client was located. Not that I'm complaining, but this seems a little weird to me. Anyone have a similar situation? If so, any theories about what might have caused it? TL;DR - Site ranked on 1st page for "keyword + location modifier" historically, now ranking on 1st page for "keyword" only and not found with "keyword + location modifier" TRQd9Hu0 -
External Linking Best Practices Question
Is it frowned upon to use basic anchor text such as "click here" within a blog article when linking externally? I understand, ideally, you want to provide a descriptive anchor text, especially linking internally, but can it negatively affect your own website if you don't use a descriptive anchor text when linking externally?
Algorithm Updates | | RezStream80 -
Website evaluation questions-what to ask a client
I am looking to make a list of go-to questions when I talk to clients on the phone. What are some good questions that would help me evaluate their website/internet goals?
Algorithm Updates | | StreetwiseReports0 -
Video SEO: Youtube, Vimeo PRO, Wistia, Longtail BOTR Experience and questions
Obviously Video SEO is changing, Google is figuring out how to do it themselves. We are left wondering… Below we have tried to explain what we have learned and how the different sites work and their characteristics (links to graphics provided) Our problem is: We are not getting congruent Google site:apalytics.tv Video filter results. We are wondering how duplicate content may be affecting our results… and if so, why will Youtube not be duplicate and prevent your own site SEO efforts from working. Is Youtube special? Does that include Vimeo too? We see our own duplicate videos on multiple sites in Google results, so it seems it is not duplicate related…? We’d appreciate your experience or add to our questions and work as a community to get this figured out more definitively. Thanks! We’ve tried four video hosting solutions at quite a cost monetarily and in time. 1.) Youtube, which gets all the SEO Juice and gets our clients on to other subjects or potentially competitive content. Iframes just don’t get the results we are looking for. 2.) See Vimeo Image: Vimeo PRO, a $200 year plus solution that allows us to do many video carousels on our own domains hosted on Vimeo, but are very limited in HTML as only CSS content changes are allowed. While we were using Vimeo we allowed the Vimeo.com community to SEO our content directly and they come up often in search results. Due to duplicate content concerns we have disallowed Vimeo.com from using our content and SEOing our content to their domain. However, we have many “portfolios” (micro limited carousal sites on our domains) that continue to carry the content. The Vimeo hosted micro site shows only three videos on Google: site:apalytics.tv During our testing we are concerned that duplicate content is causing issues too, so we are getting ready to shut off the many microsite domains hosted at Vimeo. (Vimeo has an old embed code that allows a NON-iframe embed – but has discontinued it recently) That makes it difficult if not impossible to retain SEO juice for anything other than their simple micro sites that are very limited! 3.) See Wistia Image: Wistia, a $2000 year plus solution that only provides private video site hosting embedding various types of video content on one’s site/s. Wistia has a free account now for three videos and limited plays – it’s a nice interface for SEO but is still different than BOTR. We opted for BOTR because of many other advertising related options, but are again trying Wistia with the free version to see if we can figure out why our BOTR videos are not showing up as hoped. We know that Google does not promise to index and feature every video on a sitemap, but why some are there and others are not and when remains a mystery that we are hoping to get some answers about. 4.) See Longtail Image: Longtail, Bits On The Run, (JW Player author) a $1,000 year plus like Wistia provides private hosting, but it allows a one button YouTube upload for the same SEO meta data and content – isn’t that duplicate content? BOTR creates and submits video sitemaps for your content, but it has not been working for us and it has been impossible to get a definitive answer as I think they too are learning or are not wanting the expose their proprietary methods (which are not yet working for us!) 2O9w0.png 0eiPv.png O9bXV.png
Algorithm Updates | | Mark_Jay_Apsey_Jr.0 -
Question about Local / Regional SEO
Good Morning Moz Community, I have a local SEO/regional SEO question. I apologize if this question is duplicated from another area on this forum but, a query of the term Regional SEO showed no results, as did similar queries. Please preference this entire question with "Knowing what we know about the most recent changes to local search" I know what has worked in the past, my concern is Now. Working with a heavily regulated client that is regional, mostly East Coast US. They are in Financial Services and state licensing is a requirement. They are licensed in 15 states. Obviously, it would look foolish, in this day in age, to Title Tag individual pages with local modifiers and have numerous pages covering a similar topic with not much difference than localized modifiers in front of the keyword. I've never found that SE's can understand broad regional terms such as New England or Mid Atlantic or Southeast or Northeast, if someone knows different please share. Aside from an exact match search. The client does have 7 offices in various states. Perfectly matching and consistent listings in G Places, Bing Local and Yahoo Local was step one and all their locations are now in those services and there are many more smaller local citation listings are in the works. We have also successfully implemented a plan to generate great reviews from actual customers, for each location, they're receiving a few a day right now. Their local places listings, where they have physical locations, are doing very well but: 1. What would the community's suggestion be on generating more targeted traffic in the 8 states where they have no physical location? 2. The client wants to begin creating smaller blogs that are highly localized to the states and major population centers that they do not have a physical location in. There is an open check book to dedicate to this effort however, I do a lot of work in this industry so I want to offer the best possible, most up to date advice, my concern is that these efforts will have two results: a. be obscured by the ”7 pack" by companies with local brick and mortar b. would detract from the equity built in their existing blog by generating content in other domains, I would prefer to continue growing the main blog. 3. As a follow up, it has been documented that Google is now using the same algorithm for local, personal and personalized, that being the case, is there any value in building links to you Places page? Can you optimize your Places page by using the same off site techniques as you would traditionally? Sorry to kill you with such a long question on a Sunday 🙂
Algorithm Updates | | dogflog1 -
Can you help with a few high-level mobile SEO questions?
Rolling out a mobile site for a client and I'm not positive about the following: Do these mobile pages need to be optimized with the same / similar page titles? If we have a product page on the regular site with an optimized title like "Men's Sweaters, Shirts and Ties - Company XYZ", should the mobile version's page have the same title? What if the dev team simply named it "Company XYZ Clothes" and missed the targeted keywords? Does it matter? Along the lines of question 1, isn't there truly just one index and your regular desktop browser version will be used for all ranking factors on both desktop and mobile SERPs? If that regular page indeed ranks well for "men's sweaters" and that term is searched on a mobile device, the visitor will be detected and served up the mobile page version, regardless of its meta tags and authority (say it's on a subdomain, m.example/.com/mens-department/ ), correct? Are meta descriptions necessary for the mobile version? Will the GoogleBot Mobile recognize them or will just the regular version work? Looks like mobile meta descriptions have about 30 less characters. Thanks in advance. Any advice is appreciated. AK
Algorithm Updates | | akim260