Site: Query Question
-
Hi All,
Question around the site: query you can execute on Google for example. Now I know it has lots of inaccuracies, but I like to keep a high level sight of it over time.
I was using it to also try and get a high level view of how many product pages were indexed vs. the total number of pages.
What is interesting is when I do a site: query for say www.newark.com I get ~748,000 results returned.
When I do a query for www.newark.com "/dp/" I get ~845,000 results returned.
Either I am doing something stupid or these numbers are completely backwards?
Any thoughts?
Thanks,
Ben
-
Barry Schwartz posted some great information about this in November of 2010, quoting a couple of different Google sources. In short, more specific queries can cause Google to dig deeper and give more accurate estimates.
-
Yup. get rid of parameter laden urls and its easy enough. If they hang around the index for a few months before disappearing thats no big deal, as long as you have done the right thing it will work out fine
Also your not interested in the chaff, just the bits you want to make sure are indexed. So make sure thise are in sensibly titled sitemaps and its fine (used this on sites with 50 million and 100 million product pages. It gets a bit more complex at that number, but the underlying principle is the same)
-
But then on a big site (talking 4m+ products) its usually the case that you have URL's indexed that wouldn't be generated in a sitemap because they include additional parameters.
Ideally of course you rid the index of parameter filled URL's but its pretty tough to do that.
-
Best bet is to make sure all your urls are in your sitemap and then you get an exact count.
Ive found it handy to use multiple sitempas for each subfolder i.e. /news/ or /profiles/ to be able to quickly see exactly what % of urls are indexed from each section of my site. This is super helpful in finding errors in a specific section or when you are working on indexing of a certain type of page
S
-
What I've found the reason for this comes down to how the Google system works. Case in point, a client site I have with 25,000 actual pages. They have mass duplicate content issues. When I do a generic site: with the domain, Google shows 50-60,000 pages. If I do an inurl: with a specific URL param, I either get 500,000 or over a million.
Though that's not your exact situation, it can help explain what's happening.
Essentially, if you do a normal site: Google will try its best to provide the content within the site that it shows the world based on "most relevant" content. When you do a refined check, it's naturally going to look for the content that really is most relevant - closest match to that actual parameter.
So if you're seeing more results with the refined process, it means that on any given day, at any given time, when someone does a general search, the Google system will filter out a lot of content that isn't seen as highly valuable for that particular search. So all those extra pages that come up in your refined check - many of them are most likely then evaluated as less than highly valuable / high quality or relevant to most searches.
Even if many are great pages, their system has multiple algorithms that have to be run to assign value. What you are seeing is those processes struggling to sort it all out.
-
about 839,000 results.
-
Different data center perhaps - what about if you add in the "dp" query to the string?
-
I actually see 'about 897,000 results' for the search 'site:www.newark.com'.
-
Thanks Adrian,
I understand those areas of inaccuracy, but I didn't expect to see a refined search produce more results than the original search. That just seems a little bizarre to me, which is why I was wondering if there was a clear explanation or if I was executing my query incorrectly.
Ben
-
This is an expected 'oddity' of the site: operator. Here is a video of Matt Cutts explaining the imprecise nature of the site: operator.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Speed or Site Speed which one does Google considered a ranking signal
I've read many threads online which proves that website speed is a ranking factor. There's a friend whose website scores 44 (slow metric score) on Google Pagespeed Insights. Despite that his website is slow, he outranks me on Google search results. It confuses me that I optimized my website for speed, but my competitor's slow site outperforms me. On Six9ja.com, I did amazing work by getting my target score which is 100 (fast metric score) on Google Pagespeed Insights. Coming to my Google search console tool, they have shown that some of my pages have average scores, while some have slow scores. Google search console tool proves me wrong that none of my pages are fast. Then where did the fast metrics went? Could it be because I added three Adsense Javascript code to all my blog posts? If so, that means that Adsense code is slowing website speed performance despite having an async tag. I tested my blog post speed and I understand that my page speed reduced by 48 due to the 3 Adsense javascript codes added to it. I got 62 (Average metric score). Now, my site speed is=100, then my page speed=62 Does this mean that Google considers page speed rather than site speed as a ranking factor? Screenshots: https://imgur.com/a/YSxSwOG **Regarding: **https://six9ja.com/
Reporting & Analytics | | Kingsmart1 -
Two long established sites with similar audiences, what do we do?
Hi guys, We operate two long established and reasonably well ranking sites — our company website which was built on a keyword domain: market-stalls.co.uk (approx 15 years online) and our online store which was established several years later on a different domain: tradersupplies.co.uk (approx 9 years online). (At the bottom of this post I've attached real world traffic and turnover figures that demonstrate the issue we're facing) The problem is... The above sites target very similar audiences and keywords and both rank fairly well but I know are likely competing against eachother We're a small company (8-10 employees) and we (or rather, I) don't have the time or resources to blog, build back links, manage opseo and all the social channels etc for both sites. I'm struggling to cope with one. The question is... Do we abandon the original company site (market-stalls.co.uk) in favour of pooling all our resource in to improving rankings for our online store (tradersupplies.co.uk). All our social media presence relates to tradersupplies.co.uk. We don't have any social channels for market-stalls.co.uk. Ironically, the only blog we have is established on market-stalls.co.uk — set up a couple of years ago in the hope to pull ourselves back up the rankings — but it hasn't been updated in over a year due to time restraints. Or do we attempt to keep both sites operational, despite a lack of resource? That would likely include a fairly sizeable overhaul of market-stalls.co.uk to bring it up to date with modern design standards, establishing social media channels for market-stalls.co.uk, creating a blog on tradersupplies.co.uk, and regularly updating two blogs and two sets of social media channels with unique content. Sounds like a pretty huge job right!? Obviously, had we been setting up our business in 2017 and having read the many community posts on the subject of multiple websites, we wouldn't be splitting our time between two websites and would be focussing solely on building one highly ranking site. But unfortunately we're not in this position and we're in a quandary because we don't know whether or not we should let our original, highly ranking company site drop off the radar in favour of focussing on building traffic to our online store. This situation arose out of a decision to establish our online store on a different domain to our company website. Back in 2007 I rebuilt market-stalls.co.uk and spent a lot of time optimising it. The site blew up and we were ranking very well for all kinds of keywords related to market stalls In 2009 we opened our online store tradersupplies.co.uk which sells all of the products advertised on market-stalls.co.uk and then some By using "buy now" buttons on market-stalls.co.uk which redirected to tradersupplies.co.uk, our original site was driving a large amount of traffic and sales to tradersupplies.co.uk. At it's peak it was driving almost £6,000 GBP a month in sales. This has since dropped to around a third/quarter of this total. As the business grew we began to run short of time to maintain market-stalls.co.uk and it has inevitably slipped down the rankings This has also had a direct impact on the referral traffic and resulting sales on tradersupplies.co.uk. I've attached below the analytics which show the drop in referral traffic to tradersupplies.co.uk and the drop off in sales. I have a feeling I know the answer to this debacle but I'm keen to hear the opinions of those that may have found themselves in this position before! UPDATE: I've just had a call with our Magento developer halfway through writing this post ... he has suggested we transfer all content from market-stalls.co.uk over to CMS pages on our Magento powered online store, and create 301 redirects. Apparently this will carry the weight of market-stalls.co.uk over to tradersupplies.co.uk. Does anyone have any thoughts on this? turnover.jpg
Reporting & Analytics | | tinselworm0 -
What are all the 5's in SEO Queries in Analytics?
Every small business client has the same thing. 5 impressions for keywords, row after row, every single month. Why exactly 5 and why month after month the same thing? I see this in every local business I work in - and for very important phrases! It's gotten to the point that I think those are fake and I just look at the impressions that have numbers great than 5. Obviously I have to get their impressions up, but what am I to believe about these?
Reporting & Analytics | | katandmouse0 -
Which Algorithm Change Hurt the Site? A causation/correlation issue
The attached graph is from google analytics, a correlation of about 14 months of Organic Google visits with algo changes, data from moz naturally 🙂 Is there any way to tell from this which will have affected the site? for example #1 or #2 seems to be responsible for the first dip, but #4 seems to fix it and it broke around 6, or is the rise between 4 and 7 an anomaly and actually 1 or 2 caused a slip from when it was released all the way to when 7 was released. Sorry if the graph is a little cloak and dagger, that is partly because we don't have permissions to reveal much about the identity, and partly because we were trying to do a kind of double blind, separating the data from our biases 🙂 We can say though the different between the level at the start and end of the graph is at least 10,000 visits per day JarMzoK.png
Reporting & Analytics | | Fammy0 -
Does anyone know of a way to do a profile level filter to exclude all traffic if it enters the site via certain landing pages?
Does anyone know of a way to do a profile level filter to exclude all traffic if it enters the site via certain landing pages? The problem I have is that we have several pages that are served to visitors of numerous other domains but are also served to visitors of our site. We end up with inflated Google Analytics numbers because people are viewing these pages from our partners' domains but never actually entering our site. I've made an advanced segment that serves the purpose but I'd really like to filter it at the profile level so the numbers across the board are more accurate without having to apply an advanced segment to every report. The advanced segment excludes visits that hit these pages as landing pages but includes visits where people have come from other pages on our domain. I know that you can do profile filters to exclude visits to pages or directories entirely but is there a way to filter them only if they are a landing pages? Any other creative thoughts? Thanks in advance!
Reporting & Analytics | | ATIseo0 -
Mobile Site on Google Analytics
Hi mozzers, We just launched a mobile site and I was wondering what are the main steps to follow for gettting your mobile site tracked via GA (m.example.com)? We have a profile for www.example.com GATC: javascript or PHP to install? Should the profile be on a subdomain? What else to consider when implementing a mobile site on GA? Thanks
Reporting & Analytics | | Ideas-Money-Art0 -
Has anyone noticed a dramatic drop in direct visits year over year in GA across multiple sites?
I monitor about 10 websites in GA. Many of these sites are in a stable phase of their lifecycle. I've noticed this year that direct visits on all my sites and even friends sites have dropped by 20-60%. Has anyone seen any explanation for this or noticed this when compared to previous year? In every instance I have no penalties, notices, anything and the drop is made up completely of "direct visits".
Reporting & Analytics | | bradwayland0 -
Why did I loose all my product page rankings (e-commerce site)
This friday I noticed that I'd lost pretty much all my product pages in the SERP and also their rankings for the product names. These are products I both have introduced to the market (sweden) and also some that I've been the only one selling. I've analyzed a couple of different ranking-faults. Examples: **"super mario väggdekaler" should rank **http://www.roligaprylar.se/Super-Mario-Vaeggdekaler.html as #1 and has done for several years. Instead this search in my internal search engine ranks #10-#15 with no relevance. www.roligaprylar.se/?q=mario%20v%E4g "jedi morgonrock" should rank www.roligaprylar.se/Jedi-Morgonrock.html as #1 or #2 but instead this url ranks as #12 www.roligaprylar.se/product_detail.php?pid=Jedi-Morgonrock "Charlie sheen bobblehead" (in the swedish serp this should be the most simple term to rank on. previously #1) my internal search engine ranks for #8 with this url <cite>www.roligaprylar.se/?q=Charlie%20Sheen%20Bobblehead</cite>J So I've drawn these conclusions and actions Products that don't rank well longer but still ranks with their alternative non-rewritten url has gotten deep links from affilliates (i track affilliate ids and stuff via this link) and have replaced the original url which is rewritten. Action: Canonical urls for these non-rewritten products to the rewritten version. For example on this product page www.roligaprylar.se/product_detail.php?pid=Jedi-Morgonrock I've placed a canonical for this url www.roligaprylar.se/Jedi-morgonrock.html With the products not ranking at all or when searches in my search engine shows up I suspect some kind of dup content punishment where Google thinks the search result is more important than the product page. Action: All search-pages are now noindex,follow I also increased product name density in terms of keywords on the product page. But I'm still owned and losing tons of money during the holidays (buying adwords at obscene amounts instead hehe). So just wanted to hear with you guys. Are my conclusions and actions correct? What have I missed, what more could I do to reverse this? Thanks Dan
Reporting & Analytics | | nuttinalle0