Multiple Listings in Results fading Local SEO
-
Lately I am noticing multiple listings for results seem to be fading away. Example is one domain being listed twice for a search phrase The Home page for example and an Internal Page.
Is anyone else seeing this?
Safe to say Google wants to see 10+ individual domains per results page?
-
Actually, in many other spaces we are seeing extreme host crowding, where you might get 8 or more listings - here was one pointed out by Brett Tabke of Wembaster World
-
They have always tried to keep one domain on the SERP unless there are really that little results or quality content.
Thus they use the site links feature to show more relevant content from the same domain instead of another result from the same domain.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it bad from an SEO perspective that cached AMP pages are hosted on domains other than the original publisher's?
Hello Moz, I am thinking about starting to utilize AMP for some of my website. I've been researching this AMP situation for the better part of a year and I am still unclear on a few things. What I am primarily concerned with in terms of AMP and SEO is whether or not the original publisher gets credit for the traffic to a cached AMP page that is hosted elsewhere. I can see the possible issues with this from an SEO perspective and I am pretty sure I have read about how SEOs are unhappy about this particular aspect of AMP in other places. On the AMP project FAQ page you can find this, but there is very little explanation: "Do publishers receive credit for the traffic from a measurement perspective?
Algorithm Updates | | Brian_Dowd
Yes, an AMP file is the same as the rest of your site – this space is the publisher’s canvas." So, let's say you have an AMP page on your website example.com:
example.com/amp_document.html And a cached copy is served with a URL format similar to this: https://google.com/amp/example.com/amp_document.html Then how does the original publisher get the credit for the traffic? Is it because there is a canonical tag from the AMP version to the original HTML version? Also, while I am at it, how does an AMP page actually get into Google's AMP Cache (or any other cache)? Does Google crawl the original HTML page, find the AMP version and then just decide to cache it from there? Are there any other issues with this that I should be aware of? Thanks0 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
Does using parent pages in WordPress help with SEO and/or indexing for SERPs?
I have a law office and we handle four different practice areas. I used to have multiple websites (one for each practice area) with keywords in the actual domain name, but based on the recommendation of SEO "experts" a few years ago, I consolidated all the webpages into one single webpage (based on the rumors at the time that Google was going to be focusing on authorship and branding in the future, rather than keywords in URLs or titles). Needless to say, Google authorship was dropped a year or two later and "branding" never took off. Overall, having one webpage is convenient and generally makes SEO easier, but there's been a huge drawback: When my page comes up in SERPs after searching for "attorney" or "lawyer" combined with a specific practice area, the practice area landing pages don't typically come up in the SERPs, only the front page comes up. It's as if Google recognizes that I have some decent content, and Google knows that I specialize in multiple practice areas, but it directs everyone to the front page only. Prospective clients don't like this and it causes my bounce rate to be high. They like to land on a page focusing on the practice area they searched for. Two questions: (1) Would using parent pages (e.g. http://lawfirm.com/divorce/anytown-usa-attorney-lawyer/ vs. http://lawfirm.com/anytown-usa-divorce-attorney-lawyer/) be better for SEO? The research I've done up to this point appears to indicate "no." It doesn't make much difference as long as the keywords are in the domain name and/or URL. But I'd be interested to hear contrary opinions. (2) Would using parent pages (e.g. http://lawfirm.com/divorce/anytown-usa-attorney-lawyer/ vs. http://lawfirm.com/anytown-usa-divorce-attorney-lawyer/) be better for indexing in Google SERPs? For example, would it make it more likely that someone searching for "anytown usa divorce attorney" would actually end up in the divorce section of the website rather than the front page?
Algorithm Updates | | micromano0 -
Please help explain this (Question about search results)
What's up SEO's, I'm new the SEO world and had a quick question. I just installed the MOZBAR and did a google search: "What is Google Voice" I attached an image of the results I received. Can someone explain how MacWorld's article outranked Google's when both Google's Page Authority and Domain Authority are so much stronger than MacWorlds. This is in addition to google having many more links. This is basic, but any insight will be very helpful. Thanks guys! [Screen%20Shot%202014-02-18%20at%206.08.15%20PM.png](file:///Users/jackfarrell/Desktop/Screen%20Shot%202014-02-18%20at%206.08.15%20PM.png)
Algorithm Updates | | Petbrosia1 -
“Service Location” in Lieu of Separate NAP to Avoid Merge on Google+Local?
A client has two businesses out of the same address, same phone: an eat-in restaurant and a catering service. He has a separate website for each. He’s dying to optimize the catering, although long-term wants to optimize both. For the moment, Google only knows this restaurant and his only social media presence is set up as the restaurant as well -- thus the links to his social media even off of the catering site link to his restaurant accounts. I think he has two options: 1. Really do separate them. Get a different address (suite # or use his home address?) and phone. Set up new, separate social media. Register both, separately, at all the directories, etc. 2. Merge them both into the restaurant site and have the restaurant offer both eat-in and catering. Have some pages on the site optimized for lunch and others for catering, with the home page saying both. Register the one domain with all the directories, social media under the restaurant, but with a description that includes both lunch and catering as services offered. Variation on #2: Continue to have Google show the address, since it’s a restaurant, but add the “service location” area to show as well, for the catering part. My questions are: 1. If he kept the two websites separate, would hiding the address and just using a “service location” area for the catering one keep Google happy? I mean, could he keep the same address -- although I suppose he’d still have to get a new phone -- and set up the catering entry to show only the service area? And if he did that, would Google not merge them then? In directories, though, he’d still be listing both the restaurant and the catering separately but under the same address, so maybe this is a silly scenario anyway. What do you think? 2. Which option would you choose? 3. Are there any other better options? 4. In the #2 scenario, if a directory allows registry under one category, would you choose “restaurant” or “catering” -- or sometimes one and sometimes the other? Thank you for your insight!
Algorithm Updates | | rayvensoft0 -
When did Google include display results per page into their ranking algorithm?
It looks like the change took place approx. 1-2 weeks ago. Example: A search for "business credit cards" with search settings at "never show instant results" and "50 results per page", the SERP has a total of 5 different domains in the top 10 (4 domains have multiple results). With the slider set at "10 results per page", there are 9 different domains with only 1 having multiple results. I haven't seen any mention of this change, did I just miss it? Are they becoming that blatant about forcing as many page views as possible for the sake of serving more ads?
Algorithm Updates | | BrianCC0 -
Does the use of an underscore in filenames adversely affect SEO
We have had a page which until recently was ranked first or second by Google UK and also worldwide for the term "Snowbee". It is now no longer in the top 50. I ran a page optimization report on the url and had a very good score. The only criticism was that I had used an atypical character in the url. The only unusual character was an underscore "_" We use the underscore in most file names without apparent problems with search engines. In fact they are automatically created in html files by our ecommerce software, and other pages do not seem to have been so adversely affected. Should we discontinue this practice? It will be difficult but I'm sure we can overcome this if this is the reason why Google has marked us down. I attach images of the SEO Report pages 8fDPi.jpg AdLIn.jpg
Algorithm Updates | | FFTCOUK0 -
How does my blog help in SEO
Hi I have recently put a wordpress blog on my site and have employed a few blog writers, each putting 2 or 3 posts per week up. There brief so far has been to write interesting, humorous and topical articles. Stupid as it may seem I have done this only because the general consensus seemed to be "you must have a blog for SEO" Does it help? Assuming it does: Should I post the same articles to my facebook page and or anywhere else? Should the articles have anchor text linking back to my site? What should I do to make it work well? Thanks in advance Andy
Algorithm Updates | | First-VehicleLeasing1