Double Listings On Page One
-
I've been noticing a trend over the past month and a half. My sites that use to get more than one page listed in certain SERPs are now being adjusted. It almost looks manual but I know it is most likely a change in the algorithm. Let's say I had a SERP where my site was showing two different sub-pages in a single SERP at #4 and #6 are now having one page being pushed up to #3 but the other page is being pushed back past the first page.
I'm not worried about penalizations or loss of value. I have been seeing this accross many of my client's sites. I just wanted to confirm that others were seeing it as well (so I'm not going crazy) and/or if Google has made any announcements or leaks regarding this shift.
Maybe it's just my sites coming of age or something but I would love to be able to explain it more knowledgeably than with a "Google might be doing this".
BTW - This is not effecting any of my Brand SERPs.
-
I used to have lots of #1 - #2 and even #1 - #2 - #3 - (sometimes #4) listings.
I still have some - but not as many.
Over the past few months Google is still allowing some of these but it is much harder to get two of your pages listed in the top ten positions of the SERPs.
You can really stack them up on the second and third page... but Google seems to be forcing more domain diversity in the top ten positions.
-
The Google Penguin update had two major changes that impacted the algorithm.
1. It penalized many sites that it felt was gaming the rankings.
2. It rewarded trusted sites with better rankings.
The net result of these two updates is that trusted sites will not only obtain several rankings on the first page, but will get multiple rankings on all subsequent search results pages. This doesn't leave a lot of SERP space for the rest of the competition.
-
That's it!!! I'm not crazy. Now I am happy. I really have to pay more attention to that main blog.
-
Yeah, I remembered reading something on their blog.
"More domain diversity. [launch codename "Horde", project codename "Domain Crowding"] Sometimes search returns too many results from the same domain. This change helps surface content from a more diverse set of domains."
http://insidesearch.blogspot.com/2012/05/search-quality-highlights-53-changes.html
-
Have you come across any documented change in they way they are returning SERPs?
-
I've see this as well. It seems like google wants more diversity.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need only tens of pages to be indexed out of hundreds: Robots.txt is Okay for Google to proceed with?
Hi all, We 2 sub domains with hundreds of pages where we need only 50 pages to get indexed which are important. Unfortunately the CMS of these sub domains is very old and not supporting "noindex" tag to be deployed on page level. So we are planning to block the entire sites from robots.txt and allow the 50 pages needed. But we are not sure if this is the right approach as Google been suggesting to depend mostly on "noindex" than robots.txt. Please suggest whether we can proceed with robots.txt file. Thanks
Algorithm Updates | | vtmoz0 -
Does Google considers the direct traffic on the pages with rel canonical tags?
Hi community, Let's say there is a duplicate page (A) pointing to original page (B) using rel canonical tag. Pagerank will be passed from Page A to B as the content is very similar and Google honours it hopefully. I wonder how Google treats the direct traffic on the duplicate Page A. We know that direct traffic is also an important ranking factor (correct me if I'm wrong). If the direct traffic is high on the duplicate page A, then how Google considers it? Will there be any score given to original page B? Thanks
Algorithm Updates | | vtmoz0 -
What happens when we change redirects to pass linkjuice to different pages from backlinks? Google's stand?
Hi Moz community, We have employed different pages (topics) at same URLs for years. This has brought different backlinks to same page which has led to non relevancy of backlinks. Now we are planning to redirect some URLs which may improve or drop rankings of certain pages. If we roll back the redirects in case of ranking drop, will there be any negative impact from Google? Does Google notice anything about redirect changes beside just passing pagerank from backlinks? Thanks
Algorithm Updates | | vtmoz0 -
Duplicate pages in language versions, noindex in sitemap and canonical URLs in sitemap?
Hi SEO experts! We are currently in the midst of reducing our amount of duplicate titles in order to optimize our SEO efforts. A lot of the "duplicate titles" come from having several language versions of our site. Therefore, I am wondering: 1. If we start using "" to make Google (and others) aware of alternative language versions of a given site/URL, how big a problem will "duplicate titles" then be across our domains/site versions? 2. Is it a problem that we in our sitemap include (many) URL's to pages that are marked with noindex? 3. Are there any problems with having a sitemap that includes pages that includes canonical URL's to other pages? Thanks in advance!
Algorithm Updates | | TradingFloor.com0 -
How on earth is a site with ONE LINK ranking so well for a competitive keyword?
Ok, so I'm sure you get the gist of what I'm asking about in my question. The query is 'diy kitchens' in Google UK and the website is kitchens4diy[dot]com - which is ranking in third from my viewing. The thing is, the site has just ONE BACKLINK and has done for a good while. Yet, it's ranking really well. What gives?
Algorithm Updates | | Webrevolve0 -
Is it wise to conduct a link building campaign to a Google+ Local page?
Is it wise, while doing a link building campaign to not only focus on the main website target page, but also the Google+ Local page? Here are two strategies I was thinking of using: 1. Conduct a city specific link building campaign to direct traffic to the location specific page on the main website AND the Google+ Local page. 2. Use the main website to direct traffic to each cities specific Google+ Local page. Does it make sense to drive links to a Google+ Local page? It does to me, but I haven't seen anything written about that yet... or perhaps I've just missed it along the way. I'd love to hear the communities thoughts. Thanks! Doug
Algorithm Updates | | DougHoltOnline0 -
How Can I Prevent Duplicate Page Title Errors?
I am working on a website that has two different sections, one for consumers and one for business. However, the products and the product pages are essentially the same but, of course, the pricing and quantities may be different. We just have different paths based on the kind of customer. And, we get feeds from manufacturers for the content so it's difficult to change it. We want Google to index both sections of the site but we don't want to get hammered for duplicate page titles and content. Any suggestions? Thanks!
Algorithm Updates | | JillCS0 -
Too many page links?`
Hi there This blog insert was flag suggesting there was too many page links? I cant identify the same problem? Can anyone explain?
Algorithm Updates | | footballfriends0