Double Listings On Page One
-
I've been noticing a trend over the past month and a half. My sites that use to get more than one page listed in certain SERPs are now being adjusted. It almost looks manual but I know it is most likely a change in the algorithm. Let's say I had a SERP where my site was showing two different sub-pages in a single SERP at #4 and #6 are now having one page being pushed up to #3 but the other page is being pushed back past the first page.
I'm not worried about penalizations or loss of value. I have been seeing this accross many of my client's sites. I just wanted to confirm that others were seeing it as well (so I'm not going crazy) and/or if Google has made any announcements or leaks regarding this shift.
Maybe it's just my sites coming of age or something but I would love to be able to explain it more knowledgeably than with a "Google might be doing this".
BTW - This is not effecting any of my Brand SERPs.
-
I used to have lots of #1 - #2 and even #1 - #2 - #3 - (sometimes #4) listings.
I still have some - but not as many.
Over the past few months Google is still allowing some of these but it is much harder to get two of your pages listed in the top ten positions of the SERPs.
You can really stack them up on the second and third page... but Google seems to be forcing more domain diversity in the top ten positions.
-
The Google Penguin update had two major changes that impacted the algorithm.
1. It penalized many sites that it felt was gaming the rankings.
2. It rewarded trusted sites with better rankings.
The net result of these two updates is that trusted sites will not only obtain several rankings on the first page, but will get multiple rankings on all subsequent search results pages. This doesn't leave a lot of SERP space for the rest of the competition.
-
That's it!!! I'm not crazy. Now I am happy. I really have to pay more attention to that main blog.
-
Yeah, I remembered reading something on their blog.
"More domain diversity. [launch codename "Horde", project codename "Domain Crowding"] Sometimes search returns too many results from the same domain. This change helps surface content from a more diverse set of domains."
http://insidesearch.blogspot.com/2012/05/search-quality-highlights-53-changes.html
-
Have you come across any documented change in they way they are returning SERPs?
-
I've see this as well. It seems like google wants more diversity.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Great DA but page authority not increasing!
Hey team, I hope you are doing great, I have been working effortlessly to increase the authority of my blog. I have used a number of Moz recommended methods like long-form content, posting frequency, getting references from influencers and great websites. It has all resulted in a good domain authority but no matter what I do, the page authority of my blog isn't increasing. Can you please have a look and guide: https://androidcompare.com/ Kind regards...
Algorithm Updates | | Bsmdi0 -
Do orphan pages take away link juice?
Hi, Just wondering about this whether the orphan pages take away any link juice? We been creating lot of them these days only to link from external sites as landing pages on our site. So, not linking from any part of our website; just linking from other websites. Also, will they get any link juice if they are linked from our own blog-post? Thanks
Algorithm Updates | | vtmoz1 -
Meta robots at every page rather than using robots.txt for blocking crawlers? How they'll get indexed if we block crawlers?
Hi all, The suggestion to use meta robots tag rather than robots.txt file is to make sure the pages do not get indexed if their hyperlinks are available anywhere on the internet. I don't understand how the pages will be indexed if the entire site is blocked? Even though there are page links are available, will Google really index those pages? One of our site got blocked from robots file but internal links are available on internet for years which are not been indexed. So technically robots.txt file is quite enough right? Please clarify and guide me if I'm wrong. Thanks
Algorithm Updates | | vtmoz0 -
Any suggestions why I would rank 1 on google and be on 3rd page for bing/yahoo?
Currently the site I'm working on ranks very well on google rankings but then when we cross reference into yahoo and bing we are basically in the graveyard of keywords. (bottom of 3rd page). Why would that be? Any suggestions or things I can do to fix this or troubleshoot it? Here are some things I can think of that might affect this but not sure. 1. our sitemap hasn't been updated in months and URL changes have been made 2. Onsite for yahoo and bing is different from google? 3. Bing is just terrible in general? 4. Inbound links? This one doesn't make sense though unless the search engines rank links in different ways. All jokes aside I would really appreciate any help as currently the few top ranked keywords we have are about 30% of our organic traffic and would have a huge affect on the company if we were able to rank as we should across all platforms. Thanks!
Algorithm Updates | | JemJemCertified0 -
On-Page Markup: Still a Worthwhile Practice?
So I have a question for the community that hopefully someone can help me with. Previously, whenever I created/worked on a website, when I would create or edit the content, I would bold the keywords, italicize certain items, add internal/external links and generally mark-up the content. More recently, however, I've noticed that both my client and many of their leading competitors have abandoned this practice. Now, it appears that all the text appears as plain text, there are rarely bold or italicized items and there does not seem to be as much emphasis on inserting internal/external links. While I understand the ladder to still be an effective/holistic approach to SEO, I'm wondering why the former (the bold, italicized, text variation) has gone by the wayside. So with that, is adding bold/italicized text still a worthwhile SEO technique and is it something I should continue applying to sites I work on? Please advise.
Algorithm Updates | | maxcarnage0 -
Reasons for a sharp decline in pages crawled
Hello! I have a site I've been tracking using Moz since July. The site is mainly stagnant with some on page content updates. Starting the first week of December, Moz crawler diagnostics showed that the number of pages crawled decreased from 300 to 100 in a week. So did the number of errors through. So crawler issues went from 275 to 50 and total pages crawled went from 190 to 125 in a week and this number has stayed the same for the last 5 weeks. Are the drops a red flag? Or is it ok since errors decreased also? Has anyone else experienced this and found an issue? FYI: sitemap exists and is submitted via webmaster tools. GWT shows no crawler errors nor blocked URLs.
Algorithm Updates | | Symmetri0 -
Why do I have 7 URLs from the same domain ranking on the 1st page?
I have a client that has individual pages for authorized dealers of their product (say "Car Dealers"). When you search for "brand name + location", Google returns 7 "dealership" pages from the parent company's domain as the first 7 results, but there is one that gets pushed off to the 5th page of the SERPs. The formatting of content, geo-targeting, and meta data on the page is identical on every single one. None of them have external links and there is not one extremely distinguishable thing to assess why the one page doesn't get placed on that first SERP. Why is the one getting pushed so far down? I know this may be a bit confusing, but any thoughts would be greatly appreciated. Thanks!
Algorithm Updates | | MichaelWeisbaum0 -
Test contet/pages indexed by search engines
During the web development stages of our Joomla CMS website, we have managed to get our site indexed for totally irrelevant test pages mainly to do with Joomla and some other equally irrelevant test content. How damaging is this to our domain from an SEO prospective and is there something we can do about it? When we do a site:domain.com search we see hundreds of testpages with test/irrelevant meta tags etc.
Algorithm Updates | | Fuad_YK0