Duplicate Domain Listings Gone?
-
I'm noticing in several of the SERPs I track this morning that the domains that formerly had multiple pages listed on pages 1-3 for the same keyword are now reduced to one listing per domain.
I'm hoping that this is a permanent change and widespread as it is a significant boon to my campaigns, but I'm wondering if anyone else here has seen this in their SERPs or knows what I'm talking about...?
EX of what I mean by "duplicate domain listings": (in case my wording is confusing here)
Search term "Product Item"
Pages ranking:
domain-one.com/product-item.html
domain-one.com/product-item-benefits.html
etc...
-
Interesting, thanks for your insight as always EGOL. Upon further research I have found a few double listings but they have been for specific software and the double listings are of the developer's domain. So that makes sense to me.
Either way it seems the algo is making exceptions for certain domains depending on keyword and their authority to the actual search term.
-
Based upon the topics that I watch, Google recently increased the domain diversity of the SERPs by cutting back on the number of double listings, triple listings, quad and etc. listings.
You can still get two or three pages showing on the first page of the SERPs but it seems to be a lot harder. I have never considered "keyword cannibalization" to be a problem, but am starting to see it for some of the keywords that I am after.
For my retail areas... Informative content is now dominating the SERPs.
-
Maybe.
I have the #1 position for a corner of the market, but I could not get a second page on the front page, seeing I had #1 I hade a second site and then had 1# and #2,soi made another, I now have 3 on first page, once you have #1, this seems they way to go.
-
Yeah I remember a long time ago they said they were going to do this and then on a few of my SERPs it never took effect. So I complained here and EGOL convinced me that "if I can't beat 'em, join 'em."
Well turned out I couldn't join 'em either but I hate the concept so that's okay.
Anyway, for months and months these domains have had duplicate page listings on page 1 and beyond and it's been killing me. Today they're gone. So perhaps they just turned the dial up on the algo?
-
Sometime ago google made a change where they did just this, tried to get more domains on the front page rather then many pages from the same domain.
This was a few years back, so not sure what you are seeing today, it may be that the domains were penalized some other way.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
When sub domains take away the traffic from search; will this helps or hurts main website rankings?
Hi all, We have some of the landing pages on our sub domains which are getting ranked for our brand related queries and taking away the traffic as we don't have pages to rank for those search queries. I would like to know if this scenario hurts or helps our main website ranking as the traffic to the main website is getting diverted to sub domain. Thanks
Algorithm Updates | | vtmoz0 -
New Website Old Domain - Still Poor Rankings after 1 Year - Tagging & Content the culprit?
I've run a live wedding band in Boston for almost 30 years, that used to rank very well in organic search. I was hit by the Panda Updates August of 2014, and rankings literally vanished. I hired an SEO company to rectify the situation and create a new WordPress website -which launched January 15, 2015. Kept my old domain: www.shineband.com Rankings remained pretty much non-existent. I was then told that 10% of my links were bad. After lots of grunt work, I sent in a disavow request in early June via Google Wemaster Tools. It's now mid October, rankings have remained pretty much non-existent. Without much experience, I got Moz Pro to help take control of my own SEO and help identify some problems (over 60 pages of medium priority issues: title tag character length and meta description). Also some helpful reports by www.siteliner.com and www.feinternational.com both mentioned a Duplicate Content issue. I had old blog posts from a different domain (now 301 redirecting to the main site) migrated to my new website's internal blog, http://www.shineband.com/best-boston-wedding-band-blog/ as suggested by the SEO company I hired. It appears that by doing that -the the older blog posts show as pages in the back end of WordPress with the poor meta and tile issues AS WELL AS probably creating a primary reason for duplicate content issues (with links back to the site). Could this most likely be viewed as spamming or (unofficial) SEO penalty? As SEO companies far and wide daily try to persuade me to hire them to fix my ranking -can't say I trust much. My plan: put most of the old blog posts into the Trash, via WordPress -rather than try and optimize each page (over 60) adjusting tagging, titles and duplicate content. Nobody really reads a quick post from 2009... I believe this could be beneficial and that those pages are more hurtful than helpful. Is that a bad idea, not knowing if those pages carry much juice? Realize my domain authority not great. No grand expectations, but is this a good move? What would be my next step afterwards, some kind of resubmitting of the site, then? This has been painful, business has fallen, can't through more dough at this. THANK YOU!
Algorithm Updates | | Shineband1 -
Test site is live on Google but it duplicates existing site...
Hello - my developer has just put a test site up on Google which duplicates my existing site (main url is www.mydomain.com and he's put it up on www.mydomain.com/test/ "...I’ve added /test/ to the disallowed urls in robots.txt" is how he put it. So all the site URLs are content replicated and live on Google with /test/ added so he can block them in robots. In all other ways the test site duplicates all content, etc (until I get around to making some tweaks next week, that is). Is this a bad idea or should I be OK. Last thing I want is a duplicate content or some other Google penalty just because I'm tweaking an existing website! Thanks in advance, Luke
Algorithm Updates | | McTaggart0 -
Undertanding Google's PMD (Partial Matching Domain) policy...
Hi, If your business name contains keywords, is that an issue? Some companies, have keyword based brand names... So what is Google's policy regarding EMD or PMD? What happens when the company name has a keyword in it? If anyone could help clarify, I would appreciate it. Thanks, Ben
Algorithm Updates | | bjs20100 -
Does Google use data from Gmail to penalize domains and vice versa?
Has anyone noticed issues with Gmail deliverability and spam inboxing happening around the same time as other large Google updates? For example, if Google blasted your site in Panda or Penguin, have anyone seen them use the same judgement across into Gmail deliverability to blacklist your domain?
Algorithm Updates | | Eric_edvisors0 -
Why am I not getting on Map Listing Results?
Greetings Mozzers, To my knowledge I'm doing everything that is "required" to start showing up on the map results when searching something local, however, we never seem to be on map results (A, B, C, D...etc). We have a Google+ page, submitted to Google Places (received PIN and entered it), optimized address to identical and in high authority map listing directories (GetListed.org), increased citations throughout the web, optimized keywords for categories on Google Places, schema.org HTML markup for address, meta address tags, consistent reviews being written by unique visitors to review sites (Yelp, Google+, etc). Am I missing a major component? Any advice would be great as I feel like I'm hitting many notes that should translate into a map result. Even for keywords that aren't incredibly difficult where we are ranking #1 above map results every time. Thanks and hope all have a great weekend!
Algorithm Updates | | MonsterWeb280 -
Are all duplicate contents bad?
We were badly hit by Panda back in January 2012. Unfortunately, it is only now that we are trying to recover back. CASE 1:
Algorithm Updates | | Gautam.Jain
We develop software products. We send out 500-1000 word description about the product to various download sites so that they can add to their product listing. So there are several hundred download sites with same content. How does Google view this? Did Google penalize us due to this reason? CASE 2: In the above case the product description does not match with any content on our website. However, there are several software download sites that copy and paste the content from our website as the product description. So in this case, the duplicate content match with our website. How does Google view this? Did Google penalize us due to this reason? Along with all the download sites, there are also software piracy & crack sites that have the duplicate content. So, should I remove duplicate content only from the software piracy & crack sites or also from genuine download sites? Does Google reject all kind of duplicate content? Or it depends on who hosts the duplicate content? Confused 😞 Please help.0 -
Domain Authority and Google keywords
Hi there, We have a domain authority of 33, one of our competitors has an authority of 10, yet they appear to list higher on many keyword searches in google. Is there a reason for this? Our site is 5 months old, and their site is over 3 yrs old. Thanks for your feedback 🙂
Algorithm Updates | | PHDAustralia680