Does omitted results shown by Google always mean that website has duplicate content?
-
Google search results for a particular query was appearing in top 10 results but now the page appears but only after clicking on the " omitted results by google."
My website lists different businesses in a particular locality and sometimes results for different localities are same because we show results from nearby area if number of businesses in that locality (search by users) are less then 15.
Will this be considered as "duplicate content"? If yes then what steps can be taken to resolve this issue?
-
It might go to supplemental index when:
- Content is not unique.
- No content at all or with very little content.
- You have pages, not determined to have content initially, such as sitemap, contact, Terms and Conditions, etc.
- Pages that don’t have titles/meta descriptions or have duplicate ones.
-
Hi Prashant,
Yes - any URLs that are different are different in Google's eyes, unless the modifier is a # symbol.
So if you have www.example.com/key#value12345 and www.example.com/key#valuexyzabc, then Google sees these as the same, i.e. www.example.com/key. They will ignore everything after the # character.
All other query strings, etc., mean that the URL has changed and if the pages on those URLs are the same, it's duplicate content.
I hope this helps.
Cheers,
Jane
-
Thanks Jane,
Will the following urls will be considered as two different urls?
1. www.example.com/key=value1& key2=value2
2. www.example.com/key2=value2 & key=value1
-
Thanks David,
I found that a few of these urls were not crawled by Googlebot for a month or so. Now when i checked the last crawled status using "cache:", i found out that these pages were crawled again only recently and probably that is why it is back in top 10 results (main index).
I have one question: When does an url go into "Supplemental Index" ?
-
Hi Prashant,
This sounds like removal due to duplication rather than DMCA - the omission is usually noted as being because of DMCA notices if they are the reason, e.g. http://img.labnol.org/images/2008/07/googlesearchdmcacomplaint.png
Google likely sees these as duplicates, or near-dupes, as David has said,
-
Digital Millennium Copyright Act being used here? No.
OP, it does sound like you have duplicate content issues. See what you can do to make those omitted pages more unique.
-
It's most likely because some one would have put a DMCA takedown on that Google search result. jump in to your Google WebMasterTools account and you should see some notification from Google about it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Thousands of duplicate website links at "Who links the most" in Google webmasters. Any risk being duplicate website links pointing to website?
Hi all, As I mentioned some days back here, our duplicate website got indexed a month back. Unfortunately there are links to our original website. I noticed that thousands of links are from our duplicate website at "Links to Your Site". Will this hurts? Now we have blocked the duplicate website getting indexed. What to do to remove these links from "Who links the most"? Thanks
Algorithm Updates | | vtmoz0 -
Does the more number of ranking pages improve the website ranking?
Hi all, Let's say there is a website with 100 pages ad 95 pages are not ranking for any keywords; but the other 5 pages including homepage are ranking for some keywords. In this scenario, the 95% non-ranking pages does impact the other 5% pages rankings? Or every page holds their credibility in ranking irrespective of other pages in website? Thanks
Algorithm Updates | | vtmoz0 -
404s in Google Search Console and javascript
The end of April, we made the switch from http to https and I was prepared for a surge in crawl errors while Google sorted out our site. However, I wasn't prepared for the surge in impossibly incorrect URLs and partial URLs that I've seen since then. I have learned that as Googlebot grows up, he'she's now attempting to read more javascript and will occasionally try to parse out and "read" a URL in a string of javascript code where no URL is actually present. So, I've "marked as fixed" hundreds of bits like /TRo39,
Algorithm Updates | | LizMicik
category/cig
etc., etc.... But they are also returning hundreds of otherwise correct URLs with a .html extension when our CMS system generates URLs with a .uts extension like this: https://www.thompsoncigar.com/thumbnail/CIGARS/90-RATED-CIGARS/FULL-CIGARS/9012/c/9007/pc/8335.html
when it should be:
https://www.thompsoncigar.com/thumbnail/CIGARS/90-RATED-CIGARS/FULL-CIGARS/9012/c/9007/pc/8335.uts Worst of all, when I look at them in GSC and check the "linked from" tab it shows they are linked from themselves, so I can't backtrack and find a common source of the error. Is anyone else experiencing this? Got any suggestions on how to stop it from happening in the future? Last month it was 50 URLs, this month 150, so I can't keep creating redirects and hoping it goes away. Thanks for any and all suggestions!
Liz Micik0 -
Am I doing enough to rid duplicate content?
I'm in the middle of a massive cleanup effort of old duplicate content on my site, but trying to make sure I'm doing enough. My main concern now is a large group of landing pages. For example: http://www.boxerproperty.com/lease-office-space/office-space/dallas http://www.boxerproperty.com/lease-office-space/executive-suites/dallas http://www.boxerproperty.com/lease-office-space/medical-space/dallas And these are just the tip of the iceberg. For now, I've put canonical tags on each sub-page to direct to the main market page (the second two both point to the first, http://www.boxerproperty.com/lease-office-space/office-space/dallas for example). However this situation is in many other cities as well, and each has a main page like the first one above. For instance: http://www.boxerproperty.com/lease-office-space/office-space/atlanta http://www.boxerproperty.com/lease-office-space/office-space/chicago http://www.boxerproperty.com/lease-office-space/office-space/houston Obviously the previous SEO was pretty heavy-handed with all of these, but my question for now is should I even bother with canonical tags for all of the sub-pages to the main pages (medical-space or executive-suites to office-space), or is the presence of all these pages problematic in itself? In other words, should http://www.boxerproperty.com/lease-office-space/office-space/chicago and http://www.boxerproperty.com/lease-office-space/office-space/houston and all the others have canonical tags pointing to just one page, or should a lot of these simply be deleted? I'm continually finding more and more sub-pages that have used the same template, so I'm just not sure the best way to handle all of them. Looking back historically in Analytics, it appears many of these did drive significant organic traffic in the past, so I'm going to have a tough time justifying deleting a lot of them. Any advice?
Algorithm Updates | | BoxerPropertyHouston0 -
How do you get great content for a small business?
We always talk about great engaging content being the way forwards for sites. As a small business this is an expensive commodity to outsource when you have probably in the region of 250 pages that could probably all use some work. To that end I have some questions. how do do you make a product or category description engaging? Should they still contain a certain number of words ( personally I hate ready reams of text) As on-page SEO what should we be striving to achieve? I am sure this has all been asked before but what the general consensus right now?
Algorithm Updates | | Towelsrus0 -
Changing the # of results per page in Google search settings displays totally different results. Why is this?
Curious what's going on here. This is the first time I've seen this before. What's happening is this ... In Google, I search for "mobile apps orange county" and get a standard list of 10 results. I go to Google's search settings in the top right corner of the page (button is grey with a gear) to change the number of results per page from 10 to 50 (also did 100). When I go back to Google and search again for "mobile apps orange county" I get a much larger list but with completely different results. This time around the top 10-12 are dominated by the same website (ocregister.com) What's going on here that Google would now show different results? Why is this one website all of a sudden dominating the first 12 results? Thanks everyone! ByteLaunch
Algorithm Updates | | ByteLaunch0 -
Decrease in Organic Traffic Due to Google Places
Hello there, we are national junk removal company and have franchises in most major cities in the US. We wanted to check to see if anyone else has seen a drop in organic traffic with the changes that Google has done with the amalgamation of Google Places with the organic rankings. All our places pages are ranking quite well and we are ranking higher organically but it appears that people go to the Google Places page and then either leaving or picking up the phone and calling our 1800 number to book a job instead of going to our website to make the booking. The interesting thing is that although Google started these changes back in October 2010 we have seen the drop in organic traffic mostly starting in April, even though we have seen a steady increase in organic ranking across the board. Has any other franchise based company seen this happen as well? Your feedback is greatly appreciated!
Algorithm Updates | | imspecialistgotjunk0 -
Update content
y'all, what is the recommended amount of time in which content on a website should be refreshed? TY
Algorithm Updates | | imageworks-2612900