Duplicate Content
-
I have a question about duplicate content. (auto generated text).
Will google consider page 1 and page 2 as duplicate content?Page 1.
You will find all the Amazon coupon codes and Amazon discount codes currently available listed below, if Amazon doesn't currently have any coupons available you may want to check for Amazon deals or find related coupon codes or promotional codes for similar online stores selling the same products as amazon.
We always have the latest coupon codes for Amazon which are updated daily, so if you can't find any Amazon coupons here then you won't find them anywhere else.
Shop online today at Amazon, and take advantage of the coupon codes that Amazon currently has on offer, these coupon codes, offer codes, and promo codes for Amazon may never be available again.Page 2.
You will find all the Target coupon codes and Target discount codes currently available listed below, if Target doesn't currently have any coupons available you may want to check for Target deals or find related coupon codes or promotional codes for similar online stores selling the same products as Target.
We always have the latest coupon codes for Target which are updated daily, so if you can't find any Target coupons here then you won't find them anywhere else.
Shop online today at Target, and take advantage of the coupon codes that Target currently has on offer, these coupon codes, offer codes, and promo codes for Target may never be available again. -
Sent you a PM
-
Hi,
Thanks so much!
Is possible to get in touch with you by email or skype?930240C809194680B0F8E988F699E00B.PROTECT # WHOISGUARD # COM Email used for thatswhatphilsaid
-
Each page with unique 300 words will be fine in google's eyes?
If you have 300 words on each page, as long as it's useful content that people are sticking around to read, then you should be okay. Your end goal should be to provide value to your visitors. If 300 words is plenty of content for the subject of your pages, then you're okay. If you have a blog about quantum physics and you only write 300 words per page... you might not be so okay anymoreAfter the text is removed is there any chance to recover from Panda? If your site is penalized by Panda, and you make adjustments to fix the issues you were once penalized for, yes, you can certainly recover. It's possible that duplicate content isn't your only issue, and there may be more to fix. Again, this is assuming you're penalized by Panda. I found a really good post about Panda recovery a couple weeks ago. Lucky for you, I bookmarked it! http://www.ventureharbour.com/panda-recovery-a-guide-to-recovering-googles-panda-update/
What about Page title and page meta description? I wouldn't personally write my titles and meta descriptions like that. It is probably a good idea to vary them up and make them a bit more unique from one another. If I'm being totally honest, I think your example title tags might work for Google. That would be up to you though if you're willing to take that chance. If everything else on your site is fantastic, and your only issue is those types of title tags, I really don't think Google would give you a problem. Either way, the best thing to do (obviously) is make them more unique. I'm not a personal fan of them being too similar, but I have seen it done like that on a site before and the pages ranked just fine (they were pretty low competition keywords though). Edit: This is the only question I'm not that sure about... your examples might be okay, but I don't want to give you bad advice.
This is my second question on MOZ and your answered both of them.
Hooray! I hope I'm helping you out I've made it a goal of mine to make it to the top 50 in Moz Points before the end of 2014. -
Thanks Philip,
So I need to get rid of this kind of text ( it was an example )
Each page with unique 300 words will be fine in google's eyes?After the text is removed is there any chance to recover from Panda?
What about Page title and page meta description.
Amazon Coupons and Discount Codes for March 2014
Latest Amazon coupons, promo codes and discounts for you to save! Last updated: March 2014.Target Coupons and Discount Codes for March 2014
Latest Target coupons, promo codes and discounts for you to save! Last updated: March 2014.still duplicate content?
This is my second question on MOZ and your answered both of them.
-
The answer is a big, fat, juicy, YES. That is the epitome of duplicate content.
You need to write the content completely unique from the other page. You cannot trick Google. The Panda will bite you hard
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does hover over content index well
i notice increasing cases of portfolio style boxes on site designs (especially wordpress templates) where you have an image and text appears after hover over (sorry for my basic terminology). does this text which appears after hover over have much search engine value or as it doesnt immediately appear on pageload does it carry slightly less weight like tabbed content? any advice appreciated thanks neil
On-Page Optimization | | neilhenderson0 -
Duplicate Content
I'm currently working on a site that sells appliances. Currently, there are thousands of "issues" with this site, many of them dealing with duplicate content. Now, the product pages can be viewed in "List" or "Grid" format. As Lists, they have very little in the way of content. My understanding is that the duplicate content arises from different URLs going to the same site. For instance, the site might have a different URL when told to display 9 items than when told to display 15. This could then be solved by inserting rel = canonical. Is there a way to take a site and get a list of all possible duplicates? This would be much easier than slogging through every iteration of the options and copying down the URLs. Also, is there anything I might be missing in terms of why there is duplicate content? Thank you.
On-Page Optimization | | David_Moceri0 -
Issue: Duplicate Page Content
Hello SEO experts, I'm facing duplicate page content issue on my website. My website is a apartments rental website when client search apartment for availability. Automatic generate same url's. I've already block these url's in robots.txt file but facing same issue. Kindly guide me what can I do. Here are some example links. http://availability.website.com/booking.php?id=17&bid=220
On-Page Optimization | | KLLC
http://availability.website.com/booking.php?id=17&bid=242
http://availability.website.com/booking.php?id=18&bid=214
http://availability.website.com/booking.php?id=18&bid=215
http://availability.website.com/booking.php?id=18&bid=256
http://availability.website.com/details.php?id=17&bid=220
http://availability.website.com/details.php?id=17&bid=242
http://availability.website.com/details.php?id=17&pid=220&bid=220
http://availability.website.com/details.php?id=17&pid=242&bid=242
http://availability.website.com/details.php?id=18&bid=214
http://availability.website.com/details.php?id=18&bid=215
http://availability.website.com/details.php?id=18&bid=256
http://availability.website.com/details.php?id=18&pid=214&bid=214
http://availability.website.com/details.php?id=18&pid=215&bid=215
http://availability.website.com/details.php?id=18&pid=256&bid=256
http://availability.website.com/details.php?id=3&bid=340
http://availability.website.com/details.php?id=3&pid=340&bid=340
http://availability.website.com/details.php?id=4&bid=363
http://availability.website.com/details.php?id=4&pid=363&bid=363
http://availability.website.com/details.php?id=6&bid=367
http://availability.website.com/details.php?id=6&pid=367&bid=367
http://availability.website.com/details.php?id=8&bid=168
http://availability.website.com/details.php?id=8&pid=168&bid=168 Thanks and waiting for your response | |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |0 -
Does schema.org assist with duplicate content concerns
The issue of duplicate content has been well documented and there are lots of articles suggesting to noindex archive pages in WordPress powered sites. Schema.org allows us to mark-up our content, including marking a components URL. So my question simply, is no-indexing archive (category/tag) pages still relevant when considering duplicate content? These pages are in essence a list of articles, which can be marked as an article or blog posting, with the url of the main article and all the other cool stuff the scheme gives us. Surely Google et al are smart enough to recognise these article listings as gateways to the main content, therefore removing duplicate content concerns. Of course, whether or not doing this is a good idea will be subjective and based on individual circumstances - I'm just interested in whether or not the search engines can handle this appropriately.
On-Page Optimization | | MarkCA0 -
Duplicate meta descriptions
Hi all, I'm using Yoast's SEO plugin and when I run a On Page report card here on SEOMOZ it says there are 2 descriptions tags I've been trying to fix this but can't (I'm new!) Anyone any ideas on this? Thanks Elaine
On-Page Optimization | | elaineryan0 -
Prevent indexing of dynamic content
Hi folks! I discovered bit of an issue with a client's site. Primarily, the site consists of static html pages, however, within one page (a car photo gallery), a line of php coding: dynamically generates a 100 or so pages comprising the photo gallery - all with the same page title and meta description. The photo gallery script resides in the /gallery folder, which I attempted to block via robots.txt - to no avail. My next step will be to include a: within the head section of the html page, but I am wondering if this will stop the bots dead in their tracks or will they still be able to pick-up on the pages generated by the call to the php script residing a bit further down on the page? Dino
On-Page Optimization | | SCW0 -
Duplicate content and the Moz bot
Hi Does our little friend at SEOmoz follow the same rules as the search engine bots when he crawls my site? He has sent thousands of errors back to me with duplicate content issues, but I thought I had removed these with nofollow etc. Can you advise please.
On-Page Optimization | | JamieHibbert0 -
Duplicate Content using templates
Hi, Our web site is designed using a template, which means the header and footer is consistent across all pages. Only the body content is unique on each page. Is the google bot able to see that the header and footer content is defined by the common template? Will this have any impact in terms of duplicate content? For example, we have a two line text in the footer that summarize the services we provide. Because the same text is in the footer of all pages, i am concerned about creating duplicate content. Finally, does it make sense to include keywords in header and footer of the template? Will it have any positive or negative SEO impact?
On-Page Optimization | | petersen0