Duplicate Content
-
I have a question about duplicate content. (auto generated text).
Will google consider page 1 and page 2 as duplicate content?Page 1.
You will find all the Amazon coupon codes and Amazon discount codes currently available listed below, if Amazon doesn't currently have any coupons available you may want to check for Amazon deals or find related coupon codes or promotional codes for similar online stores selling the same products as amazon.
We always have the latest coupon codes for Amazon which are updated daily, so if you can't find any Amazon coupons here then you won't find them anywhere else.
Shop online today at Amazon, and take advantage of the coupon codes that Amazon currently has on offer, these coupon codes, offer codes, and promo codes for Amazon may never be available again.Page 2.
You will find all the Target coupon codes and Target discount codes currently available listed below, if Target doesn't currently have any coupons available you may want to check for Target deals or find related coupon codes or promotional codes for similar online stores selling the same products as Target.
We always have the latest coupon codes for Target which are updated daily, so if you can't find any Target coupons here then you won't find them anywhere else.
Shop online today at Target, and take advantage of the coupon codes that Target currently has on offer, these coupon codes, offer codes, and promo codes for Target may never be available again. -
Sent you a PM
-
Hi,
Thanks so much!
Is possible to get in touch with you by email or skype?930240C809194680B0F8E988F699E00B.PROTECT # WHOISGUARD # COM Email used for thatswhatphilsaid
-
Each page with unique 300 words will be fine in google's eyes?
If you have 300 words on each page, as long as it's useful content that people are sticking around to read, then you should be okay. Your end goal should be to provide value to your visitors. If 300 words is plenty of content for the subject of your pages, then you're okay. If you have a blog about quantum physics and you only write 300 words per page... you might not be so okay anymoreAfter the text is removed is there any chance to recover from Panda? If your site is penalized by Panda, and you make adjustments to fix the issues you were once penalized for, yes, you can certainly recover. It's possible that duplicate content isn't your only issue, and there may be more to fix. Again, this is assuming you're penalized by Panda. I found a really good post about Panda recovery a couple weeks ago. Lucky for you, I bookmarked it! http://www.ventureharbour.com/panda-recovery-a-guide-to-recovering-googles-panda-update/
What about Page title and page meta description? I wouldn't personally write my titles and meta descriptions like that. It is probably a good idea to vary them up and make them a bit more unique from one another. If I'm being totally honest, I think your example title tags might work for Google. That would be up to you though if you're willing to take that chance. If everything else on your site is fantastic, and your only issue is those types of title tags, I really don't think Google would give you a problem. Either way, the best thing to do (obviously) is make them more unique. I'm not a personal fan of them being too similar, but I have seen it done like that on a site before and the pages ranked just fine (they were pretty low competition keywords though). Edit: This is the only question I'm not that sure about... your examples might be okay, but I don't want to give you bad advice.
This is my second question on MOZ and your answered both of them.
Hooray! I hope I'm helping you out I've made it a goal of mine to make it to the top 50 in Moz Points before the end of 2014. -
Thanks Philip,
So I need to get rid of this kind of text ( it was an example )
Each page with unique 300 words will be fine in google's eyes?After the text is removed is there any chance to recover from Panda?
What about Page title and page meta description.
Amazon Coupons and Discount Codes for March 2014
Latest Amazon coupons, promo codes and discounts for you to save! Last updated: March 2014.Target Coupons and Discount Codes for March 2014
Latest Target coupons, promo codes and discounts for you to save! Last updated: March 2014.still duplicate content?
This is my second question on MOZ and your answered both of them.
-
The answer is a big, fat, juicy, YES. That is the epitome of duplicate content.
You need to write the content completely unique from the other page. You cannot trick Google. The Panda will bite you hard
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicated content by the product pages
Hi,Do you thing those pages have duplicate content:https://www.nobelcom.com/Afghanistan-phone-cards/from-Romania-235-2.htmlhttps://www.nobelcom.com/Afghanistan-phone-cards-2.htmlhttps://www.nobelcom.com/Afghanistan-Cell-phone-cards-401.htmlhttps://www.nobelcom.com/Afghanistan-Cell-phone-cards/from-Romania-235-401.html.And also how much impact will it have on a panda update?I'm trying to figure out if all the product pages, (that are in the same way as the ones above) are the reson for a Panda Penalty
On-Page Optimization | | Silviu0 -
301 redirected Duplicate Content, still showing up as duplicate after new crawl.
We launched a site where key landing pages were not showing up in google. After running the seomoz crawl it returned a lot of duplicate pages which may expalin this. The actual url of the page is /design and it was telling me the following were dupes: /design/family-garden-design
On-Page Optimization | | iterate
/design/small-garden-design
/design/large-rural-garden-design
/Design All of these URL's were in fact pointing to the /design landing page. I 301 redirected all of the pages so they all now resolve to /design After running another crawl the day after doing this it's still showing up as duplicate content on seomoz. Does seomoz evaluate the new changes right away?0 -
What content is apropriate here
Hello, I've got a dozen good articles in my article section, but nobody is landing on them. Should we write articles about our products? Won't that compete with our product pages?
On-Page Optimization | | BobGW0 -
Duplicate meta descriptions
Hi all, I'm using Yoast's SEO plugin and when I run a On Page report card here on SEOMOZ it says there are 2 descriptions tags I've been trying to fix this but can't (I'm new!) Anyone any ideas on this? Thanks Elaine
On-Page Optimization | | elaineryan0 -
Article on site and distribution, is it duplicate content?
I was always taught to place all original articles on site, let them get indexed by Google, then put out for distribution through various press release outlets. With the latest penguin update, how does this practice work out concerning duplicate content? In theory, I wrote the article so I should get credit for it on my site first, then push through various distribution outlets to get it out to my targeted audience in my niche field. Typing out loud I would tend to think if the article is on my site first then I would get credit and any others following would be hit by duplicate content if in fact google considered it a dupe violation. Any input on this? Am I on track or am I heading for a train wreck.
On-Page Optimization | | anthonytjm0 -
How could I avoid the "Duplicate Page Content" issue on the search result pages of a webshop site?
My webshop site was just crawled by Roger, and it found 683 "Duplicate Page Content" issues. Most of them are result pages of different product searches, that are not really identical, but very similar to each other. Do I have to worry about this? If yes, how could I make the search result pages different? IS there any solution for this? Thanks: Zoltan
On-Page Optimization | | csajbokz0 -
Blog content on homepage - Dupe Content Penalty?
Hi All, I am working on a website which has a blog at domain.com/blog/ On the homepage they are currently looping the latest 5 blog posts in a 'Latest News' tab. Is this therefore classed as dupe content, and would this be penalized by Google? Should I recommend they use the excerpts instead of full articles and simply loop the excerpts on the homepage? The website is built on WordPress. Thanks, Woody
On-Page Optimization | | seowoody1 -
Magento Layered Navigation & Duplicate Content
Hello Dear SeoMoz, I would like to ask your help with something that I am not sure off. Our ecommerce web site is built with Magento. I have found many problems so far and I know that there will be many more in the future. Currently, I am trying to find the best way to deal with the duplicate content that is produced from the layered navigation (size, gender etc). I have done a lot of research so far in order to understand which might be the best practice and I found the following practices: **Block layered navigation URLSs from the Google Webmaster Tools (**Apparently this works for Google Only). Block these URLs with the robots.txt file Make links no-follow **Make links JavaScript from Magento *** Avoid including these links in the xml site map. Avoid including these link in the A-Z Product Index. Canonical tag Meta Tags (noindex, nofollow) Question If I turn the layered navigation links into JavaScript links from the Magento Admin, the layered navigation links are still found by the crawlers but they look like that: | http://www.mysite.com/# instead of: http://www.mysite.com/girls-basics.html?gender_filte... | Can these new URLS (http://www.mysite.com/# ) solve the duplicate content problems with the layered navigation or do I need to implement other practices too to make sure that everything is done right. Kind Regards Stefanos Anastasiadis
On-Page Optimization | | alexandalexaseo0