Duplicate content due to credit card testing
-
I recently launched a site -
http://www.footballtriviaquestions.co.uk
and the site uses Paypal. In order to test the PayPal functionality I set up a zapto.org domain via a permanent IP service that points directly to the computer I've written the website on.
It appears that Google has now indexed the zapto.org website.
Will this cause problems to my main website, as the zapto.org website will pretty much contain content that is an exact duplicate of what is held on the main website.
I've looked in Google webmaster tools for the main website and it doesn't mention any duplicate content, but I'm currently not in the top 50 ranking for "football trivia questions' on Google despite SEOMoz ranking my home page with an A rating. The page does rank at position 16 in Yahoo and Bing. This seems odd to me, although I do have very few back links pointing to my site.
If the duplicate content is likely to be causing me problems what would be the best way to knock the zapto.org results out of Google
-
Thanks for the info John, I've added the tags you suggest. Does anyone know how likely this is to have affected the main sites Google rankings?
Also, is there any way to know if Google has removed the zapto site from it's index. I've been doing a number of 'site:<sitename>.zapto.org' searches over the past few days and have seen no decrease in the number of pages that are indexed.</sitename>
-
Yes, it will be seen as duplicate content .... They are both identical websites.
Firstly I would immediately add rel canonical tag to each page indexed on the zapto.org website pointing to the page on your main website. I would alos add noidex meta tag to the zapota pages also
You could also set up redirects on the Zapota website to redirect to your main website
Lastly, you could remove the Zapota website, its just a subdomain of the domain Zapota, or that is how it appears in the search results
Hope that helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is this going to be seen by google as duplicate content
Hi All, Thanks in advance for any help that you can offer in regards to this. I have been conducted a bit of analysis of our server access file to see what googlebot is doing, where it is going etc. Now firstly, I am not SEO but have an interest. What I am seeing a lot of is that we have URL's that have an extension that sets the currency that is displayed on the products so that we can conduct Adwords campaigns in other countries, these show as follows: feedurl=AUD, feedurl=USD, feedurl=EUR etc. What I can see is that google bot is hitting a URL such as /some_product, then /someproduct?feedurl=USD and then /someproduct?feedurl=EUR and then /someproduct?feedurl=AUD all after each other. Now this is the same product page and just has the price shown slightly different on each. Would this count as a duplicate content issue? Should I disavow feedurl? Any assistance that you can offer would be greatly appreciated. Thanks, Tim
Technical SEO | | timsilver0 -
Duplicate Content from Multiple Sources Cross-Domain
Hi Moz Community, We have a client who is legitimately repurposing, or scraping, content from site A to site B. I looked into it and Google recommends the cross-domain rel=canonical tag below: http://googlewebmastercentral.blogspot.com/2009/12/handling-legitimate-cross-domain.html The issue is it is not a one to one situation. In fact site B will have several pages of content from site A all on one URL. Below is an example of what they are trying to accomplish. EX - www.siteB.com/apples-and-oranges is made up of content from www.siteA.com/apples & www.siteB.com/oranges So with that said, are we still in fear of getting hit for duplicate content? Should we add multiple rel=canonical tags to reflect both pages? What should be our course of action.
Technical SEO | | SWKurt0 -
Duplicated content in news portal: should we use noindex?
Hello, We have a news portal, and like other newspapers we have our own content and content from other contributors. Both our content and our contributors content can be found in other websites (we sell our content and they give theirs to us). In this regard, everything seems to work fine from the business and users perspective. The problem is that this means duplicated content... so my question is: "Should we add the noindex,nofollow" tag to these articles? Notice that there might be hundreds of articles everyday, something like a 1/3 of the website. I checked one newspaper which uses news from agencies, but they seem not to use any noindex tag. Not sure what others do. I would appreciate any opinion on that.
Technical SEO | | forex-websites0 -
Tags and Duplicate Content
Just wondering - for a lot of our sites we use tags as a way of re-grouping articles / news / blogs so all of the info on say 'government grants' can be found on one page. These /tag pages often come up with duplicate content errors, is it a big issue, how can we minimnise that?
Technical SEO | | salemtas0 -
Is there ever legitimate near duplicate content?
Hey guys, I’ve been reading the blogs and really appreciate all the great feedback. It’s nice to see how supportive this community is to each other. I’ve got a question about near duplicate content. I’ve read a bunch of great post regarding what is duplicate content and how to fix it. However, I’m looking at a scenario that is a little different from what I’ve read about. I’m not sure if we’d get penalized by Google or not. We are working with a group of small insurance agencies that have combined some of their back office work, and work together to sell the same products, but for the most part act as what they are, independent agencies. So we now have 25 different little companies, in 25 different cities spread across the southeast, all selling the same thing. Each agency has their own URL, each has their own Google local places registration, their own backlinks to their local chambers, own contact us and staff pages, etc. However, we have created landing pages for each product line, with the hopes of attracting local searches. While we vary each landing page a little per agency (the auto insurance page in CA talks about driving down the 101, while the auto insurance page in Georgia says welcome to the peach state) probably 75% of the land page content is the same from agency to agency. There is only so much you can say about specific lines of insurance. They have slightly different titles, slightly different headers, but the bulk of the page is the same. So here is the question, will Google hit us with a penalty for having similar content across the 25 sites? If so, how do you handle this? We are trying to write create content, and unique content, but at the end of the day auto insurance in one city is pretty much the same as in another city. Thanks in advance for your help.
Technical SEO | | mavrick0 -
Google Duplicate Content Penalty On My Own Site?
I am certain that I have hit a google penalty filter for my site http://www.playpokeronline.ca for my main keywords "play poker online" in google.ca I rank 670th and used to be on the first page between 1 and 10 in June. On Bing I am like 9th On my site I found the entire site duplicated as follows Original: www.playpokeronline.ca Duplicate www.playpokeronline.ca/playpokeronline/ this duplicate was not intentional and seems to be a result of my hosting at godaddy. for every page on my site and it shows up in webmaster tools I blocked the duplicate with robots.txt and a few days ago dropped it and wrote a rel=connonical tag in the top of each page visitors dropped from 100 per day in august to 12-20 in the last month. Google says that if duplicate content is made to try to game serps they may filter or penalize my site. Have I triggered this penalty or a different sort of over optimization penalty? Will the rel= canonical tags fix this or should i do something else? This Penalty Business is Not my Idea of a good time Thank You Jeb
Technical SEO | | PokerCanada0 -
Duplicate Content from Google URL Builder
Hello to the SEOmoz community! I am new to SEOmoz, SEO implementation, and the community and recently set up a campaign on one of the sites I managed. I was surprised at the amount of duplicate content that showed up as errors and when I took a look in deeper, the majority of errors were caused by pages on the root domain I put through Google Analytics URL Builder. After this, I went into webmaster tools and changed the parameter handling to ignore all of the tags the URL Builder adds to the end of the domain. SEOmoz recently recrawled my site and the errors being caused by the URL Builder are still being shown as duplicates. Any suggestions on what to do?
Technical SEO | | joshuaopinion0 -
Duplicate Content Question
Just signed up for pro and did my first diagnostic check - I came back with something like 300 duplicate content errors which suprised me because every page is unique. Turns out my pages are listed as www.sportstvjobs.com and just sportstvjobs.com does that really count as duplicate? and if so does anyone know what I should be doing differently? I thought it was just a canonical issue, but best I can tell I have the canonical in there but this still came up as a duplicate error....maybe I did canonical wrong, or its some other issue? Thanks Brian Clapp
Technical SEO | | sportstvjobs0