I have a duplicate content problem
-
The website guy that made the website for my business Premier Martial Arts Austin disappeared and didn't set up that www. was to begin each URL, so I now have a duplicate content problem and don't want to be penalized for it. I tried to show in Webmaster tools the preferred setup but can't get it to OK that I'm the website owner. Any idea as what to do?
-
Thanks for the help!
-
Hey Steve,
if you are searching for a good CMS take a look at contao (www.contao.org) a german open source CMS. I run it on a couple of sites such as (www.waescherei-suche.de). It's site-based and has everything out of the box (newsletter, calendar-entries, user management etc.) There are tons of plugins and upgrades for nearly all kind of purposes. If you run a small site with a few pages this might be the thing you are looking for.
Sebastian
EDIT: it's in English..
-
Thank you for helping me through this issue. I was worried I'd be facing a penalty for the duplicate content and am glad to get it corrected. This site has Drupal as it's CMS and it's been a bear trying to change/add/update anything. There are a number of problems with this site - it doesn't show the Description tags even though I have them filled out.You can see this on View Source that there's no Descriptions tags. Any changes I have wanted to make can't be changed like the Facebook button and Alt tags, by me anyway.
I'm working on a new Wordpress website to replace this one with. Again, thank you for your help!
Sincerely,
Steve
-
Hi Steve.
Log into Google WMT and go to the Home page. From there press the "Add a Site" button. You will most likely see the non-www version of your site listed. Add the www version of your site.
Next, go through the process of confirming your site. When your site was originally set up your website developer probably added the code to your web page or web server, so it may already be there. Try to confirm the code without actually adding anything to your site or server. If you receive an error, then go ahead and follow Google's instructions.
Once your site is verified, you will then be able to go to Site Configuration > Settings > Set Preferred Domain.
Decide which version you prefer for your site and stick with it. It looks like you are already set up to use "www" so I would recommend using that URL style unless you had a specific reason to change it.
EDIT: I can't help but offering a bit of feedback regarding your site. These are just suggestions so feel free to disregard any you don't care for:
-
Change the "Facebook" block to the same color as Facebook pages. Your current green color blends in too much with everything else. It needs to stand out and be easy to find.
-
Update your copyright to 2011
-
Add a meta description tag to your pages. This tag wont help you rank better, but it will often be visible to users and may influence whether they click through to your site.
-
Add ALT tags to your images, and try to align your image names with your keywords when possible. Presently they have names such as "teens" where "teen martial arts" might more accurately describe the image, and help you rank better.
You have other opportunities, but the above will help get you moving in the right direction. If you have a chance, I highly recommend reading the SEO Beginners Guide as it contains a lot of great information.
-
-
I would not worry too much about that info. You rank well for a term like "austin tx martial arts" so everything seems good.
On a sitenode: If you ever want to reuse content and check how similar two pages are you could use a service like http://utext.rikuz.com/en/ to test it algorythmically.
Hope this answers your questions.
-
Thank you for your help, but according to the Crawl Dignostics for this website's SEOMoz Campaign, I have 16 pages of duplicate content. Should I discard this info.?
-
A site:pmaaustin.com showed 19 pages indexed. All of which with www.
Calling a non www. version of a page returned
HTTP/1.1 301 Moved Permanently
Date: Mon, 04 Jul 2011 05:16:49 GMT
Server: Apache
Location: http://www.pmaaustin.com/index.php?q=node/18So everythings looks good....
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are feeds bad for duplicate content?
One of my clients has been invited to feature his blog posts here https://app.mindsettlers.com/. Here is an example of what his author page would look like: https://app.mindsettlers.com/author/6rs0WXbbqwqsgEO0sWuIQU. I like that he would get the exposure however I am concerned about duplicate content with the feed. If he has a canonical tag on each blog post to itself, would that be sufficient for the search engines? Is there something else that could be done? Or should he decline? Would love your thoughts! Thanks.
Intermediate & Advanced SEO | | cindyt-17038
Cindy T.0 -
Duplicate content on URL trailing slash
Hello, Some time ago, we accidentally made changes to our site which modified the way urls in links are generated. At once, trailing slashes were added to many urls (only in links). Links that used to send to
Intermediate & Advanced SEO | | yacpro13
example.com/webpage.html Were now linking to
example.com/webpage.html/ Urls in the xml sitemap remained unchanged (no trailing slash). We started noticing duplicate content (because our site renders the same page with or without the trailing shash). We corrected the problematic php url function so that now, all links on the site link to a url without trailing slash. However, Google had time to index these pages. Is implementing 301 redirects required in this case?1 -
Scraped content ranking above the original source content in Google.
I need insights on how “scraped” content (exact copy-pasted version) rank above the original content in Google. 4 original, in-depth articles published by my client (an online publisher) are republished by another company (which happens to be briefly mentioned in all four of those articles). We reckon the articles were re-published at least a day or two after the original articles were published (exact gap is not known). We find that all four of the “copied” articles rank at the top of Google search results whereas the original content i.e. my client website does not show up in the even in the top 50 or 60 results. We have looked at numerous factors such as Domain authority, Page authority, in-bound links to both the original source as well as the URLs of the copied pages, social metrics etc. All of the metrics, as shown by tools like Moz, are better for the source website than for the re-publisher. We have also compared results in different geographies to see if any geographical bias was affecting results, reason being our client’s website is hosted in the UK and the ‘re-publisher’ is from another country--- but we found the same results. We are also not aware of any manual actions taken against our client website (at least based on messages on Search Console). Any other factors that can explain this serious anomaly--- which seems to be a disincentive for somebody creating highly relevant original content. We recognize that our client has the option to submit a ‘Scraper Content’ form to Google--- but we are less keen to go down that route and more keen to understand why this problem could arise in the first place. Please suggest.
Intermediate & Advanced SEO | | ontarget-media0 -
Duplicate content - how to diagnose duplicate content from another domain before publishing pages?
Hi, 🙂 My company is having new distributor contract, and we are starting to sell products on our own webshop. Bio-technology is an industry in question and over 1.000 products. Writing product description from scratch would take many hours. The plan is to re-write it. With permission from our contractors we will import their 'product description' on our webshop. But, I am concerned being penalies from Google for duplicate content. If we re-write it we should be fine i guess. But, how can we be sure? Is there any good tool for comparing only text (because i don't want to publish the pages to compare URLs)? What else should we be aware off beside checking 'product description' for duplicate content? Duplicate content is big issue for all of us, i hope this answers will be helpful for many of us. Keep it hard work and thank you very much for your answers, Cheers, Dusan
Intermediate & Advanced SEO | | Chemometec0 -
Will duplicate content across a .net website and a .ch have negative affects on SERPs?
Hi, I am working with a company that has a .net site and a .ch website that are identical. Will this duplicate content have a negative affect on SERPs? Thanks Ali.B
Intermediate & Advanced SEO | | Bmeisterali0 -
Duplicate content on subdomains.
Hi Mozer's, I have a site www.xyz.com and also geo targeted sub domains www.uk.xyz.com, www.india.xyz.com and so on. All the sub domains have the content which is same as the content on the main domain that is www.xyz.com. So, I want to know how can i avoid content duplication. Many Thanks!
Intermediate & Advanced SEO | | HiteshBharucha0 -
Issue with duplicate content in blog
I have blog where all the pages r get indexed, with rich content in it. But In blogs tag and category url are also get indexed. i have just added my blog in seomoz pro, and i have checked my Crawl Diagnostics Summary in that its showing me that some of your blog content are same. For Example: www.abcdef.com/watches/cool-watches-of-2012/ these url is already get indexed, but i have asigned some tag and catgeory fo these url also which have also get indexed with the same content. so how shall i stop search engines to do not crawl these tag and categories pages. if i have more no - follow tags in my blog does it gives negative impact to search engines, any alternate way to tell search engines to stop crawling these category and tag pages.
Intermediate & Advanced SEO | | sumit600 -
Duplicate content for area listings
Hi, I was slightly affected by the panda update on the 14th oct generaly dropping by about 5-8 spots in the serps for my main keywords, since then I've been giving my site a good looking over. On a site I've got city listings urls for certain widget companys, the thing is many areas and thus urls will have the same company listed. What would be the best way of solving this duplicate content as google may be seeing it? I was thinking of one page per company and prominenly listing the areas they operate so still hopefully get ranked for area searches. But i'd be losing the city names in the url as I've got them now for example: mywidgetsite.com/findmagicwidgets/new-york.html mywidgetsite.com/findmagicwidgets/atlanta.html Any ideas on how best to proceed? Cheers!
Intermediate & Advanced SEO | | NetGeek0