What's the best way to tackle duplicate pages in a blog?
-
We installed a WP blog on a website and the below result is just an example. All of them lead to the same content. What's the best way to resolve it?
http://www.calmu.edu/blog/
http://www.calmu.edu/blog/calmu-business-spotlight-veev/
http://www.calmu.edu/blog/category/business-buzz/ -
Thanks..
-
No, almost all blogs have this kind of problem, joomla also. But in Joomla i'm more accustomed to solve them. On Joomla i use YOOtheme ZOO that is a great Blog tool, for SEO I use Acesef, which is integrated to ZOO. I can configure this duplicate problem on Acesef Zoo Extension easily.
But using WP you will find a lot of plugins for SEO and will solve this : D
-
Thanks Joao. Are you saying blogs on joomla platform do not have this problem?
-
Do you think it would work. What are the negative aspects of using a redirect?
-
I did see this page but I was unsure of how to guide my developer on putting canonical links. Can you help?
-
Thanks a lot @ParagonDigital. This issue was detected after the SEOMoz crawl of the website and they are listing all these pages as duplicate pages.
-
Hi Sangeeta,
It looks like you are using the Wordpress All-in-One SEO Pack which should take care of most duplicate pages in Category and Archives using canonicalization.
Both of these pages:
http://www.calmu.edu/blog/calmu-business-spotlight-veev/
http://www.calmu.edu/blog/category/business-buzz/have this line added by the SEO Plugin which tells the bots that the page in the blog directory is the real page and to consider the category and archive versions of the page the same page.
rel="canonical" href="http://www.calmu.edu/blog/calmu-business-spotlight-veev/" />
The home page on a blog is always going to have some duplication for the recent posts. The only thing I know of that you could do for that would be to have Wordpress display blurbs for the post with a read more link instead of the whole post on the home page.
-
I'd make sure you have canonical links too. Google has a page dedicated to them here: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=139394
-
Usually creating a 301 redirect in your HTAcces file is an effective way of dealing with duplicate content. You could use the free redirect generator if you're not too familiar with writing htaccess files.
-
I think this plugin solves your problem, I don't use wordpress a lot, I prefer joomla
but I've heard this for WP:
http://wordpress.org/extend/plugins/platinum-seo-pack/
You can use this script on header.php to make to:
if((is_home() && ($paged < 2 )) || is_single() || is_page() || is_category()){
echo '';
} else {
echo '';
}
I would prefer to use the plugin, because you can make others things god to SEO on-page
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Single Page on my client's website is not crawling and indexing new changes. What could be the possible reason?
I made several changes on client's website on different pages, changed titles, add content on few pages, moved blog from subdomain to sub directory. Everything is crawled but there is one page on the website (not part of the blog) that isn't getting crawled in Google and picking up changes. The last crawl of the website is 2 days back whereas that page was last crawled on 30th sep. I just wanted to know the possible reasons and has anyone encountered this before?
On-Page Optimization | | MoosaHemani0 -
Should I remove 'local' landing pages? Could these be the cause of traffic drop (duplicate content)?
I have a site that has most of it's traffic from reasonably competitive keywords each with their own landing page. In order to gain more traffic I also created landing pages for counties in the UK and then towns within each county. Each county has around 12 towns landing pages within the county. This has meant I've added around 200 extra pages to my site in order to try and generate more traffic from long tail keywords. I think this may have caused an issue in that it's impossible for me to create unique content for each town/country and therefore I took a 'shortcut' buy creating unique content for each county and used the same content for the towns within it meaning I have lots of pages with the same content just slightly different page titles with a variation on town name. I've duplicated this over about 15 counties meaning I have around 200 pages with only about 15 actual unique pages within them. I think this may actually be harming my site. These pages have been indexed for about a year an I noticed about 6 months ago a drop in traffic by about 50%. Having looked at my analytics this town and county pages actually only account for about 10% of traffic. My question is should I remove these pages and by doing so should I expect an increase in traffic again?
On-Page Optimization | | SamCUK0 -
What's better for SEO a page per review or a page with all reviews?
Was wondering what's better for SEO. We have a platform where consumers can read and write reviews. But the question is: is it better to give one page per company with all the reviews on it? Or should we have different pages for the specific company? Example: Itunes has a company page with all reviews on the page, but not the whole review. You have to click further to view the whole review (new page), at the moment this the current situation. What if we place the whole reviews on the company page, so you don't have specific pages for the reviews? Hopefully can someone help us out. Contact me if it's not clear or you want more extended information. Kind regards
On-Page Optimization | | MozzieJr0 -
Description tag not showing in the SERPs because page is blocked by Robots, but the page isn't blocked. Any help?
While checking some SERP results for a few pages of a site this morning I noticed that some pages were returning this message instead of a description tag, A description for this result is not avaliable because of this site's robot.s.txt The odd thing is the page isn't blocked in the Robots.txt. The page is using Yoast SEO Plugin to populate meta data though. Anyone else had this happen and have a fix?
On-Page Optimization | | mac22330 -
H2's vs Meta description
in some of my serp results the h2's are showing up instead of the meta description. i have read that H2's arent really valid anymore. can someone clarify this for me?
On-Page Optimization | | dhanson240 -
Long tail traffic - what is the best way to go back and add focus to repetitive long tail keywords?
Hey everybody, So, our niche doesn't have a million and a half searches per month, which makes a handle full of visitors look mighty enticing to a CMO Our price point is very high too, so to the question, is it worth taking the time to put a whole new content strategy in line for a few new visitors, the answer is yes. Now's the hard part. How on earth do I make 1,000 pages for similar topics? Is making new pages the best way to go about this? (probably so right? It's the only thing that I can see that would certainly increase likelihood of being more relevant, plus if I don't I will be missing out on the benefits of beefing up our site, AND the opportunity to more specifically answer a users query.) With phrases like "keyword" and "aftermarket keyword," the searcher is asking for two totally separate collections of results. I'm always reading about the importance of being there throughout the buyers complete purchasing /research process, which makes me think that considering doing anything other than creating unique pages is simply missing out.. Suggestions? Massive Content Strategy Help? Anybody? Thanks, TA
On-Page Optimization | | TylerAbernethy0 -
How do I avoid duplicate content and page title errors when using a single CMS for a website
I am currently hosting a client site on a CMS with both a Canadian and USA version of the website. We have the .com as the primary domain and the .ca is re-directed from the registrar to the Canadian home page. The problem I am having is that my campaign produces errors for duplicate page content and duplicate page titles. Is there a way to setup the two versions on the CMS so that these errors do not get produced? My concern is getting penalized from search engines. Appreciate any help. Mark Palmer
On-Page Optimization | | kpreneur0 -
Best way to optimize site for Google Maps?
I am working with a site right now and they are ranked #1
On-Page Optimization | | WhiteHat12
for many keyword phrases based on their location and service. Their service is an
Insurance agency, so they rank #1 for many keywords like “Miami Insurance” or “South
Florida Insurance Agency” (their not actually ranked for Miami
just giving an example). I also include their address on every page of the site
(maybe that better helps Google maps?). Problem I am having is when searched
just the keyword phrase they rank number 1 but when searching say “insurance”
while being logged in to the area they rank for they do not come up. I hear that
there might be specific ways to optimize for this. What I would like to know is
what would I have to do to optimize for Google maps and what’s everything I possibly
can do. I am good with search engine optimization but have never really dabbled
much with Google Maps, I always thought they just ranked you based on your
address.0