Duplicate Footer Content
-
A client I just took over is having some duplicate content issues. At the top of each page he has about 200 words of unique content. Below this is are three big tables of text that talks about his services, history, etc. This table is pulled into the middle of every page using php. So, he has the exact same three big table of text across every page.
What should I do to eliminate the dup content. I thought about removing the script then just rewriting the table of text on every page... Is there a better solution?
Any ideas would be greatly appreciated.
Thanks!
-
We currently have footer content contained in a single php include file and is included in every page and contains the following:
-
Most recent 3 tweets from our twitter feed
-
Snippets of our 3 most recent blogs posts
-
navigation links to our main pages (essentially the same as our main navigation in the header)
Is this bad practice for footer content?
-
-
If it can be useful copy that is unique to every page, it might make sense to do it. If not, it might be easier to just convert it into an image as Streamline recommended and instead add other copy on the page that is more in sync with what the page is about.
Presuming that copy in the middle is something like "Why Buy From Us" kind of thing, how many different ways can you really write it, at the same time being useful to the user.
-
Yeah, it is about 40 pages. So... I think we will just have to rewrite the content. He does want to rank these pages. They are actually ranking in the top 10 now with about 80% dup content.
thanks for the help!
-
If you can rewrite the text on each page, then that's what I would do. But if the site has thousands of pages and it's not possible to rewrite the content on every single page, then I'd suggest either moving each section of text to its own dedicated page (like History would be its own page with its own link in the navigation) OR if you think the content is important enough to be seen on every page, then you could try putting the text as an image...
Your site won't necessarily be penalized for having the same text on every page but you certainly won't benefit from it since the key is to make every single page as unique and high quality as possible.
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Magento Multistore and Duplicate Content
Hey all, I am currently optimizing a Magento Multistore running with two store views (one per language). Now when I switch from one language to another the urls shows: mydomain.de/.../examplepage.html?___store=german&___from_store=english The same page can also be reached by just entering mydomain.de/.../examplepage.html The question is: Does Google consider this as Duplicate Content or it it nothing to worry about? Or should I just do a dynamic 301 redirect from the 1st version to the 2nd? I read about some hacks posted in diferent magento forums but as I am working for a customer I want to avoid hacks. Also setting "Add Store Code to Urls" didn't help.
Technical SEO | | dominator0 -
Duplicate Page Content / Rel Canonical
Hi, The diagnostics shows me that I have 590 Duplicate Page Content , but when it shows the Rel Canonical I have over 1000, so dose that mean I have no Duplicate Page Content problem? Please help.
Technical SEO | | Joseph-Green-SEO0 -
Duplicate Content on SEO Pages
I'm trying to create a bunch of content pages, and I want to know if the shortcut I took is going to penalize me for duplicate content. Some background: we are an airport ground transportation search engine(www.mozio.com), and we constructed several airport transportation pages with the providers in a particular area listed. However, the problem is, sometimes in a certain region multiple of the same providers serve the same places. For instance, NYAS serves both JFK and LGA, and obviously SuperShuttle serves ~200 airports. So this means for every airport's page, they have the super shuttle box. All the provider info is stored in a database with tags for the airports they serve, and then we dynamically create the page. A good example follows: http://www.mozio.com/lga_airport_transportation/ http://www.mozio.com/jfk_airport_transportation/ http://www.mozio.com/ewr_airport_transportation/ All 3 of those pages have a lot in common. Now, I'm not sure, but they started out working decently, but as I added more and more pages the efficacy of them went down on the whole. Is what I've done qualify as "duplicate content", and would I be better off getting rid of some of the pages or somehow consolidating the info into a master page? Thanks!
Technical SEO | | moziodavid0 -
Tags and Duplicate Content
Just wondering - for a lot of our sites we use tags as a way of re-grouping articles / news / blogs so all of the info on say 'government grants' can be found on one page. These /tag pages often come up with duplicate content errors, is it a big issue, how can we minimnise that?
Technical SEO | | salemtas0 -
404's and duplicate content.
I have real estate based websites that add new pages when new listings are added to the market and then deletes pages when the property is sold. My concern is that there are a significant amount of 404's created and the listing pages that are added are going to be the same as others in my market who use the same IDX provider. I can go with a different IDX provider that uses IFrame which doesn't create new pages but I used a IFrame before and my time on site was 3min w/ 2.5 pgs per visit and now it's 7.5 pg/visit with 6+min on the site. The new pages create new content daily so is fresh content and better on site metrics (with the 404's) better or less 404's, no dup content and shorter onsite metrics better? Any thoughts on this issue? Any advice would be appreciated
Technical SEO | | AnthonyLasVegas0 -
Is it possible to be penalized as duplicate content for one keyword but not another?
I help develop an online shopping cart and after a request from management about some products not showing up in the SERP's I was able to pinpoint it down to mostly a duplicate content issue. It's a no brainer as some times new products are inserted in with copied text from the manufacturers website. I recently though stumbled across a odd problem. When we partially re-wrote the content to seem unique enough it seemed to remedy the issue for some keywords and not others. A) If you search the company name our category listing shows as #1 ahead of the manufacturers website. We always did rank for this term. B) If you search the product name our product page is listed #3 behind two other listings which belong to the manufacturer. C) If you search the keywords together as "company product" we are still being filtered out as duplicate content. When I allow the filtered results to show we are ranking #4 It's been a full month since the changes were indexed. Before I rewrite the content even further I thought I would ask to see if any one has any insight as to what could be happening.
Technical SEO | | moondog6040 -
Using robots.txt to deal with duplicate content
I have 2 sites with duplicate content issues. One is a wordpress blog. The other is a store (Pinnacle Cart). I cannot edit the canonical tag on either site. In this case, should I use robots.txt to eliminate the duplicate content?
Technical SEO | | bhsiao0 -
Up to my you-know-what in duplicate content
Working on a forum site that has multiple versions of the URL indexed. The WWW version is a top 3 and 5 contender in the google results for the domain keyword. All versions of the forum have the same PR, but but the non-WWW version has 3,400 pages indexed in google, and the WWW has 2,100. Even worse yet, there's a completely seperate domain (PR4) that has the forum as a subdomain with 2,700 pages indexed in google. The dupe content gets completely overwhelming to think about when it comes to the PR4 domain, so I'll just ask what you think I should do with the forum. Get rid of the subdomain version, and sometimes link between two obviously related sites or get rid of the highly targeted keyword domain? Also what's better, having the targeted keyword on the front of Google with only 2,100 indexed pages or having lower rankings with 3,400 indexed pages? Thanks.
Technical SEO | | Hondaspeder0