Best Way to Incorporate FAQs into Every Page - Duplicate Content?
-
Hi Mozzers,
We want to incorporate a 'Dictionary' of terms onto quite a few pages on our site, similar to an FAQ system.
The 'Dictionary' has 285 terms in it, with about 1 sentence of content for each one (approximately 5,000 words total).
The content is unique to our site and not keyword stuffed, but I am unsure what Google will think about us having all this shared content on these pages.
I have a few ideas about how we can build this, but my higher-ups really want the entire dictionary on every page. Thoughts?
Image of what we're thinking here - http://screencast.com/t/GkhOktwC4I
Thanks!
-
Me too! Where all my mozzers at?
-
We try to never use them for a few reasons, after talking to our dev team here...
Bad For SEO
Linking/Bookmarks
Difficulty with Debugging
No real performance gainsI would really consider a separate page that may well have some real SEO value with a few good terms and explanations on.
Would like to hear other opinions on this...
-
Another option I am thinking is to include this section in an iFrame, since I know iFrames are not read by search engines.
What do you think about that solution?
-
Ok, i see, it may be more useful to have them as a seperate page, but that is probably a whole different debate and highly subjective.
So what i have looked at is this...
...since there are so many legitimate uses for hiding content with
display: none;
when creating interactive features, that sites aren't automaticallypenalised for content that is hidden this way (so long as it doesn't look algorithmically spammy).Google's Webmaster guidelines also make clear that a good practice when using content that is initially legitimately hidden for interactivity purposes is to also include the same content in a
<noscript></code> tag, and Google recommend that if you design and code for users including users with screen readers or javascript disabled, then 9 times out of 10 good relevant search rankings will follow (though their specific advice seems more written for cases where javascript writes new content to the page).</em></p> <blockquote> <p><em><strong>JavaScript:</strong> Place the same content from the JavaScript in a tag. If you use this method, ensure the contents are exactly the same as what’s contained in the JavaScript, and that this content is shown to visitors who do not have JavaScript enabled in their browser.</em></p> </blockquote> <p><em>So, best practice seems pretty clear.</em></p> <p><em><strong>What I can't find out is</strong>, however, the simple factual matter of whether hidden content is indexed by search engines (but with potential penalties if it looks 'spammy'), or, whether it is ignored, or, whether it is indexed but with a lower weighting (<a href="http://webmasters.stackexchange.com/questions/1685/is-content-inside-a-noscript-tag-indexed-by-search-indexes">like <code><noscript></code> content is, apparently</a>).</em></p> <p>That was from another SEO site, what i would say is that Google doesn't 'penalise' for duplicate content so would it be a disaster to try it, see if it is picked up as dupe and then change if necessary?</p></noscript>
-
Thanks for the response.
Here is a crude image of what we're thinking - http://screencast.com/t/GkhOktwC4I
The text would be hidden/displayed via javascript, so it would not really affect the user's experience in a negative way.
-
Hi,
How would it be displayed? Wouldn't it be just as useful to have it open in a new window that the user could keep open? If you need to display 250+ words plus a sentence for each then the user would not be able to see the content they are/were interested in.
You could then have a link to it on each page....
Do your 'higher ups' embrace user experience and how it affects people's browsing? Maybe an education job... Good luck!
Not sure if that helped, but just my opinion.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content through 'Gclid'
Hello, We've had the known problem of duplicate content through the gclid parameter caused by Google Adwords. As per Google's recommendation - we added the canonical tag to every page on our site so when the bot came to each page they would go 'Ah-ha, this is the original page'. We also added the paramter to the URL parameters in Google Wemaster Tools. However, now it seems as though a canonical is automatically been given to these newly created gclid pages; below https://www.google.com.au/search?espv=2&q=site%3Awww.mypetwarehouse.com.au+inurl%3Agclid&oq=site%3A&gs_l=serp.3.0.35i39l2j0i67l4j0i10j0i67j0j0i131.58677.61871.0.63823.11.8.3.0.0.0.208.930.0j3j2.5.0....0...1c.1.64.serp..8.3.419.nUJod6dYZmI Therefore these new pages are now being indexed, causing duplicate content. Does anyone have any idea about what to do in this situation? Thanks, Stephen.
Intermediate & Advanced SEO | | MyPetWarehouse0 -
Help: How to optimize my duplicate category pages
Hi all My category pages will showcase the same products how do I go about optimizing these pages so they don't show up as duplicate content? Would appreciate your all feedaback! Thanks
Intermediate & Advanced SEO | | edward-may0 -
Best method for blocking a subdomain with duplicated content
Hello Moz Community Hoping somebody can assist. We have a subdomain, used by our CMS, which is being indexed by Google.
Intermediate & Advanced SEO | | KateWaite
http://www.naturalworldsafaris.com/
https://admin.naturalworldsafaris.com/ The page is the same so we can't add a no-index or no-follow.
I have both set up as separate properties in webmaster tools I understand the best method would be to update the robots.txt with a user disallow for the subdomain - but the robots text is only accessible on the main domain. http://www.naturalworldsafaris.com/robots.txt Will this work if we add the subdomain exclusion to this file? It means it won't be accessible on https://admin.naturalworldsafaris.com/robots.txt (where we can't create a file). Therefore won't be seen within that specific webmaster tools property. I've also asked the developer to add a password protection to the subdomain but this does not look possible. What approach would you recommend?0 -
We are switching our CMS local pages from a subdomain approach to a subfolder approach. What's the best way to handle this? Should we redirect every local subdomain page to its new subfolder page?
We are looking to create a new subfolder approach within our website versus our current subdomain approach. How should we go about handling this politely as to not lose everything we've worked on up to this point using the subdomain approach? Do we need to redirect every subdomain URL to the new subfolder page? Our current local pages subdomain set up: stores.websitename.com How we plan on adding our new local subfolder set-up: websitename.com/stores/state/city/storelocation Any and all help is appreciated.
Intermediate & Advanced SEO | | SEO.CIC0 -
Frequent FAQs vs duplicate content
It would be helpful for our visitors if we were to include an expandable list of FAQs on most pages. Each section would have its own list of FAQs specific to that section, but all the pages in that section would have the same text. It occurred to me that Google might view this as a duplicate content issue. Each page _does _have a lot of unique text, but underneath we would have lots of of text repeated throughout the site. Should I be concerned? I guess I could always load these by AJAX after page load if might penalize us.
Intermediate & Advanced SEO | | boxcarpress0 -
What's the best way to manage content that is shared on two sites and keep both sites in search results?
I manage two sites that share some content. Currently we do not use a cross-domain canonical URL and allow both sites to be fully indexed. For business reasons, we want both sites to appear in results and need both to accumulate PR and other SEO/Social metrics. How can I manage the threat of duplicate content and still make sure business needs are met?
Intermediate & Advanced SEO | | BostonWright0 -
Duplicate content clarity required
Hi, I have access to a masive resource of journals that we have been given the all clear to use the abstract on our site and link back to the journal. These will be really useful links for our visitors. E.g. http://www.springerlink.com/content/59210832213382K2 Simply, if we copy the abstract and then link back to the journal source will this be treated as duplicate content and damage the site or is the link to the source enough for search engines to realise that we aren't trying anything untoward. Would it help if we added an introduction so in effect we are sort of following the curating content model? We are thinking of linking back internally to a relevant page using a keyword too. Will this approach give any benefit to our site at all or will the content be ignored due to it being duplicate and thus render the internal links useless? Thanks Jason
Intermediate & Advanced SEO | | jayderby0 -
HTTPS Duplicate Content?
I just recieved a error notification because our website is both http and https. http://www.quicklearn.com & https://www.quicklearn.com. My tech tells me that this isn't actually a problem? Is that true? If not, how can I address the duplicate content issue?
Intermediate & Advanced SEO | | QuickLearnTraining0