Repeating Content Within Code On Many Pages
-
Hi,
This is sort of a duplicate content issue, but not quite. I'm concerned with the way our code is written and whether or not it can cause problems in the future.
On many of our pages (thousands), our users will have the option to post comments. We have a link which opens a JavaScript pop-up with our comments guidelines. It's a 480 word document of original text, but it's preloaded in the source code of every page it appears on.
The content on these pages will be relatively thin immediately, and many will have thin content throughout. I'm afraid so many of our pages look the same in both code and on-site content that we'll have issues down the line.
Admittedly, I've never dealt with this issue before, so I'm curious. Is having a 480 word piece of text in the source code on so many pages an issue, or will Google consider it part of the template, similar to footer/sidebar/headers?
If it's an issue, we can easily make it an actual pop-up hosted on a SINGLE page, but I'm curious if it's a problem.
Thanks!
-
Jane/Martin,
Thanks for the responses. I just wanted to verify what I was thinking before making a move. I agree with both of your points and I'm going to try and persuade the coders to make this an actual pop-up from a single static page.
Thanks again!
-
Hi,
Agreed that if the pop-up draws the content from a single page / URL, there's no duplicate content problem. JavaScript would still be required to pull the content up and produce the pop-up: as we understand it, Google and other search engines are not too keen on executing JS functions, so often won't be served the content. And even if they are, drawing that content from its own URL will mean it is not included in the source code or seen as being on the page.
Whether this content could hurt you as it is currently included is hard to prove either way, but drawing it from a different source with JS would alleviate the potential problem.
-
Hi,
Just because you know it could cause any issues I would always try to get out of the way of the current problem and go with a pop up that is using it's own single page. It probably will also save on load time etc, so that will make it better in multiple cases.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Selective 301 redirections of pages within folders
Redirection Puzzle - it's got me puzzled anyhow! The finished website has just been converted from an old aspx affair to a wordpress site. Some directory structures have changed significantly; there appears to be a load of older medical articles that have not been added back in and it sounds unlikely that they will be. Therefore unmatched old news articles need to be pointed to the top news page to keep hold of any link value they may have accrued. The htaccess file starts with ithemes security's code, Followed by the main wordpress block and I have added the user redirects to the final section of the htaccess file . I have been through the redirects and rewrites line by line to verify them and the following sections are giving me problems. This is probably just my aging brain failing to grasp basic logic. If I can tap into anybody's wisdom for a bit of help I would appreciate it. My eyes and brain are gone to jelly. I have used htaccesscheck.com to check out the underlying syntax and ironed out the basic errors that I had previously missed. The bulk of the redirects are working correctly. #Here there are some very long media URLs which are absent on the new site and I am simply redirecting visiting spiders to the page that will hold media in future. Media items refuse to redirect
Technical SEO | | TomVolpe
Line 408 redirect 301 /Professionals/Biomedicalforum/Recordedfora/Rich%20Media%20http:/kplayer.kcl.ac.uk/ess/echo/presentation/15885525-ff02-4ab2-b0b9-9ba9d97ca266 http://www.SITENAME.ac.uk/biomedical-forum/recorded-fora/ Line 409 redirect 301 /Professionals/Biomedicalforum/Recordedfora/Quicktime%20http:/kplayer.kcl.ac.uk/ess/echo/presentation/15885525-ff02-4ab2-b0b9-9ba9d97ca266/media.m4v http://www.SITENAME.ac.uk/biomedical-forum/recorded-fora/ Line 410 redirect 301 /Professionals/Biomedicalforum/Recordedfora/Mp3%20http:/kplayer.kcl.ac.uk/ess/echo/presentation/15885525-ff02-4ab2-b0b9-9ba9d97ca266/media.mp3 http://www.SITENAME.ac.uk/biomedical-forum/recorded-fora/ #Old site pagination URLs redirected to new "news" top level page - Here I am simply pointing all the pagination URLs for the news section, that were indexed, to the main news page. These work but append the pagination code on to the new visible URL. Have I got the syntax correct in this version of the lines to suppress the appended garbage? RewriteRule ^/LatestNews.aspx(?:.*) http://www.SITENAME.ac.uk/news-events/latest-news/? [R=301,L] #On the old site many news directories (blog effectively) contained articles that are unmatched on the new site, have been redirected to new top level news (blog) page: In this section I became confused about whether to use Redirect Match or RewriteRule to point the articles in each year directory back to the top level news page. When I have added a redirectmatch command - it has been disabling the whole site! Despite my syntax check telling me it is syntactically correct. Currently I'm getting a 404 for any of the old URLs in these year by year directories, instead of a successful redirect. I suspect Regex lingo is not clicking for me 😉 My logic here was rewrite any aspx file in the directory to the latest news page at the top. This is my latest attempt to rectify the fault. Am I nearer with my syntax or my logic? The actual URLs and paths have been substituted, but the structure is the same). So what I believe I have set up is: in an earlier section; News posts that have been recreated in the new site are redirected 1 - 1 and they are working successfully. If a matching URL is not found, when the parsing of the file reaches the line for the 1934 directory it should read any remaining .aspx URL request and rewrite it to the latest news page as a 301 and stop processing this block of commands. The subsequent commands in this block repeat the process for the other year groups of posts. Clearly I am failing to comprehend something and illumination would be gratefully received. RewriteRule ^/Blab/Blabbitall/1934/(.*).aspx http://www.SITENAME.ac.uk/news-events/latest-news/ [R=301,L] #------Old site 1933 unmatched articles redirected to new news top level page RewriteRule ^/Blab/Blabbitall/1933/(.*).aspx http://www.SITENAME.ac.uk/news-events/latest-news/ [R=301,L] #------Old site 1932 unmatched articles redirected to new news top level page RewriteRule ^/Blab/Blabbitall/1932/(.*)/.aspx http://www.SITENAME.ac.uk/news-events/latest-news/ [R=301,L] #------Old site 1931 unmatched articles redirected to new news top level page RewriteRule ^/Blab/Blabbitall/1931/(.*)/.aspx http://www.SITENAME.ac.uk/news-events/latest-news/ [R=301,L] #------Old site 1930 unmatched articles redirected to new news top level page RewriteRule ^/Blab/Blabbitall/1930/(.*)/.aspx http://www.SITENAME.ac.uk/news-events/latest-news/ [R=301,L] Many thanks if anyone can help me understand the logic at work here.0 -
Search Term not appearing in On-Page Grader but is in code?
Hi, I have never had this issue in the past, but I am working on a website currently which is - http://hrfoundations.co.uk I didn't develop this website and the pages are in separate folders on the FTP which I have never come across before but hey ho. Anyway I have added the search term (HR Consultancy West Midlands) to the about us page in the h1, title and description however when I run the On-Page Grader it only says that I have entered it within the body which I haven't even done yet. When I view the page source it has defiantly been uploaded and the search terms are there, I have tried checking the spelling incase I spelt it incorrectly and I still can't seem to find the issue. Has anyone else experience this in the past, I'm going to go away now and check the code for any unclosed divs etc to see if that may be causing any issue's. Thanks Chris
Technical SEO | | chrissmithps0 -
How Many Words To Make Content 'unique?'
Hi All, I'm currently working on creating a variety of new pages for my website. These pages are based upon different keyword searches for cars, for example used BMW in London, Used BMW in Edinburgh and many many more similar kinds of variations. I'm writing some content for each page so that they're completely unique to each other (the cars displayed on each page will also be different so this would not be duplicated either). My question is really, how much content do you think that I'll need on each page? or what is optimal? What would be the minimum you might need? Thank for your help!
Technical SEO | | Sandicliffe0 -
50,000 pages or a page with parameters
I have a site with about 12k pages on a topic... each of these pages could use another several pages to go into deeper detail about the topic. So, I am wondering, for SEO purposes would it be better to have something like 50,000 new pages for each sub topic or have one page that I would pass parameters to and the page would be built on the fly in code behind. The drawback to the one page with parameters is that the URL would be static but the effort to implement would be minimal. I am also not sure how google would index a single page with parameters. The drawback to the 50k pages model is the dev effort and possibly committed some faux pas by unleashing so many links to my internal pages. I might also have to mix aspx with html because my project can't be that large. Anyone here ever have this sort of choice to make? Is there a third way I am not considering?
Technical SEO | | Banknotes0 -
Duplicate page content
Hello, The pro dashboard crawler bot thing that you get here reports the mydomain.com and mydomain.com/index.htm as duplicate pages. Is this a problem? If so how do I fix it? Thanks Ian
Technical SEO | | jwdl0 -
How to avoid duplicate content penalty when our content is posted on other sites too ?
For recruitment company sites, their job ads are posted muliple times on thier own sites and even on other sites too. These are the same ads (job description is same) posted on diff. sites. How do we avoid duplicate content penalty in this case?
Technical SEO | | Personnel_Concept0 -
Have a client that migrated their site; went live with noindex/nofollow and for last two SEOMoz crawls only getting one page crawled. In contrast, G.A. is crawling all pages. Just wait?
Client site is 15 + pages. New site had noindex/nofollow removed prior to last two crawls.
Technical SEO | | alankoen1230 -
Too Many On-Page Links
Hello. My Seomoz report this week tells me that I have about 500 pages with Too Many On-Page Links One of the examples is this one: https://www.theprinterdepo.com/hp-9000mfp-refurbished-printer (104 links) If you check, all our products have a RELATED products section and in some of them the related products can be UP to 40 Products. I wonder how can I solve this. I thought that putting nofollow on the links of the related products might fix all of these warnings? Putting NOFOLLOW does not affect SEO?
Technical SEO | | levalencia10