I have a website for my small business, and hope to improve the search results position for 5 landing pages. I recently modified my website to make it responsive (mobile friendly). I was not able to use Bootstrap; the layout of the pages is a bit unusual and doesn't lend itself to the options Bootstrap provides.
Each landing page has 3 main div's - one for desktop, one for tablet, one for phone.
The text content displayed in each div is the same. Only one of the 3 div’s is visible; the user’s screen width determines which div is visible.
When I wrote the HTML for the page, I didn't want each div to have identical text. I worried that
when Google indexed the page it would see the same text 3 times, and would conclude that keyword spamming was occurring. So I put the text in just one div. And when the page loads jQuery copies the text from the first div to the other two div's.
But now I've learned that when Google indexes a page it looks at both the page that is served AND the page that is rendered. And in my case the page that is rendered - after it loads and the jQuery code is executed – contains duplicate text content in three div's. So perhaps my approach - having the served page contain just one div with text content – fails to help, because Google examines the rendered page, which has duplicate text content in three div's.
Here is the layout of one landing page, as served by the server.
1000 words of text goes here.
No text. jQuery will copy the text from div id="desktop" into here.
No text. jQuery will copy the text from div id="desktop" into here.
=====================================================================================
My question is: Will Google conclude that keyword spamming is occurring because of the duplicate content the rendered page contains, or will it realize that only one of the div's is visible at a time, and the duplicate content is there only to achieve a responsive design?
Thank you!