Does javascript generated content consider as regular content?
-
The website mentioned below, the content is generated using javascript, and content is something to do with Unicode char. The Unicode content creates as you scroll down. Will this content affect SEO https://www.myweirdtext.com/
-
I think it will affect you because the search engines will not find texts associated with your website, that is to say the mototr will read the texts you have generated and if it is by javascript that text does not exist until there is an interaction with the user.
The search engine will only read your code, not the text generated afterwards.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Potential duplicate content issue?
We have a category on our website for PVC rolls to buy as standard 50m rolls (this includes 15 products in the category). We're also releasing PVC rolls to buy per metre (10m roll/25m roll etc...), again with 15 products, which we are adding as a separate category as it makes more sense for our customers and removes the risk of having too many options. Would using the same description be bad practice for SEO? The product is exactly the same just available in different roll sizes, but we definitely do not want to combine categories as it doesn't work for our customers. Any help or suggestions would be appreciated, thanks.
On-Page Optimization | | RayflexGroup0 -
Duplicate Content - Pricing Plan tables
Hey guys, We're faced with a problem that we want to solve. We're working on the designs for a few pages for a drag & drop email builder we're currently working on, and we will be having the same pricing table on several pages (much like Moz does). We're worried that Google will take this as duplicate content and not be very fond of it. Any ideas about how we could integrate the same flow without potentially harming ranking efforts? And NO, re-writing the content for each table is not an option. It would do nothing but confuse the heck out of our clients. 😄 Thanks everybody!
On-Page Optimization | | andy.bigbangthemes0 -
Creating a .cn site with the existing site content
Hi all, I'm planning to create a .cn site. If I simply translate the existing content on my site (.com.au) into Chinese, do you think Google will see the .cn site as a duplicate of the main site? Will this cause any duplicate content issues? Thanks
On-Page Optimization | | QuantumWeb620 -
Duplicate Content
I run a Business Directory, Where all the businesses are listed. I am having an issue.. with Duplication content. I have categories, Like A, B, C Now a business in Category A, User can filter it by different locations, by State, City, Area So If they filter it by the State and the State has 10 businesses and all of them are in one City. Both of the page
On-Page Optimization | | Adnan4SEO
The state filtered and the city filtered are same, What can i do to avoid that? canonical-url-tag or changing page Meta's and Body text? Please help 🙂0 -
Webmaster tools content keywords conundrum
I'm working on optimising a phrase that is made up of two words. I've noticed in webmaster tools that the two words are listed separately under the content keywords section. This is fine apart from the two words are listed at very different significance levels, 2 and 18. Drilling deeper it shows that both these words have two variants. The word in position 2 occurs 483 times and the word in 18 occurs 60 times. Sadly the phrase is commercially sensitive as I'd like to just be able to share it here but can't. Should I be looking to include the weaker word more frequently on the site? In anchor text? Or is this normal distribution? Would optimising the weaker word risk the wrath of Panda? moz-question.jpg
On-Page Optimization | | Hannahm240 -
Do quotation marks in content effect SERPs?
Some of my art object products have words and phrases engraved on them. The words relate to the images on the product. In the product descriptions, I have been putting quotes around the entire list. Would I get better long tail results if I didn't use the quotation marks? In other words, do the quotes make everything between them an exact match phrase? For example:
On-Page Optimization | | stephenfishman
Current product description:
The worlds around the edge of the lazy susan read, "Explore nature. Dream big. Take time to smell the flowers. Enjoy the changing seasons. Seize the day. Relish the night. Live life to the fullest." Thank you for helping with this, all comments on how to present this kind of content are welcomed- Stephen kSOjt5a0 -
Do javascript pseudo-links dilute link juice ?
Hi, On our ecommerce, we use multiple pseudo-links for the layered navigation (to filter by color, site, etc), so that google doesn't crawl every combination of filters. I know this kind of links don't pass link juice and don't get crawled (provided you hide the target urls in your javascript). But, as there is an "onclick" property, I'm afraid that google could understand that these are links, and treat them the same way as nofollowed links (not following them but diluting link juice anyway). Do you know if this is the case ? Thanks,
On-Page Optimization | | Strelok0 -
Duplicate content issue with dynamically generated url
Hi, For those who have followed my previous question, I have a similar one regarding dynamically generated urls. From this page http://www.selectcaribbean.com/listing.html the user can make a selection according to various criteria. 6 results are presented and then the user can go to the next page. I know I should probably rewrite url's such as these: http://www.selectcaribbean.com/listing.html?pageNo=1&selType=&selCity=&selPrice=&selBeds=&selTrad=&selMod=&selOcean= but since all the results presented are basically generated on the fly for the convenience of the user, I am afraid google my consider this as an attempt to generate more pages as there are pages for each individual listing. What is my solution for this? Nofollow these pages? Block them thru robots txt?
On-Page Optimization | | multilang0