Getting pages that load dynamically into the SE's
-
SEO'ers,
Am dealing with an issue I cannot figure out the best way to handle. Working on a website that shows the definitions of words which are loaded dynamically from an open source. Source such as: wiktionary.org
When you visit a particular page to see the definition of the word, say; www.example.com/dictionary/example/ the definition is there. However, how can we get all the definition pages to get indexed in search engines? The WordPress sitemap plugin is not picking up these pages to be added automatically - guess because it's dynamic - but when using a sitemap crawler pages are detected.
Can anybody give advice on how to go about getting the 200k+ pages indexed in the SE's? If it helps, here's a reference site that seems to load it's definitions dynamically and has succeeded in getting its pages indexed: http://www.encyclo.nl/begrip/sample
-
I see what you mean there - thanks for sharing your expertise and views on this issue. Much appreciated
-
The only way I'd let those pages be indexed is if they had unique content on them AND/OR provided value in other ways besides just providing the Wiki definition. There are many possibilities for doing this, none of them scalable in an automated fashion, IMHO.
You could take the top 20% of those pages (based on traffic, conversions, revenue...) and really customize them by adding your own definitions and elaborating on the origin of the word, etc... Beyond that you'd probably see a decline in ROI.
-
Everett, yes that's correct. I will go ahead and follow up on what you said. I do still wonder what the best way would be to go about getting it indexed - if I wanted to do that in the future. If you could shed some light on how to go about that, I'd really appreciate it. Thanks so much in advance!
-
It appears that your definitions are coming from wiktionary.org and are therefore duplicate content. If you were providing your own definitions I would say keep the pages indexable, but in this case I would recommend adding a noindex, follow robots meta tag to the html header of those pages.
-
Hi Everett, I've been looking at the index for word definitions and there's so many pages that are very similar to each other. It's worth giving it a shot I think. If you can provide feedback please do. Here's the domain: http://freewordfinder.com. The dictionary is an addition to users who'd like to see what a word means after they've found a word from random letters. You can do a search at the top to see the results, then click through to the definition of the word. Thanks in advance
-
Ron,
We could probably tell you how to get those pages indexed, but then we'd have to tell you how to get them removed from the index when Google sees them all as duplicate content with no added value. My advice is to keep them unindexed, but if you really want them to be indexed tell us the domain and I'll have a look at how it's working and provide some feedback.
-
Hi Keri, did you think that the site might get penalized because it would in essence be duplicate content from another site? Even though the source is linked from the page? Please let me know your thoughts when you can
-
No they currently do not have additional information on them. They are simply better organized on my pages compared to the 3rd party. The unique information is what drives visitors to the site and from those pages it links to the definitions just in case they're interested understanding the meaning of a word. Does that help?
-
Do the individual pages with the definitions have additional information on them, or are they just from a third party, with other parts of the site having the unique information?
-
Hi Keri, thanks for your response. Well, I see what you're saying. The pages that show the definition pulled from the 3rd party are actually supplementary to the solution the site provides (core value). Shouldn't that make a difference?
-
I've got a question back for you that's more of a meta question. Why would the search engines want to index your pages? If all the page is doing is grabbing information from another source, your site isn't offering any additional value to the users, and the search engine algos aren't going to see the point in sending you visitors.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Which pages to put hreflang on?
Hi, we are running a site which is a directory consisting of numbers of phone spammers. It contains descriptions, comments and so on. We are currently present in 9 countries. The websites all have the same structure, but, of course, the spam numbers in each country are different ones. If I want to tell Google that our website is available is several locations/languages, do I only put my hreflang tag on the start page then? Thanks
International SEO | | Roverandom
Thomas0 -
Anybody experience with speeding up loading time for visitors from China mainland?
I made some speed tests and noticed that our website loads 10 times slower for visitors from China mainland.
International SEO | | lcourse
Did anybody have experience with speeding up loading time for visitors from China? We operate the Chinese version of our website in a subdirectory and we have no interest in registering a company in China in order to get the ICP number.
Currently using cloudflare who should have a node in HK and serving static content via rackspace. Does disabling google analytics and facebook widgets really make a difference? (ideally would like to avoid this)0 -
Low Index: 72 pages submitted and only 1 Indexed?
Hi Mozers, I'm pretty stuck on this and wondering if anybody else can give me some heads up around what might be causing the issues. I have 3 top level domains, NZ, AU, and USA. For some od reason I seem to be having a real issue with these pages indexing and also the sitemaps and I'm considering hiring someone to get the issue sorted as myself or my developer can''t seem to find the issues. I have attached an example of the sitemap_au.xml file. As you can see there is only 1 page that has been indexed and 72 were submitted. Basically because we host all of our domains on the same server, I was told last time our sitemaps were possibly been overwritten hence the reason why we have sitemap_au.xml and its the same for the other sitemap_nz.xml and sitemap_us.xml I also orignially had sitemap.xml for each. Another issue I am having is the meta tag des for each home page in USA and AU are showing the meta tag for New Zealand but when you look into the com and com.au code meta tag description they are all different as you can see here http://bit.ly/1KTbWg0 and here http://bit.ly/1AU0f5k Any advice around this would be so much appreciated! Thanks Justin new
International SEO | | edward-may0 -
Google Webmaster showing error for [hreflang='x-default']
Hi There! Using [hreflang='x-default'] tag to target language specific countries on our site but Google Webmaster showing errors even implementation made as per Google guideline but one thing is not clear and we are not sure, this is the reason behind it. Error is showing up only on those pages where 'Google Parameters' are used. For example : https://www.sitegeek.com/a2hosting?grank=open 'grank=' is defined as a 'Google Parameters' and on the above page 'hreflang' tags are : Also, on page https://www.sitegeek.com/a2hosting [without Google Parameters] same above 'hreflang' tags are taken. But, There is no error on second page URL where no 'Google Parameters' in URL. Therefore, error showing on first URL where 'Google Parameters' are taken. Is this the issue or not? suggest how to remove? -- Rajiv S9vhl3T
International SEO | | gamesecure0 -
How do you get the "real" organic traffic from direct traffic?
Please check the following article: http://www.searchenginejournal.com/study-shows-organic-search-responsible-64-web-traffic/111791/ I hope you guys have some ideas on how to extract the "real" organic traffic from direct. Thanks in advance!
International SEO | | zpm20140 -
Subfolders and 301's
Hello all, Quite simply, I'm stuck. Well, I think I am. We are about to launch a whole new International side of our website. We're an education job board www.eteach.com for schools in the UK and a little internationally. Now that the business is growing we want to make our brand more global. All the big bosses wanted to create a brand new website called www.eteachinternational.com. I managed to persuade them to not to do that and instead use a subfolder approach off of our well established and strong domain www.eteach.com (phew). However, now I'm getting a little lost in making sure I don't duplicate my content. We have a staffroom section on our website which basically has lots of relevant content for people searching how to become a teacher, e.g. www.eteach.com/how-to-become-a-teacher. We also want this same content on the international subfolder, as it will still be relevant content for international teachers. However... Do I have to completely re-write the content (which I'm trying to avoid as it will be very similar) or can I put in a rel=canonical to the already existing pages? So basically (I know this HTML isn't right, it's just for visual's sake!): www.eteach.com/international/how-to-become-a-teacher rel=canonical --> www.eteach.com/how-to-become-a-teacher I understand this gives all the authority to the original page, not the international one, but I'm fine with that (unless anyone can suggest anything else?)
International SEO | | Eteach_Marketing0 -
International Landing Page Strategy
Hello, I'm looking for some insight in an area that I don't have much experience in - hoping the community can help! We are a healthcare staffing company serving clients in the U.S. (www.bartonassociates.com). We are interested in attracting clients in Australia and New Zealand. I'm wondering if anyone as experience with best practices for doing so (both from an SEO and PPC perspective). Would it be best to purchase .au and .nz domains for these landing pages and link back to our US site for more information (or even recreate a modified version of our US site for .au and .nz). My concern here is duplicate content issues, among other things. Or, would it be better to create Australia and New Zealand focused landing pages on our US site and drive PPC there? My concern here is that we would never get organic traffic from Australia and New Zealand to our US site, in light of the competition. Also, the messaging would be a bit mixed if targeting all three countries. Our core term is "locums" and "locum tenens". Greatly appreciate any insight from you guys. Thanks, Jason
International SEO | | ba_seomoz0 -
What's the best strategy for checking international rankings?
Hi There- I am looking to optimize sites serving the UK and Austrailia markets. I feel like I have a good handle on how to go about doing that, but what I am fuzzy on is, what's the best way to monitor the SERPs for the keywords I am targeting. I know based on experience that if I just search google.com.au from here in the states, my results will be 'americanized' and may/probably won't accurately reflect what someone would see if they were search from Austrailia. Are there any good tools or tactics for seeing what searchers in the countries I am focusing on woudl see? Thanks! Jason
International SEO | | phantom0