Getting pages that load dynamically into the SE's
-
SEO'ers,
Am dealing with an issue I cannot figure out the best way to handle. Working on a website that shows the definitions of words which are loaded dynamically from an open source. Source such as: wiktionary.org
When you visit a particular page to see the definition of the word, say; www.example.com/dictionary/example/ the definition is there. However, how can we get all the definition pages to get indexed in search engines? The WordPress sitemap plugin is not picking up these pages to be added automatically - guess because it's dynamic - but when using a sitemap crawler pages are detected.
Can anybody give advice on how to go about getting the 200k+ pages indexed in the SE's? If it helps, here's a reference site that seems to load it's definitions dynamically and has succeeded in getting its pages indexed: http://www.encyclo.nl/begrip/sample
-
I see what you mean there - thanks for sharing your expertise and views on this issue. Much appreciated
-
The only way I'd let those pages be indexed is if they had unique content on them AND/OR provided value in other ways besides just providing the Wiki definition. There are many possibilities for doing this, none of them scalable in an automated fashion, IMHO.
You could take the top 20% of those pages (based on traffic, conversions, revenue...) and really customize them by adding your own definitions and elaborating on the origin of the word, etc... Beyond that you'd probably see a decline in ROI.
-
Everett, yes that's correct. I will go ahead and follow up on what you said. I do still wonder what the best way would be to go about getting it indexed - if I wanted to do that in the future. If you could shed some light on how to go about that, I'd really appreciate it. Thanks so much in advance!
-
It appears that your definitions are coming from wiktionary.org and are therefore duplicate content. If you were providing your own definitions I would say keep the pages indexable, but in this case I would recommend adding a noindex, follow robots meta tag to the html header of those pages.
-
Hi Everett, I've been looking at the index for word definitions and there's so many pages that are very similar to each other. It's worth giving it a shot I think. If you can provide feedback please do. Here's the domain: http://freewordfinder.com. The dictionary is an addition to users who'd like to see what a word means after they've found a word from random letters. You can do a search at the top to see the results, then click through to the definition of the word. Thanks in advance
-
Ron,
We could probably tell you how to get those pages indexed, but then we'd have to tell you how to get them removed from the index when Google sees them all as duplicate content with no added value. My advice is to keep them unindexed, but if you really want them to be indexed tell us the domain and I'll have a look at how it's working and provide some feedback.
-
Hi Keri, did you think that the site might get penalized because it would in essence be duplicate content from another site? Even though the source is linked from the page? Please let me know your thoughts when you can
-
No they currently do not have additional information on them. They are simply better organized on my pages compared to the 3rd party. The unique information is what drives visitors to the site and from those pages it links to the definitions just in case they're interested understanding the meaning of a word. Does that help?
-
Do the individual pages with the definitions have additional information on them, or are they just from a third party, with other parts of the site having the unique information?
-
Hi Keri, thanks for your response. Well, I see what you're saying. The pages that show the definition pulled from the 3rd party are actually supplementary to the solution the site provides (core value). Shouldn't that make a difference?
-
I've got a question back for you that's more of a meta question. Why would the search engines want to index your pages? If all the page is doing is grabbing information from another source, your site isn't offering any additional value to the users, and the search engine algos aren't going to see the point in sending you visitors.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why would a site lose rankings in U.S while maintaining rankings in other English locations (Canada & Australia)
What would cause a site to lose ranking in the U.S while maintaining top (1st page) positions in other English results countries such as Canada or Australia? Is this purely penguin related because of location of backlinks or are there other significant factors that could be in play? Would this rule out Panda as a cause because it's simply an "English language" targeted algo and not location dependent like backlinks (penguin)? Appreciate any insights
International SEO | | ResumeGenius0 -
Thai Characters in URL's
Does anyone have experience with non-Latin characters in URL's? We've launched a website in Thailand and picked Thai characters for URL's. However, when you copy it, it turns into something like this: http://www.imoneythailand.com/บัตรเครดิต Can it impact our website's crawlability? Also, is keyword in URL a ranking factor for non-Latin languages? Thanks in advance for help!
International SEO | | imoney0 -
What language to use for URL's for Russian language?
Hi, Our site is in English, Spanish, Danish and Russian - the URL's are individual to the language they are in, but of course, Russian contains some strange characters so I decided not to use them in the URL's Any advice on how to create the URL's for russian language pages? thanks
International SEO | | bjs20100 -
International Landing Page Strategy
Hello, I'm looking for some insight in an area that I don't have much experience in - hoping the community can help! We are a healthcare staffing company serving clients in the U.S. (www.bartonassociates.com). We are interested in attracting clients in Australia and New Zealand. I'm wondering if anyone as experience with best practices for doing so (both from an SEO and PPC perspective). Would it be best to purchase .au and .nz domains for these landing pages and link back to our US site for more information (or even recreate a modified version of our US site for .au and .nz). My concern here is duplicate content issues, among other things. Or, would it be better to create Australia and New Zealand focused landing pages on our US site and drive PPC there? My concern here is that we would never get organic traffic from Australia and New Zealand to our US site, in light of the competition. Also, the messaging would be a bit mixed if targeting all three countries. Our core term is "locums" and "locum tenens". Greatly appreciate any insight from you guys. Thanks, Jason
International SEO | | ba_seomoz0 -
Duplicated 404 Pages (Travel Industry)
Our website has creating numberous "future pages" with no alt tag or class tag that are showing up as 404 pages, To make matters worst, they are causing duplicate 404 pages because we have different languages. The visitors cant find the 404s but the searchbots can. Would it better to remove or add the links to robot.txt or add nofollow/noindex tag? This is an example. http://www.solmelia.com/nGeneral.LINK_FAQ http://www.solmelia.com/nGeneral.LINK_HOTELESDESTINOS_BODAS http://www.solmelia.com/nGeneral.LINK_CONDICIONES http://www.solmelia.com/nGeneral.LINK_MAPSITE http://www.solmelia.com/nGeneral.LINK_HOTELESDESTINOS_EMPRESA
International SEO | | Melia0 -
What is the best way to make country specific IP redirect for only product pricng pages?
My website has 3 services and its price will be different for US/EU/Developed world and Asian/African countries.Apart from pricing page, all other things remain same. I want to use IP based redirect .I heard this thing is called cloaking and used by black-hat guys. What kind of instructions should I give to my web developer to look best to Google/Search bots and correctly show visitors the intended prices.Is there any caution to be taken care of. Thanks for your time
International SEO | | RyanSat0 -
I need suggestions. We're helping a big journal to improve their external links, even though they've a site with over 10 million monthly visits, their external links are week. Any suggestions?
Please let us know where we can find information on how to improve external links for a very big journal site. Thanks.
International SEO | | carloscontinua0 -
Multi-lingual SEO: Country-specific TLD's, or migration to a huge .com site?
Dear SEOmoz team, I’m an in-house SEO looking after a number of sites in a competitive vertical. Right now we have our core example.com site translated into over thirty different languages, with each one sitting on its own country-specific TLD (so example.de, example.jp, example.es, example.co.kr etc…). Though we’re using a template system so that changes to the .com domain propagate across all languages, over the years things have become more complex in quite a few areas. For example, the level of analytics script hacks and filters we have created in order to channel users through to each language profile is now bordering on the epic. For a number of reasons we’ve recently been discussing the cost/benefit of migrating all of these languages into the single example.com domain. On first look this would appear to simplify things greatly; however I’m nervous about what effect this would have on our organic SE traffic. All these separate sites have cumulatively received years of on/off-site work, and even if we went through the process of setting up page-for-page redirects to their new home on example.com, I would hate to lose all this hard-work (and business) if we saw our rankings tank as a result of the move. So I guess the question is, for an international business such as ours, which is the optimal site structure in the eyes of the search engines; Local sites on local TLD’s, or one mammoth site with language identifiers in the URL path (or subdomains)? Is Google still so reliant on TLD for geo targeting search results, or is it less of a factor in today’s search engine environment? Cheers!
International SEO | | linklater0