Getting pages that load dynamically into the SE's
-
SEO'ers,
Am dealing with an issue I cannot figure out the best way to handle. Working on a website that shows the definitions of words which are loaded dynamically from an open source. Source such as: wiktionary.org
When you visit a particular page to see the definition of the word, say; www.example.com/dictionary/example/ the definition is there. However, how can we get all the definition pages to get indexed in search engines? The WordPress sitemap plugin is not picking up these pages to be added automatically - guess because it's dynamic - but when using a sitemap crawler pages are detected.
Can anybody give advice on how to go about getting the 200k+ pages indexed in the SE's? If it helps, here's a reference site that seems to load it's definitions dynamically and has succeeded in getting its pages indexed: http://www.encyclo.nl/begrip/sample
-
I see what you mean there - thanks for sharing your expertise and views on this issue. Much appreciated
-
The only way I'd let those pages be indexed is if they had unique content on them AND/OR provided value in other ways besides just providing the Wiki definition. There are many possibilities for doing this, none of them scalable in an automated fashion, IMHO.
You could take the top 20% of those pages (based on traffic, conversions, revenue...) and really customize them by adding your own definitions and elaborating on the origin of the word, etc... Beyond that you'd probably see a decline in ROI.
-
Everett, yes that's correct. I will go ahead and follow up on what you said. I do still wonder what the best way would be to go about getting it indexed - if I wanted to do that in the future. If you could shed some light on how to go about that, I'd really appreciate it. Thanks so much in advance!
-
It appears that your definitions are coming from wiktionary.org and are therefore duplicate content. If you were providing your own definitions I would say keep the pages indexable, but in this case I would recommend adding a noindex, follow robots meta tag to the html header of those pages.
-
Hi Everett, I've been looking at the index for word definitions and there's so many pages that are very similar to each other. It's worth giving it a shot I think. If you can provide feedback please do. Here's the domain: http://freewordfinder.com. The dictionary is an addition to users who'd like to see what a word means after they've found a word from random letters. You can do a search at the top to see the results, then click through to the definition of the word. Thanks in advance
-
Ron,
We could probably tell you how to get those pages indexed, but then we'd have to tell you how to get them removed from the index when Google sees them all as duplicate content with no added value. My advice is to keep them unindexed, but if you really want them to be indexed tell us the domain and I'll have a look at how it's working and provide some feedback.
-
Hi Keri, did you think that the site might get penalized because it would in essence be duplicate content from another site? Even though the source is linked from the page? Please let me know your thoughts when you can
-
No they currently do not have additional information on them. They are simply better organized on my pages compared to the 3rd party. The unique information is what drives visitors to the site and from those pages it links to the definitions just in case they're interested understanding the meaning of a word. Does that help?
-
Do the individual pages with the definitions have additional information on them, or are they just from a third party, with other parts of the site having the unique information?
-
Hi Keri, thanks for your response. Well, I see what you're saying. The pages that show the definition pulled from the 3rd party are actually supplementary to the solution the site provides (core value). Shouldn't that make a difference?
-
I've got a question back for you that's more of a meta question. Why would the search engines want to index your pages? If all the page is doing is grabbing information from another source, your site isn't offering any additional value to the users, and the search engine algos aren't going to see the point in sending you visitors.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ecommerce Product Page Optimization & International SEO
Hello, I'm working on our website SEO optimization. We have a thousands of products pages with different structures for the languages (arg) and very depth folder path .com/[folder]/[folder]/[folder]/product1.hmtl So now I have the happiness of working on the optimization of the website with themajor risk of impacting all current ranking. But anyway, here are a few questions I have on the way. Part 1 - International URL Our websites target people per country and languages. We do not have shops per countries (not enough resources_) but we try to get at least website per languages. What could be the best option?_ Url Parameters +hreflang So we save one folder less and the proper setup. But I'm just scared it's gonna be too messy for Google URL:.com/product1**?lang=fr** Product page:link rel="alternate" hreflang="en" href=".com/product1" / Language folder + hreflang one folder more but clearer structure URL:.com**/fr/**product1 **Product **page:****link rel="alternate" hreflang="en" href=".com/product1" / Part 2 - Product URL Our website is structure per categories so the product comes after. However, I've seen a lot of websites recently removing the categories to save folders space. What should be the most efficient option? Category folder It's obviously a good practice but this + the language folder makes already 2 folders URL:.com/categoryA/product1-{targetedKW} {targetedKW} = cheap product, best price or else All in url I've never done it but it somehow makes sense URL:.com/categoryA-product1-{targetedKW} Part 3 - Keyword stuffing As I'd like to get most of it automatically done, what could be the best places to add a few KW. **Markups:**All the ones we can **Meta Descriptions:**optimize one for Google + one for twitter + one for facebook Longer to do but then from google shopping and other automatic links, we could have the perfect or, at least, best description possible **All other option:**Reuse our product name + {targetter KW1 KW2 ...} Product description_ex: content_ Buttons (click to buy)ex: button title="Buy product_name cheap" alt="Purchase product_name"Buy Product name/button Images:same than above Meta:Titles and meta description Hn
International SEO | | omnyex0 -
Which pages to put hreflang on?
Hi, we are running a site which is a directory consisting of numbers of phone spammers. It contains descriptions, comments and so on. We are currently present in 9 countries. The websites all have the same structure, but, of course, the spam numbers in each country are different ones. If I want to tell Google that our website is available is several locations/languages, do I only put my hreflang tag on the start page then? Thanks
International SEO | | Roverandom
Thomas0 -
Low Index: 72 pages submitted and only 1 Indexed?
Hi Mozers, I'm pretty stuck on this and wondering if anybody else can give me some heads up around what might be causing the issues. I have 3 top level domains, NZ, AU, and USA. For some od reason I seem to be having a real issue with these pages indexing and also the sitemaps and I'm considering hiring someone to get the issue sorted as myself or my developer can''t seem to find the issues. I have attached an example of the sitemap_au.xml file. As you can see there is only 1 page that has been indexed and 72 were submitted. Basically because we host all of our domains on the same server, I was told last time our sitemaps were possibly been overwritten hence the reason why we have sitemap_au.xml and its the same for the other sitemap_nz.xml and sitemap_us.xml I also orignially had sitemap.xml for each. Another issue I am having is the meta tag des for each home page in USA and AU are showing the meta tag for New Zealand but when you look into the com and com.au code meta tag description they are all different as you can see here http://bit.ly/1KTbWg0 and here http://bit.ly/1AU0f5k Any advice around this would be so much appreciated! Thanks Justin new
International SEO | | edward-may0 -
I have more than 4000 pages but still have a low trafic. Would love to know more to be better ranked ?
My website is a magazine about travel and fashion. But even if i have a lot of pages, I am still low in ranking. Why ? Thanks for any advice !
International SEO | | ccjourn0 -
How do you get the "real" organic traffic from direct traffic?
Please check the following article: http://www.searchenginejournal.com/study-shows-organic-search-responsible-64-web-traffic/111791/ I hope you guys have some ideas on how to extract the "real" organic traffic from direct. Thanks in advance!
International SEO | | zpm20140 -
International Landing Page Strategy
Hello, I'm looking for some insight in an area that I don't have much experience in - hoping the community can help! We are a healthcare staffing company serving clients in the U.S. (www.bartonassociates.com). We are interested in attracting clients in Australia and New Zealand. I'm wondering if anyone as experience with best practices for doing so (both from an SEO and PPC perspective). Would it be best to purchase .au and .nz domains for these landing pages and link back to our US site for more information (or even recreate a modified version of our US site for .au and .nz). My concern here is duplicate content issues, among other things. Or, would it be better to create Australia and New Zealand focused landing pages on our US site and drive PPC there? My concern here is that we would never get organic traffic from Australia and New Zealand to our US site, in light of the competition. Also, the messaging would be a bit mixed if targeting all three countries. Our core term is "locums" and "locum tenens". Greatly appreciate any insight from you guys. Thanks, Jason
International SEO | | ba_seomoz0 -
Risks of Migrating tld's to sub folders
Hi Guys, I am thinking of migrating our .co.nz and our .co.uk websites into sub folders on our .com website (eg: .com/uk and .com/nz). Do you think this is a risky strategy in regards to our performance in the localised search engines or should the centralisation of all these websites and their link authority into the .com help us move up the rankings? We are thinking of doing this in the next week, we have some really good rankings for the local googles, however we also have plenty of phrases sitting just on page 2 and I was hoping this might help boost them onto page 1? Has anyone else had experience migrating tld sites to sub folders on a .com and if so what was your experience of the impact on search rankings in the local googles and the timeframe that these changes took to have an effect? Did you have any negative results?
International SEO | | ConradC0 -
What’s the best way to convert ccTLD to global TLD?
We started out as a Canadian site targeting Canadian users. Now our site http://iCraft.ca has a lot of international buyers and sellers and .ca TLD doesn’t make sense anymore, as we are not performing well on Google.com We are doing a complete site redesign right now, which will address a lot of coding and content specific issues, but we suspect .ca domain will always hold us back in achieving good positions on Google.com. Since Google doesn’t allow ccTLDs to set geo-targeting, what are our options? a) Migrating to a brand new .com site and setting up 301 redirects for all links from iCraft.ca. Would we lose all rankings in this example and pretty much start building them from scratch? Or would PR be transferred page by page from one domain to another through 301 redirects? b) Setup a separate .com site with mirrored content to target global audience and keep .ca site to target Canada. Not sure if splitting PR for the same pages between 2 sites is a good idea. Also, how would you address duplicate content properly in our situation?
International SEO | | MarinaUX
In this video that I found here on forum http://www.youtube.com/watch?v=Ets7nHOV1Yo Matt Cutts says that it’s ok to have duplicate content on different ccTLDs, but he says - make sure you localize your content on those domains. What if you can’t? Most of the content on our site is meant for anyone, not just Canadian users. So, for the most part, we’d have exactly same content on .com site, as we have on .ca site. We could display prices in different currencies on product pages, but the rest of the content – blogs, forum etc. are not country-specific and can’t be localized easily. Also, it’s not clear from the video if all mirrored sites should sit on the same domain name for each country, like example.com and example.ca or is it ok to have example.com and icraft.ca? c) Is there a better option? Thanks for your help!0