Getting pages that load dynamically into the SE's
-
SEO'ers,
Am dealing with an issue I cannot figure out the best way to handle. Working on a website that shows the definitions of words which are loaded dynamically from an open source. Source such as: wiktionary.org
When you visit a particular page to see the definition of the word, say; www.example.com/dictionary/example/ the definition is there. However, how can we get all the definition pages to get indexed in search engines? The WordPress sitemap plugin is not picking up these pages to be added automatically - guess because it's dynamic - but when using a sitemap crawler pages are detected.
Can anybody give advice on how to go about getting the 200k+ pages indexed in the SE's? If it helps, here's a reference site that seems to load it's definitions dynamically and has succeeded in getting its pages indexed: http://www.encyclo.nl/begrip/sample
-
I see what you mean there - thanks for sharing your expertise and views on this issue. Much appreciated
-
The only way I'd let those pages be indexed is if they had unique content on them AND/OR provided value in other ways besides just providing the Wiki definition. There are many possibilities for doing this, none of them scalable in an automated fashion, IMHO.
You could take the top 20% of those pages (based on traffic, conversions, revenue...) and really customize them by adding your own definitions and elaborating on the origin of the word, etc... Beyond that you'd probably see a decline in ROI.
-
Everett, yes that's correct. I will go ahead and follow up on what you said. I do still wonder what the best way would be to go about getting it indexed - if I wanted to do that in the future. If you could shed some light on how to go about that, I'd really appreciate it. Thanks so much in advance!
-
It appears that your definitions are coming from wiktionary.org and are therefore duplicate content. If you were providing your own definitions I would say keep the pages indexable, but in this case I would recommend adding a noindex, follow robots meta tag to the html header of those pages.
-
Hi Everett, I've been looking at the index for word definitions and there's so many pages that are very similar to each other. It's worth giving it a shot I think. If you can provide feedback please do. Here's the domain: http://freewordfinder.com. The dictionary is an addition to users who'd like to see what a word means after they've found a word from random letters. You can do a search at the top to see the results, then click through to the definition of the word. Thanks in advance
-
Ron,
We could probably tell you how to get those pages indexed, but then we'd have to tell you how to get them removed from the index when Google sees them all as duplicate content with no added value. My advice is to keep them unindexed, but if you really want them to be indexed tell us the domain and I'll have a look at how it's working and provide some feedback.
-
Hi Keri, did you think that the site might get penalized because it would in essence be duplicate content from another site? Even though the source is linked from the page? Please let me know your thoughts when you can
-
No they currently do not have additional information on them. They are simply better organized on my pages compared to the 3rd party. The unique information is what drives visitors to the site and from those pages it links to the definitions just in case they're interested understanding the meaning of a word. Does that help?
-
Do the individual pages with the definitions have additional information on them, or are they just from a third party, with other parts of the site having the unique information?
-
Hi Keri, thanks for your response. Well, I see what you're saying. The pages that show the definition pulled from the 3rd party are actually supplementary to the solution the site provides (core value). Shouldn't that make a difference?
-
I've got a question back for you that's more of a meta question. Why would the search engines want to index your pages? If all the page is doing is grabbing information from another source, your site isn't offering any additional value to the users, and the search engine algos aren't going to see the point in sending you visitors.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My indexed site URL removed from google search without get any message or Manual Actions??
On Agust 2 or 3.. I'm not sure about the exact date...
International SEO | | newwaves
The main URL of my website https://new-waves.net/ had been completely removed from Google search results! without getting any messages or Manual Actions on search console ?? but I'm still can find some of my site subpages in search results and on Google local maps results when I tried to check it on google
info:new-waves.net >> no results
site:new-waves.net >> only now I can see the main URL in results because I had submitted it again and again to google but it might be deleted again today or tomorrow as that happen before last few days
100% of all ranked keywords >> my site URL new-waves.net had been completely removed from all results! but I'm still can see it on maps on some results My site was ranked number 1 on google search results for "digital marketing qatar" and some other keywords, but the main URL had been removed from 100% of all search results. but you can still see it on the map only. I just tried to submit it again to Google and to index it through google search console tool but still not get any results, Can any one help to know what is the reason?? and how can I solve this issue without losing my previous ranked keywords? Can I submit a direct message to google support or customer service to know the reason or get help on this issue? Thanks & Regards0 -
How do you get the "real" organic traffic from direct traffic?
Please check the following article: http://www.searchenginejournal.com/study-shows-organic-search-responsible-64-web-traffic/111791/ I hope you guys have some ideas on how to extract the "real" organic traffic from direct. Thanks in advance!
International SEO | | zpm20140 -
Subfolders and 301's
Hello all, Quite simply, I'm stuck. Well, I think I am. We are about to launch a whole new International side of our website. We're an education job board www.eteach.com for schools in the UK and a little internationally. Now that the business is growing we want to make our brand more global. All the big bosses wanted to create a brand new website called www.eteachinternational.com. I managed to persuade them to not to do that and instead use a subfolder approach off of our well established and strong domain www.eteach.com (phew). However, now I'm getting a little lost in making sure I don't duplicate my content. We have a staffroom section on our website which basically has lots of relevant content for people searching how to become a teacher, e.g. www.eteach.com/how-to-become-a-teacher. We also want this same content on the international subfolder, as it will still be relevant content for international teachers. However... Do I have to completely re-write the content (which I'm trying to avoid as it will be very similar) or can I put in a rel=canonical to the already existing pages? So basically (I know this HTML isn't right, it's just for visual's sake!): www.eteach.com/international/how-to-become-a-teacher rel=canonical --> www.eteach.com/how-to-become-a-teacher I understand this gives all the authority to the original page, not the international one, but I'm fine with that (unless anyone can suggest anything else?)
International SEO | | Eteach_Marketing0 -
How to handle rel canonical on secondary TLD's - multi regional sites.
I currently have a .com domain which I am think of duplicating the content on to another tld, CO.UK (and regionalize some aspects like contact numbers etc). From my research, it seems that in gwt you must then indicate which country you wish to target, in the co.uk case the UK. My question is how should I handle rel canonical in the duplicated site. should it rel canonical back to the .com or the co.uk? Any other pointers would also be appreciated. Thx
International SEO | | dmccarthy0 -
Getting ranked in French on Google UK ?
Hellooooo the Moz community ! (#superexcited, # firstpost) Here's my problem. I'm working for a client specialised in Corporate Relocation to London for French families. (I'm reworking the entire site from the ground up, so I can manoeuvre pretty easily) The thing is, these families will either be : Searching on Google FR but mostly in English (French as well) Searching on Google UK but mostly in French ! (and of course, English as well) To be honest, I'm really not sure what strategy I should go with. Should I just target each local market in its native language and google will pick up the right language if people are searching in the "opposite" language ? I'd love some tips to help get me started. Sadly, I don't have a lot of data yet. (Client didn't even have tracking up on their site before I came in). So far here's what I got (on very small number of visitors): Location: 50+% from UK / 20+% from France.
International SEO | | detailedvision
Language : 60+% En / 35+% Fr Thank you. Tristan0 -
Multilingual Ecommerce Product Pages Best Practices
Hi Mozzers, We have a marketplace with 20k+ products, most of which are written in English. At the same time we support several different languages. This changes the chrome of the site (nav, footer, help text, buttons, everything we control) but leaves all the products in their original language. This resulted in all kinds of duplicate content (pages, titles, descriptions) being detected by SEOMoz and GWT. After doing some research we implemented the on page rel="alternate" hreflang="x", seeing as our situation almost perfectly matched the first use case listed by Google on this page http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077. This ended up not helping at all. Google still reports duplicate titles and descriptions for thousands of products, months after setting this up. We are thinking about changing to the sitemap implementation rel="alternate" hreflang="X", but are not sure if this will work either. Other options we have considered include noindex or blocks with robots.txt when the product language is not the same as the site language. That way the feature is still open to users while removing the duplicate pages for Google. So I'm asking for input on best practice for getting Google to correctly recognize one product, with 6 different language views of that same product. Can anyone help? Examples: (Site in English, Product in English) http://website.com/products/product-72 (Site in Spanish, Product in English) http://website.com/es/products/product-72 (Site in German, Product in English) http://website.com/de/products/product-72 etc...
International SEO | | sedwards0 -
How can I rank couple of pages to a specific geography ?
Hi guys, I have a pretty good success in many of my keyword on google US. We are a multi-country company and would like to get better ranking on all these countries. I know it's a long run and we need to by patient to get the rank desired. We are getting the slowly, bu surely. In the next couple of months, we will be attending a conference where we will have a booth and we would like to conduct a campaign to invite customers to join us. My question is : Is there an efficient way to have just couple of pages on our web site that could potentially rank fast on a specific geography ? Europe is my target audience ( France an UK ). If you have any advice, I would appreciate. Best regards,
International SEO | | processia1 -
Country specific landing pages
I have a client who wants to put a re-direct on his landing pages based on the visitors IP address. The landing page will be a sub domain relevant to the country their IP is located in. I am a little concerned this will effect the SEO. Appreciate any advice. Dylan 🙂
International SEO | | gomyseo0