Getting pages that load dynamically into the SE's
-
SEO'ers,
Am dealing with an issue I cannot figure out the best way to handle. Working on a website that shows the definitions of words which are loaded dynamically from an open source. Source such as: wiktionary.org
When you visit a particular page to see the definition of the word, say; www.example.com/dictionary/example/ the definition is there. However, how can we get all the definition pages to get indexed in search engines? The WordPress sitemap plugin is not picking up these pages to be added automatically - guess because it's dynamic - but when using a sitemap crawler pages are detected.
Can anybody give advice on how to go about getting the 200k+ pages indexed in the SE's? If it helps, here's a reference site that seems to load it's definitions dynamically and has succeeded in getting its pages indexed: http://www.encyclo.nl/begrip/sample
-
I see what you mean there - thanks for sharing your expertise and views on this issue. Much appreciated
-
The only way I'd let those pages be indexed is if they had unique content on them AND/OR provided value in other ways besides just providing the Wiki definition. There are many possibilities for doing this, none of them scalable in an automated fashion, IMHO.
You could take the top 20% of those pages (based on traffic, conversions, revenue...) and really customize them by adding your own definitions and elaborating on the origin of the word, etc... Beyond that you'd probably see a decline in ROI.
-
Everett, yes that's correct. I will go ahead and follow up on what you said. I do still wonder what the best way would be to go about getting it indexed - if I wanted to do that in the future. If you could shed some light on how to go about that, I'd really appreciate it. Thanks so much in advance!
-
It appears that your definitions are coming from wiktionary.org and are therefore duplicate content. If you were providing your own definitions I would say keep the pages indexable, but in this case I would recommend adding a noindex, follow robots meta tag to the html header of those pages.
-
Hi Everett, I've been looking at the index for word definitions and there's so many pages that are very similar to each other. It's worth giving it a shot I think. If you can provide feedback please do. Here's the domain: http://freewordfinder.com. The dictionary is an addition to users who'd like to see what a word means after they've found a word from random letters. You can do a search at the top to see the results, then click through to the definition of the word. Thanks in advance
-
Ron,
We could probably tell you how to get those pages indexed, but then we'd have to tell you how to get them removed from the index when Google sees them all as duplicate content with no added value. My advice is to keep them unindexed, but if you really want them to be indexed tell us the domain and I'll have a look at how it's working and provide some feedback.
-
Hi Keri, did you think that the site might get penalized because it would in essence be duplicate content from another site? Even though the source is linked from the page? Please let me know your thoughts when you can
-
No they currently do not have additional information on them. They are simply better organized on my pages compared to the 3rd party. The unique information is what drives visitors to the site and from those pages it links to the definitions just in case they're interested understanding the meaning of a word. Does that help?
-
Do the individual pages with the definitions have additional information on them, or are they just from a third party, with other parts of the site having the unique information?
-
Hi Keri, thanks for your response. Well, I see what you're saying. The pages that show the definition pulled from the 3rd party are actually supplementary to the solution the site provides (core value). Shouldn't that make a difference?
-
I've got a question back for you that's more of a meta question. Why would the search engines want to index your pages? If all the page is doing is grabbing information from another source, your site isn't offering any additional value to the users, and the search engine algos aren't going to see the point in sending you visitors.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Getting accurate Geo Location traffic stats in Google Analytics - HELP
One of our clients services the US and the UK, but having looked at the report over an extended period of time we can still see that the vast majority of traffic is coming from the US. I.e. our last report for March indicated that there were over 3,000 users in the US but only 6 in the UK. We know that Google Analytics works out a user’s location based on where their IP is located and not their physical location, and that this means that the data needs to be taken with a pinch of salt as it won’t always represent what you expect. That being said, we know that the traffic figures for Europe are largely inaccurate and would like to get some more accurate stats to report on. Is there a way to do so at all within Google Analytics?
International SEO | | Wagada1 -
Include mobile and international versions of pages to sitemap or not?
My pages already have alternate and hreflang references to point to international and mobile versions of the content. If I add 5 language desktop versions and 5 language mobile versions as https://support.google.com/webmasters/answer/2620865?hl=en explains, my sitemap will get bulky. What are the pros and cons for referencing all page versions in sitemap and for include just general (English/Desktop) version in sitemap?
International SEO | | poiseo0 -
Thai Characters in URL's
Does anyone have experience with non-Latin characters in URL's? We've launched a website in Thailand and picked Thai characters for URL's. However, when you copy it, it turns into something like this: http://www.imoneythailand.com/บัตรเครดิต Can it impact our website's crawlability? Also, is keyword in URL a ranking factor for non-Latin languages? Thanks in advance for help!
International SEO | | imoney0 -
Are my translated pages damaging my ranking?
Hi there, I have a site in English but with duplicates in different languages. The first problem is that these translated versions of my site receive no ranking on google stars (while the english does) - why is this? The second problem is that SEOmoz counts the errors on my site and then duplicates this error count for all the translated versions of my site - meaning I have a huge amount of errors (too many on-page links). Add this to the fact that I use affilite ID´s to track different types of traffic to my site - so all page urls in english and other languages, with an affiliate id on the end of the url, count as an error. This means I have a huge amount of on page errors indicated by SEOmoz, plus no ranking for my translated pages - I think this is really harming my overall ranking and site trust. What are your opinions on this?
International SEO | | sparkit0 -
How well does Google's crawlers understand foreign websites?
I speak 5 languages and therefore have the opportunity to do on-page SEO and content writing for 5 different cultures. This question to me has much to do with the way Google translator works. It doesn't, trust me! Which then makes me wonder how the web crawlers, which are designed with English in mind, can fairly and equally attribute the same ranking points to a foreign website. Since Google seems to use sematic search technology I'm wondering if foreign sites have it easier or not. Any ideas?
International SEO | | MassivePrime0 -
Risks of Migrating tld's to sub folders
Hi Guys, I am thinking of migrating our .co.nz and our .co.uk websites into sub folders on our .com website (eg: .com/uk and .com/nz). Do you think this is a risky strategy in regards to our performance in the localised search engines or should the centralisation of all these websites and their link authority into the .com help us move up the rankings? We are thinking of doing this in the next week, we have some really good rankings for the local googles, however we also have plenty of phrases sitting just on page 2 and I was hoping this might help boost them onto page 1? Has anyone else had experience migrating tld sites to sub folders on a .com and if so what was your experience of the impact on search rankings in the local googles and the timeframe that these changes took to have an effect? Did you have any negative results?
International SEO | | ConradC0 -
Migration from tld's to .com sub folders
Hi Guys, We currently operate five websites, 1 on .co.uk, 1 on .co.nz, 1 on .de and 1 on .com (geo targeted to USA) and 1 on .com/au (targeted at Australia). Open Site Explorer currently credits our .co.uk with 212 unique domains linking to us, our .com has 130, our .co.nz has 110 and our .de (which is new) has around 10. We have a website on .com/au targeting Australia and we have gained around 30 - 40 links into this sub folder. Our rankings in google australia for this website are fantastic and it would appear to me that we have inherited all the domain authority of our .com. The UK is currently our most important market and we operate a website on a .co.uk there. Our main competitors there have around 300 - 400 unique domains linking to them. What I am thinking of doing is deploying our UK content onto our .com root domain (which is currently geo targeted at the US which is a really small market for us) and redirecting all of the .co.uk pages at the root folder of the .com and changing the geo targeting of the .com to the UK. Additionally I was going to migrate our .co.nz and our .de websites into .com/nz/ and .com/de/ sub folders. I will also create a new .com/us/ folder for the US. I can only go off the fact that the only sub folder website we have (.com/au) has been very successful for us. Do you think migration of all of these websites onto the .com domain using sub folders will provide a meaningful boost to our rankings by virtue of having more back links into one domain? Are there any big risks in doing so and how long would you expect the redirects and changes to be picked up by google. I really appreciate any help and comments on this. Kind Regards
International SEO | | ConradC
Conrad Cranfield0 -
What is the best way to make country specific IP redirect for only product pricng pages?
My website has 3 services and its price will be different for US/EU/Developed world and Asian/African countries.Apart from pricing page, all other things remain same. I want to use IP based redirect .I heard this thing is called cloaking and used by black-hat guys. What kind of instructions should I give to my web developer to look best to Google/Search bots and correctly show visitors the intended prices.Is there any caution to be taken care of. Thanks for your time
International SEO | | RyanSat0