Getting pages that load dynamically into the SE's
-
SEO'ers,
Am dealing with an issue I cannot figure out the best way to handle. Working on a website that shows the definitions of words which are loaded dynamically from an open source. Source such as: wiktionary.org
When you visit a particular page to see the definition of the word, say; www.example.com/dictionary/example/ the definition is there. However, how can we get all the definition pages to get indexed in search engines? The WordPress sitemap plugin is not picking up these pages to be added automatically - guess because it's dynamic - but when using a sitemap crawler pages are detected.
Can anybody give advice on how to go about getting the 200k+ pages indexed in the SE's? If it helps, here's a reference site that seems to load it's definitions dynamically and has succeeded in getting its pages indexed: http://www.encyclo.nl/begrip/sample
-
I see what you mean there - thanks for sharing your expertise and views on this issue. Much appreciated
-
The only way I'd let those pages be indexed is if they had unique content on them AND/OR provided value in other ways besides just providing the Wiki definition. There are many possibilities for doing this, none of them scalable in an automated fashion, IMHO.
You could take the top 20% of those pages (based on traffic, conversions, revenue...) and really customize them by adding your own definitions and elaborating on the origin of the word, etc... Beyond that you'd probably see a decline in ROI.
-
Everett, yes that's correct. I will go ahead and follow up on what you said. I do still wonder what the best way would be to go about getting it indexed - if I wanted to do that in the future. If you could shed some light on how to go about that, I'd really appreciate it. Thanks so much in advance!
-
It appears that your definitions are coming from wiktionary.org and are therefore duplicate content. If you were providing your own definitions I would say keep the pages indexable, but in this case I would recommend adding a noindex, follow robots meta tag to the html header of those pages.
-
Hi Everett, I've been looking at the index for word definitions and there's so many pages that are very similar to each other. It's worth giving it a shot I think. If you can provide feedback please do. Here's the domain: http://freewordfinder.com. The dictionary is an addition to users who'd like to see what a word means after they've found a word from random letters. You can do a search at the top to see the results, then click through to the definition of the word. Thanks in advance
-
Ron,
We could probably tell you how to get those pages indexed, but then we'd have to tell you how to get them removed from the index when Google sees them all as duplicate content with no added value. My advice is to keep them unindexed, but if you really want them to be indexed tell us the domain and I'll have a look at how it's working and provide some feedback.
-
Hi Keri, did you think that the site might get penalized because it would in essence be duplicate content from another site? Even though the source is linked from the page? Please let me know your thoughts when you can
-
No they currently do not have additional information on them. They are simply better organized on my pages compared to the 3rd party. The unique information is what drives visitors to the site and from those pages it links to the definitions just in case they're interested understanding the meaning of a word. Does that help?
-
Do the individual pages with the definitions have additional information on them, or are they just from a third party, with other parts of the site having the unique information?
-
Hi Keri, thanks for your response. Well, I see what you're saying. The pages that show the definition pulled from the 3rd party are actually supplementary to the solution the site provides (core value). Shouldn't that make a difference?
-
I've got a question back for you that's more of a meta question. Why would the search engines want to index your pages? If all the page is doing is grabbing information from another source, your site isn't offering any additional value to the users, and the search engine algos aren't going to see the point in sending you visitors.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does not having any hreflang tags for U.S Visitors lead to an increase in International Visitors?
I have seen a massive increase in International Visitors on our website and visitors within the United States dropped off hard this month (by about 20%). Could it be possible that not having any hreflang tags can lead to an increase in International Customers visiting the site even though your sitemap is set to "Target users in United States" within the Google Search Console? In the Google Search Console, I have International Targeting set to "Target users in United States." However, Google Search Console is saying our site doesn't have any hreflang tags. In the Google Search Console, it says "Your site has no hreflang tags. Google uses hreflang tags to match the user's language preference to the right variation of your pages." I'm not sure when that was flagged, but recently we have seen a massive increase in International Visitors to our site from countries such as Russia, Vietnam, Indonesia, the United Kingdom and so on. This poses a problem since our chances of turning one of those visitors into a customer is extremely slim. Along with that, nearly every international customer is contributing to an extremely high Bounce Rate. Attached is a screenshot of the Error about hreflang tags. https://imgur.com/a/XZI45Pw And here is a screenshot of the Country we are targeting. https://imgur.com/a/ArpWe9Z Lastly, attached is a screenshot of all of the Countries that visited our site today: https://imgur.com/a/d0tNwkI
International SEO | | MichaelAtMSP1 -
Redirect to 'default' or English (/en) version of site?
Hi Moz Community! I'm trying to work through a thorny internationalization issue with the 'default' and English versions of our site. We have an international set-up of: www.domain.com (in english) www.domain.com/en www.domain.com/en-gb www.domain.com/fr-fr www.domain.com/de-de and so on... All the canonicals and HREFLANGs are set up, except the English language version is giving me pause. If you visit www.domain.com, all of the internal links on that page (due to the current way our cms works) point to www.domain.com/en/ versions of the pages. Content is identical between the two versions. The canonical on, say, www.domain.com/en/products points to www.domain.com/products. Feels like we're pulling in two different directions with our internationalization signals. Links go one way, canonical goes another. Three options I can see: Remove the /en/ version of the site. 301 all the /en versions of pages to /. Update the hreflangs to point the EN language users to the / version. **Redirect the / version of the site to /en. **The reverse of the above. **Keep both the /en and the / versions, update the links on / version. **Make it so that visitors to the / version of the site follow links that don't take them to the /en site. It feels like the /en version of the site is redundant and potentially sending confusing signals to search engines (it's currently a bit of a toss-up as to which version of a page ranks). I'm leaning toward removing the /en version and redirecting to the / version. It would be a big step as currently - due to the internal linking - about 40% of our traffic goes through the /en path. Anything to be aware of? Any recommendations or advice would be much appreciated.
International SEO | | MaxSydenham0 -
What's the best strategy for acquisition?
Hi All, Recently acquired a competitor company. This acquired company is small in size but is the exclusive UK distributor for a gigantic Swedish company. This is the way the current domain structure is divided. swedish-supplier.se (Not owned by us - swedish supplier ) Swedish-supplier.co.uk (owned by us, operating as the swedish supplier in the UK) New-acquired-company.com (owned by us) The supplier doesn't want us to have two websites as they keep getting confused customers. Because of this we have agreed to remove www.swedish-suplier.co.uk and solely sell the product at www.new-aqquired-company.com. However, because of the sheer size of the Swedish supplier, a lot of traffic comes through to swedish-supplier.co.uk. My question is, how can we work together with the supplier to remove this domain and still maintain a good amount of UK traffic? Should we point swedish-supplier.co.uk back to the suppliers original translated web site and have them pass enquiries onto us or should we point it to our website? & What's the best way to go about it? Thanks, Danny
International SEO | | DannyHoodless0 -
Will hreflang with a language and region allow Google to show the page to all users of that language regardless of region?
I'm launching translations on a website with the first translation being Brazilian Portuguese. If I use the following hreflang: If a user is outside of Brazil and has their browser language set to just Portuguese (Not Portuguese (Brazil)) will Google still serve them the Portuguese version of my pages in search results?
International SEO | | Brando160 -
'Mini' versions of our website for overseas markets. Does it matter?
Hi Guys. I work for an e-commerce site called TOAD Diaries, we make bespoke diaries and journals. In essence we allow people to design their own diary online, then we make it and send it. We have already sold some products to poeple in many European countries, (Malta, France, Germany) but we want to have a better online presence for those overseas markets. So….. We're want to do an overseas ‘test case’, to see if we can sell more products in Europe. Out thinking is this: We’ll buy a subdomain for a specific country. Then we’ll then build a ‘mini’ version of our site in the appropriate language. This be a country specific landing page with links to our ‘design your own diary’ pages, basket and checkout. All in the language we’re targeting. Question: Will having such a small number of pages in the targeted countries language effect out ability to rank well? It will be maybe 10 – 15 pages in size. Or is it much more to do with on page optimization and quality backlinks? i.e) the site's size has no impact. What other factors should we consider when trying to rank well in other European countries? Many thanks in advance.
International SEO | | isaac6630 -
How do I successfully verify my site for Baidu's webmaster tools?
Instructions for verifying a website via file validation for Baidu's webmaster tools are pretty vague. Does anyone know if the process is the same as Google Webmaster Tools where the verification string must appear in the URL and in the content of the file? Also, does it truly have to be verified within 2.6 hours? Appreciate any feedback from people who have successfully verified their site.
International SEO | | sigmaaldrich0 -
Geo-targeting a sub-folder that's had url's rewritten from a sub-domain
I have a client that's setting up a section of his site in a different language, and we're planning to geo-target those pages to that country. I have suggested a sub-folder solution as it's the most cost effective solution, and it will allow domain authority to flow into those pages. His developer is indicating that they can only set this up as a sub-domain, for technical reasons, but they're suggesting they can rewrite the url's to appear as sub folder pages. I'm wondering how this will work in terms of geo-targeting in Google Webmaster Tools. Do I geo-target the sub domain or the sub folder i.e. does Google only see urls or does it physically see those pages on the sub-domain? It seems like it might be a messy solution. Would it be a better idea just to forget about the rewrites and live with the site being a sub domain? Thanks,
International SEO | | Leighm0 -
The case of the attempted server hacking and it's effect on SEO
Since relaunch earlier this year, we've had patches where our site has failed to load. It's happened every so often, but, until I receive the server logs from the company who hosts the site, I won't know exactly when this issue has occurred. Until now, we've only noticed it when someone in the company has tried, and failed, to access the site. Again, it happened today. After hassling our developers/hosting firm for a conclusive answer as to why, it emerged that their server (perhaps our site in particular because of the nature of our business) had been the target of an attempted hacking. We've now concluded that every time our site has messed around like this, it's because of a possible hack. Would anyone in SEOmoz Land be able to tell me if this is going to have a negative impact for our SEO and site performance? Would search engines be able to tell if a potential hack is, or was, occurring? Would we then be penalised? Please feel free to elaborate on the hacking process in general, too, if you can because this is the first time I've encountered it. Thanks
International SEO | | Martin_S0