SEO-Friendly Method to Load XML Content onto Page
-
I have a client who has about 100 portfolio entries, each with its own HTML page.
Those pages aren't getting indexed because of the way the main portfolio menu page works: It uses javascript to load the list of portfolio entries from an XML file along with metadata about each entry. Because it uses javascript, crawlers aren't seeing anything on the portfolio menu page.
Here's a sample of the javascript used, this is one of many more lines of code:
// load project xml try{ var req = new Request({ method: 'get', url: '/data/projects.xml',
Normally I'd have them just manually add entries to the portfolio menu page, but part of the metadata that's getting loaded is project characteristics that are used to filter which portfolio entries are shown on page, such as client type (government, education, industrial, residential, industrial, etc.) and project type (depending on type of service that was provided). It's similar to filtering you'd see on an e-commerce site. This has to stay, so the page needs to remain dynamic.
I'm trying to summarize the alternate methods they could use to load that content onto the page instead of javascript (I assume that server side solutions are the only ones I'd want, unless there's another option I'm unaware of). I'm aware that PHP could probably load all of their portfolio entries in the XML file on the server side. I'd like to get some recommendations on other possible solutions. Please feel free to ask any clarifying questions.
Thanks!
-
As a response to my own question, I received some other good suggestions to this issue via Twitter:
- @__jasonmulligan__ suggested XSLT
- @__KevinMSpence__ suggested "...easiest solution would be to use simplexml --it's a PHP parser for lightweight XML" & "Just keep in mind that simplexml loads the doc into memory, so there can be performance issues with large docs."
- Someone suggested creating a feed from the XML, but I don't think that adds a ton of benefit aside from another step, since you'd still need a way to pull that content on to the page.
- There were also a few suggestions for ways to convert the XML feed to another solution like JSON on the page, but those were really outside the scope of what we were looking to do.
Final recommendation to the client was to just add text links manually beneath all of the Javascript content, since they only were adding a few portfolio entries per year, and it would look good in the theme. A hack, perhaps, but much faster and cost-effective. Otherwise, would have recommended they go with PHP plus the simplexml recommendation from above.
-
Think you need to find a developer who understand progressive enhancement so that the page degrades gracefully. You'll need to deliver the page using something server-side (php?) and then add the bells and whistles later.
I'm guessing the budget won't cover moving the entire site/content onto a database/cms platform.
How does the page look in Google Webmaster Tools - (Labs, Instant Preview). Might give you a nice visual way to explain the problem to the client.
-
Site was done a year or two ago by a branding agency. To their credit, they produced clean and reasonably-well documented code, and they do excellent design work. However, they relied too heavily on Flash and javascript to load content throughout the site, and the site has suffered as a result.
Site is entirely HTML, CSS, & Javascript and uses Dreamweaver template files to produce the portfolio entry pages, which then propagate into the XML files, which then get loaded by the rest of the site.
I wouldn't call it AJAX - I think it loads all of the XML file and then uses the filters to display appropriate content, so there are no subsequent calls to the server for more data.
User interface is great, and makes it easy to filter and sort by relevant portfolio items. It's just not indexable.
-
What's the reason it was implemented this way in the first place? Is the data being exported from another system in a particular way?
What's the site running on - is there a CMS platform?
Is it javascript because it's doing some funky ajax driven "experience" or are they just using javascript and the xml file to enable you to filter/sort based on different facets?
Final silly question - how's the visitor expected to interact with them?
-
Try creating an XML sitemap with all the entries, spin that into an HTML sitemap version and also a portfolio page with a list of entries by type. It's a bit of work, but will probably work best.
-
Thanks Doug,
I forgot to mention it above, but I am definitely mentioning other workaround methods of getting the content indexed, specificallly:
- XML Sitemap
- Cross-linking - there's plenty of other opportunities to link throughout the site that haven't been done yet - so that's high on the list.
- Off-site deep link opportunities are also large and will be addressed.
- The projects aren't totally linear, so we can't use next/previous in this example, but that's a good idea as well.
Those aside, there is a fundamental issue with the way the data is working now and I want to address the ideal solution, since it's within the client's budget to have that content redesigned properly.
-
While helpfully not answering the question, could you generate a xml sitemap (I take it the portfolio data is being generated from something?) to help Google find and index the pages?
Is there any cross linking between the individual portfolio pages or at least a next/previous?
(My first thought would have been the php route.)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International SEO: can I choose only certain pages for subfolder?
For a client we are discussing international SEO options. I have pushed against CCTLD because they do not have the resources to manage multiple sites. Instead we want to go the subfolder route: .com/uk/ My question is whether we can properly create a subfolder version that only includes a handful of pages rather than the whole site - so 5-10 pages vs 4k. Is that possible? I'd love your thoughts. International SEO is not my strong suit. Also - if the subfolder for .com/uk/ content is almost entirely the same as the us-based .com is that a problem? Thanks!
Intermediate & Advanced SEO | | JBMediaGroup0 -
Do you know if there is a tool that check all the scripts that are running on the page, and can diagonse scripts that can harm our seo?
Hi, Do you know if there is a tool that check all the scripts that are running on the page, and can diagnose scripts that can harm our seo? Thanks Roy
Intermediate & Advanced SEO | | kadut0 -
Xml sitemap Issue... Xml sitemap generator facilitating only few pages for indexing
Help me I have a website earlier 10,000 WebPages were facilitated in xml sitemap for indexation, but from last few days xml sitemap generator facilitating only 3300 WebPages for indexing. Please help me to resolve the issue. I have checked Google webmaster indexed pages, its showing 8,141. I have tried 2-3 paid tools, but all are facilitating 3300 pages for indexing. I am not getting what is the exact problem, whether the server not allowing or the problem with xml sitemap generator. Please please help me…
Intermediate & Advanced SEO | | udistm0 -
Contextual FAQ and FAQ Page, is this duplicate content?
Hi Mozzers, On my website, I have a FAQ Page (with the questions-responses of all the themes (prices, products,...)of my website) and I would like to add some thematical faq on the pages of my website. For example : adding the faq about pricing on my pricing page,... Is this duplicate content? Thank you for your help, regards. Jonathan
Intermediate & Advanced SEO | | JonathanLeplang0 -
Big problem with duplicate page content
Hello! I am a beginner SEO specialist and a have a problem with duplicate pages content. The site I'm working on is an online shop made with Prestashop. The moz crawl report shows me that I have over 4000 duplicate page content. Two weeks ago I had 1400. The majority of links that show duplicate content looks like bellow:
Intermediate & Advanced SEO | | ana_g
http://www.sitename.com/category-name/filter1
http://www.sitename.com/category-name/filter1/filter2 Firstly, I thought that the filtres don't work. But, when I browse the site and I test it, I see that the filters are working and generate links like bellow:
http://www.sitename.com/category-name#/filter1
http://www.sitename.com/category-name#/filter1/filter2 The links without the # do not work; it messes up with the filters.
Why are the pages indexed without the #, thus generating me duplicate content?
How can I fix the issues?
Thank you very much!0 -
Sort term product pages and fast indexing - XML sitemaps be updated daily, weekly, etc?
Hi everyone, I am currently working on a website that the XML sitemap is set to update weekly. Our client has requested that this be changed to daily. The real issue is that the website creates short term product pages (10-20 days) and then the product page URL's go 404. So the real problem is quick indexing not daily vs weekly sitemap. I suspect that daily vs weekly sitemaps may help solve the indexing time but does not completely solve the problem. So my question for you is how can I improve indexing time on this project? The real problem is how to get the product pages indexed and ranking before the 404 page shows u?. . Here are some of my initial thoughts and background on the project. Product pages are only available for 10 to 20 days (Auction site).Once the auction on the product ends the URL goes 404. If the pages only exist for 10 to 20 days (404 shows up when the auction is over), this sucks for SEO for several reasons (BTW I was called onto the project as the SEO specialist after the project and site were completed). Reason 1 - It is highly unlikely that the product pages will rank (positions 1 -5) since the site has a very low Domain Authority) and by the time Google indexes the link the auction is over therefore the user sees a 404. Possible solution 1 - all products have authorship from a "trustworthy" author therefore the indexing time improves. Possible solution 2 - Incorporate G+ posts for each product to improve indexing time. There is still a ranking issue here since the site has a low DA. The product might appear but at the bottom of page 2 or 1..etc. Any other ideas? From what I understand, even though sitemaps are fed to Google on a weekly or daily basis this does not mean that Google indexes them right away (please confirm). Best case scenario - Google indexes the links every day (totally unrealistic in my opinion), URL shows up on page 1 or 2 of Google and slowly start to move up. By the time the product ranks in the first 5 positions the auction is over and therefore the user sees a 404. I do think that a sitemap updated daily is better for this project than weekly but I would like to hear the communities opinion. Thanks
Intermediate & Advanced SEO | | Carla_Dawson0 -
HELP! How does one prevent regional pages as being counted as "duplicate content," "duplicate meta descriptions," et cetera...?
The organization I am working with has multiple versions of its website geared towards the different regions. US - http://www.orionhealth.com/ CA - http://www.orionhealth.com/ca/ DE - http://www.orionhealth.com/de/ UK - http://www.orionhealth.com/uk/ AU - http://www.orionhealth.com/au/ NZ - http://www.orionhealth.com/nz/ Some of these sites have very similar pages which are registering as duplicate content, meta descriptions and titles. Two examples are: http://www.orionhealth.com/terms-and-conditions http://www.orionhealth.com/uk/terms-and-conditions Now even though the content is the same, the navigation is different since each region has different product options / services, so a redirect won't work since the navigation on the main US site is different from the navigation for the UK site. A rel=canonical seems like a viable option, but (correct me if I'm wrong) it tells search engines to only index the main page, in this case, it would be the US version, but I still want the UK site to appear to search engines. So what is the proper way of treating similar pages accross different regional directories? Any insight would be GREATLY appreciated! Thank you!
Intermediate & Advanced SEO | | Scratch_MM0 -
SEO for 1,000,000 page site
Dear All, I hope you can help me with another question about doing SEO for a large site: 1 - My domain is 11 year old, all time was a parking domain
Intermediate & Advanced SEO | | SteveTran2013
2 - We have 10,000 articles - unique content (500-1500 words)
3 - the remaining are automated content, however, they are also unique with data (numbers, figure) We are going to launch it in 2 weeks, and intend to do the following things: Stage 1: first 2 months - only post 10,000 articles with unique content, NO using automated ones.
Link building: get 5-10 authority links pointing to it, either article writings or link pages (authority links Yahoo directory/Dmoz) Stage 2: month 3 to 6: gradually put the automated content online while still posting unique and well written articles.
Link building: Start building links with PR websites, article submission. Do you think there are any problems with this plan? and if 5-10 links can improve our site ranking, given it has a lot of unique content? Thank you very much. BR/Tran1