SEO-Friendly Method to Load XML Content onto Page
-
I have a client who has about 100 portfolio entries, each with its own HTML page.
Those pages aren't getting indexed because of the way the main portfolio menu page works: It uses javascript to load the list of portfolio entries from an XML file along with metadata about each entry. Because it uses javascript, crawlers aren't seeing anything on the portfolio menu page.
Here's a sample of the javascript used, this is one of many more lines of code:
// load project xml try{ var req = new Request({ method: 'get', url: '/data/projects.xml',
Normally I'd have them just manually add entries to the portfolio menu page, but part of the metadata that's getting loaded is project characteristics that are used to filter which portfolio entries are shown on page, such as client type (government, education, industrial, residential, industrial, etc.) and project type (depending on type of service that was provided). It's similar to filtering you'd see on an e-commerce site. This has to stay, so the page needs to remain dynamic.
I'm trying to summarize the alternate methods they could use to load that content onto the page instead of javascript (I assume that server side solutions are the only ones I'd want, unless there's another option I'm unaware of). I'm aware that PHP could probably load all of their portfolio entries in the XML file on the server side. I'd like to get some recommendations on other possible solutions. Please feel free to ask any clarifying questions.
Thanks!
-
As a response to my own question, I received some other good suggestions to this issue via Twitter:
- @__jasonmulligan__ suggested XSLT
- @__KevinMSpence__ suggested "...easiest solution would be to use simplexml --it's a PHP parser for lightweight XML" & "Just keep in mind that simplexml loads the doc into memory, so there can be performance issues with large docs."
- Someone suggested creating a feed from the XML, but I don't think that adds a ton of benefit aside from another step, since you'd still need a way to pull that content on to the page.
- There were also a few suggestions for ways to convert the XML feed to another solution like JSON on the page, but those were really outside the scope of what we were looking to do.
Final recommendation to the client was to just add text links manually beneath all of the Javascript content, since they only were adding a few portfolio entries per year, and it would look good in the theme. A hack, perhaps, but much faster and cost-effective. Otherwise, would have recommended they go with PHP plus the simplexml recommendation from above.
-
Think you need to find a developer who understand progressive enhancement so that the page degrades gracefully. You'll need to deliver the page using something server-side (php?) and then add the bells and whistles later.
I'm guessing the budget won't cover moving the entire site/content onto a database/cms platform.
How does the page look in Google Webmaster Tools - (Labs, Instant Preview). Might give you a nice visual way to explain the problem to the client.
-
Site was done a year or two ago by a branding agency. To their credit, they produced clean and reasonably-well documented code, and they do excellent design work. However, they relied too heavily on Flash and javascript to load content throughout the site, and the site has suffered as a result.
Site is entirely HTML, CSS, & Javascript and uses Dreamweaver template files to produce the portfolio entry pages, which then propagate into the XML files, which then get loaded by the rest of the site.
I wouldn't call it AJAX - I think it loads all of the XML file and then uses the filters to display appropriate content, so there are no subsequent calls to the server for more data.
User interface is great, and makes it easy to filter and sort by relevant portfolio items. It's just not indexable.
-
What's the reason it was implemented this way in the first place? Is the data being exported from another system in a particular way?
What's the site running on - is there a CMS platform?
Is it javascript because it's doing some funky ajax driven "experience" or are they just using javascript and the xml file to enable you to filter/sort based on different facets?
Final silly question - how's the visitor expected to interact with them?
-
Try creating an XML sitemap with all the entries, spin that into an HTML sitemap version and also a portfolio page with a list of entries by type. It's a bit of work, but will probably work best.
-
Thanks Doug,
I forgot to mention it above, but I am definitely mentioning other workaround methods of getting the content indexed, specificallly:
- XML Sitemap
- Cross-linking - there's plenty of other opportunities to link throughout the site that haven't been done yet - so that's high on the list.
- Off-site deep link opportunities are also large and will be addressed.
- The projects aren't totally linear, so we can't use next/previous in this example, but that's a good idea as well.
Those aside, there is a fundamental issue with the way the data is working now and I want to address the ideal solution, since it's within the client's budget to have that content redesigned properly.
-
While helpfully not answering the question, could you generate a xml sitemap (I take it the portfolio data is being generated from something?) to help Google find and index the pages?
Is there any cross linking between the individual portfolio pages or at least a next/previous?
(My first thought would have been the php route.)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I use duplicate content in different US cities without hurting SEO?
So, I have major concerns with this plan. My company has hundreds of facilities located all over the country. Each facility has it's own website. We have a third party company working to build a content strategy for us. What they came up with is to create a bank of content specific to each service line. If/when any facility offers that service, they then upload the content for that service line to that facility website. So in theory, you might have 10-12 websites all in different cities, with the same content for a service. They claim "Google is smart, it knows its content all from the same company, and because it's in different local markets, it will still rank." My contention is that duplicate content is duplicate content, and unless it is "localize" it, Google is going to prioritize one page of it and the rest will get very little exposure in the rankings no matter where you are. I could be wrong, but I want to be sure we aren't shooting ourselves in the foot with this strategy, because it is a major major undertaking and too important to go off in the wrong direction. SEO Experts, your help is genuinely appreciated!
Intermediate & Advanced SEO | | MJTrevens1 -
SEO Friendly Facets
Hi I'm still stuck on the subject if SEO friendly facets. Firstly, is it worth investing time in over things like SEO campaigns/content marketing as I'm the only one working on SEO and trying to prioritise all tasks 🙂 Can I set up facets so they are SEO friendly - should they simply be blocked? my concern is wasting crawl budget and duplicate pages. Here's an example of a page on the site - https://www.key.co.uk/en/key/lift-tables Here's an example of a facet URL - https://www.key.co.uk/en/key/lift-tables#facet:-1002779711011711697110,-700000000000001001651484832107103,-700000000000001057452564832109109&productBeginIndex:0&orderBy:5&pageView:list& What would be the best course of action to take to make them SEO friendly? Tips would be appreciated 🙂
Intermediate & Advanced SEO | | BeckyKey0 -
Duplicate content but different pages?
Hi there! Im getting LOTS of "duplicate content" pages but the thing is they are different pages. My website essentially is a niche video hosting site with embedded videos from Youtube. Im working on adding personal descriptions to each video but keeping the same video title (should I re-word it from the original also? Any help?
Intermediate & Advanced SEO | | sarevme0 -
If a website trades internationally and simply translates its online content from English to French, German, etc how can we ensure no duplicate content penalisations and still maintain SEO performance in each territory?
Most of the international sites are as below: example.com example.de example.fr But some countries are on unique domains such example123.rsa
Intermediate & Advanced SEO | | Dave_Schulhof0 -
Would spiders successfully crawl a page with two distinct sets of content?
Hello all and thank you in advance for the help. I have a coffee company that sell both retail and wholesale products. These are typically the same product, just at different prices. We are planning on having a pop up for users to help them self identify upon their first visit asking if they are retail or wholesale clients. So if someone clicks retail, the cookie will show them retail pricing throughout the site and vice versa for those that identify themselves as wholesale. I can talk to our programmer to find out how he actually plans on doing this from a technical standpoint if it would be of assistance. My question is, how will a spider crawl this site? I am assuming (probably incorrectly) that whatever the "default" selection is (for example, right now now people see retail pricing and then opt into wholesale) will be the information/pricing that they index. So long story short, how would a spider crawl a page that has two sets of distinct pricing information displayed based on user self identification? Thanks again!
Intermediate & Advanced SEO | | ClayPotCreative0 -
Duplicate page content query
Hi forum, For some reason I have recently received a large increase in my Duplicate Page Content issues. Currently it says I have over 7,000 duplicate page content errors! For example it says: Sample URLs with this Duplicate Page Content http://dikelli.com.au/accessories/gowns/news.html http://dikelli.com.au/accessories/news.html
Intermediate & Advanced SEO | | sterls
http://dikelli.com.au/gallery/dikelli/gowns/gowns/sale_gowns.html However there are no physical links to any of these page on my site and even when I look at my FTP files (I am using Dreamweaver) these directories and files do not exist. Can anyone please tell me why the SEOMOZ crawl is coming up with these errors and how to solve them?0 -
How much (%) of the content of a page is considered too much duplication?
Google is not fond of duplication, I have been very kindly told. So how much would you suggest is too much?
Intermediate & Advanced SEO | | simonberenyi0 -
What is the effect on using jQuery sliders for content on SEO?
I know using css in subversive manners gets you dinged for points. I didnt know if JS counted the same since you are essentially hiding parts of the content and showing it in intervals as slides. The goal would be having key items for a client in divs and rotating those divs via a slider plugin as slides. I was just curious if that effected things in any way. Thanks! ~Paul
Intermediate & Advanced SEO | | peb72680