SEO-Friendly Method to Load XML Content onto Page
-
I have a client who has about 100 portfolio entries, each with its own HTML page.
Those pages aren't getting indexed because of the way the main portfolio menu page works: It uses javascript to load the list of portfolio entries from an XML file along with metadata about each entry. Because it uses javascript, crawlers aren't seeing anything on the portfolio menu page.
Here's a sample of the javascript used, this is one of many more lines of code:
// load project xml try{ var req = new Request({ method: 'get', url: '/data/projects.xml',
Normally I'd have them just manually add entries to the portfolio menu page, but part of the metadata that's getting loaded is project characteristics that are used to filter which portfolio entries are shown on page, such as client type (government, education, industrial, residential, industrial, etc.) and project type (depending on type of service that was provided). It's similar to filtering you'd see on an e-commerce site. This has to stay, so the page needs to remain dynamic.
I'm trying to summarize the alternate methods they could use to load that content onto the page instead of javascript (I assume that server side solutions are the only ones I'd want, unless there's another option I'm unaware of). I'm aware that PHP could probably load all of their portfolio entries in the XML file on the server side. I'd like to get some recommendations on other possible solutions. Please feel free to ask any clarifying questions.
Thanks!
-
As a response to my own question, I received some other good suggestions to this issue via Twitter:
- @__jasonmulligan__ suggested XSLT
- @__KevinMSpence__ suggested "...easiest solution would be to use simplexml --it's a PHP parser for lightweight XML" & "Just keep in mind that simplexml loads the doc into memory, so there can be performance issues with large docs."
- Someone suggested creating a feed from the XML, but I don't think that adds a ton of benefit aside from another step, since you'd still need a way to pull that content on to the page.
- There were also a few suggestions for ways to convert the XML feed to another solution like JSON on the page, but those were really outside the scope of what we were looking to do.
Final recommendation to the client was to just add text links manually beneath all of the Javascript content, since they only were adding a few portfolio entries per year, and it would look good in the theme. A hack, perhaps, but much faster and cost-effective. Otherwise, would have recommended they go with PHP plus the simplexml recommendation from above.
-
Think you need to find a developer who understand progressive enhancement so that the page degrades gracefully. You'll need to deliver the page using something server-side (php?) and then add the bells and whistles later.
I'm guessing the budget won't cover moving the entire site/content onto a database/cms platform.
How does the page look in Google Webmaster Tools - (Labs, Instant Preview). Might give you a nice visual way to explain the problem to the client.
-
Site was done a year or two ago by a branding agency. To their credit, they produced clean and reasonably-well documented code, and they do excellent design work. However, they relied too heavily on Flash and javascript to load content throughout the site, and the site has suffered as a result.
Site is entirely HTML, CSS, & Javascript and uses Dreamweaver template files to produce the portfolio entry pages, which then propagate into the XML files, which then get loaded by the rest of the site.
I wouldn't call it AJAX - I think it loads all of the XML file and then uses the filters to display appropriate content, so there are no subsequent calls to the server for more data.
User interface is great, and makes it easy to filter and sort by relevant portfolio items. It's just not indexable.
-
What's the reason it was implemented this way in the first place? Is the data being exported from another system in a particular way?
What's the site running on - is there a CMS platform?
Is it javascript because it's doing some funky ajax driven "experience" or are they just using javascript and the xml file to enable you to filter/sort based on different facets?
Final silly question - how's the visitor expected to interact with them?
-
Try creating an XML sitemap with all the entries, spin that into an HTML sitemap version and also a portfolio page with a list of entries by type. It's a bit of work, but will probably work best.
-
Thanks Doug,
I forgot to mention it above, but I am definitely mentioning other workaround methods of getting the content indexed, specificallly:
- XML Sitemap
- Cross-linking - there's plenty of other opportunities to link throughout the site that haven't been done yet - so that's high on the list.
- Off-site deep link opportunities are also large and will be addressed.
- The projects aren't totally linear, so we can't use next/previous in this example, but that's a good idea as well.
Those aside, there is a fundamental issue with the way the data is working now and I want to address the ideal solution, since it's within the client's budget to have that content redesigned properly.
-
While helpfully not answering the question, could you generate a xml sitemap (I take it the portfolio data is being generated from something?) to help Google find and index the pages?
Is there any cross linking between the individual portfolio pages or at least a next/previous?
(My first thought would have been the php route.)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do uncrawled but indexed pages affect seo?
It's a well known fact that too much thin content can hurt your SEO, but what about when you disallow google to crawl some places and it indexes some of them anyways (No title, no description, just the link) I am building a shopify store and it's imposible to change the robots.txt using shopify, and they disallow for example, the cart. Disallow: /cart But all my pages are linking there, so google has the uncrawled cart in it's index, along with many other uncrawled urls, can this hurt my SEO or trying to remove that from their index is just a waste of time? -I can't change anything from the robots.txt -I could try to nofollow those internal links What do you think?
Intermediate & Advanced SEO | | cuarto7150 -
How will changing my website's page content affect SEO?
Our company is looking to update the content on our existing web pages and I am curious what the best way to roll out these changes are in order to maintain good SEO rankings for certain pages. The infrastructure of the site will not be modified except for maybe adding a couple new pages, but existing domains will stay the same. If the domains are staying the same does it really matter if I just updated 1 page every week or so, versus updating them all at once? Just looking for some insight into how freshening up the content on the back end pages could potentially hurt SEO rankings initially. Thanks!
Intermediate & Advanced SEO | | Bankable1 -
Does collapsing content impact Google SEO signals?
Recently I have been promoting custom long form content development for major brand clients. For UX reasons we collapse the content so only 2-3 sentences of the first paragraph are visible. However there is a "read more" link that expands the entire content piece.
Intermediate & Advanced SEO | | RosemaryB
I have believed that the searchbots would have no problem crawling, indexing and applying a positive SEO signal for this content. However I'm starting to wonder. Is there any evidence that the Google search algorithm could possible discount or even ignore collapsed content?1 -
SEO: Can you rank Amazon product Pages ?
Hi Guys Just a general question about doing SEO for our product pages... We have a range of products on Amazon and wondered if it is worth build some good links to our product pages to get them ranked higher in Google ?? Is it easier to rank Amazon product pages ?? Thanks Guys G Gareth
Intermediate & Advanced SEO | | GAZ090 -
301 or 404 Question for thin content Location Pages we want to remove
Hello All, I have a Hire Website with many categories and individual location pages for each of the 70 depots we operate. However, being dynamic pages, we have thousands of thin content pages. We have decided to only concentrate on our best performing locations and get rid of the rest as its physically impossible to write unique content for all our location pages for every categories. Therefore my question is. Would it cause me problems by having to many 301's for the location pages I am going to re-direct ( i was only going to send these back to the parent category page) or should I just 404 all those location pages and at some point in the future when we are in a position to concentrate on these locations then redo them with new content ? in terms of url numbers It would affect a few thousand 301's or 404's depending on people thoughts. Also , does anyone know what percentage of thin content on a site should be acceptable ?.. I know , none is best in an ideal world but it would be easier if there we could get away with a little percentage. We have been affected by Panda , so we are trying to tidy things up as best at possible, Any advice greatly appreciated? thanks Peter
Intermediate & Advanced SEO | | PeteC120 -
Update content or create a new page for a year related blog post?
I have a page called 'video statistics 2013' which ranks really well for video stat searches and drives in a lot of traffic to the site. Am I best to just change the title etc to 2014 and update the content, or create a totally new page? The page has 2013 in the URL as well which may be a problem for just updating?
Intermediate & Advanced SEO | | JonWhiting0 -
If Penguin 2.0 targets specific pages and keywords, should I spend less SEO effort on them since will they be harder to optimize? Penalty repair is only starting at end of year.
I’m working with a company that got hit by Penguin 2.0. They’re going to switch to white-hat only for a few months and review analytics before considering repairing the penalty. In the meantime, would it make sense to focus less SEO effort (on-site optimization, link building, etc.) on any pages or keywords that were penalized or hit hardest? Or are those the pages we should work on the most? Thanks for reading!
Intermediate & Advanced SEO | | DA20130 -
SEO Tools for Content Audit
Hi i'm looking for a tool which can do a full content audit for a site for instance - Find pages which: • Lack text content. • Finds pages with lengthy meta descriptions • Finds missing H1 tags or multiple H1 tags . • Duplicate meta descriptions. • Find images with no alt text Are there any tools besides the ones on SEMOZ which can enable me to do a full content audit on factors like these. Or any SEO audit tools out there which you can recommend. Cheers, Mark
Intermediate & Advanced SEO | | monster990