Category Archives: xapi

Using xAPI to support blended simulation

OLab and OpenLabyrinth have always been good at providing the contextual glue that holds together various simulation modalities. Here are some examples of projects where OpenLabyrinth has supported blended simulation activities:

  • Virtual Spinal Tap – uses haptic simulation to model the feeling of needle insertion
  • Rushing Roulette – timed tasks with a $30 Arduino lie detector!
  • Crevasse Rescue – multiple teams & disciplines, with high & low fidelity simulators
  • R. Ed Nekke – bookending around Laerdal mannequin scenario

But now with xAPI providing the background data linking to a Learning Record Store, it is much easier to do this across a wider range of tools and platforms. Some of the above mentioned projects used a very sophisticated gamut of high-speed networks, at considerable cost.

Doing this now with xAPI is proving to be much more flexible, scalable and cost effective. To support haptic projects, like Virtual Spinal Tap, we are now working with the Medbiq Learning Experience Working Group on an xAPI Haptics Profile. Check it out and give us feedback.

OpenLabyrinth at the OHMES Symposium

Coming up this week on Wed 22nd Feb and Thu 23rd, we are hosting the annual symposium for

OHMES: Office of Health & Medical Education Scholarship

at the Cumming School of Medicine. We have some great keynote speakers, including Lorelei Lingard, Kevin Eva and Stella Ng. For full details on the program, check out

We have lots of interest this year – hope you registered already.

One of the things that we will be demonstrating at this year’s symposium is the continuing work we are doing with our Rushing Roulette stress tests.

Check out this page for more info on how we are combining multiple activity streams, using xAPI and a Learning Record Store (LRS), OpenLabyrinth, and a cheap $30 Arduino board.

You also use this shortcode link to reach that same page:

OpenLabyrinth stress testing at CHES scholarship day

On Wed, 5th October, the Centre for Health Education Scholarship (CHES) at UBC held its annual scholarship symposium, in Vancouver.

There were many interesting sessions, including a stirring keynote address from Rachel Ellaway (Professor, Education Research, University of Calgary.

OpenLabyrinth featured at a few presentations at the CHES symposium, including a short presentation on Activity Metrics by David Topps and Corey Albersworth. (See )

In one of the afternoon demonstration sessions, we were able to show our Arduino stress-detector kit in action to conference participants. Here we have a short video of the Arduino sensors being calibrated.

This was the same basic setup as that first shown at the Medbiq Conference in Baltimore earlier this year. However, for this conference, no expense was spared. We splurged another $29.99 on another Arduino device. Yes, it nearly broke the budget!

We also managed to set up the software on both Windows 10 and OS X Yosemite, which highlights the platform independence of the Eclipse IDE that we used for collecting the Arduino data and sending it to the LRS.

Here we have a short video of the OpenLabyrinth stress-test in action. Our participant is playing a rapid-fire series of case vignettes on the Mac on the right, while the Arduino sensors connected to the Windows machine on the right is recording real-time data on her heart rate and Galvanic Skin Response.

We initially created this project as a simple technical demonstration that one could use a cheap, easy combination of Arduino hardware, OpenLabyrinth and xAPI statement collection into the GrassBlade Learning Record Store. We had only intended to show that such collection from multiple activity streams was feasible in the time and resources available to the average education researcher i.e. not much.

We were delighted to find that the stress detector was much more sensitive than we anticipated and will be useful in real-world research.

Medbiq xAPI workshop technical report

We just published the interim technical report from our xAPI workshop at the Medbiq annual conference. (We also have an updated reported, stored internally here : Medbiq xAPI Workshop Report, which corrects a few minor errors in the original.)

Medbiq xAPI Arduino Sensors

As we mentioned in our earlier posts, we were really pleased by the participation at the workshop. We just heard from Medbiq that it was really well received and the evaluations were very positive.

We created this much more detailed Technical Report so that others, who may be interested in exploring what you can do with xAPI and Arduino sensors, can follow our processes and the challenges we faced. This will hopefully provide enough detail that others groups can also make similar explorations. Please feel free to contact us through this site if you are interesting in this area of research and development.

OpenLabyrinth officially recognized as TinCan/xAPI Adopter

More on the xAPI stuff… and perhaps a wee bit of clarification about terminology.

OpenLabyrinth was just admitted to the official group of Tin Can Adopters:


Tin Can API was the original name given by Rustici Software. It is now more properly known as the Experience API or xAPI but many still call it Tin Can. It is the same thing and the terms are synonymous. Advanced Distributed Learning (ADL) was the group who first commissioned the development of xAPI by Rustici so I guess they get to name it.

But most importantly, the API will remain open and non-proprietary.

OpenLabyrinth has H5P widgets

A couple of weeks ago, we described how we were using H5P widgets here on our WordPress web site. Well, now we also have them fully integrated into OpenLabyrinth itself.

H5P logo

So, what’s the big deal, I hear you say…well, it means that we now have access to a whole new way of interacting with our users. It makes our nodes and pages much richer, with some nicely crafted HTML5 interactive content.

There are many pre-built H5P widgets on their main web site, which you can then easily modify to include your own content. We won’t bore you with descriptions of everything they have because H5P does it better. But the really cool part is that you can download H5P widgets from other web sites and insert them into your own cases and pages.

Given the interest in our recent work on Activity Metrics and xAPI, we are also delighted that H5P widgets provide xAPI tracking. So you can study how your learners interact with your widgets and cases in even greater detail.

Activity metrics at Medbiquitous conference

It’s conference season indeed, around here. The Medbiq Annual Conference is coming up again soon in Baltimore, May 15-17, 2016.

Medbiq logo

Following on from previous years, activity streams and learning analytics will again feature prominently. OpenLabyrinth will be heavily used in a workshop we are holding about the Experience API (xAPI), along with some interesting widgets and gadgets to track/stress your learners.

This will make a nice extension on some of the other work we have recently presented about big data principles, applied to educational metrics, at the Ottawa Conference and CCME over the past month.

Come and play – we’ll make you sweat!

OpenLabyrinth’s timings tighten up

We are pleased to announce an interesting new development on our OpenLabyrinth test site. We are experimenting with timestamps that have millisecond accuracy – this opens up this tool to a whole bunch of new research areas.

For example, you can now start looking at reaction times or which player was first to the buzzer in competitive team scenarios. Lots more fun stuff.

Previously, in OpenLabyrinth, all of our participants’ activities when playing a case were recorded into its database but the timestamps for each activity point were only recorded to the nearest second. For most purposes, this is just fine.

But now we are able to track these same activity points much more accurately. The internal database will now record timestamps in microseconds. Now, for anyone who works with such research, it will be clear that you also have to take into account the tiny fractions of a second between an activity and the time it is stored, including the processing time in between. There are established techniques for accommodating these timing offsets.

So, if you have an interest in taking advantage of this greater timing accuracy in one of your projects, please contact us.

OpenLabyrinth and BIG data

The Ottawa Conference, an international medical education conference held every two years (and only occasionally in Ottawa (2002, 2014…)) has its main focus on educational assessment.

This year, as we noted in a previous post, there has been a lot of interest in Big Data in medical education. Now, before I am laughed out of the house by real big data scientists, I hasten to add that the amounts of data generated by medical education are still tiny compared to those from genomics, protein folding or the ginormous stuff from the Large Hadron Collider.

But size isn’t everything.

There are various V’s attributed to big data – initially three, but growing, and controversial but I won’t get into that digression.

  • Volume
  • Velocity
  • Variety

While our volumes are several orders of magnitude smaller than the big boys, it is the principles that matter. What we have been finding is that these principles are very useful and usable even when applied to personal learning data. Just before the conference, we posted some test pages about Precision Education. This theme came out over and over again at the Ottawa Conference with some fascinating insights that can be generated from such data sources.

If you want a nice, easy to understand overview of some of the key principle of big data, I suggest (again) that you take a look at Kenneth Cukier’s presentation at TED.