Category Archives: Sakai

All the Connecting Dots

We’re planning an upgrade of our LMS. Consider these “dots” and their connections:

  • The old gradebook is out. No further development. When do we take it away from our instructors? How many times do we notify them first?
  • We’re in the middle of changing video on demand (VOD) providers. The old one was to have been available ’til Jan. 2019, but the old one does not provide a standard LTI integration and therefore isn’t ready for the upgraded LMS version.
  • Also, there is no straight migration of content between the old VOD and the new one. Our instructors want to take their content with them. There are 576 sites which have media that needs to be migrated. This needs to be decoupled from any and all LMS upgrades. I wish.
  • We’ve been running this LMS since 2011 with that aforementioned gradebook. There is no archival method for old sites in this LMS, nor in any other LMS I’m aware of. Nor should anyone ever expect that an LMS is a system of record. But instructors do. So when we upgrade to the new version without the old gradebook, none of their old sites will have grades. Just a blank screen. Can we live with that?
  • There’s one best time frame for an LMS upgrade, even one that mostly streamlines performance and usability without adding new features or seriously changing workflows. That time is Commencement weekend, a mere 5 and a half weeks from now.

abstract-313629_640

Instead of focusing on the technical details, all the splotches that need to be corralled, tested, organized, cleaned up and formed into a pleasing whole… let’s see if it helps to imagine this from our constituents’ point of view. What do they need to know and when?

  • Their current gradebook is going away. Even in old course sites they haven’t seen in a long time. Now would be a good time to check their records and export gradebooks from long ago since instructors are responsible for these things, not the LMS, nor those who run it.
    • From Fall 2018 course sites on, the old gradebook will no longer be available.
    • From that same time, old course sites also won’t have grades records.
    • We’re working on a copy of all those sites to run on the old server software version, so that we can take requests for anyone who missed these messages, and get them spreadsheet exports of those old gradebooks. But that old server software won’t be able to be up for ever. We’re thinking may another year?
  • They’re going to like the new gradebook. And we’ll help. In fact, we’ve been offering it as an alternative since last May, so even if it wasn’t in use by any given instructor, LMS support staff already knows a good deal about it and the kinds of questions  people have.
  • The upgrade to the LMS? Not really of consequence this time. People will care more about their gradebooks and their embedded video service.
  • How fast can we migrate their content from the old (“K”) to the new (“P”) video on demand/playback service? Because of course we can’t take away K without delivering on an alternative!

Sometime in August I hope to deliver on a cohesive orchestrated totality.

background-2734972_640

 

Advertisements

What I Didn’t Know About Using Community Source Software (Sakai) in Higher Ed

“How to set expectations for change management,” – that’s it. That’s what I didn’t know.

Consider the kinds of changes one habitually takes from a vendor of proprietary software which you maintain in your own higher ed Data Center.

The hypothetical change is

New cool features advertised to you, or…

  • Bug fixes (you’d found them or you hadn’t, in other words, you cared deeply or not at all), or …
  • Enhancements.

Qualities of the change:

  • The change has already been deployed and tested many times over by the software vendor in environments, with data, very similar to yours.
  • You schedule when you want to accept/install the change and make it available to your users, usually based on your academic schedule.

What your users expect-

  • Won’t get it before the regular schedule break, even if they knew about it.
  • Didn’t know about it anyway because the vendor doesn’t market to them but to the people who manage the system for them.
  • The vendor is an uncaring large collection of cogs anyway, so no point in asking for an enhancement.

What your users do-

  • Blog, tweet and facebook their complaints vociferously but without expecting more than a good venting session.
  • Write you emails about how your vendor doesn’t care.
  • Login after upgrades and harumpf that that the thing that used to annoy them so greatly is finally fixed.

 

Contrast this with open/community source:

What your users expect-

  • They will be heard if they connect with the community.
  • You are connecting with the community on their behalf and that will be meaningful in the community because it must be small.
  • The developers working on their behalf will automatically do a better job than the vendor because they work for higher ed institutions.

What your users do-

  • Demand bug fixes and enhancements.
  • Expect them to be applied frequently.
  • Suggest enhancements and expect them to be executed in amazingly beautiful ways.

It’s a media, media media world.

If you’re doing academic research, you can now cite a Tweet.

From the MLA:

MLA_Tweet_Citation

If you do project management, make it visual. In my workplace we’re seeing these “SCRUM boards” on every available wall. Some even include “buns in the oven” (the photo of the ultrasound is an example of media embedding):

scrumwall5

If you want to make a point, use an Infographic (fancy name for a collage that’s informative, right?) :

infographofatwitteruser2

Marketers always use media, your technology project might want to use it to help spin the change:

Sakai-AMovingStory

Stalking Sakai

I’m new to the open source model. To supporting it. To participating in the community. To seeing how it’s built and how features are added. But I’ve been watching for nigh unto 15 years. And I’m here to tell you: higher ed is generally bullish on software derived from this open source model.

It’s almost as if open source were the answer to all the budgetary , visionary, and advocacy issues we all face. From Community Jr. College to State School to Private – we’ve summoned open source to give us more freedom, more features, more revenue, more integration points, more responsiveness to our constituencies,  and more control of our destinies.

Software derived from and supported by the open source model is more and more under investigation by more and more institutions of higher ed. Cautiously under investigation in some cases, but under investigation nevertheless.

Sakai began around 2004 initially as a collaboration between University of Michigan, Indiana University, MIT and Stanford. By 2005 Foundation Staff on the Sakai CLE were 5 people- salaries based mostly on contributions from higher ed IT.

Institutions joined up. Commercial affiliates formed. Synergies developed. The coalition worked diligently. Advocated. Listened. Built. Deployed. Software developed by higher ed for higher ed and ‘owned’ by all.

Very cool.

Except when too many institutions want to take and not give back.

That was the message I was shocked to internalize last week when one of the chief Sakai advocates and architects this past 8 years, Dr. Chuck Severance, defended his decision to take employment from – Blackboard. He took a position at Blackboard that furthers his goals (shared by the Sakai community) of making learning technologies interoperable. Below, website by website, is a visual of his considerable breadth of reach. From development acknowledgements at Moodlerooms, and Blackboard’s Edugarage , to standards work at IMS Global and thought leadership published by Gilfus, Delta Initiatives, Campus Technology, edu1World and InsideHigher Ed. (As well as a frequently referenced though ‘unpublished’ work…!).

dr chuck severance internet presence

But today, according to Dr. Chuck, since about 4 months ago,  Sakai Foundation Staff actively working on the Sakai CLE (version 2.9 now) is zero.  Instead, the only remaining +dedicated+ release management resources moving the release forward come from commercial affiliates, NOT higher ed.

In Dr. Chuck’s call to action posting last week, he says, “Does it bother you that about 40 higher educations stopped supporting the Sakai Foundation over the past five years?” We remember the past five years- In budgetary terms, everyone ran for the hills, dug in where we could. The difficulty is that if higher ed doesn’t sustain this effort, who will?

He goes on to ask,  “Are you uncomfortable that for-profit companies already provide all of the long-term committed resources for the Sakai CLE product?”

I am. I am very uncomfortable. Are you?

Blackboard, SunGard and rSmart : A Client’s Take on Support, Oh My.

“Support” in the context of this posting is that service for which a client contracts that includes a way to report issues to a vendor such that the vendor is expected to respond with a resolution. That’s just basic, right?

Unfortunately with technology vendors the evaluation of which vendor to choose for ones initial purchase is so complex that it can not and does not include an evaluation of the structures and process the vendor has in place to provide that support. This leads to wide disparity among technology vendors as to how they resource support and what they think support does for their bottom line.

I know these three companies fairly well from personal experience. Here’s this year’s report card from me:

  • Blackboard    A
  • rSmart             B+
  • SunGard         D

In every single encounter the following is what the support structure must deliver.  Here are my ratings for these three vendors on a 1-5 scale; 1 being exemplary and 5 indicating you’d be looking for another vendor if it were your cellphone service:

Parameter Bb rSmart SunGard
Timeliness 1 1 5
Expectation Management 1 4 4
Trustworthiness 1 1 3
Accuracy 2 3 4
Knowledge 2 3 5
Meaningfulness 2 3 4
Totals 9 15 25

So vendors, I’ll say it straight out: Your clients expect to be able to contact you quickly, receive immediate word that you’re working on their issue, and have a meaningful resolution in a timeframe commensurate to their urgency.

We want what you tell us to be truthful and timely and we don’t want promises to guestimate to a greater degree of accuracy than you’re able to deliver. If you say the service pack which includes my issue resolution will be available “soon” – tell us if that means 3rd quarter or next year and get it to us by then. If you say, “April” – you’ll lose points for Accuracy, Trustworthiness and Meaningfulness if you then deliver in May. And God forbid that what you deliver should end up being nothing like what you told us it would be!

“Meaningful” – let’s talk about that specifically. No one wants repeated “We’re working on it,” messages. Even if you personalize those messages, they aren’t meaningful. 

“Knowledge” – SunGard has a lot of knowledge, but I can’t locate it myself in their huge knowledgebase site – so what good is that?

“Expectation Management” – I’m not sure I have any help for you vendors on this one. On the one hand, I appreciate multiple ways of communicating with you. On the other hand, if none of the people who say they’re going to escalate my problem actually do, or attempt it, but don’t understand my problem well enough to communicate it, I’m still looking for someone who can. I should also remind you saying “no,” is sometimes the right answer. But if you say that to a new client there could be a deeper issue. The fact that they asked the question in the first place probably means you didn’t manage their expectations well when you sold them whatever they bought from you in the first place.

When I started this post I thought I had a bias toward smaller companies. It’s true that larger companies with complex products often have unwieldy support structures that make self-service almost always impossible, however, larger companies that keep it personal and pay attention to quality follow-up can still effectively meet my needs.

At least when they’re not in the middle of a merger.

Making this easy: LMS Evaluations

My brain just did a flashback as my fingers poised over the keyboard ready to begin this post. The song, “War is a Science,” from Pippin has started to syncopate through my skull:

the rule that every gen-er-al
kno-ws by he-art:
it’s smarter to be lucky
than it’s lucky to be smart!

 

I value ‘smart’ (O how I value smart!) but  in my experience we overthink LMS evaluations.

It’s not about a Request for Proposal process. It’s not about a comparison of features. It’s not about the best software package out there. It’s not even primarily a decision of open source vs proprietary, although this exercise may help you characterize your institution’s culture as one or the other and that will get you started down the right track …

LMS Evaluations are like any other decision you have to make for your institution. It’s about trusting that the software you choose matches the way your institution does things.

It’s a cultural decision. Wasn’t always. But these days the market is mature enough that all these packages (Canvas, Sakai, Blackboard, Desire2Learn, Moodle) can do pretty much the same thing. It’s the way they do them that you care about. It’s the way your institution plans to use and support the software that will make the implementation project a smashing success or an unadopted disaster.

This is what you need to plan for and do:

1). Select the group of people at your institution that you trust to make this decision for you. Is it an already existing faculty committee? Maybe its composed of appointees to be representative of each college or department, with central IT or the Library thrown in because they have to run it?

2). Create and follow whatever rigor or metrics these people will need to document and communicate their decision for maximum buy-in.

That’s it.

No kidding. Institutions often publish their final LMS Eval reports. Read them. The variety they represent is as wide as the cultures of the institutions that created them. They’re not all smart. But the successes are the lucky ones who chose software that matches their institutions’ culture.

Pros and Cons of hosting Sakai outside of your institution

The University of Notre Dame is moving from a proprietary LMS we host in our own Data Center to an open source system, Sakai, hosted with rSmart. Two big changes we’re lumping together. Ask yourself …

What advantages do you expect to gain when switching from a proprietary system to an open source system?

Does outsourcing the system’s management mitigate against those advantages?

What we’ve found so far (6 months):

Trade-offs  
Some Fixes/enhancements: can still be deployed faster than with a proprietary system, but not as fast as we expected … rSmart, or other provider, will still have tested version combinations and be reluctant to share risk with you of deploying a tool version in a lesser tested Sakai version
Staffing. You can redeploy your app admin to direct faculty support and do away with sys admin, DBA, etc. You didn’t have Developers before & by hosting, any development (customizations even) you wanted to contribute now will be problematic unless you still build an in-house development/test instance.
TCO: You may find the costs between licensing/hosting yourself and not-licensing hosting elsewhere to be very similar. You are re-arranging your human resources, which could bring advantages to your faculty despite the similar cost of ownership.
SIS integration: Always more difficult when your ‘home data’ has to be shared with someone off-site. Particularly bad at the moment as the industry transitions from former methods of SIS- LMS integration to the new LIS 2.0 standard.
Part of fixes/enhancements, that of User Acceptance Testing, involves back and forth communication, and management of Help Desk ticketing between you and your host vendor. You have a dependency on the ability to use a test or 2nd instance with your live data, but this synchronization between live and test is no longer handled by you – but by your vendor.