This thread of screen captures from the twitter stream, I’m calling “Training,” that is, valuable comments today related to Faculty training/workshops.
My brain just did a flashback as my fingers poised over the keyboard ready to begin this post. The song, “War is a Science,” from Pippin has started to syncopate through my skull:
the rule that every gen-er-al
kno-ws by he-art:
it’s smarter to be lucky
than it’s lucky to be smart!
I value ‘smart’ (O how I value smart!) but in my experience we overthink LMS evaluations.
It’s not about a Request for Proposal process. It’s not about a comparison of features. It’s not about the best software package out there. It’s not even primarily a decision of open source vs proprietary, although this exercise may help you characterize your institution’s culture as one or the other and that will get you started down the right track …
LMS Evaluations are like any other decision you have to make for your institution. It’s about trusting that the software you choose matches the way your institution does things.
It’s a cultural decision. Wasn’t always. But these days the market is mature enough that all these packages (Canvas, Sakai, Blackboard, Desire2Learn, Moodle) can do pretty much the same thing. It’s the way they do them that you care about. It’s the way your institution plans to use and support the software that will make the implementation project a smashing success or an unadopted disaster.
This is what you need to plan for and do:
1). Select the group of people at your institution that you trust to make this decision for you. Is it an already existing faculty committee? Maybe its composed of appointees to be representative of each college or department, with central IT or the Library thrown in because they have to run it?
2). Create and follow whatever rigor or metrics these people will need to document and communicate their decision for maximum buy-in.
No kidding. Institutions often publish their final LMS Eval reports. Read them. The variety they represent is as wide as the cultures of the institutions that created them. They’re not all smart. But the successes are the lucky ones who chose software that matches their institutions’ culture.
The dots I’m connecting here:
As there are two kinds of ePortfolios, there are now two classes of systems which help facilitate learning. I’ve blogged before on what we should call the one system. Well, I think the industry has chosen to continue to have both. On the one hand, we have a learning management system (LMS), the child of the course management system (CMS). This type of system is managed by an institution of higher learning for purposes of providing controlled interaction for formal learning situations resulting in grades and a diploma. We can continue to call it an LMS. To be clear, there are a number of products which could be configured to meet this goal, or configured to meet the next goal. What we call it is dependent on who it’s configured to serve, the student or the institution.
We now begin to see systems, or collections of systems, serving the purpose of fostering and collecting informal and formal learning of students and adults alike. They have these attributes in common:
Student-centric. I can configure how I want to be messaged when a new grade is posted, a new assignment given. I can configure my own blogs, wikis, links to other places I learn outside of my institution, I can receive messages relevant to my learning interests and formal classes. I can form my own ad hoc groups.
Account life. While this system may be hosted by a formal institution of higher learning, provision is made for me to either have access beyond graduation or take it with me.
Copyrighted material can still be accessed after the course is over.
Privacy. There is none. Everyone understands to learn from others, I have to share what I know. Or what I don’t know (my questions). By giving up my privacy in any given domain, I can earn my own “cred” as a learner in that field. I need to willingly agree to carry that risk myself.
These kind of systems, shall we call them Collaborative Learning Environments (CLE) ?
The University of Notre Dame is moving from a proprietary LMS we host in our own Data Center to an open source system, Sakai, hosted with rSmart. Two big changes we’re lumping together. Ask yourself …
What advantages do you expect to gain when switching from a proprietary system to an open source system?
Does outsourcing the system’s management mitigate against those advantages?
What we’ve found so far (6 months):
|Some Fixes/enhancements: can still be deployed faster than with a proprietary system, but not as fast as we expected …
||rSmart, or other provider, will still have tested version combinations and be reluctant to share risk with you of deploying a tool version in a lesser tested Sakai version
|Staffing. You can redeploy your app admin to direct faculty support and do away with sys admin, DBA, etc.
||You didn’t have Developers before & by hosting, any development (customizations even) you wanted to contribute now will be problematic unless you still build an in-house development/test instance.
|TCO: You may find the costs between licensing/hosting yourself and not-licensing hosting elsewhere to be very similar.
||You are re-arranging your human resources, which could bring advantages to your faculty despite the similar cost of ownership.
|SIS integration: Always more difficult when your ‘home data’ has to be shared with someone off-site.
||Particularly bad at the moment as the industry transitions from former methods of SIS- LMS integration to the new LIS 2.0 standard.
|Part of fixes/enhancements, that of User Acceptance Testing, involves back and forth communication, and management of Help Desk ticketing between you and your host vendor.
||You have a dependency on the ability to use a test or 2nd instance with your live data, but this synchronization between live and test is no longer handled by you – but by your vendor.
Just a few notes, probably only of significance to myself, of the work involved when the technology around a software changes even as that software is being brought to an end of life…
Blackboard’s CE and Vista product is scheduled to be de-supported in January of 2013. Notre Dame will continue to run it as we come along side it with another system and move our Faculty and Students to it.
- Meanwhile Oracle has de-supported (as of June 2010) its database software version Oracle 10g, on which many institutions have been running their Bb Vista databases.
- Meanwhile Oracle has acquired Java and issued Update 29 of Java 6 (available for PCs Oct 10th and for Mac 10.6.8 and Mac 10.7.1 shortly thereafter) with which Bb Vista 8 doesn’t play well.
- Meanwhile Oracle 11g is being deployed as a database cluster – RAC, that is, a feature some of this older software wouldn’t have dreamed of.
This just makes keeping the old girl running that much more of an effort.
This week here at Notre Dame we validated our Bb Vista 8 Dev environment to service pack 6 (SP6) on our Oracle 11g RAC database farm. Here were our tests:
|NDCustom copy content tool (uses siapi)
||perl, cron, DB link to Banner, permissions, UI display: all looks good
|Created supersections with our NDcustom job (uses siapi)
||Same as above. Passed.
|Took a quiz while stopping the database on one node (no failover).
||System Exception error. Session remained open. Saved answers were saved. When database ‘returned’ saves continued. Repeated logged messages as app tried to reconnect to the database. Passed
|Took a quiz while gracefully failing over database nodes.
||No system exception error. Session remained open. Everything saved correctly. Only indication the db node had failed over was watching the netstat –a close connections on 1 db and open on another node, also saw 1 unpinned connection error in the logs. RAC works!
|JMS real-time messaging server failover
||Still fails. Same behavior as always. Recommend Weblogic setting to leave Target set to a single non-migratable node.
|Background Jobs: Garbage Collection.
||Deleted hundreds of courses. Checked timing & completion. No essential change in performance. GC completes & took over an hour. Our live system job averages 2 hrs nightly on 10g to complete. We now anticipate the same on 11g.
|Background Jobs: Content Index Search
||No essential change between 10g and 11g. Works. Passes.
|Background Jobs: Tracking Event
||No essential change between 10g and 11g. Works. Passes.