Tag Archives: lms

Sakai is alive and well and on the move

Like Sakai, this tiger is alive, staring directly at you, and ready to spring

Sakai is open source. So yeah, many open source softwares come and go.

But other open source softwares embed into your infrastructure, are constantly maintained and improved and evolving. Examples: Apache webserver, Central Auth Service, Git, Linux, and Sakai.

Don’t believe me? Monitor the commits on github. Look at contributions from Western Ontario, University of Virginia and Longsight (a commercial affiliate). Oh, and globally? Check out Flying Kite of Australia (busy enough that they don’t bother with a website but use LinkedIN (https://www.linkedin.com/company/flying-kite-au ) and Entornos de Formación of Spain (big enough to have an English language website: https://www.edf.global/ ).

So let’s see where Sakai compares with overall caveats proposed by Sam Saltis in his recent blogpost on Core dna:

#1. Cost. Sam says, “Open software providers are also increasingly charging for extras like add-ons, integration, and additional services, which can negate any cost-saving advantages in some cases. In the end, rather than being free, you are still paying for a service with open source software.”

And he’s right. This is happening. But not with Sakai. Sakai is community-based and community maintained. It can be hosted anywhere you like, including by a commercial hosting provider or in your own institutional AWS, Google or Azure cloud. It can also be multi-tenanted, so that you could share costs of an instance. With Sakai you control the cost by controlling your choices.

#2. Service. Sam says, “Open source software relies on a loyal and engaged online user community to deliver support via forums and blogs, but this support often fails to deliver the high level of response that many consumers expect (and can receive with proprietary software).”

And he’s right. But it’s also a strength. Sakai has a loyal and engaged online user community which is highly responsive. I dare you to engage with us on our lists. You’ll get an answer within 24 hours for sure, but often within minutes(!)

#3. Innovation. Sam says, “Open source software provides a large amount of flexibility and freedom to change the software without restriction. This innovation, however, may not be passed on to all users and it is debated whether customized changes to the original source code can limit the future support and growth of the software. Once more, open source software providers often struggle to attract large-scale research and development.”

I have to answer this one for the Sakai community in parts: a) Yes, you are able to change Sakai without restriction. Truthfully, we like that control. b) Don’t do it in a vacuum. Engage with the community to innovate in a direction we all can use. That’s what we do. c) The Sakai community is a member of a larger open source umbrella organization, Apereo. This has been a good move for us spurring even more innovation. The Apereo Foundation is like a ‘braintrust’ for higher ed open source software projects. Check it out. 

#4. Usability. Sam says, “Usability is often a major area of criticism for open source software because the technology is generally not reviewed by usability experts and caters to developers rather than the vast majority of layperson users. User guides are not required by law and are therefore often ignored. When manuals are written, they are often filled with jargon that is difficult to follow.”

And he’s right. And admittedly, in the past this has plagued Sakai software as well. We reached a tipping point in community involvement about 3 years ago when we held a little “Sakai Camp” for the first time and discovered those who came were about half developers and half instructional designers, LMS Admins, and those representing accessibility and usability concerns. More and more instructional designers and faculty are being heard from on the lists. Even the formerly predominantly developer list, sakai-dev@apereo.org , now has regular members who are NOT developers but again are peoople who care about usability.

Sakai is now reviewed by usability and accessibility experts.

Sakai has robust user guides created by users, not developers.

#5. Security. And finally, Sam says, “With individual users all around the world developing the software, there is a lack of continuity and common direction that prevents effective communication. Once more, the software is not always peer-reviewed or validated, meaning that a programmer can embed a backdoor Trojan into the software while the user is none the wiser.”

This is just plain bogus. This is not how many of the most popular open source softwares are maintained. Maybe it was back in the days of the wild west, but today I think the majority of open source software has nailed this one. I’d like to think Sakai was among the first.

I’m not a developer. But I care about the software I am administering for Notre Dame. So I know those who run Sakai and develop Sakai keep current on exploits and test Sakai for those. I know when an exploit is identified, a patch comes out very quickly. It becomes part of the main branch from which anyone who implements Sakai can pull.

Finally, “continuity and common direction” – well, let’s let the comment section by my Sakai peers speak to this.

Sam Saltis’ full article.

 

Advertisements

rSmart and SunGard Re-evaluate their Partnership

The partnership announced a little over a year ago [SunGard Press Release, Sept. 28, 2010]  has certainly not deteriorated, just undergone a metamorphosis in the light of work with several clients. While initially envisioned to include sales support and subscriptions utilizing SunGard resources, the past year has indicated clients find that structure, not easier, but more difficult.

 Chris Coppola of rSmart, in a recent email, put it this way, “We just didn’t see the benefits we thought we would from a sales perspective.”

 The two service providers continue to pool their resources when assisting clients in integrating their rSmart-hosted Sakai CLE with SunGard’s SIS product lines. In this respect they continue to learn from each other, and find this to be the true value-add for their clients.

 The University of Notre Dame is one of those clients, on track to put our rSmart-hosted CLE into full production with real-time and batch integrations to our SunGard Banner SIS. We’re enjoying full implementation support from both SunGard and rSmart engineers in standing up the Banner Event Publisher, a new SunGard technology, with Sakai as its first fully configured subscriber.

Tufts University, University of South Alabama, Notre Dame and Sakai–a middle way?

Representatives of our three schools, along with Nate Angell of rSmart, began conferencing a few weeks ago in order to discuss their mutual desire to provide multiple methods integrating SunGard Banner and Sakai.

As many of you know, to date there are two methods to provision ones LMS or CLE.

A. Administratively create course site shells each semester for all, or a majority, of your course offerings. When Instructors login, their course and its enrollments are already in place and managed by the SIS integration.

B. Allow (change the verb here depending on your perspective…) Instructors to create their own course sites and to choose which groups of students (rosters) are given access.

Tufts, Notre Dame and U of South Alabama are currently constructing what their requirements look like for a middle way, a third method in which:

–  all course site shells are created and instructors given access

– instructors wanting the flexibility to re-arrange which groups of students (rosters) can access their course are free to create a new course shell and to add various rosters to it (still maintaining enrollment synchronization with Banner).

  1. Multiple sections can then interact in a single course site.
  2. The rosters from the pre-built crosslisted course sites could also be re-combined if instructors are teaching multiple sections of those.
  3. Final grades can be submitted from Sakai back to Banner no matter whether the instructor is using a pre-built course site or one they created and added rosters to themselves.

The requirements process is in its early stages. Is your institution interested? Please comment.

Deliverables to include in your LMS transition project

I just sent these tweets in succession then I realized I will lose these, and so will those who may be following with an LMS transition in 6 months or a year. (You’re welcome Janel!). Here they are consolidated into a true posting:

1. LMS_Deliverable2

2. LMS_Deliverable3

3. LMS_Deliverable4

4. LMS_Deliverable1

 

Let me elaborate a bit.

#1 Without the ability to copy a production environment, its configuration and all its courses (database) to a non-production environment, you will never be able to thoroughly test configuration changes, new tools you might add, or system performance under peak loads without impacting your faculty and students.

#2 Please don’t think you can make due without at least a one non-production TEST instance. This is not the place to cut your budget. While end users typically don’t know about this ‘hidden’ environment, your decision not to have one and not to support testing before something is okayed to be a part of your faculty/student tool mix, THAT decision will impact your institution.

#3 In our process just this week we received as a deliverable a word doc capturing all of our discussions around the best way to configure our LMS for Notre Dame. When I received the document I immediately recognized its format as being completely unmanageable as documentation of a ‘known state’ of a system in use. The settings you start with WILL change over time and you need to have a configuration document that is in a format that can be easily reviewed, changed, built-to, and tested-to. You will find as you change your settings, that some of them are dependent on each other, a spreadsheet will allow you to indicate which sets of configuration settings work together and which ones are problematic.

Notes on Penn State Learning Design Podcast #4: Throw out the LMS?

They title it “Baby and the Bathwater,” Jeff Swain and Brian Young.

baby-with-bathwater-mid-size

Full podcast available from ITunes :EdTech episode #4_ Baby & the bath water

These snippets of their conversation stuck out to me.

“If we implement it as we did in the past, if we support it as we did in the past, we will end up with what we had in the past.”  (Brian’s voice I think)

baby_in_bath_listening

The LMS’s of today pretty much all have the same functionality. In what way are they flawed? What is it about an LMS as a tool that still needs to change… is it

– the workflow required of an Instructor to accomplish a pedagogical goal?

– the fact that it’s a closed system which doesn’t allow for students to interact with other students studying the same same?

– flexibility to make some parts open and others closed to the students taking it that term (in that particular section, or all sections taught by same Instructor? Or all sections across several Instructors?)

They love blogs (sounds like for teaching, research, and reflection method).

How does an LMS related to the ground swell of Program Assessment? LMS repositories aren’t built for providing artifacts and data for assessing the program… But why not?

What about from the student point of view? Why don’t we map out their entire program so that over time they can see how each course supports the program’s goals, how far they’ve come? How far yet to go?

baby_bathtub

100% of the issues with any of the LMSs are due to lack of planning on the Instructor’s part.” (around 26 minutes) – Brian Young

babies_water 4