Planet Sakai

November 26, 2014

Dr. Chuck

Tracing History from to “Imitation Game” to the Modern-Day Internet (#IHTS)

In a sense Alan Turing’s cryptography, code breaking and computer science work at Bletchley Park featured in the Imitation Game movie was the kickoff for the modern day Internet and well as modern day electronic computing technologies. For the first time in history, communication was essential for survival and applying computation to understanding communication was critical to success or failure in World War II. There was unprecedented funding poured into research into mathematics, computer science, social science, linguistics, and many other fields. Bletchley Park was one of the world’s first great large-scale cross-disciplinary research labs. The work done at Bletchley Park had a tremendous impact on the results of World War II and the shape of our world to the present day.

If you are interested in learning how we got from Bletchley Park to today’s Internet – I would invite you to attend my free self-paced Internet History, Technology, and Security course on Coursera.

IHTS was one of the first 20 pioneering MOOCs as Coursera was rolled out in 2012 (yes two years seems like a long time ago). And now IHTS is one of the first Coursera courses to pioneer a new self-paced format that allows students to start and take courses at any time and at their own pace.

We initially have soft-launched IHTS so students can view all of the lectures and supplementary materials. Over the coming months, we will be adding quizzes and other assessments so that the self-paced offering includes all the features of the previous scheduled cohort based offerings on Coursera – except with no deadlines :).

The course is a mix of lectures and interviews with Internet innovators. All of the course materials are open and available under a CC-BY Creative Commons License to allow reuse of the lecture materials.

I hope to see you in class.

by Charles Severance at November 26, 2014 10:45 AM

November 24, 2014

Steve Swinsburg

Movember 2014

It’s the tail end of Movember, just a few days to go and my team has almost raised raised over a thousand bucks for the Movember Foundation!

What is Movember you ask? It’s about raising awareness for men’s health issues like depression, testicular cancer and prostate cancer. In Australia, the life expectancy of men is 5 years less than for women, 50% of men struggle with mental health issues at some point, and 50% of men will be diagnosed with cancer by age 85.

50%.

1 in 2.

Either you or me.

Fuck that.

I’ve been doing Movember for the past 6 years to try to tackle this issue and have raised a few grand in doing so. This year I setup a team with my work mates and we’ve collectively raised over $1000 already, with more donations promised this week. Our original goal was $1000, with your help we can make it $1500.

All donations are tax deductible  and you can donate here:
https://www.movember.com/au/donate/payment/member_id/51531/

Here’s a pic of my latest Mo efforts for your viewing pleasure. You can see past Movember efforts on my Movember page.

movember_20141124

by steveswinsburg at November 24, 2014 10:56 AM

Adam Marshall

FAO: Local WebLearn Coordinators in your department, faculty or college

  • Are you new in the role of Local WebLearn Coordinator?
  • Do you perhaps need a refresher about the do’s and don’ts of managing your unit’s WebLearn presence?
  • Did you  know that you must never add external accounts to the Administration Site, since this could cause a serious security breach?

A lunch time overview session as been scheduled for Local WebLearn Coordinators. Come along to find out more about your role and recent changes.

Date: Thursday 27 November 2014

Time: 12:30-13:30

Venue: IT Services, 13 Banbury Road

Booking: http://courses.it.ox.ac.uk/detail/TOVS

We hope to see you there.

by Adam Marshall at November 24, 2014 09:27 AM

November 21, 2014

Michael Feldstein

WCET14 Student Panel: What do students think of online education?

Yesterday at the WCET14 conference in Portland I had the opportunity along with Pat James to moderate a student panel.[1] I have been trying to encourage conference organizers to include more opportunities to let students speak for themselves – becoming real people with real stories rather than nameless aggregations of assumptions. WCET stepped up with this session. And my new favorite tweet[2]:

As I called out in my introduction, we talk about students, we characterize students, we listen to others talk about students, but we don’t do a good job in edtech talking with students.  There is no way that a student panel can be representative of all students, even for a single program or campus[3]. We’re not looking for statistical answers, but we can hear stories and gain understanding.

These four students were either working adults (and I’m including a stay-at-home mom in this category) taking undergraduate online programs. They were quite well-spoken and self-aware, which made for a great conversation that included comments that might surprise some on faculty-student interaction potential:

A very surprising (to me) comment on class size:

And specific feedback on what doesn’t work well in online courses:

To help with viewing of the panel, here are the primary questions / topics of discussion:

The whole student panel is available on the Mediasite platform:

Thanks to the help of the Mediasite folks, I have also uploaded a Youtube video of the full panel:

Click here to view the embedded video.

  1. Pat is the executive director of the California Community College Online Education Initiative (OEI) – see her blog here for program updates.
  2. I’m not above #shameless.
  3. As can be seen from this monochromatic panel, which might make sense for Portland demographics but not from a nationwide perspective.

The post WCET14 Student Panel: What do students think of online education? appeared first on e-Literate.

by Phil Hill at November 21, 2014 09:37 PM

November 20, 2014

Adam Marshall

WebLearn upgraded to version 2.10-ox1.3 on 18 November 2014

WebLearn was upgraded on 18th November 2014 to version 2.10-ox1.3. If you want more details then please contact the WebLearn Team. For more detailed information and other minor changes, please looked at the detailed release notes.

If you would like to suggest further improvements then please do so by contributing to the WebLearn User Voice feedback service.

Improvements

  • The pop up help has been partitioned into ‘Student’ and ‘Instructor’ guides and has been reorganised so that tools have their correct names and there are no entries for tools that are not offered at Oxford
  • Humanities courses now appear in the Researcher Training Tool
  • Improved error message when an assignment cannot be routed via Turnitin
  • My Dashboard has been changed back to My Workspace (as users found it confusing)
  • The Contact Us Tool has been improved
  • Opting to hide yourself in a site (My Workspace > Preferences > Privacy) will prevent your name being listed as a site contact for content-related problems
  • The Quick Links menu now behaves just like the My Sites menu
  • Lessons tool can now include quizzes from the Tests tool
  • Improvements have been made to the top banner so that the height can be reduced if desired

Bug Fixes

  • Attachments are now correctly stored in the archive when sending an email with attachments to the Email Archive
  • The new(ish) WYSIWYG (rich-text) editor is now available for the Surveys Beta tool in Internet Explorer 11
  • The Researcher Training Tool (RTT) Module Administration page now works correctly with Internet Explorer 11
  • An email is no longer sent when a site is created
  • The problems with the Site Members tool should now be fixed
  • The Forums tool will no longer send out an email containing three identical messages
  • Appearance in Internet Explorer 11 of the work “Minimise” when you scroll has been supressed

by Adam Marshall at November 20, 2014 02:43 PM

Michael Feldstein

In Which I (Partially) Disagree with Richard Stallman on Kuali’s AGPL Usage

Since Michael is making this ‘follow-up blog post’ week, I guess I should jump in.

In my latest post on Kuali and the usage of the AGPL license, the key argument is that this license choice is key to understanding the Kuali 2.0 strategy – protecting KualiCo as a new for-profit entity in their future work to develop multi-tenant cloud hosting code.

What I have found interesting is that in most of my conversations with Kuali community people ,even for those who are disillusioned, they seem to think the KualiCo creation makes some sense. The real frustration and pushback has been on how decisions are made, how decisions have been communicated, and how the AGPL license choice will affect the community.

In the comments, Richard Stallman chimed in.

As the author of the GNU General Public License and the GNU Affero General Public License, and the inventor of copyleft, I would like to clear up a possible misunderstanding that could come from the following sentence:

“Any school or Kuali vendor, however, that develops its own multi-tenant cloud-hosting code would have to relicense and share this code publicly as open source.”

First of all, thinking about “open source” will give you the wrong idea about the reasons why the GNU AGPL and the GNU GPL work as they do. To see the logic, you should think of them as free software licenses; more specifically, as free software licenses with copyleft.

The idea of free software is that users of software deserve freedom. A nonfree program takes freedom away from its users, so if you want to be free, you need to avoid it. The aim of our copyleft licenses is to make sure all users of our code get freedom, and encourage release of improvements as free software. (Nonfree improvements may as well be discouraged since we’d need to avoid them anyway.) See http://gnu.org/philosophy/free-software-even-more-important.html.

I don’t use the term “open source”, since it rejects these ethical ideas. (http://gnu.org/philosophy/open-source-misses-the-point.html.) Thus I would say that the AGPL requires servers running modified versions of the code to make the source for the running version available, under the AGPL, to their users.

The license of the modifications themselves is a different question, though related. The author of the modifications could release the modifications under the AGPL itself, or under any AGPL-compatible free software license. This includes free licenses which are pushovers, such as the Apache 2.0 license, the X11 license, and the modified BSD license (but not the original BSD license — see
http://gnu.org/licenses/license-list.html).

Once the modifications are released, Kuali will be able to get them and use them under whatever license they carry. If it is a pushover license, Kuali will be able to incorporate those modifications even into proprietary software. (That’s what makes them pushover licenses.)

However, if the modifications carry the AGPL, and Kuali incorporates them into a version of its software, Kuali will be bound by the AGPL. If it distributes that version, it will be required to do so under the AGPL. If it installs that version on a server, it will be required by the AGPL to make the whole of the source code for that version available to the users of that server.

To avoid these requirements, Kuali would have to limit itself to Kuali’s own code, others’ code released under pushover licenses, plus code for which it gets special permission. Thus, Kuali will not have as much of a special position as some might think.

See also http://gnu.org/philosophy/assigning-copyright.html
and http://gnu.org/philosophy/selling-exceptions.html.

Dr Richard Stallman
President, Free Software Foundation (gnu.org, fsf.org)
Internet Hall-of-Famer (internethalloffame.org)
MacArthur Fellow

I appreciate this clarification and Stallman’s participation here at e-Literate, and it is useful to understand the rationale and ethics behind AGPL. However, I disagree with the statement “Thus, Kuali will not have as much of a special position as some might think”. I do not think he is wrong, per se, but the combination of both the AGPL license and the Contributor’s License Agreement (CLA) in my view does ensure that KualiCo has a special position. In fact, that is the core of the Kuali 2.0 strategy, and their approach would not be possible without the AGPL usage.

Note: I have had several private conversations that have helped me clarify my thinking on this subject. Besides Michael with his comment to the blog, Patrick Masson and three other people have been very helpful. I also interviewed Chris Coppola from KualiCo to understand and confirm the points below. Any mistakes in this post, however, are my own.

It is important to understand two different methods of licensing at play – distributing code through an APGL license and contributing code to KualiCo through a CLA (Kuali has a separate CLA for partner institutions and a Corporate CLA for companies).

  • Distribution – Anyone can download the Kuali 2.0 code from KualiCo and make modifications as desired. If the code is used privately, there is no requirement for distributing the modified code. If, however, a server runs the modified code, the reciprocal requirements of AGPL kick in and the code must be distributed (made available publicly) with the AGPL license or a pushover license. This situation is governed by the AGPL license.
  • Contribution – Anyone who modifies the Kuali 2.0 code and contributes it to KualiCo for inclusion into future releases of the main code grants a license with special permission to KualiCo to do with the code as they see fit. This situation is governed by the CLA and not AGPL.

I am assuming that the future KualiCo multi-tenant cloud-hosting code is not separable from the Kuali code. In other words, the Kuali code would need modifications to allow multi-tenancy.

For a partner institution, their work is governed by the CLA. For a company, however, the choice on whether to contribute code is mutual between that company and KualiCo, in that both would have to agree to sign a CLA. Another company may choose to do this to ensure that bug fixes or Kuali enhancements get into the main code and do not have to be reimplemented with each new release.

For any contributed code, KualiCo can still keep their multi-tenant code proprietary as their special sauce. For distributed code under AGPL that is not contributed under the CLA, the code would be publicly available and it would be up to KualiCo whether to incorporate any such code. If KualiCo incorporated any of this modified code into the main code base, they would have to share all of the modified code as well as their multi-tenant code. For this reason, KualiCo will likely never accept any code that is not under the CLA – they do not want to share their special sauce. Chris Coppola confirmed this assumption.

This setup strongly discourages any company from directly competing with KualiCo (vendor protection) and is indeed a special situation.

The post In Which I (Partially) Disagree with Richard Stallman on Kuali’s AGPL Usage appeared first on e-Literate.

by Phil Hill at November 20, 2014 12:32 AM

November 19, 2014

Michael Feldstein

A Weird but True Fact about Textbook Publishers and OER

As I was perusing David Kernohan’s notes on Larry Lessig’s keynote at the OpenEd conference, one statement leapt out at me:

Could the department of labour require that new education content commissioned ($100m) be CC-BY? There was a clause (124) that suggested that the government should check that no commercial content should exist in these spaces. Was argued down. But we were “Not important” enough to be defeated.

It is absolutely true that textbook publishers do not currently see OER as a major threat. But here’s a weird thing that is also true:

These days, many textbook publishers like OER.

Let me start with the full disclosure. For 18 months, I was an employee of Cengage Learning, one of the Big Three textbook publishers in US higher education. Since then, I have consulted for textbook publishers on and off. Pearson is a current client, and there have been others. Make of that what you will in terms of my objectivity on this subject, but I have been in the belly of the beast. I have had many conversations with textbook publisher employees at all levels about OER, and many of them truly, honestly like it. They really, really like it. As a rule, they don’t understand it. But some of them actually see it as a way out of the hole that they’re in.

This is a relatively recent thing. Not so very long ago, you’d get one of two reactions from employees at these companies, depending on the role of the person you were talking to. Editors would tend to dismiss OER immediately because they had trouble imagining that content that didn’t go through their traditional editorial vetting process could be good (fairly similarly to the way academics would dismiss Wikipedia as something that couldn’t be trusted without traditional peer review). There were occasional exceptions to this, but always for very granular content. Videos, for example. Sometimes editors saw (or still see) OER as extra bits—or “ancillary materials,” in their vernacular—that could be bundled with their professionally edited product. That’s the most that editors typically thought about OER. At the executive level, every so often they would trot out OER on their competitive threat list, look at it for a bit, and decide that no, they don’t see evidence that they are losing significant sales to OER. Then they would forget about it for another six months or so. Publishers might occasionally fight OER at a local level, or even at a state level in places like Washington or California where there was legislation. But in those cases the fight was typically driven by the sales divisions that stood to lose commissions, and they were treated like any other local or regional competition (such as home-grown content development). It wasn’t viewed as anything more than that. For the most part, OER was just not something publishers thought a lot about.

That has changed in US higher education as it has become clear that textbook profits are collapsing as student find more ways to avoid buying the new books. The traditional textbook business is clearly not viable in the long term, at least in that market, at least at the scale and margins that the bigger publishers are used to making. So these companies want to get out of the textbook business. A few of them will say that publicly, but many of them say it among themselves. They don’t want to be out of business. They just want to be out of the textbook business. They want to sell software and services that are related to educational content, like homework platforms or course redesign consulting services. But they know that somebody has to make the core curricular content in order to for them to “add value” around that content. As David Wiley puts it, content is infrastructure. Increasingly, textbook publishers are starting to think that maybe OER can be their infrastructure. This is why, for example, it makes sense for Wiley (the publisher, not the dude) to strike a licensing deal with OpenStax. They’re OK about not making a lot of money on the books as long as they can sell their WileyPlus software. Which, in turn, is why I think that Wiley (the dude, not the publisher) is not crazy at all when he predicts that “80% of all US general education courses will be using OER instead of publisher materials by 2018.” I won’t be as bold as he is to pick a number, but I think he could very well be directionally correct. I think many of the larger publishers hope to be winding down their traditional textbook businesses by 2018.

How particular OER advocates view this development will depend on why they are OER advocates. If your goal is to decrease curricular materials costs and increase the amount of open, collaboratively authored content, then the news is relatively good. Many more faculty and students are likely to be exposed to OER over the next four or five years. The textbook companies will still be looking to make their money, but they will have to do so by selling something else, and they will have to justify the value of that something else. It will no longer be the case that students buy closed textbooks because it never occurs to faculty that there is another viable option. On the other hand, if you are an OER advocate because you want big corporations to stay away from education, then Larry Lessig is right. You don’t currently register as a significant threat to them.

Whatever your own position might be on OER, George Siemens is right to argue that the significance of this coming shift demands more research. There’s a ton that we don’t know yet, even about basic attitudes of faculty, which is why the recent Babson survey that everybody has been talking about is so important. And there’s a funny thing about that survey which few people seem to have noticed:

It was sponsored by Pearson.

The post A Weird but True Fact about Textbook Publishers and OER appeared first on e-Literate.

by Michael Feldstein at November 19, 2014 07:44 PM

November 17, 2014

Adam Marshall

WebLearn unavailable on Tuesday 18th November 2014 from 7-9am

It is planned to upgrade WebLearn to version 2.10-ox1.3 on Tuesday 18th November 2014 7-9am. There will be no service during this period.

We apologise for any inconvenience that this essential work may cause.

by Adam Marshall at November 17, 2014 02:43 PM

November 10, 2014

Dr. Chuck

Riding My Way Back – Veterans Day – And my own Big-Screen Film Debut

Tomorrow is Veterans Day and I will be attending a film screening of “Riding My Way Back” Tuesday November 11 at 7PM at the Celebration Cinema in South Lansing.

Riding My Way Back (http://www.ridingmywayback.com) is a documentary film about a veteran who came back from Iraq and Afghanistan with with traumatic brain injury (TBI) and Post-Traumatic Stress Disorder (PTSD) and how his relationship with a horse named “Fred” helped him rebuild his life.

In addition to Riding My Way Back, we will be showing “CHUM Families” which is a documentary about parents and children that are part of the C.H.U.M. Therapeutic Riding family. I produced the film and it if the first time any of my work will be shown on a big screen cinema.

Here is a preview of the CHUM Families movie on YouTube.

The proceeds for the showing will go to support the programs at C.H.U.M. Therapeutic Riding (www.chumtherapy.net).

I hope to see you there.

by Charles Severance at November 10, 2014 05:52 PM

October 29, 2014

Steve Swinsburg

Sakai Wicket Maven Archetype updated and released to Maven Central

The Sakai Wicket Maven Archetype has been updated to the latest version of Wicket 6 and styling issues fixed for the latest Sakai portal. It’s also been released to Maven central.

The Sakai Wicket Maven Archetype allows you to generate a sample Sakai app via a single Maven command. The app demonstrates how to get a Sakai tool styled, internationalised and registered, setup your own APIs, wire them up with Spring and inject them via annotations. The app also also includes multi database support via Spring JDBC. It could easily be used as a base for a real tool.

Generate an app:

mvn archetype:generate -DarchetypeGroupId=org.sakaiproject.maven-archetype -DarchetypeArtifactId=sakai-wicket-maven-archetype -DarchetypeVersion=1.5.0 -DgroupId=org.sakaiproject.example -DartifactId=exampleApp

More info here:

https://confluence.sakaiproject.org/display/BOOT/Sakai+Wicket+Maven+Archetype

by steveswinsburg at October 29, 2014 07:46 PM

September 30, 2014

Ian Boston

AppleRAID low level fix

Anyone who uses AppleRAID will know how often it declares that a perfectly healthy disk is no longer a valid member of a Raid set. What you may not have experienced is when it wont rebuild. For a stripped set, the practical only solution is a backup. For a mirror there are some things you can do. Typically when diskutil or the GUI wont repair the low level AppleRAID.kext wont load, or will load and fails reporting it cant get a controller object. In the logs you might also see the Raid set is degraded or just offline. If its really bad DiskUtility and diskutil will hang somewhere in the kernel, and you wont be able to get a clean reboot.

Here is one way to fix:

Unplug the disk subsystem causing the problem.

Reboot, you may have to pull the plug to get shutdown.

Once up, move the AppleRAID.kext into a safe place eg

mkdir ~/kext
sudo mv /System/Library/Extensions/AppleRAID.kext ~/kext

Watch the logs to see that kextcache has rebuilt the cache of kernel extensions. You should see something like

30/09/2014 13:21:37.518 com.apple.kextcache[456]: /: helper partitions appear up to date.

When you see that you know that if you plugin the RAID Subsystem the kernel wont be able to load the AppleRAID.kext and so you will be able to manipulate the disks.

Plugin the raid subsystem and check that it didnt load the kernel extension,

kextstat | grep AppleRAID

You will now be able to do diskutil list and you should see your disks listed as Apple RAID disks, eg

$ diskutil list
...
/dev/disk2
 #: TYPE NAME SIZE IDENTIFIER
 0: GUID_partition_scheme *750.2 GB disk2
 1: EFI 209.7 MB disk2s1
 2: Apple_RAID 749.8 GB disk2s2
 3: Apple_Boot Boot OS X 134.2 MB disk2s3
/dev/disk3
 #: TYPE NAME SIZE IDENTIFIER
 0: GUID_partition_scheme *750.2 GB disk3
 1: EFI 209.7 MB disk3s1
 2: Apple_RAID 749.8 GB disk3s2
 3: Apple_Boot Boot OS X 134.2 MB disk3s3


At this point the disks are just plain disks. The AppleRAID kernel extension isn’t managing the disks. Verify with

$ diskutil appleRAID list
No AppleRAID sets found
$

Since you cant use them as RAID any more, and so cant use the diskutil appleRAID delete command convert the RAID set into normal disks you have to trick OSX into mounting the disks. To do this you need to edit the partition table, without touching the data on the disk. You can do this with gpt

$ sudo gpt show disk2
start size index contents
0 1 PMBR
1 1 Pri GPT header
2 32 Pri GPT table
34 6
40 409600 1 GPT part - C12A7328-F81F-11D2-BA4B-00A0C93EC93B
409640 1464471472 2 GPT part - 52414944-0000-11AA-AA11-00306543ECAC
1464881112 262144 3 GPT part - 426F6F74-0000-11AA-AA11-00306543ECAC
1465143256 7
1465143263 32 Sec GPT table
1465143295 1 Sec GPT header
$ sudo gpt show disk3
start size index contents
0 1 PMBR
1 1 Pri GPT header
2 32 Pri GPT table
34 6
40 409600 1 GPT part - C12A7328-F81F-11D2-BA4B-00A0C93EC93B
409640 1464471472 2 GPT part - 52414944-0000-11AA-AA11-00306543ECAC
1464881112 262144 3 GPT part - 426F6F74-0000-11AA-AA11-00306543ECAC
1465143256 7
1465143263 32 Sec GPT table
1465143295 1 Sec GPT header
$

According to https://developer.apple.com/library/mac/technotes/tn2166/_index.html the partition in index 2 with a partition type of 52414944-0000-11AA-AA11-00306543ECAC is a Apple_RAID partition. Its actually HFS+ with some other settings. Those settings get removed when converting it form RAID to non RAID, but to get it mounted we can just change the partition type. First delete the entry from the partion table, then recreated it with the HFS+ type exactly the same size.

$ gpt remove -i 2 disk2
disk2s2 removed
$ gpt add -b 409640 -s 1464471472 -t 48465300-0000-11AA-AA11-00306543ECAC disk3
disk2s2 added

OSX will mount the disk. It will probably tell you that its been mounted read only, and cant be repaired. At the point you need to copy all the data off onto a clean disk, using rsync.

Once that is done you can do the same with the second disk and compare the differences between both your RAID members. When you have all the data back, you can consider if you leave the AppleRAID.kext disabled or use it again. I know what I will be doing.

by Ian at September 30, 2014 01:29 PM

September 26, 2014

Steve Swinsburg

TextWrangler filters to tidy XML and tidy JSON

I work with XML and JSON a lot, often as the input to or output from web services. Generally it is unformatted, so before I can read the data I need it formatted and whitespaced. So here are some TextWrangler filters to tidy up XML and JSON documents.

#!/bin/sh
XMLLINT_INDENT=$'\t' xmllint --format --encode utf-8 -

Save this into a file called Tidy XML.sh

#!/usr/bin/python
import fileinput
import json
print json.dumps( json.loads(''.join([line.strip() for line in fileinput.input()])), sort_keys=True, indent=2)

Save this into a file called Tidy JSON.py

Drop these into ~/Library/Application Support/TextWranger/Text Filters. You can then run them on a file within TextWrangler by choosing Text > Apply Text Filter > [filter].

by steveswinsburg at September 26, 2014 12:31 AM

September 25, 2014

Apereo OAE

Apereo OAE Heron is now available!

The Apereo Open Academic Environment (OAE) project team is extremely proud to announce the next major release of the Apereo Open Academic Environment; OAE Heron or OAE 9.

OAE Heron is a landmark release that introduces the long awaited folders functionality, allowing for sets of content items to be collected, organised, shared and curated. OAE Heron also provides full support for Shibboleth access management federations and brings improvements to activities, (email) notifications and the REST API documentation. Next to that, OAE Heron also ships with a wide range of overall usability improvements.

Changelog

Folders

Using the personal and group libraries, Apereo OAE has always allowed collaboration to grow organically, reflecting how most of our collaborations work in real life. Individual content items could be shared with people and groups, making those items available in their respective libraries. This has always tested extremely well in usability testing, and not requiring the organisation of items upfront has been considered to reduce the obstacles to collaboration.

However, sustained usage and usability testing have also highlighted a number of challenges with this approach. First of all, it was difficult to group items that logically belong together (e.g. a set of field trip pictures) and share and interact with them as a single unit. Next to that, heavy use of the system was showing that libraries could become quite hard to manage and were clearly lacking some form of organisation.

Therefore, OAE introduces the long-awaited folders functionality, a feature we've been working on for an extended period of time and has gone through many rounds of usability testing. OAE Folders allow for a set of content items to be grouped into a folder. This folder can be shared with other people and groups and has its own permissions and metadata. A folder also has its own thumbnail picture based on the items inside of the folder and folders will generate helpful activities, notifications and emails.

OAE folders also stay true to the OAE philosophy, and therefore content items are never bound to a folder. This means that the items in a folder can still be used as an independent content and can be shared, discussed, etc. individually. This also means that a content item can belong to multiple folders at the same time, opening the door for re-mixing content items and content curation, allowing new interesting folders to be created from existing folders and content items.

Whilst maintaining the ability to grow collaboration organically, OAE Folders allow for a better and more logical organisation of content items and open the door to many interesting content re-use scenarios.

Shibboleth federations

Many countries around the world now expose their own Shibboleth access management federation. This provides an organised and managed way in which an application can be offered to many institutions at the same time, directly integrating with the institutional Single Sign On systems.

OAE Heron makes it possible for an OAE installation to become a recognised Service Provider for one or more of these federations. This dramatically simplifies the tenant creation process for an institution that's a member of one of these access management federations, making it possible to set up an OAE tenant with full Shibboleth SSO integration in a matter of minutes.

Email improvements

OAE Heron introduces significant email notification improvements for those users that have their email preference set to Immediate. OAE was already capable of aggregating a series of actions that happened in quick succession into a single email. OAE Heron makes this possible over a longer period of time, and will hold off sending an email until a series of events that would otherwise generate multiple email notifications has finished. This dramatically cuts down the number of emails that are sent out by OAE and provides a more intelligent email update to users.

The display of email notifications on mobile devices has also been improved significantly, making the content of the email much easier to read.

Activity improvements

OAE Heron offers more descriptive activity summaries, especially in the area of content creation. These will for example provide a much better overview of the context in which an activity happened.

Next to that, OAE Heron will also ensure that the indicator for the number of unread notifications a user has is always completely accurate.

REST API documentation

OAE Heron continues to build on the REST API documentation that was introduced in OAE Griffin. It makes all possible responses for each of the REST endpoints available through the documentation UI and further improves the quality of the available documentation.

Try it out

OAE Heron can be tried out on the project's QA server at http://oae.oae-qa0.oaeproject.org. It is worth noting that this server is actively used for testing and will be wiped and redeployed every night.

The source code has been tagged with version number 9.0.0 and can be downloaded from the following repositories:

Back-end: https://github.com/oaeproject/Hilary/tree/9.0.0
Front-end: https://github.com/oaeproject/3akai-ux/tree/9.0.0

Documentation on how to install the system can be found at https://github.com/oaeproject/Hilary/blob/9.0.0/README.md.

Instruction on how to upgrade an OAE installation from version 8 to version 9 can be found at https://github.com/oaeproject/Hilary/wiki/OAE-Upgrade-Guide.

The repository containing all deployment scripts can be found at https://github.com/oaeproject/puppet-hilary.

Get in touch

The project website can be found at http://www.oaeproject.org. The project blog will be updated with the latest project news from time to time, and can be found at http://www.oaeproject.org/blog.

The mailing list used for Apereo OAE is oae@apereo.org. You can subscribe to the mailing list at https://groups.google.com/a/apereo.org/d/forum/oae.

Bugs and other issues can be reported in our issue tracker at https://github.com/oaeproject/3akai-ux/issues.

by Nicolaas Matthijs at September 25, 2014 12:22 PM

September 24, 2014

Sakai Project

Sakai Virtual Conference 2014: Bridging Education with Technology November 7, 2014

Sakai Virtual Conference 2014
Bridging Education with Technology
November 7, 2014 - Online

http://virtconf.apereo.org/   #SakaiVC14

Register now to attend the first ever Sakai Virtual Conference on Friday, November 7th!

September 24, 2014 04:43 PM

Apereo October-December 2014 Webinar Program

Webinars will use Big Blue Button. Choose Apereo Room 1, enter your name and the password apereo at -

http://apereo.blindsidenetworks.net/apereo/

Schedule:

September 24, 2014 04:40 PM

2014 Educause Conference - The Open Communities Reception hosted by Apereo Foundation and Open Source Initiative (OSI)

Educause Conference - The Open Communities Reception hosted by Apereo Foundation and Open Source Initiative (OSI)

Tuesday September 30th, 2014
6:30 PM - 8:00 PM Eastern Time
Florida Ballroom A, Convention Level, Hyatt Regency Hotel

September 24, 2014 04:36 PM

September 19, 2014

Jason Shao

Searching for an ideal home whiteboard

71jkVN7rBjL._SL1500_

I have to admit, a good whiteboard is one of my absolute favorite things in the world. While I absolutely spend all kinds of time writing in text editors, and other digital medium (and have tried just about every tablet/digital/smart-pen replacement for dumb-pens and paper) there is something about how easy it is to work at a whiteboard, especially collaboratively. Maybe it’s good memories of doing work at the board in HS Math.

At home, I recently moved into a new apartment that has a slightly > 8′ long wall space right by the entry. While *clearly* too tight for me to want to put furniture on that wall, the space is *screaming* for a large whiteboard. One of my prime criteria is project/bucket-lists though – so I do expect items to stay up on the board for potentially a *loooooong* time. Looking at options, it seems like we have can figure out something:

  • Actually buying a whiteboard – about $300 for a melamine one, and $4-500 for one made out of porcelain which should last longer (though given I don’t use it all the time, melamine would probably be fine)
  • IdeaPaint – about $100-150, which I have used in offices before, and am a big fan of, but unfortunately requires *really* flat wall surfaces – and not sure it’s worth sanding and re-painting for the small number of blemishes (that absolutely will bother me). There are of course cheaper options – even Sherwin Williams seems to be getting in the game, but those seem to have mixed reviews
  • Mark R Board – the paper guys (Georgia Pacific) – a sample at: http://blog.listia.com/2010/06/08/diy-white-board/
  • Bathtub Reglaze Kit – about $30, plus something for probably a board or the like – seems like this is also a valid refinish strategy – http://wiki.xtronics.com/index.php/Shower_Board_as_a_white_Board
  • IKEA Hacking - about $120 to use a TORSBY glass-top table, picture ledge, and some mirror hangers. Example with pictures at: http://www.ikeahackers.net/2012/01/not-expensive-glass-whiteboard.html
  • White Tile Board – about $20 at Lowes, and even a bunch of comments that it’s a great DIY whiteboard, though some other people have posted notes about it not *quite* being the same, and definitely seeing ghosting if you leave writing on it for more than a few days
  • Decals. http://www.incrediline.com/category-s/1819.htm has some fascinating pre-printed ones – baseball fields, maps, graph paper – that seem really interesting. http://mywhiteboards.com/opti-rite-easy-60.html also has some

Across the different options I have to admit, I think I’m almost definitely going to look into the glass tabletop – I have lusted after that look for a while, and this looks like by far the most reasonably way to get there I’ve seen so far, will post pics once I get something up.

… and then I can build one of these: http://www.cnet.com/news/3d-printed-plotclock-writes-the-time-on-a-tiny-whiteboard-every-minute/ :)

300px-plotclock

by jayshao at September 19, 2014 02:45 PM

September 15, 2014

Sakai@UD

Changing Your Display Name in Sakai@UD

If you are a student and if your name in Sakai (or in other campus systems) is not what you want it to be, you can change it in UDSIS. more >

by Mathieu Plourde at September 15, 2014 04:23 PM

September 13, 2014

Alex Balleste

My history with Sakai

Tomorrow,  September 13 is the 10th anniversary of Sakai at UdL. We ran into production with Sakai 1.0 rc2 in University of Lleida in 2004. Quite an achievement and an adventure that has lasted 10 years and hopefully will be able to last many more. Perhaps it was a little rushed but luckily it worked out fine.

I will not tell you the history of the UdL and Sakai, I'll tell you what I know and I feel about my history of Sakai, that is directly related with the UdL. To get the full version of the UdL we should have a lot of people points of view. 

So I will start before Sakai, we have to go back few months ago, In January 2004 I applied for a contest for a temporary position at UdL for a project to provide to the University an open source LMS system. The tests were based on knowledge of programming in Java servlets, jsp,  and knowledge of eLearning. The IT service management was looking for a Java developer profile as they were evaluating the coursework platform from Stanford. They wanted developers to make improvements and adapt it to the UdL needs. At that time, UdL ran WebCT and wanted to replace it to an open source one in the context of free software migration across all the University.


I had coded a little Java for my final degree project, but I didn’t know anything about servlets or jsp, so I bought a book of J2SE and I studied some days before and took the test with many other they wanted that position. I passed the tests, and I was lucky to win the programmer position  on the “Virtual Campus” team  with 2 other guys.  David Barroso was already the analyst programmer of the team, it mean he was my direct boss (a really good one).  

We ran a pilot with a few subjects with the Computer Science degree in Coursework, and it looked to be well adapted to our needs. Also we were looking closely the LMS CHEF. When founding universities of Sakai announced that they join to create an LMS based on the work of those LMS the decision was taken.

When Sakai started lacked many features that we thought necessary, like a gradebook, a robust tool for assignment  and assessment,  but still seemed a platform with great potential. It had a big funding and support from the best universities in the world, it was enough for us to get into the project. UdL intention  with Sakai was to go beyond the capabilities of LMS and use it as a virtual space for the whole university. Provide in a future a set of community sites, and use for our intranet as well as development framework for our applications.

So we started working on it, translating the interface of Sakai to catalan and adapting the institutional image. We created sites for the subjects of the studies of the center Escola Politècnica Superior of the UdL. The September 13, 2004 the platform was in production.

Sakai 1.0rc2 translated in catalan and customized for UdL





During the process, we realized that the need of translating all the platform in each version would be very expensive, and the internationalization process appeared not to be one of the imminent community efforts, so the IT service manager Carles Mateu and David Barroso decided to offer support to internationalize Sakai. The idea was to provide a mechanism to translate Sakai easily without having to modify the source code every time Sakai released new version. It was an essential feature for us and it must be done in order to continue with Sakai project.  David contacted with the Sakai project chief director Dr. +Charles Severance and offered our help to internationalize whole Sakai.

Chuck was glad about our offering and the work started soon. Beth Kirschner was the person in charge of managing our work and sync with Sakai code. I was lucky to have the responsibility to manage the task on our part. First thing I did was a PoC with a tool. I extracted all the properties of a VM tool to a properties file, and then it was loaded with Java Properties objects. The PoC worked well but Beth encouraged me to use ResourceBundles instead of simple Properties class. I wrote another PoC with this guy and it worked great. From that point then began the tedious task of going all the code to do this. The result was  “tlang” and “rb” objects everywhere. That took between 2-3 months 3 people. We also used that process to write the catalan translation. We used a Forge instance installed at UdL to synchronize these efforts. We implement the changes there for Sakai 1.5 and when a tool was completely internationalized I notified in order for Beth to apply  the changes in the project main Sakai branch.

Although we worked on a 1.5, i18n changes were released in version the Sakai 2.0. For us it was a success because it ensured that we could continue using this platform for longer.  When version 2.0 came out we upgraded from our 1.0rc2. Only one word comes to my mind when I remember that upgrade: PAIN. We had a very little documentation and we had to look for the code for every error we found. We had to make a preliminary migration to 1.5, running scripts and processes on the Sakai startup and then upgrade to 2.0. The migration process failed on all sides but with a lot of efforts finally we went ahead with it.

Once we had the platform upgraded, we started to organize our virtual campus and university-wide LMS as intranet, creating sites for specific areas and services and facilitating access to people depending on the profile had in the LDAP. We also created the sites for the rest of degrees of our University. 

From that moment our relationship with Sakai has not been so hard. Everything went
better. Next version we ran was 2.2, we upgrade it on 2006. By then we  were granted with a Mellon Foundation award for our internationalization effort in Sakai. It’s one of the things that I’m prouder of my career, but it was embittered because the prize reward was finally not claimed. I did not find out until a couple of years after that happened.  The money of the award should be spent developing something interesting related on education, so in order to receive the award was needed to make a project proposal detailing how we would spend the $50K. UdL’s idea was to create a system to translate Sakai’s string bundles easily like some tools did it by then (poedit, ...). The IT service direction thought it was a better that the project wasn’t done by the same team that customized Sakai at UdL and internationalized (I guess they had other priorities in mind for us), but that development should be done by people of a Computer Science research group of the UdL. I do not know why they didn’t do the project or the proposal to get the award money, but nowadays I already don’t mind. [Some light here ... see the comments]

Around that time our team started working on a new project where were implied a large number of catalan Universities,  The campus project. It initially began as a proposal to create an open source LMS from scratch to be used by them. The project was lead by Open University of Catalunya, UOC. The UdL IT service direction board and David Barroso expressed their disagreement to spend the 2M€ finance such a project having already open source LMSs like Moodle and Sakai in which they could invest that money. The project changed direction and they tried to do something involved with existing LMSs, so they decided to create a set of tools that would use an OKI OSID middleware implemented for Moodle and Sakai.  Although running external tools in the context of an LMS using standards and provide an WS BUS to interact with the LMSs API was a good idea  I didn’t like how wanted to use a double level OKI OSID layer to interact with both LMS APIs. I thought that was too much complex and hard to be maintained.


OKI BUS

We upgraded Sakai again in 2007 to version 2.4 (that release gave us a lot of headaches). I also won the position as analyst programmer in the Virtual Campus team that David Barroso vacated when he won the internal projects manager position. The selection process left me quite exhausted by the long rounds of tests that were delayed in time and the competition with nearly 30 colleagues the made ​​harder the effort to get the best grades to win the position. By then, the IT service direction board, Carles Mateu and Cesar Fernandez, resigned because they had discrepancies with the main university direction board about how to apply the free software migration in the UdL. It was a shame because from then we have experienced a strong running back in free software policies and has worsened the situation of the entire IT service.

In September of that year, after the job competition finished and being chosen to take it, I went to spend a couple of weeks at the University of Michigan. My mission there was to work on the IMS-TI protocol with Dr. Chuck to see if we could use this standard as part of the Campus project. These two weeks there were very helpful. We did several examples implementing IMS-TI with OSID.  I spent a good time with Chuck and Beth in Ann Arbor during my visit to the United States, but I really remember fondly that trip because a few days before I went to Michigan, I got married in Las Vegas and I spent our honeymoon in New York.

Once back to Lleida, I insisted several times to the Campus Project architects on changing standards for the registration and launch apps to IMS-TI. Although people lead the campus project loved that idea they had already deep in mind the architecture they wanted to use, so we went with the original idea.

Several of the partner universities in the project created tools for that system and the UdL picked up the responsibility to create OSID implementations for Sakai as well as a tools to register and launch remote tools within Sakai as if they were their own . Although it was very tedious to implement OSID, it allowed me to get a fairly deep knowledge of all systems that later became the Kernel of Sakai. Unfortunately, the campus project was not used, but parallel IMS-LTI could end up winning.

Already on April 2008, taking advantage of a visit by Dr. Chuck to Barcelona for an attendance at a conference organized by the University Ramon Llull, we had the first meeting of Spanish Universities that had or thought to run Sakai. 

I went with the new director of IT services of the UdL, Carles Fornós. He was there the first time I saw Sakaigress, furry pink Sakai’s mascot. Dr. Chuck was carrying her. I explained to my boss that these teddies were given as a reward for participation in the community, and the first thing he told me was, "we have to get one." During the meeting the representatives of both universities that had running sakai, UPV as we, explained a bit of the experience we had with Sakai and resolved doubts that were raised to us from other universities. At the end of the meeting, everyone's surprise, Dr. Chuck wanted to give to us (UdL) the Sakaigress. He did it for two reasons that told me later. First, because we had been working hard in the community to internationalization and helping to promote standards like the IMS-TI with our work in  the implementation of the campus project, on the other hand he gave it to silence some voices of doubt that came out in the environment of our university about choosing Sakai instead of Moodle, wanting to reaffirm the commitment of the community with our University. 

Sakaigress

During that meeting also came the idea of making the first workshop of Sakai. A way to show people how to install, make tools and discuss about the platform. When my boss heard it whispered to me that we should offer volunteers to organize it, so I offered to organize.

In that meeting I also met the man who was in charge of implementing Sakai in the Valencian International University (VIU). We talked with him ​​about the OKI OSID implementation with his technical staff by mail some days before. They were very interested  in this use case. It was not even a month that  the team that prepared the specifications for implementation of Sakai to VIU came to Lleida to visit us. Before I tried to convince Carles Fornós to offer our services to the VIU. The customization of Sakai on other university for us would have been very simple and it was an opportunity to provide to the UdL more funds to keep developers. Carles did not seem a good idea, so I did not even offered. 
Moreover, when the UdL rejected to offer services as an institution, I considered doing at personal level with the help some co-workers. At first it seemed like a good idea for the responsibles for the technical office of the VIU, but when the moment arrived to go ahead with the collaboration, the UdL main direction board showed their disapproval (no prohibition), which made ​​us pull back because the risk of losing our jobs in the UdL if anything went wrong. Finally, the work was made by Pentec-Setival (Samoo). They did a great job. Perhaps it was the best result for the Spanish Sakai community because we got  a commercial  provider to support  Sakai.

In June 2008 we held the first Sakai workshop. It was a very pleasant experience, where the colleagues from UPV Raul Mengod and David Roldan, along with some staff of the Institute of Education Science of UdL (ICE) helped me to give some talks to other universities that were evaluating Sakai as their LMS.  

Soon after, in February of 2009, it was organized the second Sakai event in Santiago de Compostela. There, the group of the S2U was consolidated. By then, the UPNA was about to run on production migrating the contents of its old LMS WEBCT. In that meeting I showed how to develop tools in Sakai. At UdL we had upgraded to 2.5 and also shared opinions. We suffered a lot for performance issues and crashes with 2.4, but 2.5 seemed to improve a lot.

Days later that event, UPV invited us to attend a presentation and a meeting with Michael Korcuska, Sakai Foundation executive director by then. In Valencia it was the first time I saw the preview of Sakai 3. It was sold as the new version that would replace Sakai 2, he told that perhaps community would release a 2.7 version but not a 2.8. It was expected to be on 2010.

Truth be told, I loved it, and I spent much time tinkering and learning new technologies that had behind sakai 3. I went to the workshops offered at the 2009 conference in Boston, the truth is that everything pointed to the community supported the plan to move to Sakai 3, or at least it seemed to me.

On the 3rd congress of the S2U on November 2009, I made ​​a presentation of the benefits and technology behind Sakai 3 for making people aware of the new road that faced the LMS. Unfortunately we all know what has been the real way. It passed as slowly from “being the replacement”  to “something complementary” and finally to “something totally different”.

We did some proof of concept with hybrid system between Sakai CLE i OAE, Bedework, BBB and Kaltura. The PoC was quite promising, but the shift in architecture given the poor results obtained with the technological stack chosen frustrated our plans. Currently OAE continues with another stack but this away to the idea we had in mind at first.

By then we owned a big number of tools developed for Sakai JSF and Spring-Hibernate. For us, it was a problem in the future expected platform migration process between 2 and 3. In late 2009 and early 2010 we started developing our own JS + REST framework based on Sakai to have tools implemented more neutral manner that would allow us to move between platforms in less traumatic process. Thanks to all what I learned from Sakai OAE technologies I designed what is now our tool development framework for Sakai, DataCollector. It’s a framework that allows us to link to multiple sources and types of data sources and display it as js apps inside Sakai. It uses Sakai realms as permission mechanism and lets create big apps based on templates.
Gradually we have been replacing all the tools created in JSF (poor maintainable) by these based in our framework.  Although we finally we have not moved to OAE platform, it has helped us to have a set of more flexible and maintainable apps than those written  in JSF.

In July of 2010 we upgraded to version 2.7. We were still hoping to see soon Sakai OAE as part of our ecosystem Virtual Campus. Everything seemed to fit pretty well. At the 2010. At the end of the month my first son was born, and I took a long paternity. I was not working in the UdL but I wanted to assist to the IV Spanish Sakai congress in Barcelona in November to show all the work made with the Datacollector.  I went with my wife and him, the youngest member in the S2U.

In June 2011 we had another meeting in Madrid, it was organized to show to whole S2U member how to coordinate and use JIRA in a better way to help to our contributions being incorporated in sakai trunk. Some time ago we had an arrangement to implement some functionalities together and it was difficult to get in the main code. Some universities paid to Samoo to get it implemented but UM and UdL preferred to implemented ourselves. But what I really enjoyed in that meeting is how UM had implemented Hudson in their CI process. I loved the idea and my task in the following months was refactor all our process and automatize builds, deployments and tests with jenkins and selenium. 

Looking backward I see that during the years 2010 to 2012 our involvement with the S2U and the whole Sakai community dropped considerably. I guess that our eyes were on shift to the new environment. We concentrate our efforts on having DataCollector framework developed as much as possible in order to have a valid output gate for all those tools that have been developed since 2004. In addition S2U objectives were not in line with what we had in that moment. S2U's approach focused on the internationalization. As I understand it was a mistake because there was already a part of the community focused on that and the S2U should not focus only on those issues.

In July 2013 we did the sixth and last upgrade. In the upgrade process to  2.9 we took the chance to spend some time to migrate from our users provisioning system based scripts to an implementation of Course Management. Mireia Calzada did an excellent job  preparing ETLs and helping to build an implementation based on hibernate.

We took that opportunity to open all the functionality to allow create sites by teachers and students to let them use to work together, now they have storage space for their own projects, communication tools, etc.  That gave us very good results because people feel the virtual campus more useful than previous years. Also, we allowed teachers to invite external people and organize their sites as they want. Many of the complaints we had about the Sakai platform weren’t about features not supported by Sakai but due to the restrictions imposed by us.

The previous tasks related with that upgrade allowed me to reconnect with the community collaborating  reporting and resolving bugs, participating in QA,  and contributing what my colleagues and I have translated into catalan.

During 2013 I also ventured on a personal project related with Sakai. I created together with Juanjo Meroño from Murcia a functionality to allow videocam streaming  using the Sakai’s portal chat. A desire to contribute something personal to free software and especially to Sakai motivated me to make this project. It was a pretty nice experience to work with the community again. The help of Neal Caidin and Adrian Fish was the key to integrate it to the main code Sakai.  

In November 2013, Juanjo and me presented that functionality in the VI Congress of Sakai in Madrid. The important thing about that congress was that the whole s2u recovered sinergia. I’m convinced that University of Murcia staff was the key to inspire the rest of us. If you are interested you can read my opinion of the event in a previous blog post. Now we have weekly meetings and work as a team. Resources flow gently between the needs of group members and goes pretty well.

Now I feel again that I’m part of the Sakai community and S2U. I guess that the fact of working closely with its members has allowed me to believe that Sakai has a bit me.I'm waiting when the next s2u meeting is going to celebrate, and maybe I'm gonna go with my second son born that August.

And that is a brief summary of how I remember that history, maybe something was different, or happened in a different time. Just say Thanks to UdL, Sakai project, and S2U members to make that experience so amazing. 







by Anonymous (noreply@blogger.com) at September 13, 2014 08:18 AM

September 09, 2014

Sakai@UD

New Terminology in Forums

In Sakai 2.9.3, the Forums tool uses a different terminology to refer to what used to be called a Thread. It is now called a Conversation. The following diagram presents graphically the new hierarchy of the terms now used in the Forums tool. Once in a topic, you can either start a top-level conversation or […] more >

by Mathieu Plourde at September 09, 2014 03:27 PM