Planet Sakai

September 16, 2014

Adam Marshall

Major New WebLearn release: WebLearn 10

We are pleased to announce that WebLearn has undergone significant face-lift with a move to WebLearn 10 (2.10-ox1). The upgrade was effected on Tuesday 16th September 2014, we apologise for the disruption that the extended downtime must have caused.

WL10

As well as a new look, this release brings a wealth of new features including:

  • Brand new modern, mobile-friendly navigation with the ‘Sites Drawer’ offering quick links to individual tools within a site
  • Peer marking in Assignments tool
  • Brand new Contact Us tool (as recommended by the WebLearn Student Experience study)
  • Sequencing of content and other activities with the new Lessons tool
  • Customisable branding for a department (available upon request, see below)
  • Drag-and-dropping of files into Resources through your web browser
  • Allowing students to submit group Assignments
  • A new Syllabus tool – edit in place, calendar integration, bulk changes
  • Updated Markbook: assign extra credit, new drop highest, drop lowest, and keep highest grade options, hide/show columns in All Grades page and PDF export
  • Add Mathematical notation to web pages
  • Improved Forums tool offering statistics and grading; an option to require users to post before they can read existing posts; word count for messages; and improved permission settings layout
  • New Web Content and Home tools (see below for important information)
  • Support for IMS Learning Tools Interoperability 2.0 – this allows data to be sent back to WebLearn from the 3rd party tool
  • The ability to locate and link to Forum posts and Assignments within the WYSIWYG editor (via Browse Server)
  • Language preference selectable at site level
  • Support for IMS Common Cartridge import (through the Lessons tool)
  • Brand new Site Members tool (aka Roster)
  • Improved Reading Lists: items can be reordered, list items can be files in Resources, fixed many problems with importing
  • Updated WYSIWYG HTML editor
  • Users can record audio in Resources
  • Add movies to HTML pages in Resources
  • Broadcast files to all Drop Boxes
  • Tutorial for first -time users
  • Question pools copied during site duplication in Tests tool
  • Introduction of AntiSamy HTML code filtering to enhance security (on all tools except Resources) – you will see a warning if HTML is removed
  • Plus many many more enhancements

This new version has been subjected to months of testing but, as with other similarly complex systems, there are bound to be some teething problems – some issues that our testing did not reveal. Should you encounter any oddities then we would be very grateful if you could report them to the central team (using the new ‘Contact Us’ tool), ideally with screen-shots. We will attempt to address any outstanding issues over the next few weeks.

The next few sections outline some of the new features and present a handful of known issues that we simply didnt have time to address.

New Navigation

There is a new ‘breadcrumb trail’ and navigation to sites is now performed via the “Sites Drawer”.

top-bar

To access the “Sites Drawer”, click on My Sites and then click on the symbol to the right of the target site to reveal and jump to individual tools present on the site.

sites-drawer

New Web Content and Home tools

The Home and Web Content tools no longer use “iFrames”, this has resulted in a change in behaviour (see below) and have new icon for configuring the ‘properties’ of the tool, this used to be a button labelled ‘Options’. (This is the first step towards removing iFrames from the system.)

options-edit

On the Home page, you may also see a red warning rectangle appearing at the top right of the screen explaining that some HTML tags have been removed from the page. This warning comes from AntiSamy which is a new library filtering out potentially dangerous HTML tags. AntiSamy filtering is not applied to files in Resources so the procedure detailed below will fix that issue as well.

antisamy

Change in Behaviour – Important

If the text for a Home Page has been entered via the WYSIWYG editor (rather than linking to a page in Resources) then all links will open in a new tab  which is likely to alter the way in which a site’s home page works.

If this causes a problem, then follow this 3 step ‘recipe’:

  1. edit the Home Tool and copy all the text from within the editor panel then “Cancel” (do not just copy the text directly from the home page, you must open the editor)
  2. paste this text into a new HTML page in Resources, copy the URL of this page; you may want to place this page in a special folder
  3. edit the Home Tool again, paste the URL into the box marked “Site Info URL” (underneath the editor panel) and “Save”

After thoroughly checking that the (new) home page works, it may be an idea to remove the text from the editor window to alleviate future confusion.

You do not need to do this if your ‘home page’ is currently a page in Resources.

We apologise for the inconvenience that this may cause.

Customisable Branding

It is now possible for departments to have their own top banner on all their pages.The following aspects can be configured:

  • ‘backgroundColour’ – colour of main banner (specified as #rrggbb colour)
  • ‘backgroundImage’ – URL of the background image of the main banner (will tile) – store this in a public folder in the department’s administration site
  • ‘imageSource’ – URL of the image at the right end of the main banner – store this in a public folder in the department’s administration site
  • ‘imageLink’ – URL of a website that the image at the right end of the banner links to
  • ‘message’ – the title, if this is not set then it will always be the name of the current site
  • ‘fontColour’ – the colour of the font of the ‘message’ (specified as #rrggbb colour)

This customisation must be done by the central WebLearn team. If you require this service then please send an email listing the various values for the above attributes; ensure that you have stored any images in a public folder in your administration site and place the URLs in the email.

Examples of Custom Branding

The Said Business School: this has the ‘message and ‘imageSource’ attributes set

sbs

A site for ice cream lovers: this has the ‘message’, ’backgroundImage’ and ‘fontColour’ attributes set; the ‘imageSource’ attribute has been blanked out.

ice-creamun-

Known Issues

Unfortunately, as with any major release, there are a handful of known issues that we simply didn’t have time to address. We will attempt to address these issues over the next few weeks.

  • No WYSIWYG editor in Surveys beta when using Internet Explorer
  • Email attachments are not indexed by ‘Search’
  • Some information is duplicated in the pop-up help
  • The Padlock “lock tool” facility is equivalent to the ‘light bulb’ feature, ie, it only hides the tool
  • The Site Members tool (mistakenly referred to as ‘Roster’ in Site Info) does not display members if a site contains one or more Participant Groups
  • The ‘Search Library Catalogue’ pop-up is blocked by most browsers and must be specifically allowed to open (there is nothing we can do about this)
  • The ‘Search Library Catalogue’ facility will not work in Internet Explorer 10 and higher
  • The ‘Sakai Resource Picker’ within a reading list appears to go wrong with certain browsers but in actuality, the resource is added to the reading list just as it should be

Links

by Adam Marshall at September 16, 2014 08:57 AM

September 15, 2014

Michael Feldstein

LMS and Open: The false binary is based on past, not future markets

D’Arcy Norman has an excellent blog post up titled “On the false binary of LMS vs. Open” that captures a false framing issue.

We’re pushed into a false binary position – either you’re on the side of the evil LMS, working to destroy all that is beautiful and good, or you’re on the side of openness, love, and awesomeness. Choose. There is no possible way to teach (or learn) effectively in an LMS! It is EVIL and must be rooted out before it sinks its rotting tendrils into the unsuspecting students who are completely and utterly defenseless against its unnatural power!

While D’Arcy is a proponent of open tools, he rightly calls out the need to understand institutional responsibilities.

But. We can’t just abdicate the responsibility of the institution to provide the facilities that are needed to support the activities of the instructors and students. That doesn’t mean just “hey – there’s the internet. go to it.” It means providing ways for students to register in courses. For their enrolment to be automatically processed to provision access to resources (physical classrooms, online environments, libraries, etc…). For students’ grades and records to be automatically pushed back into the Registrar’s database so they can get credit for completing the course. For integration with library systems, to grant acccess to online reserve reading materials and other resources needed as part of the course.

This is an important point, in that the institutional LMS is important and will not, and should not, go away anytime soon. I have pointed out recently that the LMS is one of the very few technologies now used in a majority of courses within an institution, and the institutional responsibility described above helping to explain why.

In our consulting work Michael and I often help survey institutions to discover what technologies are being used within courses, and typically the only technologies that are used by a majority of faculty members or in a majority of courses are the following:

  • AV presentation in the classroom;
  • PowerPoint usage in the classroom (obviously connected with the projectors);
  • Learning Management Systems (LMS);
  • Digital content at lower level than a full textbook (through open Internet, library, publishers, other faculty, or OER); and
  • File sharing applications.

At the same time, the LMS does a very poor job at providing a lot of the learning technologies desired by faculty and students. There is no way that a monolithic LMS can keep up with the market – it cannot match functionality of open internet tools especially without adding feature bloat.

I would add that part of the cause of the “false binary position” that D’Arcy points out is that much of the public commentary focuses on where the LMS has been rather than where it is going. There is a significant movement based on interoperability that is leading, perhaps painfully and slowly, to a world where the LMS can coexist with open educational tools, with even end users (faculty and students) eventually having the ability to select their tools that can share rosters and data with the institutional LMS.

Below is a modified presentation I gave at the Apereo Mexico conference in the spring (with a few changes to explain slides without audio). The key point is that there are subtle changes to the LMS market that are significant, and the coexistence of the LMS with open tools will be central to the market’s future.

Will all LMS vendors move this direction? In marketing, yes, but in reality, no. There are different approaches to this coexistence issue from the LMS vendors, ranging from lip service to outright support, and several points in between. But the overall trend is clearing moving this direction, even if some solutions lose out over time.

Download (PDF, 4.21MB)

The post LMS and Open: The false binary is based on past, not future markets appeared first on e-Literate.

by Phil Hill at September 15, 2014 06:46 PM

Sakai@UD

Changing Your Display Name in Sakai@UD

If you are a student and if your name in Sakai (or in other campus systems) is not what you want it to be, you can change it in UDSIS. more >

by Mathieu Plourde at September 15, 2014 04:23 PM

September 13, 2014

Alex Balleste

My history with Sakai

Tomorrow,  September 13 is the 10th anniversary of Sakai at UdL. We ran into production with Sakai 1.0 rc2 in University of Lleida in 2004. Quite an achievement and an adventure that has lasted 10 years and hopefully will be able to last many more. Perhaps it was a little rushed but luckily it worked out fine.

I will not tell you the history of the UdL and Sakai, I'll tell you what I know and I feel about my history of Sakai, that is directly related with the UdL. To get the full version of the UdL we should have a lot of people points of view. 

So I will start before Sakai, we have to go back few months ago, In January 2004 I applied for a contest for a temporary position at UdL for a project to provide to the University an open source LMS system. The tests were based on knowledge of programming in Java servlets, jsp,  and knowledge of eLearning. The IT service management was looking for a Java developer profile as they were evaluating the coursework platform from Stanford. They wanted developers to make improvements and adapt it to the UdL needs. At that time, UdL ran WebCT and wanted to replace it to an open source one in the context of free software migration across all the University.


I had coded a little Java for my final degree project, but I didn’t know anything about servlets or jsp, so I bought a book of J2SE and I studied some days before and took the test with many other they wanted that position. I passed the tests, and I was lucky to win the programmer position  on the “Virtual Campus” team  with 2 other guys.  David Barroso was already the analyst programmer of the team, it mean he was my direct boss (a really good one).  

We ran a pilot with a few subjects with the Computer Science degree in Coursework, and it looked to be well adapted to our needs. Also we were looking closely the LMS CHEF. When founding universities of Sakai announced that they join to create an LMS based on the work of those LMS the decision was taken.

When Sakai started lacked many features that we thought necessary, like a gradebook, a robust tool for assignment  and assessment,  but still seemed a platform with great potential. It had a big funding and support from the best universities in the world, it was enough for us to get into the project. UdL intention  with Sakai was to go beyond the capabilities of LMS and use it as a virtual space for the whole university. Provide in a future a set of community sites, and use for our intranet as well as development framework for our applications.

So we started working on it, translating the interface of Sakai to catalan and adapting the institutional image. We created sites for the subjects of the studies of the center Escola Politècnica Superior of the UdL. The September 13, 2004 the platform was in production.

Sakai 1.0rc2 translated in catalan and customized for UdL





During the process, we realized that the need of translating all the platform in each version would be very expensive, and the internationalization process appeared not to be one of the imminent community efforts, so the IT service manager Carles Mateu and David Barroso decided to offer support to internationalize Sakai. The idea was to provide a mechanism to translate Sakai easily without having to modify the source code every time Sakai released new version. It was an essential feature for us and it must be done in order to continue with Sakai project.  David contacted with the Sakai project chief director Dr. +Charles Severance and offered our help to internationalize whole Sakai.

Chuck was glad about our offering and the work started soon. Beth Kirschner was the person in charge of managing our work and sync with Sakai code. I was lucky to have the responsibility to manage the task on our part. First thing I did was a PoC with a tool. I extracted all the properties of a VM tool to a properties file, and then it was loaded with Java Properties objects. The PoC worked well but Beth encouraged me to use ResourceBundles instead of simple Properties class. I wrote another PoC with this guy and it worked great. From that point then began the tedious task of going all the code to do this. The result was  “tlang” and “rb” objects everywhere. That took between 2-3 months 3 people. We also used that process to write the catalan translation. We used a Forge instance installed at UdL to synchronize these efforts. We implement the changes there for Sakai 1.5 and when a tool was completely internationalized I notified in order for Beth to apply  the changes in the project main Sakai branch.

Although we worked on a 1.5, i18n changes were released in version the Sakai 2.0. For us it was a success because it ensured that we could continue using this platform for longer.  When version 2.0 came out we upgraded from our 1.0rc2. Only one word comes to my mind when I remember that upgrade: PAIN. We had a very little documentation and we had to look for the code for every error we found. We had to make a preliminary migration to 1.5, running scripts and processes on the Sakai startup and then upgrade to 2.0. The migration process failed on all sides but with a lot of efforts finally we went ahead with it.

Once we had the platform upgraded, we started to organize our virtual campus and university-wide LMS as intranet, creating sites for specific areas and services and facilitating access to people depending on the profile had in the LDAP. We also created the sites for the rest of degrees of our University. 

From that moment our relationship with Sakai has not been so hard. Everything went
better. Next version we ran was 2.2, we upgrade it on 2006. By then we  were granted with a Mellon Foundation award for our internationalization effort in Sakai. It’s one of the things that I’m prouder of my career, but it was embittered because the prize reward was finally not claimed. I did not find out until a couple of years after that happened.  The money of the award should be spent developing something interesting related on education, so in order to receive the award was needed to make a project proposal detailing how we would spend the $50K. UdL’s idea was to create a system to translate Sakai’s string bundles easily like some tools did it by then (poedit, ...). The IT service direction thought it was a better that the project wasn’t done by the same team that customized Sakai at UdL and internationalized (I guess they had other priorities in mind for us), but that development should be done by people of a Computer Science research group of the UdL. I do not know why they didn’t do the project or the proposal to get the award money, but nowadays I already don’t mind. [Some light here ... see the comments]

Around that time our team started working on a new project where were implied a large number of catalan Universities,  The campus project. It initially began as a proposal to create an open source LMS from scratch to be used by them. The project was lead by Open University of Catalunya, UOC. The UdL IT service direction board and David Barroso expressed their disagreement to spend the 2M€ finance such a project having already open source LMSs like Moodle and Sakai in which they could invest that money. The project changed direction and they tried to do something involved with existing LMSs, so they decided to create a set of tools that would use an OKI OSID middleware implemented for Moodle and Sakai.  Although running external tools in the context of an LMS using standards and provide an WS BUS to interact with the LMSs API was a good idea  I didn’t like how wanted to use a double level OKI OSID layer to interact with both LMS APIs. I thought that was too much complex and hard to be maintained.


OKI BUS

We upgraded Sakai again in 2007 to version 2.4 (that release gave us a lot of headaches). I also won the position as analyst programmer in the Virtual Campus team that David Barroso vacated when he won the internal projects manager position. The selection process left me quite exhausted by the long rounds of tests that were delayed in time and the competition with nearly 30 colleagues the made ​​harder the effort to get the best grades to win the position. By then, the IT service direction board, Carles Mateu and Cesar Fernandez, resigned because they had discrepancies with the main university direction board about how to apply the free software migration in the UdL. It was a shame because from then we have experienced a strong running back in free software policies and has worsened the situation of the entire IT service.

In September of that year, after the job competition finished and being chosen to take it, I went to spend a couple of weeks at the University of Michigan. My mission there was to work on the IMS-TI protocol with Dr. Chuck to see if we could use this standard as part of the Campus project. These two weeks there were very helpful. We did several examples implementing IMS-TI with OSID.  I spent a good time with Chuck and Beth in Ann Arbor during my visit to the United States, but I really remember fondly that trip because a few days before I went to Michigan, I got married in Las Vegas and I spent our honeymoon in New York.

Once back to Lleida, I insisted several times to the Campus Project architects on changing standards for the registration and launch apps to IMS-TI. Although people lead the campus project loved that idea they had already deep in mind the architecture they wanted to use, so we went with the original idea.

Several of the partner universities in the project created tools for that system and the UdL picked up the responsibility to create OSID implementations for Sakai as well as a tools to register and launch remote tools within Sakai as if they were their own . Although it was very tedious to implement OSID, it allowed me to get a fairly deep knowledge of all systems that later became the Kernel of Sakai. Unfortunately, the campus project was not used, but parallel IMS-LTI could end up winning.

Already on April 2008, taking advantage of a visit by Dr. Chuck to Barcelona for an attendance at a conference organized by the University Ramon Llull, we had the first meeting of Spanish Universities that had or thought to run Sakai. 

I went with the new director of IT services of the UdL, Carles Fornós. He was there the first time I saw Sakaigress, furry pink Sakai’s mascot. Dr. Chuck was carrying her. I explained to my boss that these teddies were given as a reward for participation in the community, and the first thing he told me was, "we have to get one." During the meeting the representatives of both universities that had running sakai, UPV as we, explained a bit of the experience we had with Sakai and resolved doubts that were raised to us from other universities. At the end of the meeting, everyone's surprise, Dr. Chuck wanted to give to us (UdL) the Sakaigress. He did it for two reasons that told me later. First, because we had been working hard in the community to internationalization and helping to promote standards like the IMS-TI with our work in  the implementation of the campus project, on the other hand he gave it to silence some voices of doubt that came out in the environment of our university about choosing Sakai instead of Moodle, wanting to reaffirm the commitment of the community with our University. 

Sakaigress

During that meeting also came the idea of making the first workshop of Sakai. A way to show people how to install, make tools and discuss about the platform. When my boss heard it whispered to me that we should offer volunteers to organize it, so I offered to organize.

In that meeting I also met the man who was in charge of implementing Sakai in the Valencian International University (VIU). We talked with him ​​about the OKI OSID implementation with his technical staff by mail some days before. They were very interested  in this use case. It was not even a month that  the team that prepared the specifications for implementation of Sakai to VIU came to Lleida to visit us. Before I tried to convince Carles Fornós to offer our services to the VIU. The customization of Sakai on other university for us would have been very simple and it was an opportunity to provide to the UdL more funds to keep developers. Carles did not seem a good idea, so I did not even offered. 
Moreover, when the UdL rejected to offer services as an institution, I considered doing at personal level with the help some co-workers. At first it seemed like a good idea for the responsibles for the technical office of the VIU, but when the moment arrived to go ahead with the collaboration, the UdL main direction board showed their disapproval (no prohibition), which made ​​us pull back because the risk of losing our jobs in the UdL if anything went wrong. Finally, the work was made by Pentec-Setival (Samoo). They did a great job. Perhaps it was the best result for the Spanish Sakai community because we got  a commercial  provider to support  Sakai.

In June 2008 we held the first Sakai workshop. It was a very pleasant experience, where the colleagues from UPV Raul Mengod and David Roldan, along with some staff of the Institute of Education Science of UdL (ICE) helped me to give some talks to other universities that were evaluating Sakai as their LMS.  

Soon after, in February of 2009, it was organized the second Sakai event in Santiago de Compostela. There, the group of the S2U was consolidated. By then, the UPNA was about to run on production migrating the contents of its old LMS WEBCT. In that meeting I showed how to develop tools in Sakai. At UdL we had upgraded to 2.5 and also shared opinions. We suffered a lot for performance issues and crashes with 2.4, but 2.5 seemed to improve a lot.

Days later that event, UPV invited us to attend a presentation and a meeting with Michael Korcuska, Sakai Foundation executive director by then. In Valencia it was the first time I saw the preview of Sakai 3. It was sold as the new version that would replace Sakai 2, he told that perhaps community would release a 2.7 version but not a 2.8. It was expected to be on 2010.

Truth be told, I loved it, and I spent much time tinkering and learning new technologies that had behind sakai 3. I went to the workshops offered at the 2009 conference in Boston, the truth is that everything pointed to the community supported the plan to move to Sakai 3, or at least it seemed to me.

On the 3rd congress of the S2U on November 2009, I made ​​a presentation of the benefits and technology behind Sakai 3 for making people aware of the new road that faced the LMS. Unfortunately we all know what has been the real way. It passed as slowly from “being the replacement”  to “something complementary” and finally to “something totally different”.

We did some proof of concept with hybrid system between Sakai CLE i OAE, Bedework, BBB and Kaltura. The PoC was quite promising, but the shift in architecture given the poor results obtained with the technological stack chosen frustrated our plans. Currently OAE continues with another stack but this away to the idea we had in mind at first.

By then we owned a big number of tools developed for Sakai JSF and Spring-Hibernate. For us, it was a problem in the future expected platform migration process between 2 and 3. In late 2009 and early 2010 we started developing our own JS + REST framework based on Sakai to have tools implemented more neutral manner that would allow us to move between platforms in less traumatic process. Thanks to all what I learned from Sakai OAE technologies I designed what is now our tool development framework for Sakai, DataCollector. It’s a framework that allows us to link to multiple sources and types of data sources and display it as js apps inside Sakai. It uses Sakai realms as permission mechanism and lets create big apps based on templates.
Gradually we have been replacing all the tools created in JSF (poor maintainable) by these based in our framework.  Although we finally we have not moved to OAE platform, it has helped us to have a set of more flexible and maintainable apps than those written  in JSF.

In July of 2010 we upgraded to version 2.7. We were still hoping to see soon Sakai OAE as part of our ecosystem Virtual Campus. Everything seemed to fit pretty well. At the 2010. At the end of the month my first son was born, and I took a long paternity. I was not working in the UdL but I wanted to assist to the IV Spanish Sakai congress in Barcelona in November to show all the work made with the Datacollector.  I went with my wife and him, the youngest member in the S2U.

In June 2011 we had another meeting in Madrid, it was organized to show to whole S2U member how to coordinate and use JIRA in a better way to help to our contributions being incorporated in sakai trunk. Some time ago we had an arrangement to implement some functionalities together and it was difficult to get in the main code. Some universities paid to Samoo to get it implemented but UM and UdL preferred to implemented ourselves. But what I really enjoyed in that meeting is how UM had implemented Hudson in their CI process. I loved the idea and my task in the following months was refactor all our process and automatize builds, deployments and tests with jenkins and selenium. 

Looking backward I see that during the years 2010 to 2012 our involvement with the S2U and the whole Sakai community dropped considerably. I guess that our eyes were on shift to the new environment. We concentrate our efforts on having DataCollector framework developed as much as possible in order to have a valid output gate for all those tools that have been developed since 2004. In addition S2U objectives were not in line with what we had in that moment. S2U's approach focused on the internationalization. As I understand it was a mistake because there was already a part of the community focused on that and the S2U should not focus only on those issues.

In July 2013 we did the sixth and last upgrade. In the upgrade process to  2.9 we took the chance to spend some time to migrate from our users provisioning system based scripts to an implementation of Course Management. Mireia Calzada did an excellent job  preparing ETLs and helping to build an implementation based on hibernate.

We took that opportunity to open all the functionality to allow create sites by teachers and students to let them use to work together, now they have storage space for their own projects, communication tools, etc.  That gave us very good results because people feel the virtual campus more useful than previous years. Also, we allowed teachers to invite external people and organize their sites as they want. Many of the complaints we had about the Sakai platform weren’t about features not supported by Sakai but due to the restrictions imposed by us.

The previous tasks related with that upgrade allowed me to reconnect with the community collaborating  reporting and resolving bugs, participating in QA,  and contributing what my colleagues and I have translated into catalan.

During 2013 I also ventured on a personal project related with Sakai. I created together with Juanjo Meroño from Murcia a functionality to allow videocam streaming  using the Sakai’s portal chat. A desire to contribute something personal to free software and especially to Sakai motivated me to make this project. It was a pretty nice experience to work with the community again. The help of Neal Caidin and Adrian Fish was the key to integrate it to the main code Sakai.  

In November 2013, Juanjo and me presented that functionality in the VI Congress of Sakai in Madrid. The important thing about that congress was that the whole s2u recovered sinergia. I’m convinced that University of Murcia staff was the key to inspire the rest of us. If you are interested you can read my opinion of the event in a previous blog post. Now we have weekly meetings and work as a team. Resources flow gently between the needs of group members and goes pretty well.

Now I feel again that I’m part of the Sakai community and S2U. I guess that the fact of working closely with its members has allowed me to believe that Sakai has a bit me.I'm waiting when the next s2u meeting is going to celebrate, and maybe I'm gonna go with my second son born that August.

And that is a brief summary of how I remember that history, maybe something was different, or happened in a different time. Just say Thanks to UdL, Sakai project, and S2U members to make that experience so amazing. 







by Alex Ballesté (noreply@blogger.com) at September 13, 2014 08:18 AM

September 11, 2014

Michael Feldstein

Pearson’s Efficacy Listening Tour

Back around New Year, Michael wrote a post examining Pearson’s efficacy initiative and calling on the company to engage in active discussions with various communities within higher education about defining “efficacy” with educators rather than for educators. It turns out that post got a fair bit of attention within the company. It was circulated in a company-wide email from CEO John Fallon, and the blog post and all the comments were required reading for portions of the company leadership. After a series of discussions with the company, we, through our consulting company, have been hired by Pearson to facilitate a few of these conversations. We also asked for and received permission to blog about them. Since this is an exception to our rule that we don’t blog about our paid engagements, we want to tell you a little more about the engagement, our rationale for blogging about it, and the ground rules.

The project itself is fairly straightforward. We’re facilitating conversations with a few different groups of educators in different contexts. The focus of each conversation is how they define and measure educational effectiveness in their respective contexts. There will be some  discussion of Pearson’s efficacy efforts at a high level, but mainly for the purpose of trying to map what the educators are telling us about their practices to how Pearson is thinking about efficacy in the current iteration of their approach. After doing a few of these, we’ll bring together the participants along with other educators in a culminating event. At this meeting, the participants will hear a summary of the lessons learned from the earlier conversations, learn a bit more about Pearson’s efficacy work, and then break up into mixed discussion groups to provide more feedback on how to move the efficacy conversation forward and how Pearson’s own efforts can be improved to make them maximally useful to educators.

Since both e-Literate readers and Pearson seemed to get a lot of value from our original post on the topic, we believe there would be value in sharing some of the ongoing conversation here as well. So we asked for and received permission from Pearson to blog about it. Here are the ground rules:

  • We are not getting paid to blog and are under no obligation to blog.
  • Our blog posts do not require prior editorial review by Pearson.
  • Discussions with Pearson during the engagement are considered fair game for blogging unless they are explicitly flagged as otherwise.
  • On the other hand, we will ask for Pearson customers for approval prior to writing about their own campus initiatives (and, in fact, will extend that courtesy to all academic participants).

The main focus of these posts, like the engagement itself, is likely to be on how the notion of efficacy resonates (or doesn’t) with various academic communities in various contexts. Defining and measuring the effectiveness of educational experiences—when measurement is possible and sensible—is a subject with much broader application’s than Pearson’s product development, which is why we are making an exception to our blogging recusal policy for our consulting engagements and why we appreciate Pearson giving us a free hand to write about what we learn.

The post Pearson’s Efficacy Listening Tour appeared first on e-Literate.

by Michael Feldstein at September 11, 2014 08:06 PM

GAO Report: Yes, student debt is growing problem

In case anyone needed additional information to counter the Brookings-fed meme that “Americans who borrowed to finance their education are no worse off today than they were a generation ago”, theU.S. Government Accountability Office (GAO) released a report yesterday with some significant findings. As reported at Inside Higher Ed by Michael Stratford:

More than 700,000 households headed by Americans 65 or older now carry student debt, according to a report released Wednesday by the U.S. Government Accountability Office. And the amount of debt owed by borrowers 65 and older jumped from $2.8 billion in 2005 to $18.2 billion last year. [snip]

Between 2004 and 2010, for instance, the number of households headed by individuals 65 to 74 with student loan debt more than quadrupled, going from 1 percent to 4 percent of all such families. During that same period, the rate of borrowing among Americans under 44 years old increased between 40 and 80 percent, even though borrowing among that age group is far more prevalent than it is among senior citizens.

I have been highly critical of the Brookings Institutions and their report and update. This new information from the GAO goes outside the selective Brookings data set of households headed by people aged 20 – 40, but it should be considered by anyone trying to draw conclusions about student debt holders.

Noting that Brookings analysis is based on “Americans who borrowed to finance their education” and the GAO report is on student debt holders, it is worth asking if we’re looking at a similar definition. For the most part, yes, as explained at IHE:

While some of the debt reflects loans taken out by parents on behalf of their children, the vast majority — roughly 70 to 80 percent of the outstanding debt — is attributable to the borrowers’ own education. Parent PLUS loans accounted for only about 27 percent of the student debt held by borrowers 50 to 64 years old, and an even smaller share for borrowers over 65.

Go read at least the entire IHE article, if not the entire GAO report.

Student debt is a growing problem in the US, and the Brookings Institution conclusions are misleading at best.

The post GAO Report: Yes, student debt is growing problem appeared first on e-Literate.

by Phil Hill at September 11, 2014 04:57 PM

September 09, 2014

Adam Marshall

WebLearn unavailable on Tuesday 16 September 2014 from Midnight – Noon

It is planned to upgrade WebLearn to version 2.10-ox1 on Tuesday 16 September 2014 from Midnight – Noon (note the extended outage). There will be no service during this period.

This release will bring major changes in both WebLearn and the supporting infrastructure:

  • New user interface
  • New ‘Lessons’ tool: content sequencing, embedding media, inserting assessments, prerequisites, student and group pages and more
  • New ‘Syllabus’ tool with in-line editing, start/end dates, table of contents view
  • New ‘Contact Us’ (feedback) tool
  • A new audio recording widget for the Sakai rich text editor
  • Drag-and-drop files uploads
  • Peer review and group submissions in the Assignment tool
  • ….. and much more

We apologise for any inconvenience that this essential work may cause.

Here’s a sneak-preview of the new front page.

WL10

by Adam Marshall at September 09, 2014 04:14 PM

Sakai@UD

New Terminology in Forums

In Sakai 2.9.3, the Forums tool uses a different terminology to refer to what used to be called a Thread. It is now called a Conversation. The following diagram presents graphically the new hierarchy of the terms now used in the Forums tool. Once in a topic, you can either start a top-level conversation or […] more >

by Mathieu Plourde at September 09, 2014 03:27 PM

Dr. Chuck

How to Achieve Vendor Lock-in with a Legit Open Source License – Affero GPL

Note: In this post I am not speaking for the University of Michigan, IMS, Longsight, or any one else. I have no inside information on Kuali or Instructure and am basing all of my interpretations and commentary on the public communications from the kuali.org web site and other publically available materials. The opinions in this post are my own.

Before reading this blog post, please take a quick look at this video about Open Source:



The founding principles of Open Source from the video are as follows:

  1. Access to the source of any given work
  2. Free Remix and Redistribution of Any Given Work
  3. End to Predatory Vendor Lock-In
  4. Higher Degree of Cooperation

A decade ago efforts like Jasig, Sakai, and Kuali were founded to collaboratively build open source software to meet the needs of higher education to achieve all of the above goals. Recently Kuali has announced a pivot toward Professional Open Source. Several years ago the Sakai and Jasig communities decided to form a new shared non-profit organization called Apereo to move away from Community Source and toward pure Apache-style open source. So interestingly, at this time, all the projects that coined the term “Community Source”, no longer use the term to describe themselves.

In the August 22 Kuali announcement of the pivot from non-profit open source to for-profit open source, there was a theme of how much things have changed in the past decade since Kuali was founded:

…as we celebrate our innovative 2004 start and the progress of the last decade, we also know that we live in a world of change. Technology evolves. Economics evolve. Institutional needs evolve. We need to go faster. We need a path to a full suite of great products for institutions that want a suite. So it is quite natural that a 10-year-old software organization consolidates its insights and adapts to the opportunities ahead.

There were many elements in the August 22 announcement that merit discussion (i.e. here and here) but I will focus on these particular quotes from the FAQ that accompanied the August 22 announcement:

This plan is still under consideration. The current plan is for the Kuali codebase to be forked and re-licensed under Affero General Public License (AGPL).

The Kuali Foundation (.org) will still exist and will be a co-founder of the company. … The Foundation will provide initial capital investment for the company out of its reserves.

In a follow-up post five days later on August 27 they clarified the wording about licensing and capital:

All software that has been released under the current, Open Source Initiative approved Educational Community License (ECL) will and can continue under that license.

The software license for work done by the new entity and from its own capital will be the Open Source Initiative approved Affero GPL3 license (AGPL3).

While the details and overall intent of the August 22 and August 27 announcements from the Kuali Foundation may seem somewhat different, the AGPL3 license remains the central tenet of the Kuali pivot to professional open source.

The availability of the AGPL3 license and the successful use of AGPL3 to found and fund “open source” companies that can protect their intellectual property and force vendor lock-in *is* the “change” that has happened in the past decade that underlies both of these announcements and the makes a pivot away from open source and to professional open source an investment with the potential for high returns to its shareholders.

Before AGPL3

Before the AGPL3 license was created, there were two main approaches to open source licensing – Apache-style and GPL-style. The Apache-like licenses (including BSD, MIT, and ECL) allow commercial companies to participate fully in both the active development of the code base and the internal commercial use of that code base without regard to mixing of their proprietary code with the open source code.

The GNU Public License (GPL) had a “sticky” copyleft clause that forced any modifications of redistributed code to also be released open source. The GPL license was conceived pre-cloud and so its terms and conditions were all about distribution of software artifacts and not about standing up a cloud service with GPL code that had been modified by a company or mixed with proprietary code.

Many companies chose to keep it simple and avoided making any modifications to GPL software like the Linux kernel. Those companies could participate in Apache projects with gusto but they kept the GPL projects at arms length. Clever companies like IBM that wanted to advance the cause of GPL software like Linux would hire completely separate and isolated staff that would work on Linux. They (and their lawyers) felt they could meet the terms of the GPL license by having one team tweak their cloud offerings based on GPL software and a completely separate team that would work on GPL software and never let the two teams meet (kind of like matter and anti-matter).

So clever companies could work closely with GPL software and the associated projects if they were very careful. In a sense because GPL had this “loophole”, while it was not *easy* for commercial companies to engage in GPL projects when a company tweaked the GPL software for their own production use, it was *possible* for a diverse group of commercial companies to engage constructively in GPL projects. The Moodle project is a wonderful example of a great GPL project (well over a decade of success) with a rich multi-vendor ecosystem.

So back in 1997, the GPL and Apache-like licenses appeared far apart – in practice as the world moved to cloud in the past decades the copyleft clause in GPL became less and less of a problem. GPL licensed code could leverage a rich commercial ecosystem almost as well as Apache licensed code. The copyleft clause in GPL had became much weaker by 2005 because of the shift to the cloud.

AGPL – Fixing the “loophole” in GPL

The original purpose of the GPL license was to insist that over time all software would be open source and its clause to force redistribution was a core element.

For example, if you distribute copies of such a program, whether gratis or for a fee, you must pass on to the recipients the same freedoms that you received. You must make sure that they, too, receive or can get the source code. And you must show them these terms so they know their rights.

The fact that these cloud vendors could “have their cake and eat it too” could be easily fixed by making the AGPL3 license tighter than the GPL license by adding this clause:

The GNU Affero General Public License is designed specifically to ensure that, in such cases, the modified source code becomes available to the community. It requires the operator of a network server to provide the source code of the modified version running there to the users of that server. Therefore, public use of a modified version, on a publicly accessible server, gives the public access to the source code of the modified version.

This seems simple enough. Fix the flaw. The GPL license did not imagine that someday software would not be “distributed” at all and only run in the cloud. The AGPL3 license solves that problem. Job done.

But solving one problem in the GPL pathos causes another in the marketplace. AGPL3 ensures that we can “see” the code that those who would remix and run on servers would develop, but it creates an unfortunate asymmetry that can be exploited to achieve a combination of vendor lock-in and open source.

AGPL3 = Open Source + Vendor Lock-In

The creators of GPL generally imagined that open source software would have a diverse community around it and that the GPL (and AGPL) licenses were a set of rules about how that community interacted with each other and constrain companies working with GPL software to bring their improvements back to the commons. But just like the GPL founders did not imagine the cloud, the AGPL creators did not imagine that open source software could be created in a proprietary organization and that the AGPL license would ensure that a diverse community would never form (or take a really long time to form) around the open source software.

These days in Educational Technology it is pretty easy to talk to someone on your Caltrans commute and get $60 Million in venture capital for an educational technology startup. But your VC’s want an exit strategy where they make a lot of money. I think that there are likely no examples of VC-funded companies that used an Apache-like license in their core technology that were funded let alone successful. That hippie-share-everything crap just does not cut it with VC’s. Vendor lock-in is the only way to protect asset value and flip that startup or go public.

Clever company founders figured out how to “have their cake and eat it too”. Here is the strategy. First take VC money and develop some new piece of software. Divide the software into two parts – (a) the part that looks nice but is missing major functionality and (b) the super-awesome add-ons to that software that really rock. You license (a) using the AGPL3 and license (b) as all rights reserved and never release that source code.

You then stand up a cloud instance of the software that combines (a) and (b) and not allow any self-hosted versions of the software which might entail handing your (b) source code to your customers.

Since the (a) portion is incomplete it poses no threat to their commercial cloud offering. And since the (a) part is AGPL it is impossible for a multi-vendor commercial ecosystem to emerge. If a small commercial competitor wants to augment the (a) code to compete with the initial vendor that has (a)+(b) running in the cloud, they are bound by the AGPL3 license to publish all of their improvements. This means that if the second company comes up with a better idea than the original company – the original company gets it and any and all competitors of the second company get the improvement for free as well. But if the original company makes an improvement – they keep it hidden and proprietary thus extending their advantage over all other commercial participants in the marketplace:

You can see this theme in the August 22 Kuali FAQ where they talk about “What happens to the Kuali Commercial Affiliates (KCAs)?”:

There will be ample and growing opportunities for the KCAs to engage with Kuali clients. The company would love for KCAs to take on 80% or more of the installation projects. The Kuali platform will continue to become more and more of a platform that KCAs can augment with add-ons and plugins. In addition, KCAs will likely be used to augment the company’s development of core code and for software projects for Kuali customers.

Reading this carefully, the role for companies other than Kuali, Inc. is to install the software developed by the new “Kuali, Inc.” company or perhaps develop plugins. With the source code locked into AGPL3, the greatest role that a community of companies can do is be “Kuali Inc’s little helpers”. The relationship is not a peer relationship.

When a company builds a proprietary product from scratch and releases a portion of it under APGL3, there never was a commons and the AGPL3 license is the best open source license the comapny can use to insure that there never will be a true commons.

Revisiting – AGPL – Fixing the “bug” in GPL (oops)

Now the AGPL3 advocates actually achieve their goals when the original company goes out of business because even though we never see the (b) component of the software, since the (a) part is open source and a truly open ecosystem could emerge around the carcass of the company – but by the time the company failed – it is not likely that their “half-eaten code carcass” would be all that useful.

What is far more likely is that the company using the AGPL strategy would get a few rounds of VC, thrive and sell themselves for a billion dollars or go public for a few billion dollars. After the founders pocket the cash, there would no longer need to market themselves as “open source” so they would just change the license on (a) from AGPL3 to a proprietary license and stop redistributing the code. Since the (b) code was always proprietary – after a few months of improvements to the (a) code in a non-open source fashion and the deep interdependence of the (a) and (b) code, the open copy of (a) has effectively died on the vine. The resulting company has a wonderfully proprietary and closed source product with no competitors and the VC’s have another half-billion dollars to give to some new person on a Caltrans ride. And the “wheel of life” goes on.

Each time open source loses and VCs and corporations win, I am sure somewhere in the world, about ten Teslas get ordered and a puppy cries while struggling to make it to the next level.

Proprietary Code is a Fine Business Model

Probably by this time (if you have read this far) you probably have tagged this post as #tldr and #opensourcerant – it might indeed warrant #tldr – but it is not an open source rant.

I am a big fan of open source but I am also a big fan of proprietary software development. The educational technology market is made up of well over 90% of its software that is proprietary. Excellent proprietary offerings come from companies like Blackboard, Coursera, Instructure (part b), Piazza, Microsoft, Google, Edmodo, Flat World Knowledge, Pearson, McGraw Hill, Apple and many others. Without them open source efforts like Sakai and Moodle would not exist. I am not so foolish that I believe that purely open source solutions will be sufficient to meet the need of this market that I care so much about.

The right combination in a marketplace is a combination of healthy and competitive open source and proprietary products. This kind of healthy competition is great because choices make everyone stronger and keep teams motivated and moving forward:

  • Linux and Microsoft Windows
  • Microsoft Office and LibreOffice
  • Sakai and Blackboard
  • Apache HTTPd and Microsoft IIS
  • ….

The wisest of proprietary companies even see fit to invest in their open source competitors because they know it is a great way to make their own products better.

The reason that the “open source uber alles” strategy fails is that proprietary companies can raise capital far more effectively than open source efforts. This statement from an earlier Kuali blog post captures this nicely:

We need to accelerate completion of our full suite of Kuali software applications, and to do so we need access to substantially more capital than we have secured to date to meet this need of colleges and universities.

The problem is also why it is very rare for an open source product to dominate and push out proprietary competitors. Open source functions best as a healthy alternative and reasonably calm competitor.

AGPL3 + Proprietary + Cloud Strategy in Action

To their credit, Instructure has executed the AGPL3 open/closed hybrid strategy perfectly for their Canvas product. They have structured their software into two interlinked components and only released one of the components. They have shaded their marketing the right way so they sound “open source” to those who don’t know how to listen carefully. They let their fans breathlessly re-tell the story of “Instructure Open Source” and Instructure focuses on their core business of providing a successful cloud-hosted partially open product.

The Kuali pivot of the past few weeks to create Kuali Inc., (actual name TBD) is pretty clearly an attempt to replicate the commercial success of the Instructure AGPL3 strategy but in the academic business applications area. This particular statement from the August 22 Kuali announcement sums it up nicely:

From where will the founding investment come?

The Foundation will provide initial capital investment for the company out of its reserves. Future investment will come from entities that are aligned with Kuali’s mission and interested in long-term dividends. A first set of investors may be University foundations. There is no plan for an IPO or an acquisition.

Read this carefully. Read this like a lawyer, venture capitalist, or university foundation preparing to invest in Kuali, Inc. would read it. The investors in Kuali, Inc. may be more patient than the average investor – but they are not philanthropic organizations making a grant. The AGPL license strategy is essential to insuring that an investment in Kuali, Inc. has the potential to repay investors investments as well as a nice profit for its patient investors.

Is there any action that should be taken at this time? If I were involved in Kuali or on the board of directors of the Kuali Foundation, I would be very suspect of any attempted change to the license of the code currently in the Kuali repository. A change of the kind of license or a change to “who owns” the code would be very significant. The good news is that in the August 27 Kuali post it appears that at least for now, a board-level wholesale copyright change is off the table.

All software that has been released under the current, Open Source Initiative approved Educational Community License (ECL) will and can continue under that license.

I think that a second issue is more about the individual Kuali projects. There are lots of Kuali projects and each project is at a different maturity level and has its own community and its own leadership. I think that the approach to Kuali, Inc. might be different across the different Kuali Foundation projects. In particular if a project has a rich and diverse community of academic and commercial participants, it might be in that communities’ best interest to ignore Kuali Inc. and just keep working with the ECL licensed code base and manage its own community using open source principles.

If you are a member of a diverse community working on and using a Kuali project (Coeus and KFS are probably the best examples of this) you should be careful not to ignore a seemingly innocuous board action to switch to AGPL3 in any code base you are working on or depending on (including Rice). Right now because the code is licensed under the Apache-like Educational Community License, the fact that the Foundation “owns” the code hardly matters. In Apache-like licenses, the owner really has no more right to the code than the contributors. But as soon as the code you are working on or using is switched to AGPL3, it puts all the power in the hands of the copyright owner – not the community.

A worrisome scenario would be to quietly switch the license to AGPL3 and then have the community continue to invest in the Kuali Foundation version of the code for a year or so and then a year from now, the Kuali Foundation Board could then transfer ownership of the code to someone else and then you would have to scramble and pick through the AGPL3 bits and separate them out if you really wanted to continue as a community. This is usually so painful after a year of development that no one ever does it.

The Winter of AGPL3 Discontent

If we look back at the four principles of open source that I used to start this article, we quickly can see how AGPL3 has allowed clever commercial companies to subvert the goals of Open Source to their own ends:

  • Access to the source of any given work – By encouraging companies to only open source a subset of their overall software, AGPL3 ensures that we will never see the source of the part (b) of their work and that we will only see the part (a) code until the company sells itself or goes public.
  • Free Remix and Redistribution of Any Given Work – This is true unless the remixing includes enhancing the AGPL work with proprietary value-add. But the owner of the AGPL-licensed software is completely free to mix in proprietary goodness – but no other company is allowed to do so.
  • End to Predatory Vendor Lock-In – Properly used, AGPL3 is the perfect tool to enable predatory vendor lock-in. Clueless consumers think they are purchasing an “open source” product with an exit strategy – but they are not.
  • Higher Degree of Cooperation – AGPL3 ensures that the copyright holder has complete and total control of how a cooperative community builds around software that they hold the copyright to. Those that contribute improvements to AGPL3-licensed software line the pockets of commercial company that owns the copyright on the software.

So AGPL3 is the perfect open source license for a company that thinks open source sounds great but an actual open community is a bad idea. The saddest part is that most of the companies that were using the “loophole” in GPL were doing so precisely so they could participate in and contribute to the open source community.

Conclusion

As I wrote about MySQL back in 2010, a copyright license alone does not protect an open source community:

Why an Open Source Community Should not cede Leadership to a Commercial Entity – MySql/Oracle

Many people think that simply releasing source code under an open license such as Instructure or GPL is “good enough” protection to ensure that software will always be open. For me, the license has always been a secondary issue – what matters is the health and vitality of the open community (the richness and depth of the bazaar around the software).

Luckily, the MySQL *community* saw the potential of the problem and made sure that they had a community-owned version of the code named MariaDB that they have actively developed from the moment that Oracle bought MySQL. I have not yet used MariaDB – but its existence is a reasonable insurance policy against Oracle “going rogue” with MySQL. So far, now over four years later Oracle has continued to do a reasonable job of managing MySQL for the common good so I keep using it and teaching classes on it. But if MariaDB had not happened, by now the game would likely be over and MySQL would be a 100% proprietary product.

While I am sure that the creators of the Affero GPL were well intentioned, the short-term effect of the license is to give commercial cloud providers a wonderful tool to destroy open source communities or at least ensure that any significant participation in an open-source community is subject to the approval and controls of the copyright owner.

I have yet to see a situation where the AGPL3 license made the world a better place. I have only seen situations where it was used craftily to advance the ends of for-profit corporations that don’t really believe in open source.

It never bothers me when corporations try to make money – that is their purpose and I am glad they do it. But it bothers me when someone plays a shell game to suppress or eliminate an open source community. But frankly – even with that – corporations will and should take advantage of every trick in the book – and AGPL3 is the “new trick”.

Instead of hating corporations for being clever and maximizing revenue – we members of open source communities must simply be mindful of being led down the wrong path when it comes to software licensing.

Note: The author gratefully acknowledges the insightful comments from the reviewers of this article.

by Charles Severance at September 09, 2014 02:26 AM

September 08, 2014

Sakai Project

September 06, 2014

Adam Marshall

WebLearn and Turnitin courses MT 2014

We have planned another full programme of ITLP courses for the new term. The majority of courses are aimed at staff members, but there is one for students on ‘Awareness and avoidance of plagiarism’, which has proved to be very popular.

The programmes for all courses during MT 2014 are provided below, with links to book on those that you might like to attend:

* WebLearn courses: https://weblearn.ox.ac.uk/x/7G9PI6
* Plagiarism (Turnitin) courses: https://weblearn.ox.ac.uk/x/HHBNBy

Remember also the WebLearn User Group meeting (Thursday 16 October, 2:00-4:00) and the Turnitin User Group meeting (Friday 10 October, 2:00–4:00).  (Bookings are required via the links provided.)

Hope to see you there.

by Jill Fresen at September 06, 2014 09:33 AM

August 28, 2014

Sakai@UD

Managing Announcements and Notifications

As the fall semester begins, IT staff members have received a number of inquiries related to notifications and sending announcements to students, especially with the growing number of instructors opting to use Canvas instead of Sakai. Below are some options and “gotchas” regarding notifications in UD-supported technologies, including Canvas, Sakai, and P.O. Box. I. Verify […] more >

by Mathieu Plourde at August 28, 2014 05:55 PM

Sakai Project

Apereo OAE Griffin is now available!

The Apereo Open Academic Environment (OAE) project team is excited to announce the next major release of the Apereo Open Academic Environment; OAE Griffin or OAE 8.

August 28, 2014 05:02 PM

Deadline Extended: Submit your Sakai Virtual Conference proposals by September 7th

Only a week left! The call for proposals for the Sakai Virtual Conference 2014 is open until September 7th!

Don’t wait! Submit your proposal today!

August 28, 2014 04:57 PM

August 27, 2014

Apereo OAE

Apereo OAE Griffin is now available!

The Apereo Open Academic Environment (OAE) project team is excited to announce the next major release of the Apereo Open Academic Environment; OAE Griffin or OAE 8.

OAE Griffin brings a complete overhaul of the collaborative document experience, metadata widgets, full interactive REST API documentation and improved Office document previews. Next to that, OAE Griffin also introduces a wide range of incremental usability improvements, technical advances and bug fixes.

Changelog

Collaborative documents

The collaborative document experience in OAE Griffin has been completely overhauled. Whilst OAE's collaborative note taking capabilities have consistently been identified as very useful during usability testing, the actual Etherpad editor user experience has always tested poorly and never felt like an inherent part of the OAE platform.

Therefore, OAE Griffin introduces a fully skinned and customised collaborative document editor. The Etherpad editor has been skinned to make it fit seamlessly into the overall OAE interface and a number of under-utilised features have been removed. The editor and toolbar now also behave a lot better on mobile devices. All of this creates a much cleaner, more integrated and easier to use collaborative document experience.

At the same time, the activities and notifications generated by collaborative documents have also been fine-tuned. OAE Griffin now detects which people have made a change and will generate accurate activities, providing a much better idea of what's been happening inside of a document.

Metadata widgets

It is now possible to see the metadata for all content items, discussions and groups. This includes the full title of the item, the description, who created it and when it was created. For content items and discussions, it is also possible to see the full list of managers and people and groups it's shared with. All of this will provide a lot more context to an item, for example when discovering an interesting content item or when wondering who's involved in a discussion.

At the same time, the long-awaited download button has been provided for all content items, ensuring that the original file can easily be downloaded.

REST API Documentation

OAE Griffin introduces a REST API documentation framework and all of the OAE REST APIs have been fully documented. This work is based on a REST API documentation specification called Swagger, and offers a nice interactive UI where the documentation can be viewed and all of the REST endpoints can be tried.

This documentation is available on every OAE tenant and sits alongside the internal API documentation. All of this should provide sufficient information and documentation for widget development and integration with OAE.

Office documents

The OAE preview processor has been upgraded from LibreOffice 3.5 to LibreOffice 4.3. This brings tremendous improvements to the content previews that are generated for Office files (Word, Excel and PowerPoint). Especially the display of shapes, pictures and tables has been much improved, whilst some additional font support has been added as well.

Email improvements

The email notifications have been tweaked to ensure that emails sent out by OAE are as relevant as possible. At the same time, a number of visual improvements have been made to those emails to ensure that they look good on all devices.

Embedding improvements

Browsers have started introducing a set of new new cross-protocol embedding restrictions, which were causing some embedded links to not show correctly in the content profile. Therefore, OAE Griffin puts a number of measures in place that improve link embedding is and provide a fallback when a link can not be embedded.

CAS Authentication

It is now possible to pick up and use SAML attributes released by a CAS authentication server. This allows for a user's profile metadata to be available immediately after signing into OAE for the first time, without having to pre-provision the account.

Icons

The icons used in OAE Griffin have been upgraded from FontAwesome 3 to FontAwesome 4.3, allowing for a wider variety of icons to be used in widget development.

Apache Cassandra

OAE Griffin has been upgraded from Apache Cassandra 1.2.15 to Apache Cassandra 2.0.8, bringing a range of performance improvements, as well as the possibility of setting up simple database transactions.

Try it out

OAE Griffin can be tried out on the project's QA server at http://oae.oae-qa0.oaeproject.org. It is worth noting that this server is actively used for testing and will be wiped and redeployed every night.

The source code has been tagged with version number 8.0.0 and can be downloaded from the following repositories:

Back-end: https://github.com/oaeproject/Hilary/tree/8.0.0
Front-end: https://github.com/oaeproject/3akai-ux/tree/8.0.0

Documentation on how to install the system can be found at https://github.com/oaeproject/Hilary/blob/8.0.0/README.md.

Instruction on how to upgrade an OAE installation from version 7 to version 8 can be found at https://github.com/oaeproject/Hilary/wiki/OAE-Upgrade-Guide.

The repository containing all deployment scripts can be found at https://github.com/oaeproject/puppet-hilary.

Get in touch

The project website can be found at http://www.oaeproject.org. The project blog will be updated with the latest project news from time to time, and can be found at http://www.oaeproject.org/blog.

The mailing list used for Apereo OAE is oae@apereo.org. You can subscribe to the mailing list at https://groups.google.com/a/apereo.org/d/forum/oae.

Bugs and other issues can be reported in our issue tracker at https://github.com/oaeproject/3akai-ux/issues.

by Nicolaas Matthijs at August 27, 2014 01:14 PM

August 25, 2014

Chris Coppola

Kuali 2.0

Last Friday, the Kuali Foundation made an announcement that stirred up quite a buzz. Kuali has formed a new Kuali Commercial entity (referred to below as Kuali-Company) that is a “for profit” enterprise owned by and aligned with the higher ed community through Kuali.org. As a Kuali.org co-founder and community leader, rSmart has naturally been asked for comment, clarity, and information.

Kuali 2.0 logoLet’s start with some facts…

  • Kuali software will continue to be open source.
  • Kuali will continue to be driven by higher education.
  • Kuali will continue to engage colleges and universities in the way that it always has.

… and we expect that…

  • Kuali-Company will be better at engaging the higher education community from a marketing and sales perspective.
  • Kuali will have a more effective product development capability organized in Kuali-Company.
  • Kuali-Company will deliver Kuali software in the cloud.

The Kuali mission is unwavering, to drive down the cost of administration for colleges and universities to keep more money focused on the core teaching and research mission. Our (the Kuali community) mission hasn’t changed, but the ability to execute on it has improved dramatically. The former structure made it too difficult for colleges and universities to engage and benefit from Kuali’s work. This new model will simplify how institutions can engage. The former structure breeds a lot of duplicative (and even competitive) work. The new structure will be more efficient.

People have been calling and asking what this means to rSmart. rSmart is made up of people who, like me, are passionate about the mission. We wake up every day and work hard to achieve the mission. rSmart has been involved in the leadership of Kuali since the day it started… actually before that since we are one of Kuali’s founders. This change is no different, and we believe that this change is going to better enable us to fulfill our mission.

We’re excited about Kuali’s future. I’ve had the opportunity to get to know Joel Dehlin a bit and I’m confident that he is going to be a great addition and I look forward to working with him. rSmart is committed to our customers, to the Kuali mission, and to supporting this new direction.


Tagged: business, commercial-os, community, education, erp, future, kuali, open source, rSmart

by Chris at August 25, 2014 06:27 PM

August 15, 2014

Zach Thomas

Sakai and Ansible Sittin’ In a Tree

I love data center automation. It’s funny, I know, since I don’t even have a data center. But I started my software development career back when it took weeks to order and procure any servers, and the developers were definitely not allowed to ever touch them. Each one was a finely-crafted jewel, fussed over by an attentive sysadmin. Fast forward to now, and I can spin up a fleet of servers with a single line from a terminal. With good automation in place, you can keep all your configuration under version control, and should any server disappear without warning, you can be confident that you’ll have a new one completely ready in a matter of minutes.

I first got started with puppet, which, together with chef, is one of the giants of IT automation. I created modules for my company’s build environment, staging environment, and a generic development environment for Sakai.

More recently, I discovered ansible, thanks to the ThoughtWorks technology radar. Ansible is different from puppet in some superficial ways (python instead of ruby!), but I also found it much easier to understand and much quicker to get up and running.

One of the things that’s tricky to get right in puppet is which tasks depend on other tasks. For example, if you’re going to download a file using cURL, that task will depend on the one that ensures cURL is installed on the system. With puppet, you say that one task requires one or more other tasks. But if you forget, you’ll find out when it fails.

With ansible, tasks run in the order they appear in your specification. What could be more intuitive than that? That is pretty much what you would expect to happen.

Another cool thing about ansible is that it doesn’t require an agent to be running on the machine you’re trying to provision. It just needs an ssh daemon and python, and those are standard equipment on (nearly) any server distribution you’re going to find.

I have converted our staging environment to ansible, and it’s working great. If you take the extra step (optional) of installing ansible on the server, you can have the server refresh itself with a command called ansible-pull. You just put that in a cron job, and it can pick up any changes you might want to make from a git repository.

I have been wanting to try ansible for the Sakai develeoper environment, and I finally did it. You’ll find that here on github. The most frustrating part is still waiting for the Subversion checkout of close to 540MiB of source files, but it’s pretty cool that you can have everything you need (java jdk, maven, tomcat, svn) in a vm without having to worry that you left out a step, or that anything might conflict with versions that you already have installed on your own machine.

August 15, 2014 01:28 PM

August 07, 2014

Dr. Chuck

Sakai 11: iFrames are starting to vanish

You have been hearing a bunch about the new responsive Morpheus portal and the removal of the iFrames from Sakai 11. Lots of work has been going on. Last night I committed the first of many changes to the portal code in trunk to start Sakai 11 down the path to being iframe free. The Morpheus effort is already well on the way to making our default portal mobile friendly and responsive.

If you go to the nightly server as of this morning, you will see that there are no more iframes except for the following tools:

Lessons, Samigo, Preferences, Resources, DropBox, and Home

If you want to a fun test, go to:

http://nightly2.sakaiproject.org:8082/portal

Make an account, make a site, add the Gradebook and a few other tools to the site – then click the four buttons across the top of Gradebook and then click the Back button four times – watching the URL change. No iframes, real REST looking URLs in the location box and flawless back button.

At this point we have not done anything that is irreversible, all we did was change two property defaults in trunk. You can restore yesterday’s behavior by setting these back to their old defaults:

portal.inline.experimental=false
portal.pda.iframesuppress=:all:

If you are playing with the morpheus portal, the next time you so an ’svn update’ the same settings and behaviors will be in effect. The morpheus portal is inlining all but the above tools as well.

None of this will be put into Sakai 10 – it will remain in trunk for Sakai 11. We know there will be lots of little issues as we completely rewrite how the portl works underneath our feet and so we really need the next six or so months to colletively test these major UI improvements.

Over the next few weeks, we will be working on tweaking little markup glitches to make it so all tools can be inlined in both the neo and morpheus portals. Lessons, Samigo and Preferences have small issues of markup conflicts, jQuery versions or local CSS bits that should be easily fixed to allow them to be inlined. DropBox and Home use Bootstrap Javascript and CSS and so they have significant markup conflicts between them and the portal – we may need to wait for morpheus to mature some more to get these two tools working in inline mode. Home has four tools on a single page and there is no way to inline more than one tool on a page other than using portlets so the Home page will take some work – or perhaps we just replace it with the Dashboard :).

As we make these changes, every effort will be made to keep the tools working in all variants of the portal (neo with frames, neo with no frames, and morpheus with and without frames). But at some point we will need to change tool markup in a way that it no longer works with the neo portal or works in a diminished mode in neo. By that point, morpheus will have nmatured to the point where it will be the default and only portal that we support for Sakai 11. When that happens there will be plently of communication and announcements and opportunities to do some testing and feedback from the community. So make sure to listen carefully to the developer and user lists in Sakai over the next few months as this bit of “evolution in place” happens.

You can track what is happening at this JIRA:

https://jira.sakaiproject.org/browse/SAK-27774

If you find a problem that appears to be a markup conflict between the portal and tool markup, please file a JIRA and link it to SAK-27774 and we will see what we can do.

Let us know what you think of this on the user and developer lists. One of the benefits of being part of the Sakai community is that we make changes like this in the open and invite broad disucssion about them. It is one of the hallmarks of an open community of developers and users guiding a product forward together.

This is the first of many steps to a state-of-the-art responsive and iframe-free user experience – the journey of a thousand commits starts with a single commit.

by Charles Severance at August 07, 2014 05:06 AM

July 23, 2014

Dr. Chuck

Sakai 10 Released – The Magic of Open Source

In this post I am not speaking for University of Michigan, Longsight Inc., or the IMS Global Learning Consortium.

It is always a great feeling for an open source community to finish a release. So much work goes into a release and so many volunteers are involved and work hard – so it is a proud moment for a lot of us. I tend to be involved in more of the up front development and working on crazy next gen stuff. So I am doubly grateful to those who put the finishing touches on these releases and then get them out to the public and put them into production.

Here is the official release notice for Sakai 10. There is a long list of great stuff in that link that I won’t replicate here.

As I said in the The Post-LMS LMS article in Inside Higher Education, the past year has seen a lot of incremental investment in all five of the major LMS systems in the marketplace. In a sense we were all reacting to changes in the market. As we gain experiences with larger sized classes that we hope to run at scale (i.e inspired by MOOCs) there are a number of MOOC-like features that are finding their way into products. Sakai-10’s peer-assessment and improvement of group-submitted are partially inspired by the MOOCs heavy use of peer features. It is not so much that MOOCs were the first to do peer-assessment – more that peer-assessment has gotten a lot of attention in the past two years.

If Sakai end-users feel strongly enough about a feature to bring funding or resources to the table, the features get built and added to the core product and are part of the next release surprisingly quickly. It is that simple – no product marketing layer or sales people to fight through. You find or hire the necessary resources and have a feature in the core product. There is no other product in the LMS marketplace where end-users personally know the core developers of the product on a first name basis.

Another big trend is making sure that LMS systems can function well in cloud environments (i.e. like Amazon). In the past two years, Amazon’s costs have dropped dramatically and their capabilities have grown significantly. The addition of Solid State Disk Drives in many of their offerings is a quantum leap forward in the ability to host “normal” applications in the cloud that was impractical a while back. Simply put, two years ago – you had to be pretty clever to move a large application into the cloud because of the subtle performance tuning that was required – but now Amazon’s cloud resources have very similar performance characteristics to locally-owned hardware – expecially if virtualization is used on that hardware.

Just a quick look at Amazon’s EC2 pricing is pretty amazing – especially the one and three year fixed contract pricing. A m3.medium instance with about 4G RAM, 4G SSD and one CPU is $172 for three years. A bit more capable two CPU, 8G RAM, 32G SSD m3.large server is $343 for three years. Why would I ever run a server under my desk at work with prices like that?

So there is a pull for both self-hosted schools and commercial companies that host LMSs in their own clouds to take advantage of these prices. This is true for all vendors. Based on my rumors and bar conversations, I think that Canvas is the only major hosting company that is mostly using Amazon – but all the other vendors are likely eyeing hosting new work and new expansion in the cloud and as servers get replaced in a company data center it is likely that there will be an urge to use Amazon instead.

But as we move the hosting of these systems into the Amazon cloud, we still want to spend as little money as possible. And if you look at what you are getting in Amazon, the most expensive element of the cost is the RAM followed by I/O. CPU is almost an insignificant concern on most LMS applications. So not surprisingly, if you want to optimize costs in a cloud version of Sakai, you find a way to trade CPU for RAM and database I/O. The solution of course for all applications is a shared cache like memcache or Reddis.

So not surprisingly in the above video you see three Sakai Commercial Affliliates (AshaiNet, Longsight, and Unicon) putting a lot of energy into cloud-tuning Sakai by reducing app server memory footprint and database I/O by adding a shared cache and switching to elastic search.

This kind of cloud tuning goes on for all of the LMS systems. For Moodle, Moodlerooms and RemoteLearner separately tune Moodle to scale for the cloud. Of course Instructure tunes Canvas for the cloud all the time but we never see the source code. Blackboard announced thair plans to host Learn on AWS at BBWorld14 – an impressive step – since I was not in Vegas, I had to settle for screen shots of slides in Twitter DMs.

But the essential difference in the Sakai community is that three competitors saw fit to pool their cloud tuning efforts and put their code into the community release. Even while the code was being built and tested, developers from AsahiNet, Longsight, and Unicon were communicating regularly, checking, testing, and fixing code written by one of their “competitors”. And when it was all done the code ends up in the open source trunk of Sakai. There are no secret repositories with the “magic sauce” – you don’t have to go to the bar and get a drawing on a napkin to find out the clever tuning tricks that are being done to make this possible. Just check out the source code and take a look.

Now while to a proprietary competitor, it might seem crazy to give away the “crown jewels”. But like many crazy plans, it is just so crazy that it might work. First, everyone is running the same code. Vendors don’t need a vendor fork for perfomance tweaks. Self-hosted schools can deploy the same solution as the commercial vendors if they like. If self-hosted schools are a little nervous about switching from the “app server” / “db server” architecture – they can just wait while the commercial Sakai vendors gain experience – but at any time – they have access to the exact same cloud code that the vendors are using in production when they are ready to start saving hardware costs.

The second and more important issue is that cloud performance tuning is a moving target. Amazon will continue to tweak their offerings and performance characteristics. You never really are sure how something will scale until you are running it at scale. Who knows if Unicon, LongSight, or AsahiNet will be the first one to encounter a little bit of code that needs a bit of tweaking as you add the millionth user. But as long as we avoid tweaks in vendor branches and keep the tweaks in the trunk, when the second vendor crosses that million user barrier – the code will be there for them – sitting in trunk and fully tested.

Again it might seem insane for one vendor to commit code that will allow other vendors to match their offerings in the marketplace or allow self-hosted schools to avoid out-sourcing their applications to a vendor because they have 100% access to the same code. But the reality is that it is far less risky to work together than to work separately. There is no single school or commercial vendor in the Sakai ecosystem that can go it alone and ignore everyone else. We are in this together. We sink or swim together.

We will all help each other find our way to the cloud together. That is the power of real open source. That is the magic of real open source.

Even if you run a commercial LMS at your University – you should join us and be part of Apereo. Apereo is not just about Sakai. Apereo is where the next generation of teaching and learning technology will be collectively defined and built. Because what is next will be even more exciting than getting an LMS ready for the cloud.

by Charles Severance at July 23, 2014 01:17 PM

July 16, 2014

Steve Swinsburg

Sakai Quartz example bundle receives an update

Six years ago I wrote a little bundle for Sakai that sets up a Quartz job and registers it with the Sakai Job Scheduler so you can setup triggers for it to run, just like a cron job. It was getting a little long in the tooth so it’s now had a makeover and now works for Sakai 11.

All of the bits of code are documented so if you are looking to write Quartz jobs for Sakai, this is what you need. Check it out:

https://source.sakaiproject.org/contrib/swinsburg/quartz-example/

by steveswinsburg at July 16, 2014 11:48 AM