Planet Sakai

July 02, 2015

Michael Feldstein

D2L Again Misusing Academic Data For Brightspace Marketing Claims

At this point I’d say that we have established a pattern of behavior.

Michael and I have been quite critical of D2L and their pattern of marketing behavior that is misleading and harmful to the ed tech community. Michael put it best:

I can’t remember the last time I read one of D2L’s announcements without rolling my eyes. I used to have respect for the company, but now I have to make a conscious effort not to dismiss any of their pronouncements out-of-hand. Not because I think it’s impossible that they might be doing good work, but because they force me to dive into a mountain of horseshit in the hopes of finding a nugget of gold at the bottom. Every. Single. Time. I’m not sure how much of the problem is that they have decided that they need to be disingenuous because they are under threat from Instructure or under pressure from investors and how much of it is that they are genuinely deluding themselves. Sadly, there have been some signs that at least part of the problem is the latter situation, which is a lot harder to fix. But there is also a fundamental dishonesty in the way that these statistics have been presented.

Well, here’s the latest. John Baker put out a blog called This Isn’t Your Dad’s Distance Learning Program with this theme:

But rather than talking about products, I think it’s important to talk about principles. I believe that if we’re going to use education technology to close the attainment gap, it has to deliver results. That — as pragmatic as it is — is the main guiding principle.

The link about “deliver results” leads to this page (excerpted as it existed prior to June 30th, for reasons that will become apparent).

Why Brightspace

Why Brightspace? Results.

So the stage is set – use ed tech to delivery results, and Brightspace (D2L’s learning platform, or LMS) delivers results. Now we come to the proof, including these two examples.

CSULB UWM Results

According to Californiat State University-Long Beach, retention has improved 6% year-over-year since they adopted Brightspace.[snip]

University of Wisconsin-Milwaukee reported an increase in the number of students getting A’s and B’s in Brightspace-powered courses by over 170%

Great results, no? Let’s check the sources. Ah . . . clever marketing folks – no supporting data or even hyperlinks to learn more. Let’s just accept their claims and move along.

. . .

OK, that was a joke.

CSU Long Beach

I contacted CSU Long Beach to learn more, but I could find no one who knew where this data came from or even that D2L was making this claim. I shared the links and context, and they went off to explore. Today I get a message saying that the issue has been resolved, but that CSU Long Beach would make no public statements on the matter. Fair enough – the observations below are my own.

If you now look at that Results page now, the CSU Long Beach claim is no longer there – down the memory hole[1] with no explanation, replaced by a new claim about Mohawk College.

Mohawk UWM Results

While CSU Long Beach would not comment further on the situation, there are only two plausible explanations for the issue being resolved by D2L taking down the data. Either D2L was using legitimate data that they were not authorized to use (best case scenario) or D2L was using data that doesn’t really exist. I could speculate further, but the onus should be on D2L since they are the ones who made the claim.

UW Milwaukee

I also contacted UW Milwaukee to learn more, and I believe the data in question is from the U-Pace program which has been fully documented.[2][3]

The U-Pace instructional approach combnes self-paced, master-based learning with instructor-initiated Amplified Assistance in an online environment.

The control group was traditionally-taught (read that as large lecture classes) for Intro to Psychology.

From the EDUCAUSE Quarterly article on U-Pace, for disadvantaged students the number of A’s and B’s increased 163%. This is the closest data I can find to back up D2L’s claim of 170% increase.

U-Pace results EQ

There are three immediate problems here (ignoring the fact that I can’t find improvements of more than 170% – I’ll take 163%).

  1. First, the data claim is missing the context of “for underprepared students” who exhibited much higher gains than prepared students. That’s a great result for the U-Pace program, but it is also important context to include.
  2. The program is an instructional change, moving from large lecture classes to self-paced, mastery-learning approach. That is the intervention, not the use of the LMS. In fact, D2L was the LMS used in both the control group and the U-Pace treatment group.
  3. The program goes out of its way to call out the minimal technology needed to adopt the approach, and they even list Blackboard, Desire2Learn, and Moodle as examples of LMS’s that work with the following conditions:

U-Pace LMS Reqs

This is an instructional approach that claims to be LMS neutral with D2L’s Brightspace used in both the control group and treatment group, yet D2L positions the results as proof that Brightspace gets results! It’s wonderful that Brightspace LMS worked during the test and did not get in the way, but that is a far cry from Brightspace “delivering results”.

The Pattern

We have to now add these two cases to the Lone Star College and LeaP examples. In all cases, there is a pattern.

  1. D2L makes marketing claim implying their LMS Brightspace delivers results, referring to academic outcomes data with missing supporting data or references.
  2. I contact school or research group to learn more.
  3. Data is either misleading (treatment group is not LMS usage but instead instructional approach, adaptive learning technology, or student support software) or just plain wrong (with data taken down).
  4. In all cases, the results could have been presented honestly, showing the appropriate context, links for further reading, and explanation of the LMS role. But they were not presented honestly.
  5. e-Literate blog post almost writes itself.
  6. D2L moves on to make their next claim, with no explanations.

I understand that other ed tech vendors make marketing claims that cannot always be tied to reality, but these examples cross a line. They misuse and misrepresent academic outcomes data – whether public research-based on internal research – and essentially take credit for their technology “delivering results”.

This is the misuse of someone else’s data for corporate gain. Institutional data. Student data. That is far different than using overly-positive descriptions of your own data or subjective observations. That is wrong.

The Offer

For D2L company officials, I have an offer.

  1. If you have answers or even corrections about these issues, please let us know through your own blog post or comments to this blog.
  2. If you find any mistakes in my analysis, I will write a correction post.
  3. We are happy to publish any reply you make here on e-Literate.
  1. Their web page does not allow archiving with the Wayback Machine, but I captured screenshots in anticipation of this move.
  2. Note – While I assume this claim derives from U-Pace, I am not sure. It is the closest example of real data that I could find, thanks to a helpful tip from UW-M staff. I’ll give D2L the benefit of the doubt despite their lack of reference.
  3. And really, D2L marketing staff should learn how to link to external sources. It’s good Internet practice.

The post D2L Again Misusing Academic Data For Brightspace Marketing Claims appeared first on e-Literate.

by Phil Hill at July 02, 2015 11:56 AM

July 01, 2015

Adam Marshall

WISE project on the road

SPI_Fawei2edit

(picture shows Learning Technologist Fawei Geng working with the Department of Social Policy and Intervention).

The WISE team have been talking to lecturers, staff and administrators to gather requirements for their new, or improved WebLearn sites.  The process normally involves:

  • initial conversations about requirements and current usage of WebLearn
  • designs for a prototype site
  • testing the prototype and gathering feedback from users
  • prototype version 2
  • launch to Faculty

Despite the challenges and work involved in reviewing information content and structures, the response from Faculty so far has been very positive and we look forward to working closely with more departments across the University.

by Stephen Burholt at July 01, 2015 10:53 AM

June 30, 2015

Michael Feldstein

U of Phoenix: Losing hundreds of millions of dollars on adaptive-learning LMS bet

It would be interesting to read (or write) a post mortem on this project some day.

Two and a half years ago I wrote a post describing the University of Phoenix investment of a billion dollars on new IT infrastructure, including hundreds of millions of dollars spent on a new, adaptive-learning LMS. In another post I described a ridiculous patent awarded to Apollo Group, parent company of U of Phoenix, that claimed ownership of adaptive activity streams. Beyond the patent, Apollo Group also purchased Carnegie Learning for $75 million as part of this effort.

And that’s all going away, as described by this morning’s Chronicle article on the company planning to go down to just 150,000 students (from a high of 460,000 several years ago).

And after spending years and untold millions on developing its own digital course platform that it said would revolutionize online learning, Mr. Cappelli said the university would drop its proprietary learning systems in favor of commercially available products. Many Apollo watchers had long expected that it would try to license its system to other colleges, but that never came to pass.

I wonder what the company will do with the patent and with Carnegie Learning assets now that they’re going with commercial products. I also wonder who is going to hire many of the developers. I don’t know the full story, but it is pretty clear that even with a budget of hundreds of millions of dollars and adjunct faculty with centralized course design, the University of Phoenix did not succeed in building the next generation learning platform.

Update: Here is full quote from earnings call:

Fifth. We plan to move away from certain proprietary and legacy IT systems to more efficiently meet student and organizational needs over time. This means transitioning an increased portion of our technology portfolio to commercial software providers, allowing us to focus more of our time and investment on educating and student outcomes. While Apollo was among the first to design an online classroom and supporting system, in today’s world it’s simply not as efficient to continue to support complicated, custom-designed systems particularly with the newer quality systems we have more recently found with of the self providers that now exist within the marketplace. This is expected to reduce costs over the long term, increase operational efficiency and effectiveness while still very much supporting a strong student experience.

The post U of Phoenix: Losing hundreds of millions of dollars on adaptive-learning LMS bet appeared first on e-Literate.

by Phil Hill at June 30, 2015 03:17 PM

June 29, 2015

Michael Feldstein

ASU Is No Longer Using Khan Academy In Developmental Math Program

In these two episodes of e-Literate TV, we shared how Arizona State University (ASU) started using Khan Academy as the software platform for a redesigned developmental math course[1] (MAT 110). The program was designed in Summer 2014 and ran through Fall 2014 and Spring 2015 terms. Recognizing the public information shared through e-Literate TV, ASU officials recently informed us that they had made a programmatic change and will replace their use of Khan Academy software with McGraw-Hill’s LearnSmart software that is used in other sections of developmental math.

To put this news in context, here is the first episode’s mention of Khan Academy usage.

Phil Hill: The Khan Academy program that you’re doing, as I understand, it’s for general education math. Could you give just a quick summary of what the program is?

Adrian Sannier: Absolutely. So, for the last three-and-a-half years, maybe four, we have been using a variety of different computer tutor technologies to change the pedagogy that we use in first-year math. Now, first-year math begins with something we call “Math 110.” Math 110 is like if you don’t place into either college algebra, which has been the traditional first-year math course, or into a course we call “college math,” which is your non-STEM major math—if you don’t place into either of those, then that shows you need some remediation, some bolstering of some skills that you didn’t gain in high school.

So, we have a course for that. Our first-year math program encompasses getting you to either the ability to follow a STEM major or the ability to follow majors that don’t require as intense of a math education. What we’ve done is create an online mechanism to coach students. Each student is assigned a trained undergraduate coach under the direction of our instructor who then helps that student understand how to use the Khan Academy and other tools to work on the skills that they show deficit in and work toward being able to satisfy the very same standards and tests that we’ve always used to ascertain whether a student is prepared for the rest of their college work.

Luckily, the episode on MAT 110 focused mostly on the changing roles of faculty members and TAs when using an adaptive software approach, rather than focusing on Khan Academy itself. After reviewing the episode again, I believe that it stands on its own and is relevant even with the change in software platform. Nevertheless, I appreciate that ASU officials were proactive to let me know about this change, so that we can document the change here and in e-Literate TV transmedia.

The Change

Since the change has not been shared outside of this notification (limiting my ability to do research and analysis), I felt the best approach would be to again interview Adrian Sannier, Chief Academic Technology Officer at ASU Online. Below is the result of an email interview, followed by short commentary [emphasis added].

Phil Hill: Thanks for agreeing to this interview to update plans on the MAT 110 course featured in the recent e-Literate TV episode. Could you describe the learning platforms used by ASU in the new math programs (MAT 110 and MAT 117 in particular) as well as describe any changes that have occurred this year?

Adrian Sannier: Over the past four years, ASU has worked with a variety of different commercially available personalized math tutors from Knewton, Pearson, McGraw Hill and the Khan Academy applied to 3 different courses in Freshman Math at ASU – College Algebra, College Math and Developmental Math. Each of these platforms has strengths and weaknesses in practice, and the ASU team has worked closely with the providers to identify ways to drive continuous improvement in their use at ASU.

This past year ASU used a customized version of Pearson’s MyMathLab as the instructional platform for College Algebra and College Math. In Developmental Math, we taught some sections using the Khan Academy Learning Dashboard and others using McGraw Hill’s LearnSmart environment.

This Fall, ASU will be using the McGraw Hill platform for Developmental Math and Pearson’s MyMathLab for College Algebra and College Math. While we also achieved good results with the Khan Academy this past year, we weren’t comfortable with our current ability to integrate the Khan product at the institutional level.

ASU is committed to the personalized adaptive approach to Freshman mathematics instruction, and we are continuously evaluating the product space to identify the tools that we feel will work best for our students.

Phil Hill: I presume this means that ASU’s usage of McGraw Hill’s LearnSmart for Developmental Math will continue and also expand to essentially replace the usage of Khan Academy. Is this correct? If so, what do you see as the impact on faculty and students involved in the course sections that previously used Khan Academy?

Adrian Sannier: That’s right Phil. Based on our experience with the McGraw Hill product we don’t expect any adverse effects.

Phil Hill: Could you further explain the comment “we weren’t comfortable with our current ability to integrate the Khan product at the institutional level”? I believe that Khan Academy’s API approach is more targeted to B2C [business-to-consumer] applications, allowing individual users to access information rather than B2B [business-to-business] enterprise usage, whereas McGraw Hill LearnSmart and others are set up for B2B usage from an API perspective. Is this the general issue you have in mind?

Adrian Sannier: That’s right Phil. We’ve found that the less cognitive load an online environment places on students the better results we see. Clean, tight integrations into the rest of the student experience result in earlier and more significant student engagement, and better student success overall.

Notes

Keep in mind that ASU is quite protective of its relationship with multiple software vendors and that they go out of their way to not publicly complain or put their partners in a bad light, even if a change is required as in MAT 110. Adrian does make it clear, however, that the key issue is the ability to integrate reliably between multiple systems. As noted in the interview, I think a related issue here is a mismatch of business models. ASU wants enterprise software applications where they can deeply integrate with a reliable API to allow a student experience without undue “cognitive load” of navigating between applications. Khan Academy’s core business model relies on people navigating to their portal on their website, and this does not fit the enterprise software model. I have not interviewed Khan Academy, but this is how it looks from the outside.

There is another point to consider here. While I can see Adrian’s argument that “we don’t expect any adverse effects” in the long run, I do think there are switching costs in the short term. As Sue McClure told me via email, as an instructor she spent significantly more time than usual on this course due to course design and ramping up the new model. In addition, ASU added 11 TAs for the course sections using Khan Academy.  These people have likely learned important lessons about supporting students in an adaptive learning setting, but a great deal of their Khan-specific time is now gone. Plus, they will need to spend time learning LearnSmart before getting fully comfortable in that environment.

Unfortunately, with the quick change, we might not see hard data to determine if the changes were working. I believe ASU’s plans were to analyze and publish the results from this new program after the third term which will not happen.

If I find out more information, I’ll share it here.

  1. The terms remedial math and developmental math are interchangeable in this context.

The post ASU Is No Longer Using Khan Academy In Developmental Math Program appeared first on e-Literate.

by Phil Hill at June 29, 2015 11:37 PM

June 27, 2015

Steve Swinsburg

Sakai and MariaDB via Vagrant

Sakai has recently switched over to using the MariaDB connector for MySQL databases, and a number of institutions are running MariaDB in production, so I thought I might as well change my dev machine over to MariaDB.

To ease the transition, I whipped up a Vagrant box so I could run this in a VM and spin it up whenever I needed it.

Clone this:
https://github.com/steveswinsburg/mariadb-vagrant

Run this: vagrant up

Done.

by steveswinsburg at June 27, 2015 03:06 PM

June 26, 2015

Adam Marshall

June 25, 2015

Dr. Chuck

How is Sakai faring in the face of competition from Canvas?

(This was originally an email sent to the Sakai developer list)

A member of an institution that uses Sakai recently heard an interesting comment from a Canvas LMS representative:

“Sakai is such a cool concept but I do wonder where it will end up in the future as most its founding schools (and the schools putting resources into developing it) have now left and come to Canvas (for example, University of Indiana, University of Michigan, Stanford University).”

I thought this deserved a public reply.

My first observation is that a salesperson spreading FUD (fear, uncertainty and doubt) about Sakai suggests to me that they may not have a strong positive feeling about their own product. Most salespeople will tell you that the best thing to do is focus on what makes your product strong without even talking about other products.

That aside, let me give my response to your question. Each year I do some analytics on the developer list activity:

This chart shows a trend that at this point is about five years old. In the beginning early adopters such as Michigan, Indiana, Stanford, and Cambridge were pulling a lot of load as the product was literally being built and rebuilt. Also in the earliest years, new schools were adopting Sakai continuously so a lot of the e-mail activity was helping new schools.

The early lead schools dropped in activity in 2009. In 2009 Michigan was still the #1 participant in the dev list but a lot of increased participation was also coming from companies like LongSight and Unicon; participation from the other commercial affiliates (often using gmail.com addresses) was increasing as well.

In some ways, 2009-2011 was Sakai’s period of greatest risk as a community. A lot of things were trending downward and near the end of 2011 there was a very good chance that Sakai 2.9 would never see the light of day and it would be “last one out turn off the lights”.

The future of Sakai was originally planned to be a ground-up rewrite known as Sakai 3, however, this didn’t work out as planned and instead a brand new product known as Apereo Open Academic Environment (OAE) was developed. (OAE became a new type of learning platform based on social networking principles: sharing, co-editing, discovery and commenting upon content.)

But in 2012-13, there was a big turnaround with a redesigned Sakai 2.9 which included the brand new Lesson Builder tool.

Following that came consolidation with the tool-rich and innovative Sakai 10. Those who were still in the community put in a lot of effort – Michigan and Longsight were in really strong leadership positions. Other schools like Rutgers, NYU, Columbia, Duke, UNC, and others don’t show up in this dev list graph but they provided much of the money and developer talent to get us through Sakai 2.9 and Sakai 10.

Interestingly in the 2013-14 timeframe we see a couple of factors at work. First the 2102-13 sprint was over – we had Sakai 10. Here is a SlideShare I did that celebrated that moment:

The upcoming Sakai 11 release is the most important release for several years, however, aside from the addition of a responsive design, it is unique in that we are not expanding functionality as much as in the past: we are actually removing more code than we are adding and doing a bunch of UI rework in tools like Lessons, Gradebook and Portal. These more design-oriented activities tend not to cause lots of traffic on the dev lists.

Another interesting trend is that now that we have weekly developer team and teaching and learning meetings with up to 20 people regularly attending: the community is coordinating verbally and collectively in these meetings therefore less email is needed.

As we emerge into 2015, activity and commitment is very strong. The commercial affiliates (large and small) are a very important part of the community. Indiana and Stanford are quite low compared to earlier levels of participation. But something interesting is happening – some of the code that was traditionally the exclusive domain of Stanford or Indiana is now being maintained by the whole community. The interesting result is that the pace of development in those areas of the code base is increasing because now the whole community can move the code forward.

More community members are stepping forward to help because they know that they no longer assume that Indiana, Stanford and Co. will pick up the slack.

During 2011-2014 as the founding institutions slowly backed away, patches and bug fixes started to pile up. Now that the community has inherited the code-base and collective responsibility, the outstanding issues are rapidly being addressed. This is not meant as a criticism of the original partners, they built the core codebase that we all have and without them, we would have nothing. We are very much in their debt.

Looking forward, our community is solid and making lots of progress every single week. We have the luxury of putting a lot of effort into the UI and catching up with applying a backlog of local improvements from places like Oxford, Dayton, Columbia, NYU, Duke, Notre Dame, and UNC. These improvements are enriching our product. In addition, schools like Valencia, Murcia, Rutgers, and UCT are continuing to make strong direct contributions to the code base.

As we see Sakai 11 coming out with its new Morpheus responsive mobile-friendly portal and all of the user interface and performance improvements, I can see why Canvas sales people might be getting a little nervous and use a bit of FUD to try to scare you to switch now.

Thanks to Adam Marshall of Oxford for his editing help on rewording this from an email to a blog post.

by Charles Severance at June 25, 2015 08:51 PM

Steve Swinsburg

MySQL via Vagrant

Ever needed an instance of MySQL to test something or develop against or just for fun but didn’t want to go through the hassle of installing etc?

Clone this:
https://github.com/steveswinsburg/mysql-vagrant

Run this: vagrant up

Done.

by steveswinsburg at June 25, 2015 12:55 PM

June 19, 2015

Adam Marshall

WebLearn Plans for the Next Few Months

Busy Bees. Phtoto credit: https://www.flickr.com/photos/nomindsvision/2333302342We thought it might be useful to outline what tasks the WebLearn team are currently working on. The focus at the moment is on 4 projects but we are also spending time inducting two new developers and two new learning technologists:

  1. WebLearn Improved Student Experience (WISE) project (see: https://blogs.it.ox.ac.uk/adamweblearn/wise-project/)  – The WISE project will support departments, faculties, colleges and units to fast-track the development and improvement of their WebLearn presence in order to deliver an enhanced (and consistent) student digital experience, as per recommendations from previous projects.
  2. Developing an Online Reading List Management System (ORLiMS) at the Bodleian Social Science Library (Innovation Project) – Major improvements to the design, functionality and user interface of WebLearn’s Reading List tool
  3. Researcher Training Tool Improvement Exercise (RATTIE) – Numerous improvements to the user experience plus big fixes.
  4. Rewriting the WYSIWYG HTML editor ‘item picker’. (This relates to the pop-up window that appears when you opt to ‘Browse Server’ from within the editor.) This work is being undertaken by a student intern.

In the future, WebLearn is poised to switch to switch over to using the IT Services Group Store as the provider of institutional groups (unit and course groups); this will happen in July.

There is some good news in that the Education IT Board has approved the Mobile Learning with WebLearn (MOLE) project brief. (A project brief is a pre-project phase where requirements are fully-defined and the project plan is made.) The full project will transform WebLearn into a fully responsive service meaning a much-improved user experience on a mobile phone. In addition, the project will develop a handful of ‘Learning App’s and the next few months will be spent mapping out exactly what Apps will be developed.

WebLearn will also be providing the back-end to the Humanities Division’s  ‘Frameworks: The Oxford Mobile Career Planner’. The project is in its very early stages so details may change but it is currently planned that WebLearn will act as a data store and present anonymised skills audit data to skills training officers who will be able to assess the effectiveness of Research Training at Oxford University. The project will also develop an App for students to record and reflect upon researcher career development in terms of skills accrued.

Another substantial piece of work is the rewrite of the integration code that links WebLearn and TurnItIn (the plagiarism awareness service). Turnitin are withdrawing the current interface (API) and moving to an IMS Basic LTI with extensions approach. The new integration should be invisible to the end user although we may be able to improve the range of options available via WebLearn’s Assignments tool.

Links

by Adam Marshall at June 19, 2015 11:45 AM

June 16, 2015

Dr. Chuck

Annoucing the (very early) Sakai – Tsugi Bridge

In my previous post, I announced a Java implementation of my Tsugi library. Java Tsugi has as its goal to allow externally hosted LTI applications to be quickly developed and hosted.

But there is more….

I have been long positing that Tsugi would be a way to build portable applications that ran in any Java framework. Take a look at this slide set starting at slide 24 through the end. Pay particular attention to slide 28.

I claim that the same Tsugi application can run standalone in a Tomcat or in a Sakai-provisioned Tomcat as a Sakai tool with zero code changes. I see Tsugi as a great way to build the next generation of Sakai tools like XWiki – we can build an LTI capable XWiki to plug into Sakai or any other LMS.

I have made some really initial steps in this repo:

https://github.com/csev/tsugi-sakai

This is contains a Sakai implementation of the Tsugi APIs so that the Tsugi Java Servlet can be provisioned run in a Sakai Tomcat. The implementation is currently empty – but I have worked out the class loader issues that allow me to provision a Tsugi servlet with a different implementation without changing the servlet.

So this is very much a start – the README.md has a lot of steps – and at the end all you get is a 500 error as shown below – but it does show how we can eventually connect the Tsugi and Sakai worlds.

2015-06-15-Tsugi-Sakai-500

by Charles Severance at June 16, 2015 05:16 PM

June 13, 2015

Dr. Chuck

Annoucing the Tsugi Java Library for Building Interoperable Learning Applications

I have been focused on laying the technical groundwork for interoperable learning applications for the past ten years. Through my work on Sakai and IMS I have tried to help move the entire industry forward to enable innovative teaching and learning applications. While we have made great progress, there is much to do. My recent “State of Sakai” talk at the Apereo Conference alludes to the kind of work we still need to do.

I have been exploring the space where applications are both portable and interoperable through my Tsugi. Over the past year I have spent more time on Tsugi than I have on Sakai because I think that exploring future architecture is a very high priority task.

Last week I gave a workshop at the Apereo 2015 conference on Building Applications using IMS Learning Tools Interoperability using my PHP Tsugi. While the workshop went very well, it was clear that the folks that needed to build interoperable applications *right now* were not interested in programming in PHP. Also, I am well into my Summer of Sakai 2015 effort working with a number of students over the summer and it is clear that it is simply too difficult to teach new developers how to write Sakai applications.

All of this was a “perfect storm” that motivated me to drop everything and put in an all-out effort to port my Tsugi library to Java in the past week.

Announcing Tsugi Java 0.0.1

It has been a pretty crazy week – I coded day and night and pretty much ignored my inbox – but I am pretty pleased with the results. While Java Tsugi still needs some work – it is already quite competent to build LTI 1.0 applications in Java – and they feel very clean and elegant. Here are some documents:

This will continue to move and evolve but it is in good enough shape to share with others to start getting broader input.

I also recorded a short introductory video about Tsugi Java:

Conclusion

This is a great start and there is much still to do for Java Tsugi. I am hoping that others will help move this effort forwards and contribute to the project. For the next few weeks, I now need to sprint on finishing up a bunch of things for the Sakai 11 code freeze.

Please feel free to let me know if you have any questions or comments.

by Charles Severance at June 13, 2015 09:25 PM

June 10, 2015

Apereo Foundation

June 01, 2015

Steve Swinsburg

Sakai, ditch the custom classloaders

A few years ago I added support to the Sakai Maven Plugin to deploy everything that normally goes into /shared/lib and common/lib into just /lib, as per the standard Tomcat classloader layout.

To use, add -Dsakai.app.server=tomcat7 to the build command. Everything gets deployed to /lib and Sakai starts up without any modifications (except the standard connector modification in server.xml and the optional performance improvements in catalina.properties).

Enjoy the future!

by steveswinsburg at June 01, 2015 11:06 AM

May 31, 2015

Apereo Foundation

May 28, 2015

Apereo Foundation

Apereo OAE

Apereo OAE cloud hosting partnership

Cloud or above-campus services can provide many benefits for higher education, including management simplicity and cost effectiveness. Such services can also create challenges; from reducing the ability to integrate or innovate, through to legal, ethical and data privacy concerns that cross national boundaries. Apereo exists to help higher education and other institutions meet those challenges. Above all, we believe that cloud based offerings should enable choice, openness, and institutional control, rather than setting up yet another remote, rent-extracting gatekeeper. That's why the Apereo Open Academic Environment is available by a variety of routes to suit the needs of your institution.

One route will be familiar. Your institution can choose to download OAE, and install and run it for your faculty and students - and, if you wish, others. OAE is licensed under an Apache license, allowing you the freedom to customize, tweak, and run OAE in a  variety of contexts. OAE is a growing and vibrant community which provides peer to peer support in a classic open source manner. Our ESUP colleagues in France have chosen this route to deployment for French higher education.

If, on the other hand, you wish to avoid the complexity of installation, configuration, and maintenance, and take advantage of OAE's strong network effects and its ability to seamlessly collaborate and share across institutional boundaries, other options are open to you. Apereo has partnered with a commercial provider - *Research, a member of the core OAE stakeholder group - to provide a graduated, co-operative hosting agreement. This agreement has three main options:

  • Option 1: Receive an institutional tenant that can be used for free without an SLA/data processing agreement, but under a reasonable use policy. Under this arrangement, individual users will need to accept a Terms and Conditions agreement before using the environment
  • Option 2: Receive an institutional tenant with an SLA and a data processing agreement. The institution will only be charged the full economic costs of providing this service to the institution, plus a 20% contribution towards the further development of the OAE platform.
  • Option 3: Become a strategic OAE project partner and contribute to the strategic direction of the project. In exchange, the project partner receives an option to use Option 2 at no cost for 12 months. The project partner investment goes directly to the Apereo OAE project to support further design, development and maintenance.

All these options allow your institution to retain full control of the look-and-feel of the tenant, and to control which institutions you choose to collaborate and share with. Content migration tools will become available in the next period of OAE development that will allow you to move between options. They will be free and open source. 

You can read the details behind these options and the full partnership agreement below. We believe they provide a path to participation, sustainability and growth that remains 100% open. Join the 383 institutions with OAE tenancies, and begin to explore the next generation of academic collaboration today.

 

by Nicolaas Matthijs at May 28, 2015 03:51 AM

May 26, 2015

Sakai Project

Marist College Concludes its First Sakai-Based MOOC

Marist is concluding their first MOOC on Enterprising Computing

by MHall at May 26, 2015 11:50 PM

Sakai TCC is renamed Sakai Project Management Committee

Apereo Board approved renaming the Sakai CLE TCC as the Sakai Project Management Committee.

by MHall at May 26, 2015 11:45 PM

May 13, 2015

Apereo OAE

Apereo OAE Jack Snipe is now available!

The Apereo Open Academic Environment (OAE) project team is extremely pleased to announce the eleventh major release of the Apereo Open Academic Environment; OAE Jack Snipe or OAE 11.

OAE Jack Snipe brings a wide range of new features and capabilities, including group profile pages, the ability to delete and archive groups, an editor role for collaborative documents and increased configuration support for landing pages and the footer. Next to that, OAE Jack Snipe also includes an extraordinary number of usability gains, accessibility improvements and bug fixes.

Changelog

Group Profiles

OAE Jack Snipe brings group profile pages, allowing for groups to be better contextualised and presented to non-members. A group profile page contains a description of the group, ensuring that the subject and goal of the group is clear to the visitor, a public activity feed, showcasing the recent activity that has taken place in the group, and a list of featured members, providing an idea of the people involved in the group.

Group profiles are anticipated to be especially useful when browsing and discovering public or joinable groups, but will also provide convenient additional context when visiting groups you're already a member of.

Delete group

OAE Jack Snipe makes it possible for group managers to delete groups, allowing for inactive or unused groups to be removed from membership libraries. Deleted groups are not removed from the system entirely though, but are archived instead. Therefore, deleted groups can be re-activated by an administrator at any point in time.

Editor role

Recent usage feedback has indicated a need for allowing people to edit a collaborative document without being able to perform other administrative tasks such as deleting the document. Examples include a collaborative writing course where students needed to be able to contribute to a collaborative document without being able to delete it.

Therefore, OAE Jack Snipe introduces an editor role for collaborative documents. Users and groups with the editor role will be able to edit the collaborative document without being able to manage it (delete, manage access, etc.)

List items

Up until Apereo OAE 10, the display of list item titles was limited to a single line of text. As space was limited, this often meant that the title was cut off too quickly, making it difficult to identify an item.

OAE Jack Snipe ensures that all list items (libraries, search, etc.) will display a much larger part of the item's title (up to 2 full lines), making it a lot easier to identify the item you're looking for. We are convinced that this relatively small usability improvement will make a world of difference when using the system.

Mobile login

On mobile devices, there will no longer be a need to sign in every time a user visits their tenant. A session will now be remembered for up to 30 days, ensuring that OAE and its activity feed can be accessed quickly and easily.

REST API improvements

As easy-to-use and well documented REST APIs have always been a critical part of the OAE architecture, OAE Jack Snipe introduces a range of REST API enhancements.

Next to various REST API usability improvements, OAE Jack Snipe introduces a cross-origin resource sharing (CORS), making it easier for external applications to integrate with the OAE REST APIs.

The Swagger REST API documentation framework has also been upgraded to the latest version, adding some nifty additional features to the REST API documentation pages.

Google Authentication

The Google Authentication integration that ships with Apereo OAE has been upgraded to work with the latest version of the Google Authentication API. Next to that, it is now also possible to configure multiple Google Apps authentication domains per tenant.

Landing page configuration

Apereo OAE tenant landing pages can be fully customised, allowing for an institution to present and contextualise their tenancy with great flexibility. OAE Jack Snipe introduces a number of additional tenant landing page configuration and customisation options, providing even greater control over their look and feel.

Footer configuration

The page footer can now be fully configured and customised on a per installation basis. Amongst other things, this will allow Apereo OAE hosting providers to name the installation, link to a website for the installation and link back to the website for the hosting provider.

Try it out

OAE Jack Snipe can be tried out on the project's QA server at http://oae.oae-qa0.oaeproject.org. It is worth noting that this server is actively used for testing and will be wiped and redeployed every night.

The source code has been tagged with version number 11.0.0 and can be downloaded from the following repositories:

Back-end: https://github.com/oaeproject/Hilary/tree/11.0.0
Front-end: https://github.com/oaeproject/3akai-ux/tree/11.0.0

Documentation on how to install the system can be found at https://github.com/oaeproject/Hilary/blob/11.0.0/README.md.

Instruction on how to upgrade an OAE installation from version 10 to version 11 can be found at https://github.com/oaeproject/Hilary/wiki/OAE-Upgrade-Guide.

The repository containing all deployment scripts can be found at https://github.com/oaeproject/puppet-hilary.

Get in touch

The project website can be found at http://www.oaeproject.org. The project blog will be updated with the latest project news from time to time, and can be found at http://www.oaeproject.org/blog.

The mailing list used for Apereo OAE is oae@apereo.org. You can subscribe to the mailing list at https://groups.google.com/a/apereo.org/d/forum/oae.

Bugs and other issues can be reported in our issue tracker at https://github.com/oaeproject/3akai-ux/issues.

by Nicolaas Matthijs at May 13, 2015 02:56 PM