Cramer, Hilton, Kimpton on Changing Information Systems Landscape in Higher Education at the CNI Fall Meeting Plenary Session

Mon, 2014-12-15 11:30 -- carol

cramer_kimpton_hilton_photograph

Tom Cramer, Michele Kimpton, and James Hilton discuss the landscape of information systems at the CNI Fall Member Meeting plenary session.

Winchester, MA  The CNI Fall Member Meeting opened with a plenary panel discussion led by CNI Executive Director, Clifford Lynch entitled, A Conversation on the Changing Landscape of Information Systems in Higher Education: Community Source, Community Platforms and Systems as Services. Panelists Tom Cramer, Chief Technology Strategist, Stanford University, James Hilton, Dean of Libraries and Vice Provost for Digital Education Initiatives, University of Michigan, and Michele Kimpton, Chief Executive Officer, DuraSpace, highlighted some of the advantages and risks of developing and using community source software along with associated business models for SAAS. The discussion concluded with a sobering look at a pressing digital scholarly ecosystem issues that are in need of focused attention—cyber security and privacy.

Clifford Lynch, introduced the panel session with the comment that community source has “served us well for 10-15 years”.  Now that software is often delivered as an online service he asked panelists to weigh-in on where that leaves us with regard to understanding our collective position in managing the lifecycle of scholarly information.

James Hilton believes that community source software is here to stay and that it reflects assumptions about how intellectual property rights are managed which may not be true. “Getting it done” depends on how labor is organized to achieve particular outcomes that include a host of non-technical issues such as IP rights management and legal issues.

“Tuning the community source development model is what DuraSpace has been doing for several years,” explained Michele Kimpton. To collaborate and develop code together is at the center of the community source process.  She believes that increased transparency has led to the collective ability of DuraSpace and the open source communities it serves to advance DSpace and Fedora software together. This deeper, extremely open version of community engagement has been transformational in the projects stewarded by DuraSpace.

Hilton reminded the audience that as the scale of institutional investment in community source goes up the desire to manage the process also increases. This effect often leads to increased participation in the community source development.

Software as a service

If you run open source software as a service do you lose the ability to advance core infrastructure? Tom Cramer suggested that this assumption might not be true if software is run at scale with a management focus on building digital workflows.

Kimpton pointed out the difficulty smaller institutions face in upgrading open source software to take advantage of current advances in functionality. “This is why DSpaceDirect (a hosted repository service) works well for smaller and mid-sized institutions,” She said, “Because upgrades are included in subscription prices”. Further, she thinks of the “cloud” as a utility similar to electricity that will turn the traditional operation of IT on its head. “ The same community development and governing process can continue even if the software is run in the cloud with the right governance and controls in place. Community source and SAAS do not need to be mutually exclusive to advance core infrastructure. In fact community members may be able to participate more fully in innovative software development if some operations are managed as a service.

Security

There is a fear that system break-ins will happen if IT systems are run externally. The alternative is that institutions will need to commit to invest in and operate everything themselves. In this scenario institutional value-add is the trade-off for long-term costs associated with running IT systems in-house.

Cramer points out that the real challenge is in being able to foster innovation, development, operations and security across all layers of library stewardship. Engaging with external collaborators and service providers to achieve institutional strategic goals is part of the big picture.

Kimpton mentioned that increased software standardization allows for economies of scale in making resources both durable for the long-term and accessible across communities and institutions. “There has been a lot of customization work in the DSpace and Fedora communities. The beauty of more standardization is that you can look at aggregations like cloud to cloud or one kind of data pipeline to another’” she said.

Cramer added that modularization—making services and applications easy to integrate or “plug in” to a technology stack—is an alternative to standardization. Libraries are part of a large ecosystem which is why native linked data interconnectivity is seen as a possible solution for institutional access and value going forward. Standards do not matter as much if you are trying to connect up big things, although standards for content and legal agreements make it easier to work across multiple systems.

Changing patterns of innovation

Panelists agreed that if the process of innovation is distributed among many institutions interested in solving a problem, the potential for “moving innovation up the stack” increases. Networked services or community source development initiatives will need to determine how to innovate at scale and at current points of greatest need.

Hilton emphasized that it’s still about staying in control of our content.

Campus IT role

Does an in-house process that focuses developer efforts on developing breed increased innovation? IT groups may be called on to manage vendor contracts, but tuning IT focus could also translate into savings as innovative practices become part of workflows.

Costs for cloud storage—an item that impacts many budgets as demand for storage and preservation of more and larger data increase—can be understood by looking at the way the service is used. Kimpton points out that paying for what you use means that the price tag goes up or down depending on usage. Storage—putting content in the cloud and keeping it there—is constant and increases as more data is added, although institutions do not have to provision large storage systems upfront, as they do today on campus. They pay for the storage they use. Compute time—running software as a service in the cloud—varies depending on how much time and space it takes to run a particular service or operation.

Solving security

Some institutional users feel that implementing open source solutions or using external services present unacceptable security risks.

Cramer suggested that “we” are “they” in terms of who will guarantee community source security for scholarly assets in the future. He pointed out that many of the people and institutions who have the knowledge and resources to solve security problems were “in the room”.

Hilton explained that legal infrastructure needs to advance with software and contractual agreements for services. He sees a looming pair of disasters—net neutrality and cyber terrorism—that are out of scholarly ecosystem control. Community source software development assumes that we will continue to have a clean utility layer. This assumption could prove to be incorrect.

“It will be messy and not really different for anyone in terms of where the server runs—it’s still going to be messy,” said Cramer. Panelists agreed that diversity in compute cultures and technology across the ecosystem is a strength with regard to protection from security risks.