Vince Kellen, PhD, CIO, University of California San Diego
Complexity bedevils us. And in the world of information technology, we have loads of it. Digitization, especially in universities, has grown far faster, perhaps even outstripping a huffing and puffing Moore’s Law, which for years has shielded IT leaders from this thicket of complexity. And within the last few years, the boom in educational technology investment has created both crazy activity within our vendor community and diverse technology adoption patterns among our students and our faculty.
“Mental inertia causes a discounting of evolving solutions and a continued investment in aging ones”
But complexity also serves as a sieve that separates leaders from laggards. Objects in the future are often more complex than they appear and not everyone sees the same things. Individuals and institutions have three possible reactions when faced with a complex future: they can react to the future, they can plan for it, or they can shape it. For higher education institutions that choose to shape the future they can influence the course of unforeseen events with ease. This institution gains a stronger foothold because the solutions they need and parts of the market condition are in better alignment with their capabilities. After all, they helped create these future conditions that they can then utilize.
CIOs and IT leaders inherit a significant chunk of the complexityof this problem. The major trends of the last severalyears, Web 2.0 and social media, cloud computing, big data and analytics, the Internet of things, and the EdTech boom, are in full swing and have grown into mega-trends reshaping not just the economy but human life. Vendors are also adjusting and in some cases struggling to keep pace, trying the find a way to reconcile the needs of both the leaders and laggards within their customer base. Like a marathon up a mountain, as the slope gets steeper, the distance between leaders and laggards increases.
For CIOs, the choices we have to make are getting more interesting. The monolithic enterprise (ERP) system has been gradually decomposed as so many vendors, now offer software-as-a-service (SaaS) solutions for ERP-like modules, including tools that support advising, student success analytics, customer relationship management (CRM), human capital management, faculty promotion and tenure, alumni relations and development. This list goes on.
Meanwhile back in the classroom, the vendor community has erupted as investors poured billions in to educational technology start-ups of all sorts, including gradebooks, clickers, active learning tools, course capture and video production, MOOCs, electronic textbooks, online testing and even online testing fraud detection. This megabloom has created many options and choices for students and faculty and of course, a complexity challenges for IT leaders.
In the research labs, supercomputing is undergoing speciation. Very large scale, high speed, general purpose computing clusters are being outstripped by large scale clusters with different combinations of technologies to more carefully match well with the algorithms running on them. Cloud computing, the use of graphics processors (GPUs), flash and soon non-volatile memory, neuromorphic computing and neural chips are allowing all sorts of new high performance computing animals. This makes it harder for the researcher to be both a master of their research domain and a master of this growing research ecosystem.
In research, the pace of digitization has been particularly acute. Business transaction data can only grow now as fast as business acquires more customers. Even the growth of video is a bounded digitization problem. Video typically only grows as dense as the human retina can detect differences, thus creating growth rates that eventually will be by clocked by growth in the numbers of people served. But research data knows few limits to density growth. As scales of analysis go down to the molecular and atomic scale, the time dimension shrinks dramatically to even levels of the femtosecond (one millionth of one billionth of a second). The research that dives down into this scale of analysis is now inside so much of the generation of new materials and new drugs. The future is nanotech.
At the same time analysis of data is conquering smaller and smaller scales, researchers are aggregating population data sets, such as genomic and other–omic data that now multiplies this data. As you might imagine, the speciation of research computing goes along with a bubbling ecosystem of software solutions, most of them developed by and for researchers, creating even more difficult matching problems. The future is multi-scale.
If you can intellectually hover above this jungle of IT within higher education, it becomes clearer that the ecosystem has been getting richer, much more complex and is as daunting as ever. For CIOs and IT leaders, this creates a central challenge: when should we design solutions that respect the past versus design solutions that embrace the future?
Too often, I see organizations looking at this evolving ecosystem with mental frames that are wrought from the distant IT past and unaware of the need to examine this question. This mental inertia causes a discounting of evolving solutions and a continued investment in aging ones. To an extent, the vendor IT market appreciates this, as it allows vendors to extract a stable profit from this mental inertia. The vendor can slow down the rate of change in their solutions, keep customers who want to stay with the past happy and redirect their resources elsewhere. We know this as ‘milking the cash cow.’
But on the flip side of this question, chasing ephemeral solutions can create its own problems. Universities have a tendency to do this, especially if they feel some stress in particular areas and usually when a forward-looking administrator or faculty is eloquent for the change. After all, when the going gets tough, the tough go shopping. Shamelessly pursuing every new thing creates data integration challenges and often leaves universities in a lurch when the vendor disappears from the market.
But inside this choice between the past and the future lies the great opportunity. The trick becomes in knowing, not just for the CIOs and IT leaders, but the leadership team of the institution, which decisions need to respect the past, which decisions need to embrace the future and more importantly, how does the organization move its IT solutions from the past to the future cost-effectively.
What is making this choice between past and future interesting is the state of data and system integration. It has significantly improved in the last decade. The development of robust and now streaming APIs, continued virtualization advances including containers, lambas and other fascinating ‘micro’ services allow for new levels of scale in distributed systems we have not seen before. Networking, both wired and wireless, has advanced allowing for cooperative systems across the globe. These advances in data movement and distributed computing are begging for system modularity.
In this bubbling cauldron of complex solutions, agile, incremental and module maneuvers are in order. Foresight will be the key skill. This will require CIOs and IT to get out and shape the future by collaborating and innovating with internal and external partners. Since it will be impossible for any institution or IT leader to turn back the clock and slow the pace of technical change, the alternative is to stay ahead of it. This will require a community of people, both inside the institution and across a diverse vendor pool, working collectively to increase their knowledge. By collectively and collaboratively shaping the future, we might be able to tame our portion of the wild IT ecosystem.