Before I get to Frost, I want to absorb the work of some other people, most of whom, unlike Frost, intentionally engage and think about complexity. I want to start with an article called On the Status of Boundaries, both Natural and Organizational: A Complex Systems Perspective (Emergence, ISSN: 1521-3250, 2002, 3(4): 32-49) by Kurt A Richardson and Michael R. Lissack. I start with them because of how they start their article: "Contemporary science with its strong positivism tends to trivialize the nature of boundaries." I think this says better, certainly more succinctly, what I was trying to say in my previous post: those who manage data security tend to trivialize the nature of boundaries, and this likely leads to most of their problems. I should probably go back and rewrite yesterday's post, but …
Anyway, Richardson and Lissack then go on to say that "Complexity thinking forces us to review our conceptions of what natural boundaries are", and the rest of the article attempts just that. They make some points that shed light on the issue for data security—and for education, by the way. I'll deal with data security in this post. They first establish, to my satisfaction at any rate, that boundaries are the foundation of knowledge—if we can't identify boundaries that distinguish one thing from other things, then we have difficulty saying we know that thing; however, our "boundary assumptions go unquestioned, resulting in flawed understanding and leading to flawed decisions and actions" (33).
As does Snowden's Cynefin Framework, they distinguish complex systems from complicated systems, but their distinctions rely more on scientific properties than on the organizational properties of Snowden. For Richardson and Lissack, complex systems are
comprised of a large number of non-linearly interacting non-decomposable elements. The interactivity must be such that the system cannot be reducible to two or more distinct systems, and must be sufficient (where the determination of “sufficient” is problematic) to allow the system to display the behaviours characteristic of such systems. (34)While some complicated systems, computers for instance, can contain non-linear interactions, they are not complex. Unlike the prescribed and fixed sub-systems of a complicated system such as a jet airplane, complex systems have emergent and temporary sub-systems. This implies that the boundaries of complex systems are inevitably emergent and temporary, and all human organizations and their sub-organizations are complex systems. Richardson and Lissack say pointedly that "the boundaries describing subsystems in a complicated system are prescribed and fixed, whereas the boundaries delimiting subsystems in a complex system are emergent, critically organized, and temporary. By this definition most organizational working boundaries are those of a complex system" (36).
Herein lies the big problem for data security. Emergent and temporary boundaries don't merely complicate the data security issue, they complexify it. Boundaries emerge and wane, though many are stable enough for us to rely on over the course of a human lifetime. Many more are not so stable, but the point for an organization is that all boundaries will shift, wax, and wane. This is a physical fact.
Moreover, organizational boundaries are rendered further problematic by the fact of scale. Boundaries tend to exist at one scale of reality and not at another. For instance, the very thick, impregnable steel walls of a bank vault become quite porous and pregnable at the atomic scale. Information can leach through the thickest steel. Curiously enough, most modern information works at a very small scale which most of us, including data security experts, simply cannot imagine and at which we are not mentally or physically equipped to function.
Then, boundaries within even the most simple of structures are dynamic. Drawing on the work of Sommerer and Ott (1993, A physical system with qualitatively uncertain dynamics, Nature, 365: 138–40), Richardson and Lissack note that "even with qualitatively stable order parameters, qualitatively unstable behavior occurs" (40). When two systems interact—two people, for instance—then the boundaries cannot remain stable. We know this intuitively, but we design organizations and data security systems as if the boundaries surrounding our data are fixed and persistent. They are neither. They cannot be.
Richardson and Lissack conclude their article with a philosophical position called quasi-critical pluralism, a dynamic, dialogic position between objective realism on one hand and subjective constructivism on the other. That philosophical position deserves its own discussion, but later. As it is, I'm enjoying this line of thought, so I think I'll continue it for a few more posts.
No comments:
Post a Comment