The Internet as Connective Tissue

The Internet–or more specifically the Web–represents for me a networked extension of distributed computing, a trend that began with the advent of the personal computer. The Wikipedia description of “a network of networks” (http://en.wikipedia.org/wiki/Internet) is catchy but needlessly redundant: a network connected to another network is merely a larger network with rules that define that connection. I’m not sure where it will end, but I suspect it will be with something that is like the Web (in terms of connections) but even more ubiquitous.

While the Internet existed prior to PC’s and grew out of ARPANET and the need for redundant connections among mainframe-based computers (in order for the communication infrastructure to survive a nuclear attack), the popular (and profound for me) implementation of the Internet via the graphical browser allowed personal computers to be connected among average humans, not just military specialists and university researchers who spoke gopher. While this ability to connect has not necessarily spawned a technological efficiency in terms of shared processing power (think, “Let’s hook all our PC’s together and create a super-computer like SkyNet in the Terminator movies or Holmes IV in Heinlein’s The Moon is a Harsh Mistress“), it has certainly spawned a communicative efficacy in terms of an always-on connection. Prior to Web, most PC’s were used for stand-alone applications–word processing, spreadsheets, databases, and desktop publishing; with the release of Mosaic (and Netscape’s subsequent commercialization), PC’s increasingly came to be viewed as communication tools. Witness the sale of Email machines. Witness the move of software applications to the Web as a service (in fact, with a fast Internet connection and a browser, you don’t actually need much software on your PC anymore–just access to Google Documents).

The key to the explosive growth of the Internet in the mid-1990’s was not AOL (although that company certainly helped mask the complexity of access) but DNS–the Domain Name System which allows machine addresses (such as 128.83.40.25) to be translated into the University of Texas website. No longer did we have to remember an arcane series of up to 12 numbers in order to reach a remote computer; we simply had to remember a word followed by a period (a dot) and a suffix (such as “com” for commercial or “edu” for education). The foundation for that growth (also known as the “dot com bubble”) was a well-defined set of basic network communication rules (TCP/IP: Transmission Control Protocol and Internet Protocol) which specified how messages (in the form of small packets) would be sent but not the what or why or when. This lack of definition and central administration allowed organic growth. Much like a living organism, the Internet grew as fast as it could be fed (in this case, the food was additional connected computers and faster connections between those nodes).

While I don’t live on the Web the way my teenagers do, I probably spend 6 hours a day online–communicating, mostly via email and a few some social networking tools; building (and occasionally teaching) online courses; and surfing (not just academic topics–movies, news, Dilbert). Despite this time investment and starting one of those dot com companies in 1995, I don’t think I have the expertise to evaluate educational uses of the Internet. On the surface, the Web seems to offer three primary affordances:

  • information – the original instructional vision and still the basis for tools like Wikipedia and even Google; this is what I think of when I hear Web 1.0: finding data
  • voice – in the form of blogs and YouTube and flickr; this is what I think of when I hear Web 2.0: creating data
  • connection – IM and Twitter and social networks; this is what I think of when I hear Web 3.0: becoming data

However, the tools are not completely self-contained: Wikipedia offers both information and voice; Twitter connects followers but rewards clever voices with an increased following. The Wikipedia entry for the Internet (http://en.wikipedia.org/wiki/Internet#Services) delineates several discrete services:

  • email
  • web
  • collaboration
  • streaming media
  • telephony
  • file transfer

Any list of uses is inevitably incomplete and dated, and all of the services Wikipedia lists are merely methods for obtaining information (web, streaming media, file transfer) or connecting (email, collaboration, telephony).

I like metaphors, especially visual ones. I’ve always found the web metaphor for the Internet accurate but somewhat lacking (not to mention a little scary); spider webs are symmetrical, and the “real” Web is lumpy in terms of pages (nodes) and connections. For example, a typical image of Internet connections looks something like this:

data visualization of the Internet

data visualization of the Internet


Similarly, an image (or any social network analysis) of connections looks something like this:
data visualization social network connections

data visualization social network connections


These images make me wonder:

  1. The images of nodes and connections look random but lumpy. Recent descriptions of networks talk about the difference between “small world” and “scale free” network diagrams. What is that difference? Are social networks more like small world or scale-free networks (and does that even matter)?
  2. What is the advantage of social relationship networks (like Facebook) over social object networks (like flickr)? MySpace was overtaken by Facebook which itself now seems to be fading; why?
  3. And getting to the title of my post, are data visualizations of connections (not the nodes but the paths among them) better represented by an organic, tissue metaphor than a “spidery” but inevitably linear image? This MRI of brain connections seems to offer both lumpiness and non-linearity:
    MRI brain connections

    MRI brain connections

And on a personal level, because of my teenagers, I worry about this issue:

  1. Does multitasking lead to superficial learning? Marc Prensky uses the term “twitch speed” which carries the potential for partial attention; am I enabling my kids to develop only superficial learning by allowing them to spend so much time online?

Expertise: a long and winding road

The idea that experts tackle problems that increase their expertise seems supportive of the self-efficacy behind ACT-R:

  • Reinvestment – the  motivational aspect
    • conserving resources to have energy to put back into new problems
  • Progressive problem-solving – the cognitive aspect
    • tackling more difficult problems AND tackling more complex representations of recurrent problems
    • represents working at the edge of competence (ZPD)

Pattern learning, which occurs without extensive effort, involves choosing the right patterns and recognizing when no pattern fits. This seems to equate with imagaic memory which is efficient for spatial and temporal data.

Procedural learning starts as step-by-step problem solving which become “chunked” into a single procedure; while this automaticity frees resources, it becomes a handicap in the improvement of performance. However, automaticity does not inevitably lead to inflexibility if automated skills are building blocks to new skills that are not automated.

However, learning is also pattern and procedural learning; what distinguishes expertise is the seeking of complexity. Complexity is described as a matter of the number of constraints. Because most real-world problems are not reducible to a step-by-step economy, we use simplified representations (akin to simulations). The class of problems that are endlessly complex are the constitutive problems of a domain.

Experts are motivated by:

  • flow:  total absorption and feeling of control, loss of normal time; becomes addictive to the point that problems are invented
  • second-order environment: the expert sub-culture where your recognition as an expert matters to you (not useful in fostering early development of expertise)
  • heroism: effort disproportionate to rewards

Competitive environments foster expertise. However, so do expert sub-cultures which may not always involve competition but always involve recognition of success and help/cooperation leading to success. The expert environment constantly changes as the experts become more expert; the reason experts help each other is to help that environment conttinue to get more difficult (i.e., inventing problems). Expert teams are one example: everyone more or less knows how to do everything so the focus is on the goal, not on individual achievement.

Cognitive Apprenticeship

While this article provides useful definitions and distinctions among implementations of cognitive apprenticeship, the emphasis on a survey of articles provides with rare exception neither practical advice nor theoretical support. The initial definition of cognitive apprenticeship (“learning that occurs as experts and novices interact socially while focused on completing a task”) is instructive but needlessly includes the social component; all human-human interaction is social. However, the Lave/Wenger concept of legitimate peripheral participation (“a process in which newcomers enter on the periphery and gradually move toward toward full participation) was significant as it accurately describes workplace learning in my experience.

The discussion of scaffolding, fading, intersubjectivity (negotiated shared understanding), modeling, mentoring, and coaching were redundant with previous readings. However, there were practical nuggets:

  • modeling is more efficient than trial and error
  • mentors and coaches help tacit knowledge become explicit
  • coaching focuses on a specific goal
  • expert outlines reduce cognitive load
  • discovery alone is insufficient to ensure learning will take place
  • individuals expect others to share their understanding (myopic as this may seem)

In particular, the productive mentoring practices (structure, regular meetings, and mentor training) point the way to effective support design. And though the explanation of ZPD was also familar, the description of activities based on ZPD as just within a learners’ current ability level (the ZPD is just beyond) is eerily reminiscent of video games which aim to create gameplay levels which are barely doable.

The Perfect Theory

Snelbecker also tackles design theories but not by contrasting them with learning theories like Reigeluth; instead, Snelbecker contrasts design theories from theoretical and practitioner points of view. The former is designed not to yield rules of practice but to help practitioners “design conditions that facilitate learning.” As such, theories, while closely related to practice (as opposed to learning theories), are indirect.

Theories, designed by Snelbecker’s knowledge producers, are expected to provide definitive answers by knowledge users (instructors and designers); however, because theorists exercise caution in drawing conclusions, theories seldom satisfy users who need immediate answers. I particularly appreciated the perspective that theorists view their work as progress reports designed to help users “consider the merits of alternative approaches.”

The irony is that while theorists do not want practitioners to consider their work as final answers, these same producers adopt dogmatic stances regarding their personal theory. Snelbecker’s solution involves posing three questions to the theorists:

  1. Is any theory perfect?
  2. Does any theory include everything?
  3. Should any theory be the only theory?

After answering, “No” to each of these questions, Snelbecker concludes by recommending that theories identify the added value they provide to our understanding of how instruction can be designed.

The Design Profession

The career recommendations was extremely practical in listing designer, project coordinator, and artist as the primary personnel paths. The industry categories were similarly useful; accrediting bodies (such as associations) also offer job opportunities. The professional development sections were somewhat cursory (writing a blog should be added to the promotional suggestions). However, the 2 pages that summarized design models with visuals was superb:

  • Dick and Carey provides all the elements (ADDIE) but only a linear process.
  • R2D2 provides an iterative process solution. Assessment is the junction and bridge between Dissemnination and Definition; the model can also embrace Nokia’s spiral between explicit and tacit knowledge.
  • The layers model inserts the real-world constraints of time (budget); higher layers can produce meta-designs such as simulation archetypes.
  • The rapid model marries ADDIE to concurrence.

Demonstration phase

A transition between Design and Develop/Deliver, this phase is often incorporated directly into a single iterative unit release methodology. The optional deliveries were very helpful:

  • treatment
  • scenarios
  • templates (content as well as layout/technical)
  • requirements spec (especially useful for software)
  • storyboards
  • scripts
  • prototypes

Practical suggestions are prevalent:

  • media asset lists
  • reading levels
  • translations
  • reviewer training

The storyboard form could have been demonstrated as a template–with the navigation (top) annotated as a common element.

Common Vocabulary for ID

Citation

Reigeluth, Charles M. & Alison Carr-Chellman. A Common Language and Knowledge Base for ID? Retrieved 2/6/2009 from http://it.coe.uga.edu/itforum/paper91/Paper91.html.

Summary

Reigeluth proposes that design theory differs from descriptive (learning) theory because it identifies methods versus cause-effect relationships; he also contends that design theory assists in the creation (versus mere description) of outcomes. His five sets of design theories parallels the ADDIE model.  He divides methods into several constructs (which he admits may not be all-inclusive):

  • Scope
  • Generality
  • Precision
  • Power
  • Consistency

These methodological constructs are contained within (and constrained by) the situational constructs of values and conditions (environment).

Response

The argument that the cause-effect relationships of descriptive theory are probablistic is contradicted by Reigeluth’s earlier work that argues  that the methods of design theories, “are probablistic” (Riegeluth, Charles M. Instructional Design Theories and Models: A New Paradigm, Vol. 2 (1983). Mahwah, NJ: Erlbaum. p. 7); the earlier work seems more accurate. However, the idea to distinguish (and connect) design theory with learning theory is very useful. The act of situating methods in community (values) and environment (conditions) embraces constructivism and allows the methods to vary from behavioral to cognitive, depending upon the needs of the learner (developmental stage, as well as prior knowledge) and the content (including the goals).