Teacher Professional Development

“Teacher Learning” in How People Learn: Brain, Mind, Experience, and School: Expanded Edition. (2000).  Commission on Behavioral and Social Sciences and Education. pp. 178-193.

The article defines five (actually four) primary ways that practicing teachers learn:

  1. from their own practice (reflection)
  2. from interaction with other teachers (informal apprenticeship, formal in-service workshops, and professional associations)
  3. from graduate courses
  4. from their roles as parents

Bransford then considers the quality of these learning opportunities from the four lenses previously explored in this book:

  • Learner-centered
    • efforts often fall short as professional development is delivered transmissionally
  • Knowledge-centered
    • efforts focus on techniques and methods but fall short on pedagogical content knowledge
  • Assessment-centered
    • most efforts lack feedback
  • Community-centered
    • efforts are most useful when centered on situated discourse around texts and shared data

In response to these failings, Bransford proposes two solutions. The first–action research–is a social constructivist process in which ideas are collaboratively discussed in a community of learners. Despite reported successes, action research faces difficulty because of the disparity “between practitioner and academic research.” The second solution– a revised approach to preservice education–holds greater promise. Current teacher education programs tend to be a disjointed collection of academic theory (both subject matter and methods) and practicum. Instead, Bransford argues for an integration of courses with classroom practice to overcome the belief that the two are unrelated. A possible solution is an epistemic game which provides education students the opportunity to intellectually integrate the profession into their behavioral repertoire.


Merlot – not just a wine

MERLOT, the Multimedia Educational Resource for Learning and Online Teaching, is a learning object repository (a collection of instructional modules which fit varying definitions of “learning object”). However, MERLOT is more than simply a collection of resources; it also includes communities, not only for disciplines but also for specific learning environments. The most helpful (and unusual among learning object repositories) feature is the inclusion of ratings and reviews of contributed materials.

There are a few design issues with the site itself (it would be unfair to evaluate MERLOT on the basis of design issues with the contributed materials):

  • The scrolling banner at the top can be annoying
  • The site is not XHTML compliant (according to the W3C validator at http://validator.w3.org/#validate_by_uri)
  • Not 508 (WCAG Priority I/II/III) compliant (according to the Cynthia validator at http://www.contentquality.com/)
  • The site itself is somewhat busy:
    • search and advanced search at the top
    • tabbed navigation at the top (and text navigation in the footer which also contains additional non-navigation links)
    • 3-column format with primary navigation in the left column, alternate navigation in the middle, and login and multiple organizational links in the right column
    • no use of boxes that open and close (see http://orangoo.com/labs/GreyBox/)
  • The multiple methods for getting to the learning objects may offer increased utility but seems confusing; for example, to get to materials in Biology, you can:
    • enter search criteria (simple or advanced) in the Search header
    • select Learning Materials in the top tab
    • select Learning Materials from the middle column
    • select Biology community from the middle column
    • select Science collection from the left column

That said. MERLOT is a valuable resource and provides access to more than 20,000 resources. Technically, MERLOT is a referatory (it contains only links to the objects on the servers of their original creators) rather than a repository, but that distinction does not detract from its utility. In addition, while only a portion (20%?) of the objects are reviewed or rated, the fact that ANY evaluations are included set MERLOT apart from other libraries of learning materials and allow users (in the sense of using the objects; contributors are also users) to more quickly search and locate appropriate materials. The advanced search is further enhanced through metadata fields; while the material category is inclusive (if somewhat confusing), the technical category needs work (for example, it includes both Quicktime–a video format–and video) as it mixes specific file types with presentation and distribution technologies.


  • Community
  • Subject
  • Language
  • Material
    • Animation
    • Case Study
    • Collection
    • Drill and Practice
    • Learning Object Repository
    • Lecture/Presentation
    • Online Course
    • Open Journal-Article
    • Open Textbook
    • Reference Material
    • Quiz/Test
    • Simulation
    • Tutorial
    • Workshop and Training Material
  • Technical format
    • Active X
    • Audio
    • Authorware
    • Blog
    • CD-ROM
    • Director
    • Executable
    • Flash
    • HTML/Text
    • Image
    • Java
    • Javascript
    • PDF
    • Podcast
    • Quicktime
    • SCORM
    • Shockwave
    • Video
    • VRML
  • Audience
  • Learning Management System
  • Cost involved
  • Copyrighted
  • Section 508 compliant
  • Source code available

The peer reviews and ratings  use “star” icons (like Amazon); peer reviews are anonymous while the ratings (which include mandatory comments) are attributed. In addition (and perhaps even more helpful), each object also shows the number of personal collections each object belongs to (collection usage) and the number of assignments created around the object and contributed to MERLOT (pedagogical usage). Finally, some objects are also distinguished by “Editor’s Choice” and “Classics” awards.

Creating an account is relatively easy (5 minutes). While identifying an affiliation is required and is used to build the user community, this seems the least useful aspect of the site (connecting users). Comments are supplemented with a Technical Remarks box, a Time Spent box, and a checkbox if the object is used in the classroom (this should be termed, “used in a course” to include online instruction).

Each object links to a material details page with a description and metadata (as well as the link to the object itself). Contributing objects or adding comments or assignments requires creating an account which is free and non-expiring; at present, MERLOT has over 68,000 members. While OER proponents may decry this requirement, it provides transparency; at the same time, this very strength may be a weakness–peer reviews (even though not attributed) and comments are public which may inflate ratings and discourage submission. In addition, contributions of instructional materials, even peer-reviewed contributions, may hold little weight in the academic promotion and tenure process (although at least in Texas, Governor Perry has proposed changing this situation; see http://www.texashighered.com/node/6). As a result, the incentive to contribute or to evaluate contributions may ultimately inhibit the growth of this valuable resource.

ID/Project Management

Mc Daniel, K. and Liu, M. A Study of project management techniques for developing interactive multimedia programs: A practitioner’s perspective. Journal of Research on Computing in Education: 29.1 (1996). 29-48.

The mesh of ID and project management brought a refreshing practicality, and I especially appreciated the additions from Gentry on adoption (all of her supporting processes are essentially PM skills). Greet’s model didn’t add much; it seemed merely a merge of Dick & Carey with Gentry. However, the proposed 5 components were perfectly clear.

I was surprised not to see a discussion of Gantt charts and critical path in the Team Assembly and Management section. In the section on Evaluation, Marketing and Support, I have successfully used the conceptual differences between project management (the team) and project ownership (the client) to effect this critical transfer. If ongoing support is in the business model as a continuing revenue stream, it should be provided on a T&M basis.
The ID models seemed overly complex and somewhat idiosyncratic. However, the common desire for evaluation and revision is more easily accomplished with networked products (especially using agile and rapid prototyping) than with previous static delivery mechanisms.

As far as responsibilities, the following are key in my experience:

  • Keep big picture (outcome) in mind
  • Motivate team
  • Meet deadlines and budget
  • Free members from day to day
  • Break & sequence projects into manageable tasks
  • Use appropriate personnel (bring members along through smaller projects to develop independent decision-making)

However, my experience cautions that meetings should not necessarily “involve as many key players as needed.” Smaller teams may be more successful at problem-solving (see post on 2-player teams outperforming 4-player teams in online PBL). Also, while open communication channels are always essential, the project lifecycle I have found successful has 4 phases:

  1. Brainstorming – everyone talks, all ideas entertained
  2. Defining – everyone offers dependencies, tasks and times negotiated
  3. Monitoring – often, only key players present/talk
  4. Celebrating – offsite if possible, documentation review and lessons learned

ASTD Standards

The design of the certification was interesting: 7 standards are mandatory, and 7 of 12 other standards are selected, depending on the appropriateness for the course being certified. However, the optional standards are divided into two general topical areas–design and instructional design.The required standards are:

  • Navigation
    • standard features such asstart, exit, forward, back, home
    • requirement for save requires login (but absence won’t prevent “passing”)
    • doesn’t address accessibility–text and keyboard equivalents of navigation
  • Launching
    • installation documentation and system requirements in hard copy
    • on-screen guidance for troubleshooting
    • access to a technical support website
  • Legibility
    • for text and graphics on 800×600 screen (seems outdated)l
    • text labels on graphics
  • Objectives
    • specific to skill or knowledge (although examples provided don’t meet Dick and Carey’s “x – criterion – x model)
  • Consistency
    • content maps to objectives, covers all objectives sufficiently, shows relationship among objectives (if applicable), and is parallel in detail across multiple objectives
  • Presentation/Demonstration
    • 2 or more methods used to trigger prior experience
    • presentation to describe flow of new information; demonstration to exhibit new information or skills
    • effective media
  • Practice with Feedback
    • consistent with objectives
    • provides complete directions
    • allows incorrect responses
    • provides relevant, corrective and supportive feedback
    • feedback is gradually withdrawn (scaffolded)

The optional standards are:

  • Orientation
    • indicate learner location
  • Tracking
    • similar to Orientation except that the course tracks which potions have been accessed or completed
    • requires a user login
    • may be problematic with non-linear sequence designs
  • Optional Navigation
    • 3 aspects:
      • additional information such as references
      • learner-defined alternate organization (such as topical versus chronological)
      • bookmarking
  • Support
    • support for navigation, technical issues, and any proprietary functions
    • Help function available for any course location
  • Setup
    • captures user-defined demographic information, system configuration, and learning preferences (such as audio versus text; however, this seems to impact instructional design)
    • mandates login
  • Subsequent  Launching
    • allows learner to return to previous location (or start over at her preference) and saves progress
    • mandates login (or at the minimum a browser cookie which requires each learner to use the same machine)
  • Uninstalling
    • ability to completely remove course from a machine
    • doesn’t apply to We-based
  • Formatting
    • essentially copy-editing and page design:
      • no spelling or grammar errors
      • cross-referenced graphics
      • headings and sub-headings
      • appropriate margins
  • Purpose
    • outcome, audience, and scope explicitly described
    • both knowledge/skills and task/problem defined
  • Facilitation
    • methods facilitate internalizing and synthesizing
    • varied methods: self-directed (readings, individual problems) and collaborative (group cases, role-plays)
    • clear guidance (guidance can take the form of multiple representations and differing viewpoints)
    • guidance is gradually decreased (scaffolded)
  • Engagement
    • more than one technique (questions, humor, metaphor, narrative, cues, etc.)
    • directly connected to content
    • appropriate for audience
  • Assessment
    • valid (defined as linking to the intent of the objectives and covering the same content in the same way)
    • provide guidance and/or feedback

One aspect that concerns me: for an online course that requires no login for access, four optional standards are not possible which effectively prevents such a course from being certified. One funny error: the glossaries for standards 10 and 11 list ADA as the American Dental Association, instead of the American for Disabilities Act.

Planning (actually activity evaluation)

The first thing that struck me about the Wiggins chapter was that the opening Chinese proverb was wrong: if I’m an auditory learner, then I remember what I hear and forget what I see. This called to mind the statistic, “We remember 10% of what we hear, 20% of what we read…” which I’d always taken as gospel until a blog post dispelled for me this learning myth. It makes me wonder how many other myths we need to exorcise.

This chapter gives us yet another acronym, although in this case, I sort of like it with the exception noted below:

  • W – Where from and Where to ties in past knowledge and states objectives
  • H – Hooks their attention (Gagne step one)
  • E – Equips them with tools
  • R – Reflect and Revise (and thus self-evaluate) opportunities
  • E – (same as above)
  • T – Tailor (to individual contexts and learning styles)
  • O – Organize (by providing schema–networks and systems–and patterns/images)

Even better than the WHERTO analytical tool, I liked the characteristics:

  • Performance goals
  • Hands-on (real-world immersion)
  • Feedback (trial and error)
  • Personalized
  • Models and modeling (narrative)
  • Reflection
  • Variety

The discussion amplified these ideas with examples, although many of the practical suggestions were offered in previous original source readings. I did get a great idea: make lectures available in the library but require students to check them out in pairs and discuss them together. I do agree with Wiggins that direct instruction is only one of many learning activities.