Instructional Design is my vocation. And while professionals in the learning industries know what that means and respect the skills of instructional designers, few people outside of education and training know what I mean when I use the term.
So, I added other titles to my business card to explain myself professionally: Knowledge Wrangler, Creative Catalyzer, and Learning Sherpa. They all sound just odd enough to spark interest and additional queries, but the thing is, these “job titles” really do cover a lot of the unexplained ground involved in instructional design. Additional explanations can be helpful, though, so I will tell you ore about these three choices in a short series of blog posts.
A Knowledge Wrangler has to root out the sources of information, and, yes, knowledge, about a specific topic. While it is not difficult to get a lot of information on many intended topics, I’m using the term knowledge to refer to the…um, knowledge, that an individual possesses that enhances their understanding and application of any related information. Sources of knowledge might include an individual who is a subject-matter expert or a team of experts, but should also include other stakeholders, too, to provide a more fully-rounded picture.
These human sources of knowledge are often either overlooked, or under-utilized, never plumbing the depths of their extensive knowledge. Sometimes, they cannot even put it into words, so observational analysis often adds to the knowledge. Before I design instruction, I want to wrangle up as much knowledge (which is far richer than information) as can be mustered within the constraints of the project.
Basically, it’s data collection — and beyond.
Let me explain with an example. The MHW Project sought to examine the role of the Mental Health Worker providing direct care to people receiving mental health services. An entry-level job with minimal requirements (18 or older, high school diploma or GED or higher), it’s a job held by thousands of workers across the state of Texas. We wanted to know how to improve their initial training.
We used 3 main data collection techniques: document review, interviews, and what we called “focus groups.” Actually, though, those were rapid-fire answer collecting sessions ore than traditional focus groups. A facilitator and a scribe worked together, asking simple questions in 2 main categories: What is a Mental Health Worker (roles and responsibilities) and how do you learn to become one?
The facilitator asks a question, pushes for an answer, clarifies the wording so the scribe can write it down on a flip chart — and then they immediately press people to offer more answers, accepting all appropriate answers. By writing the answers verbatim (or asking if a shortened phrase or rewording is accurate) builds trust in the group, promoting further participation.
As a flip chart sheet gets filled, post it to a wall so it remains visible. The facilitator must manage the time while ensuring maximum data is collected. The point is to keep the pace moving, involve as many individuals as possible, and accumulate enough input for thematic analysis at a later time. Either during or after a section, note to the group how they have covered entire walls with the information they have supplied. The flip chart sheets can be rolled up and tabulated for analysis at a later time.
We used 3 groups of employees for these sessions. One group had veteran MHWs who had been on the job more than a year. These people know the job better than the newer workers. The second group, new employees, had been hired and trained within 6 months. These workers were more familiar with how they were trained and how they learned the job. A third group of managers was brought in to provide the perspective of what job and training expectations looked like from their viewpoint.
Additionally, I conducted semi-structured interviews with administrators: the head of staff development, to talk about the training program as is; the head of nursing services for an overview of how the MHW role fits into the treatment team; and the hospital administrator to describe how the MHW fits in the Big Picture fo the entire facility.
Document review included policies & procedures, training programs and materials, and any additional communication regarding the role of the MHW and how they are trained. These should align well, or they need to be adjusted.
These are fairly standard data collection activities. “Knowledge wrangling” extends the reach of these, firstly by choosing appropriate data collection activities, and then by shining a light on both areas of commonality & alignment as well as gaps in understanding. Often, the knowledge I am seeking to wrangle exists in that in-between space of correlations between disparate data. Often, it emerges as someone further explains a particular finding or comment, revealing tacit knowledge left unspoken.
I once implemented unit-based instructor teams for a risk management training program, a clearer example of knowledge wrangling in action. The instructor team empowered knowledgeable individuals distributed throughout the workplace to disseminate the required training. That’s nothing special — that’s just training. The knowledge wrangling involved using those distributed instructors to inform the training program with real-life practice, thus working to align training with practice.
Sometimes, knowledge wrangling involves formalizing a setting for knowledgeable people to more informally share their experiences and understanding. That was the case for the J-NET project, a series of audio conference calls among hospital staff development directors held to prepare them for a round of accreditation visits from the Joint Commission for the Accreditation of Healthcare Organizations. JCAHO visits are routinely nerve-wracking, concerns worrying administrators months in advance. In truth, the Staff Development Directors were each in good shape, but nervous nonethless. By creating a space where they could more directly talk with each other — uncommon in their daily jobs — to share insights and experiences. we only held about 6 of these, but they helped assuage the concerns and ease the accreditation process for all.
In each of these instances, the people surfaced more knowledge than others may have been aware. By sharing what they knew but had not spoken of, they increased the overall knowledge available to the group. These 3 projects are but small examples of ongoing activity of knowledge wrangling involved in instructional design.
Next: Creative Catalyzer