‘Knowledge and Data’

Samuel Gerstin
3 min readOct 5, 2019

Over the past year, I have dropped using the term ‘data’ unilaterally. Instead, I prefer to go with ‘knowledge and data.’ The principle reason behind this is our common understanding of ‘data’ has become a bit warped, in that it tends to infer information processed into manageable form; i.e., numerical, categorical, graphical or otherwise visually symbolic. This understanding of ‘data,’ while expansive, nevertheless glosses over the base idea of ‘data’ as merely information. Certainly processed, but also unprocessed.

What is unprocessed information? Most often, I’ve found it to be the words people say (although it is also of course non-verbal cues — actions, demeanor, expression and tone).

How is unprocessed information most casually and honestly expressed? As individuals reflect and share experiences, observations, comments and questions.

Interpreted this way, ‘unprocessed information’ seems to simply be describing people’s general knowledge. Which is precisely the point. To return to the world of MEL, when we speak of our role in assessing the influence and outcomes of our work, it is incumbent on us to appreciate the information we use to make this assessment comes in both processed and unprocessed form. All too often we subdue the latter, implicitly but also explicitly (i.e., not articulating our strategy for gathering unprocessed programmatic information as part of our formal work plans); the issue is, of course, we are losing a vast amount of the story. Alternatively, if we begin to plan for ‘knowledge and data collection,’ we pay much closer attention to the day-to-day experiences of our colleagues and ourselves, and correspondingly we are assessing** what is likely closer to the full gamut of information produced by the program.

Taking this a step further. It may sound as though I am advocating increased emphasis on traditional qualitative assessment, notably purposive interviews and focus group discussions. While true, these scenarios still provide us relatively processed information, as qualitative coding methodologies compartmentalize people’s words and actions — particularly when there are many people — for (again) more manageable assessment.

I do not discount these methodologies, but they nevertheless distance us from even less processed sources of information: what about team meetings, casual conversations, informal feedback, the daily words and actions of program staff, partners and stakeholders? And more…

These sources, alongside our planned scenarios for qualitative and quantitative information, make up the full universe of program data.

I’ll leave with a practical example. Under our Freedom House Human Rights Support Mechanism (HRSM) program, we comprise nearly 15 individual activity teams spanning the globe. A diversity of sectors, genders, disciplines, regions, networks and professional experiences. Following our MEL role, and tasked with articulating ‘knowledge and data’ — or simply put, the story of our program — the most meaningful method we have envisioned is an online platform and accompanying email group. While my team provides some direction in terms of prompts and thought questions, the true utility of the platform is it brings an opportunity to gather unprocessed information from a wider contingent of HRSM colleagues than were we to rely merely on quantitative indicator data (gathered, interpreted and reported on by only certain individuals), remote interviewing and surveying (often accessible only to managers and senior staff) or even in-person visits and activities (infrequent, and can discourage openness following the sense of being in the ‘spotlight’). Indeed, our platform is a space where HRSM colleagues, on a moment-by-moment basis, as that reflection, observation, comment or question comes to mind — as that knowledge is formed — might share this data. (To be fair, that the platform is online reduces our ability to pick up non-verbal cues).

So hopefully, the term ‘knowledge and data’ is a gentle reminder that from a MEL point of view, we have quite a bit of information to tap into. (This is not a suggestion we should be wiretapping each other).

**Doesn’t this sound as though I am describing the research process of a developmental evaluator?

--

--