We begin with the lived reality of communities, frontline workers, institutions, and programme beneficiaries. Our tools and protocols are designed to listen before they measure — producing evidence that reflects people rather than just indicators. Every instrument is piloted in the field before deployment, refined based on respondent comprehension, and adapted for cultural and linguistic context.
Methodology That Produces Evidence Decision-Makers Can Trust
At AUM Research, approach is not a section of a proposal — it is the foundation of every assignment. Our methodology is grounded in contextual reality, anchored in ethical practice, executed with field-tested systems, and calibrated to produce evidence that is analytically robust and practically useful.
Four Pillars That Govern Every Assignment
These are not aspirational values — they are operational standards embedded in every tool, protocol, field system, and deliverable we produce.
Informed consent, respondent confidentiality, DPDP Act-compliant data handling, and IRB-compatible study design are non-negotiable in every assignment. Our field teams are trained in ethical protocols, our data systems follow strict security standards, and our reporting is transparent about limitations, assumptions, and methodological constraints.
Our portfolio spans agriculture, health, nutrition, WASH, rural development, governance, and education. That depth allows us to interpret field realities within the context of policy frameworks, scheme architecture, and implementation constraints — producing insights that are not just statistically valid but sectorally informed and policy-relevant.
We do not produce research for academic audiences. Every deliverable — from interim reports to final presentations — is structured for decision-makers: government officials, programme managers, and donor representatives who need to act on findings with confidence. We prioritise clarity, actionability, and implementation relevance in every output we produce.
How Evidence Moves From Field to Decision
Context Mapping & Scope Definition
Before designing any study, we invest time in understanding the programme — its architecture, implementation history, stakeholder ecosystem, policy context, and evaluation needs. We review secondary data, conduct scoping interviews, and study existing documentation to ensure our design is grounded in the reality of what is being evaluated.
Study Design & Instrument Development
We develop mixed-method frameworks, sampling plans, structured questionnaires, FGD guides, KII protocols, and observation checklists — all calibrated to answer the actual evaluation questions. Every instrument is piloted, refined, and translated into relevant languages before field deployment.
Field Execution & Quality Assurance
CAPI-based data collection using SurveyCTO or ODK, with GPS verification, audio audit protocols, and real-time quality dashboards. Trained, supervised field teams operating under strict back-check systems. Daily data review and feedback loops to ensure collection quality throughout fieldwork.
Analysis & Triangulation
Quantitative analysis using SPSS, STATA, or R — including descriptive statistics, regression, DiD, PSM, and multilevel modelling as appropriate. Qualitative data coded systematically in NVivo or Atlas.ti. Findings triangulated across methods to produce conclusions that are both statistically robust and contextually grounded.
Reporting & Dissemination
Decision-ready reports structured for their intended audience — not generic academic documents. Executive summaries, policy briefs, visualised presentations, and dissemination workshops where required. We stay engaged through the dissemination phase to ensure evidence is understood and used.
Frameworks We Apply — And Why
We are fluent in the major international evaluation frameworks and apply them contextually — not mechanically. The right framework is chosen based on the assignment purpose, client requirements, and evidence needs.
REESI+C+E Framework
Relevance, Effectiveness, Efficiency, Sustainability, Impact, Coverage, and Coherence. Applied in our KVK Scheme Evaluation for DMEO, NITI Aayog — the standard framework for Indian government scheme evaluation.
OECD-DAC Criteria
Relevance, Effectiveness, Efficiency, Sustainability, and Impact — the internationally recognised standard for development programme evaluation used by bilateral and multilateral donors.
Theory of Change & LogFrame
We develop and validate Theories of Change for programmes under evaluation — and design LogFrames and Results Frameworks for new programmes being designed for implementation.
Quasi-Experimental & RCT Design
Difference-in-Difference, Propensity Score Matching, Regression Discontinuity, and Randomised Controlled Trials — applied where causal attribution is required by the assignment mandate.
Cost-Effectiveness & SROI
Cost-effectiveness analysis and Social Return on Investment (SROI) frameworks for assignments that require assessment of economic efficiency and financial value of development outcomes.
PRA & Community-Based Methods
Participatory Rural Appraisal (PRA), community mapping, seasonal calendars, problem trees, and participatory ranking — for assignments requiring deep community engagement and co-inquiry.
The Systems That Protect Data Integrity
CAPI-Based Data Collection
All field data collected digitally using SurveyCTO or ODK — eliminating data entry errors, enabling real-time submission, and providing instant visibility into collection progress and quality.
GPS Verification
Every interview geo-tagged with GPS coordinates — verifying that interviews were conducted at the correct location and enabling spatial analysis of data collection patterns.
Audio Audit Protocols
Random audio recording of interviews for quality audit — ensuring enumerators follow prescribed protocols, respondents are not coached, and interviews meet duration and completeness standards.
Back-Check System
10-15% of completed interviews re-contacted by supervisors for back-check verification — confirming respondent identity, interview conduct, and data accuracy.
Real-Time Dashboards
Field supervisors and project managers monitor data collection progress, completion rates, and quality flags through real-time dashboards — enabling immediate corrective action.
Data Security & Privacy
DPDP Act-compliant data storage, encrypted transfer protocols, role-based access controls, and secure data deletion procedures — protecting respondent privacy at every stage.
Research becomes evidence only when it is designed to answer real questions, collected with integrity, analysed without bias, and communicated to people who can act on it.
We have delivered over 40 assignments — from villages in Jharkhand to health facilities in Haryana, from panchayat halls in Assam to policy rooms in New Delhi. Each one has reinforced the same conviction: the quality of evidence is determined not by the sophistication of the analysis, but by the integrity of every step that precedes it.
That is why we invest in context mapping before tool design. Why we pilot instruments before field deployment. Why we supervise closely during data collection. And why we stay engaged through dissemination — because evidence without use is not evidence. It is just data.
See How We Apply This →Want to Know How We Would Approach Your Assignment?
Share your evaluation mandate with us and we will outline an approach — including study design, sampling strategy, and methodology — at no obligation.