Skip to content

Proposal Structure & Recommendations

Analysis by Cline — February 2026


What I Understand About the Situation

The current Proposal.md is the wrong document. It is the 1995 MCI conferencing proposal with the company name swapped and almost nothing else changed. It talks about ISDN BRI, videoconferencing subscriber bases, and keeping "ARDC competitive in multimedia conferencing." That is not just dated — it is a completely different project. The bones are good (decision-tree objectives, parallel qual+quant workstreams, phased delivery) but the content needs to be replaced, not edited.

What ARDC actually wants (translated from their two documents): They are a grant-making organization trying to decide where grants can move the needle. They need to understand who is in the amateur radio / digital communications ecosystem, why people join and stay (or leave), what messaging works, and what technologies and programs are worth investing in. Their internal planning is in early stages — Chelsea/Merideth have a preliminary outline but nothing close to a full research program.

What Jim envisions: A full professional market research engagement — rigorous, phased, mixed-method — that ARDC could not organize internally. Two primary workstreams (licensed community leveraging FCC ULS; non-licensed community via alternative recruitment). Expert interviews as a third thread. Governance with decision gates. Options and tiers. A longitudinal design that plants seeds for an ongoing data-driven decision-making culture. And Jim's personal credibility as the qualified practitioner who understands both the research discipline and amateur radio deeply.

The ChatGPT review is the strongest editorial guide available. Its core insight: preserve the MCI20 logic (decision-use framing, objectives as a decision tree, mixed-method flow) but make the packaging unmistakably 2026 — artifact-driven, deliverable-first, governed by decision gates, with explicit validity and bias controls.


The Core Argument the Proposal Must Make

ARDC is making multimillion-dollar grantmaking decisions without knowing who the community actually is, why people engage or disengage, or what moves the needle. This research closes that gap — delivering specific, actionable intelligence at each milestone, not just at the end. Jim brings the research expertise AND the ARDC relationship to do this right.

Everything in the proposal should serve that argument.


Home (index.md)
├── Proposal
│   ├── At-a-Glance           ← 5-min read; decision-oriented exec summary
│   └── Full Proposal         ← main narrative document
└── Appendices
    ├── A: Questions & Measurement Map
    ├── B: Sampling & Statistical Accuracy
    ├── C: Bias Management
    ├── D: Longitudinal / Retention Modules
    ├── E: Draft Survey Instruments
    ├── F: Cognitive Testing & Pilot Protocol
    ├── G: Analysis Plan
    ├── H: Reporting Templates
    └── I: Fielding Options & Partner Organizations

Note on modules/: The M0–M8 module files exist on disk but are not in the mkdocs nav. Recommend folding their detail into the appendices rather than surfacing them as a separate nav section. This keeps top navigation clean and the proposal readable.


The narrative arc follows a modern consulting proposal logic: establish the stakes, map the decisions to be made, describe the populations, walk through the research program, prove methodological rigor, then show what ARDC gets and what it will cost.


Section 1 — Executive Summary

"What you'll know. What you'll be able to decide. How we'll get there."

  • ARDC's current information gap and why it matters at the grantmaking level
  • Three outcomes this research delivers: participation mechanics understood, investment levers identified, messaging grounded in data
  • Research will yield findings at measurable milestones — not just at the end
  • Introduce the idea of building a data-driven decision culture (longitudinal value, briefly)
  • Plant the capabilities seed here: one energizing sentence naming the concept — e.g., "Beyond the research findings themselves, this engagement builds three durable organizational capabilities that change how ARDC operates for years to come." No detail at this stage — just the promise, with depth delivered later in the proposal.

This replaces the current introduction entirely. It should be tight — 3 to 4 paragraphs max — and read like a confident executive briefing, not a cover letter.


Section 2 — ARDC's Goals & This Project's Role

Reflect the client's vision back, amplified and contextualized

  • ARDC's strategic framework: Learn → Experiment → Do
  • The grantmaking challenge: how do you allocate wisely without knowing where the leverage is?
  • Specific research questions ARDC has already identified (drawn from their documents, translated into research objectives)
  • The scope question (technology-focused AR vs. broader AR ecosystem) — name it explicitly, commit to resolving it in the Charter phase rather than dodging it
  • Connect the three organizational capabilities here: this is where the brief exec summary promise gets its first real argument. Show how each capability directly addresses a dimension of ARDC's grantmaking challenge — the shift from reactive to proactive grantmaking is especially resonant here, as it speaks directly to how ARDC can evolve its role in the community. Keep this to a paragraph; the full treatment comes later.

Section 3 — Research Objectives: The Decision Map

Modernized version of MCI20 Section II — the strongest structural element to carry forward

Reframe MCI20's "Product Development / Marketing" categories into ARDC's language:

Decision Domain Core Research Questions
Participation Mechanics Who joins, what drives them in, what causes disengagement at each lifecycle stage?
Adoption & Retention Drivers What accelerates or blocks deeper involvement — as a ham, experimenter, or technologist?
Messaging & Perception What does amateur radio mean to non-hams? What narrative works for which audience?
Investment Levers Where can grants and programs measurably change outcomes for ARDC's strategic priorities?

This section turns vague goals into specific, measurable parameters — exactly what MCI20 did well with its "information requirements" logic.


Section 4 — The Research Populations

This section does not exist in MCI20 and is critical for ARDC

Five segments, each described with: who they are, why they matter to ARDC, and how they will be reached:

  1. Currently licensed hams → FCC ULS, mail-to-web invitation
  2. Previously licensed / lapsed hams → FCC ULS historical records
  3. Adjacent populations (STEM educators and students, maker community, Scouts, youth STEM orgs) → partner organizations and online panels
  4. General public → online panels for awareness and perception research
  5. Digital communications community (non-ham) → targeted recruitment via tech communities and panels

Introduce the FCC ULS advantage: a unique, publicly available sampling frame for licensed hams that gives this research a structural head start no general research firm would typically have.

Introduce license cohort targeting within the licensed population — the 10-year FCC license term creates natural research sub-populations: - Recently licensed (first 6 months) - Mid-term active - Approaching renewal window - In grace period - Lapsed post-grace

→ Diagram 1: Population Segmentation — five segments with defining characteristics and primary access method

→ Diagram 2: FCC License Cohort Map — showing the 10-year license lifecycle and where cohort targeting applies


Section 5 — Research Program Overview

The "what we're doing" at a glance before diving into each part

Two primary workstreams, sequenced strategically: - Workstream 1 (WS1): Licensed & Lapsed Licensees — first priority, FCC ULS-based - Workstream 2 (WS2): Adjacent & General Public — staggered to allow WS1 learnings to inform WS2 design

Plus Expert Interviews as a parallel thread throughout the project.

Key operating principles (brief, bulleted): - Qual before quant: qualitative research informs survey design and ensures closed-ended questions don't miss important responses - Parallel workstreams: survey planning begins during the qual phase to compress total timeline — same structure as MCI20, which was right - Decision gates: no phase proceeds without stakeholder sign-off - Interim deliverables: ARDC receives findings packages at major milestones, not just at the final report

→ Diagram 3: High-Level Program Flow — both workstreams plus expert interviews, flowing to deliverables over time


Section 6 — Phase 0: Planning & Charter Alignment

"Everyone in sync before a dollar is spent on data collection"

This phase is the foundation. Its output (a signed Charter Document) is the gate that starts everything else.

Agenda for Charter phase: - Stakeholder alignment: translate ARDC goals into specific research questions → information requirements (the MCI20 logic, preserved and modernized) - Scope clarification: resolve technology-focus vs. broader AR ecosystem question - Sampling plan: target segment sizes, expected response rates, sample requirements - Respondent differentiators: demographics, econographics, sociographics, license class — what cross-tabulations will ARDC want? - Bias inventory and mitigation commitments - Instrument design principles and cognitive testing protocol - Longitudinal design commitments: define the "repeatable question bank" before writing the survey - Governance agreement: agile cadence, sprint structure, status reporting, change request process - Interim deliverable agreement: what ARDC receives at each major milestone

Deliverable: Charter Document (signed by ARDC before proceeding to Phase 1)


Section 7 — Workstream 1: The Licensed Community

The highest-priority workstream; the most detailed section of the proposal

Qualitative Phase

  • Focus groups (remote-first, optional in-person components)
  • Open-ended survey casting a wide net across the range of experiences
  • Purpose: surface the vocabulary, range of responses, and attitudinal landscape before designing closed-ended questions
  • Deliverable: Qualitative Findings Report — standalone, early milestone deliverable

Survey Design (runs in parallel with qual phase)

  • FCC ULS outreach planning: data hygiene, NCOA address updates, deduplication
  • Mail-to-web invitation model with embedded unique participant identifiers (e.g., QR codes with unique IDs)
  • Survey branching logic based on engagement and satisfaction:
  • Path A: Actively involved, enthusiastic
  • Path B: Involved but with waning engagement or mixed experience
  • Path C: Inactive / disengaged
  • Cohort-specific modules for key sub-populations: newly licensed, approaching renewal, in/post grace period
  • Qual findings incorporated into final survey instrument design

Instrument Validation: Cognitive Testing & Pilot

  • Cognitive interviews: do respondents understand what we are asking?
  • Pilot fielding: small-scale run to test flow, completion rates, and data quality
  • Revisions based on pilot learnings
  • Gate: Instrument Gate — approve revised instrument before proceeding to main fielding

Main Survey Fielding

  • Sample drawn from FCC ULS license pool (10–12 year window)
  • Targeted cohort sub-samples as warranted by goals
  • Response tracking, reminder waves, nonresponse analysis
  • Interim: Topline results report at fielding close

Analysis & Reporting

  • Statistical modeling of licensed population
  • Segment-level analysis and comparison
  • Interpretation and actionable recommendations
  • Deliverable: WS1 Full Report + Decision Kit (WS1 component)

→ Diagram 4: WS1 Detailed Flow (qual → instrument design in parallel → pilot → fielding → analysis → report)


Section 8 — Workstream 2: Beyond the Licensed Community

A parallel but sequenced effort targeting the non-ham world

  • Adjacent populations: STEM programs, maker community, Scouts, youth tech orgs — people with higher propensity to find amateur radio appealing
  • General public: baseline awareness, perception, and interest — what percentage know what AR is, who is "convertible"
  • Digital communications non-hams: people already interested in the technology but not through amateur radio

Recruitment approach (different from FCC ULS — online panels, partner organizations, social channels).

Timeline: staggered to incorporate learnings from WS1. Helpful insights about which adjacent pathways matter are expected to emerge from WS1 qualitative data.

Structure follows the same qual → instrument → pilot → survey → analysis arc, adapted for each sub-segment's access approach.

→ Diagram 5: WS2 Approach — recruitment paths and sequencing


Section 9 — Expert Interviews

Structured interviews with key voices in the amateur radio ecosystem

Who we will talk to: - Volunteer Examiner (VE) community — frontline observations of who is getting licensed and why - License exam prep educators (Hamcram, HamStudy, Gordon West, etc.) - Club leaders with active onboarding and Elmer programs - ARRL leadership with responsibility for recruitment, engagement, and retention - ARISS program leadership — the pipeline from youth STEM to ham licensure - Digital comms success stories: Meshtastic, Winlink, JS8Call, others - Technology innovators and ARDC grant recipients — what motivated them, what they needed

What the interviews will produce: - Practitioner-level observation of lifecycle dynamics - Integration with survey themes for triangulation and richer interpretation - Identification of organizations and individuals relevant to ARDC's priority funding areas


Section 10 — Governance & Decision Gates

The modern project management framework the ChatGPT review correctly identified as a gap in MCI20

Five named gates form the spine of the governance model:

Gate What Gets Decided Who Approves
Charter Gate Goals, scope, plan, segment targets ARDC + Jim
Instrument Gate What we are measuring and why; finalized survey instruments ARDC + Jim
Pilot Gate Results of cognitive testing + pilot; approved revisions ARDC + Jim
Fielding Gate Go / no-go on main survey launch ARDC + Jim
Readout Gate Findings accepted; recommended actions and owners assigned ARDC + Jim

Between gates: agile sprint cadence, regular status reports, defined change request process.

→ Diagram 6: Decision Gate Model


Section 11 — Methodology Integrity: Validity & Bias Controls

This section builds trust with ARDC's analytically skeptical stakeholders — and differentiates this proposal from a generic market research pitch

Key topics (brief in the proposal; deep treatment in Appendices B and C): - Sampling frame for each workstream — what is included, what is excluded, and why - Response bias risks: who is most likely to respond vs. ignore an invitation? How do we account for that? - Weighting approach for population representativeness - Nonresponse analysis: diagnosing who we missed and what it means for interpretation - Cognitive testing and attention checks - Deduplication and data integrity procedures - Transparency about limitations in the final report


Section 12 — Deliverables: What ARDC Receives

The "Decision Kit" — artifact-driven and milestone-mapped

Milestone Deliverable
Charter Phase Signed Charter Document + approved information requirements
Qual Phase (WS1) Qualitative Findings Report (early milestone)
WS1 Survey Topline findings brief; Full WS1 Report; Dataset + codebook
Expert Interviews Expert Interview Synthesis Report
WS2 Survey WS2 Report; Integrated cross-segment analysis
Final Decision Kit (see below)
Optional Interactive data dashboard

The Decision Kit (final integrated deliverable): - Topline findings + recommended actions - Segment profiles and personas - Messaging guidance by audience - Investment lever analysis: where programs and grants can move the needle - Dataset + codebook - Methods memo - Repeatable question bank for future waves - Executive readout presentation deck

Longitudinal Design Component: Before the project closes, a formal recommendation for ongoing tracking — cohort following, annual cross-sections, and a survey instrument designed for repeat use — giving ARDC the infrastructure to make this a recurring practice rather than a one-time study.

→ Diagram 7: Deliverables Timeline / Milestone Map


Section 13 — Building a Data-Driven Decision Culture

Brief but important — the vision section that extends the value of this project forward in time

  • Research as ongoing practice, not a one-time event
  • Cohort tracking: follow groups of licensees as they progress through the amateur radio journey over years
  • Annual cross-sections: survey similar populations each year to detect whether and how the community is changing
  • Connecting the research program to ARDC's grants strategy cycle — informing prioritization, evaluating program outcomes, and identifying emerging needs before they become missed opportunities

The point: ARDC does not just get a report. They get the foundation for a data-informed organization.


Section 14 — Scope & Program Options

Tiered structure, giving ARDC meaningful choices without overwhelming them

Core Program — WS1 only (licensed and lapsed community) - Qual phase + main survey + analysis + WS1 Decision Kit component - The must-do foundation; delivers the highest-value intelligence first

Plus Program — Core + WS2 + Expert Interviews - Adds the adjacent population and general public research - Adds structured expert interview program - Integrated cross-segment analysis and full Decision Kit

Max Program — Plus + Longitudinal Design + Dashboard + Repeat Wave Planning - Adds interactive data dashboard - Adds formal longitudinal design and repeatable question bank documentation - Adds planning for a follow-on wave 12–18 months later

Implementation Team Options (dimension of optionality): - Jim-led with external fielding partner for participant outreach and data collection (recommended for WS1 given FCC ULS complexity; firms like NORC, Westat, or Qualtrics panel services) - Collaborative model with ARDC team participation (Chelsea/Merideth in appropriate supporting roles)


Section 15 — About Jim Idelson

Not a resume — a "why this person for this project" argument

  • Market research methodology: survey design, sampling, qualitative methods, analysis, reporting
  • Writing, presentation, and synthesis for executive audiences
  • Technology strategy background (the "learn/experiment/do" framing resonates deeply)
  • Amateur radio knowledge and community membership
  • Long ARDC tenure: known, trusted, mission-aligned — ARDC participants are more likely to engage with outreach from someone in the community
  • Specific examples and further detail available as attached appendix

Section 16 — Timeline & Investment

  • High-level program timeline by option (Gantt or milestone table via Mermaid)
  • Cost structure: what drives cost, what can be controlled
  • Investment summary by program option tier
  • Cost/risk tradeoff framing: what does ARDC risk by not having this information?

Planned Diagrams (8, all via Mermaid)

# Diagram Location
0 From Anecdote to Action: The Research Value Chain Section 2 (after paragraph 1)
1 Population Segmentation Section 4
2 FCC License Cohort Map Section 4
3 High-Level Program Flow Section 5
4 WS1 Detailed Flow Section 7
5 WS2 Approach Section 8
6 Decision Gate Model Section 10
7 Deliverables Timeline / Milestone Map Section 12

Diagram 0 — From Anecdote to Action: The Research Value Chain

Purpose: Hook the reader early. Shows the transformation from ARDC's current information state (anecdotal, informal, incomplete) through the research program to the outputs that enable better decisions and higher-impact grantmaking. The message in one line: structured research turns gut feel into grantmaking confidence.

Suggested structure (left to right flow):

  • Left column — Current State: "Anecdotal observations," "Grant applications from those who already know ARDC," "Community insiders' impressions," "No baseline to measure against"
  • Center column — The Research Program: Labeled simply as this project (phased, rigorous, mixed-method) — does not need full detail here
  • Right column — Outputs: "Who the community actually is," "What drives engagement and dropout," "Where grants move the needle," "Baseline for measuring impact over time"
  • Bottom label on the right: "Higher-impact grants. Every dollar working harder."

Design notes: Clean and minimal. Three-column flow diagram or a simple left-to-right arrow progression with grouped nodes. Avoid clutter. The visual should reinforce the argument made in Sections 1 and 2 prose, not repeat it. Mermaid graph LR is the right format.


Beyond the Report: Research as Organizational Capability

The research will produce a set of findings, reports, and a Decision Kit — but its deepest value is what it makes possible for ARDC as an organization. Three durable capabilities emerge from this work that change how ARDC operates long after the final report is delivered. These should be introduced briefly in the Executive Summary (Section 1), connected to ARDC's grantmaking challenge in Section 2, and given their full — but concise — treatment here, after the deliverables are described. Eyes should light up at their mention; deep elaboration is beyond the scope of this proposal.

Capability 1 — Evidence-Based Grant Strategy: Selection Framework and Impact Baseline

The research produces evidence-based criteria for evaluating inbound grant proposals against demonstrated, high-leverage gaps in the amateur radio ecosystem. Equally — and this is what distinguishes it from a one-time study — it establishes a baseline that allows ARDC to measure whether past and future grants actually moved the needle. Did a program designed to engage newly licensed hams improve retention rates? For the first time, that question will be answerable. An early, practical application: testing the framework against recent past grants to see how they score. Note: what this engagement delivers is the baseline and the framework — "Grant Evaluation Criteria Framework: evidence-based scoring dimensions derived from research findings" — as a named Decision Kit output. The actual longitudinal measurement happens over time.

Capability 2 — Proactive Grantmaking: From Reactive to Strategic

This is likely the capability that resonates most immediately with ARDC leadership. Today, grantmaking is largely reactive — evaluating proposals as they arrive. The research will identify specific, addressable gaps that no current grant program serves: underserved segments, lifecycle stages where engagement breaks down, technology areas with demonstrated need and insufficient investment. These become the basis for designing targeted grant initiatives ARDC actively publishes and promotes. The shift is from evaluating what comes in to leading where the field goes — a fundamentally different organizational posture, and one the research makes possible for the first time.

Capability 3 — ARDC as Knowledge Authority on Amateur Radio

Published research — white papers, community-facing findings summaries, presentations at amateur radio conferences and academic forums — positions ARDC as the most credible, data-grounded voice on the state of the amateur radio and digital communications ecosystem. Community trust grows when ARDC is seen as investing not just in grants but in the knowledge base the whole community benefits from. This is also a credibility asset for future grantmaking: applicants and partners take ARDC more seriously when its decisions are visibly grounded in rigorous research.


Things to Flag / Resolve Before Writing

1. The At-a-Glance page should be designed to stand completely alone. Someone who reads only that page should know what ARDC gets, why it matters, and what it costs at a high level. Think of it as the leave-behind for a 10-minute pitch meeting. It is not a table of contents — it is a decision brief. The three organizational capabilities above should appear on this page, concisely, as part of the "what this makes possible" framing.

2. ARDC's technology scope question should be surfaced and named in Section 2, not deferred or softened. The proposal should acknowledge the tension (ARDC's technology focus vs. broader AR ecosystem) and commit to resolving it as the first act of the Charter phase. Naming it is a sign of sophisticated engagement with the client's actual situation.

3. Jim's section (Section 15) should be short and confident — not a resume. The key argument is: unusual combination of research expertise, amateur radio community membership, and established ARDC trust makes this a better match than any outside firm. That is the differentiator.

4. Costs and timeline (Section 16) are the biggest unknown. Before writing this section, Jim needs to define rough numbers: daily rates, expected time investment by phase, external partner cost estimates. The proposal can show ranges and assumptions rather than precise quotes — but it needs something here or the section is a placeholder. This is the one item to discuss before writing begins.

5. Modules (M0–M8) on disk can stay as working notes but should not appear in nav. Their content will be absorbed into the proposal and appendices as we write.

6. Personal Information HandlingRESOLVED. Tiered consent model:

  • Default: Responses anonymized at ingestion; data retained only for project duration; no individual-level data shared with ARDC; respondents informed of this through invitation language and survey introduction.
  • Opt-in tiers: Participants may affirmatively choose to: (a) allow their responses to be shared in identifiable form for specific purposes (e.g., being quoted); (b) be named or attributed in published findings. Opt-in is separate from participation — declining does not affect inclusion.
  • Expert interviewees: Different default. Experts are named participants by nature; default is that their names and professional affiliation are associated with the project and findings, with the option to request specific statements be attributed or off-record.

This tiered approach should be described in Section 11 (Methodology Integrity) and reflected in participant invitation materials. A brief plain-language summary of the default protections belongs in the survey introduction.

7. ARDC's Open Source StanceRESOLVED. Middle-ground position that aligns with ARDC's values while preserving Jim's ability to reuse tools and techniques in future engagements:

  • Open: Anonymized/aggregated raw data published openly (under Creative Commons or similar); survey instruments and codebook published openly; methodology documentation published openly. These are community assets.
  • Proprietary: Analysis tools, code, and workflows remain proprietary. Jim retains the ability to use similar tooling for future clients without licensing constraints.
  • Framing in the proposal: Position this as open findings, open instruments, open data — emphasizing what ARDC and the community receive openly — without volunteering the tool/code distinction unless asked. The open data commitment is the differentiator that matters to ARDC's constituency; the tool/code question is unlikely to be raised. Connect the open data posture to Capability 3 (ARDC as Knowledge Authority) and ARDC's open source ethos.

8. Role of AI in This ProjectRESOLVED. Starting position, to be iterated:

AI assists with analysis, synthesis, draft writing, diagram design, and coding tasks. All AI-assisted outputs are reviewed and verified by Jim before delivery. AI use is disclosed in the methods memo as part of transparency commitments. Human judgment remains primary at every decision point in the research — instrument design, interpretation, recommendations, and quality review.

This statement should appear in one place only in the proposal — the Methodology Integrity section (Section 11) — as a brief, matter-of-fact disclosure, not a lengthy defense. Tone: confident and transparent, not apologetic.

9. The Non-Amateur Radio Digital Communications PathwayRESOLVED. WS2's framing clarified significantly:

  • WS1 is AR-focused: Licensed and lapsed licensees, FCC ULS-based. The research question is about the amateur radio lifecycle.
  • WS2 is non-ham focused — with a crucial distinction: The goal is NOT simply to identify potential future hams. DC participation and engagement are valued outcomes on their own terms. Someone who becomes deeply involved in open-source digital communications software, mesh networking, or SDR development without ever pursuing a ham license is a successful outcome for ARDC's mission. WS2 research should be designed to understand what this community needs, what motivates them, and where ARDC can have impact — whether or not that path leads to a license.

Writing implications: Section 8 (WS2) description needs to be reframed away from "path to licensing" and toward "understanding the DC community on its own terms." Section 2 should name both AR and DC as co-equal strategic pathways. Section 4 Segment 5 needs fuller, more specific treatment. The Research Objectives table (Section 3) should reflect DC as fully as AR.

10. Geographic Scope — Acknowledging US Centricity and Creating a Path to Globalization — The current proposal is inherently US-centric: FCC ULS is a US federal database, the mail-to-web approach assumes US postal addresses, and the "adjacent populations" framing (STEM programs, scouts, maker community) is largely a US construct. ARDC, however, is global by intent — they fund projects internationally, their board is increasingly internationally diverse, and amateur radio exists in every country, each with its own licensing authority and callsign database. Before writing, we need to address this tension honestly rather than papering over it. Proposed approach: (a) Frame WS1 explicitly as "US Licensed Community Research" — be accurate about the geographic scope rather than implying comprehensiveness we cannot deliver; (b) Propose a clear path to globalization: Phase 1 (this proposal) builds the US foundation and the reusable research framework; Phase 2 (future, optional) extends to international populations through partnerships with national amateur radio societies (the equivalents of the ARRL in other countries), IARU coordination, and international digital comms community recruitment; (c) For the expert interview program, include international voices from the outset — IARU representatives and leaders of major national societies in regions important to ARDC's global strategy cost relatively little to add to the interview roster; (d) WS2 and DC community research is more globally accessible than WS1 — online panels and DC community forums have no geographic barriers in the same way FCC ULS does — so WS2 can have a more genuinely global flavor even in Phase 1; (e) The research frameworks produced (segmentation model, survey instruments, grant evaluation criteria) should be designed to be portable and replicable in international contexts from the start, not retrofitted later.

11. Jim Idelson Background Bundle — Example Projects Appendix — A credibility-building appendix (or set of appendices) presenting selected prior engagements that directly illustrate the methodological competencies being proposed for ARDC. Candidate projects:

  • Mixed-method market research for a major global telecommunications provider: focus groups + survey research to understand user requirements and translate them into product and service development decisions. Conducted in the context of significant business growth ($90M→$500M). Direct methodological parallel to WS1 + WS2 of the current proposal.
  • Eight-year longitudinal salary and satisfaction survey for the voice, video, and data conferencing industry: annual study conducted in collaboration with a major industry user group, with global scope and data captured in conjunction with an annual conference. The strongest single credential for ARDC's longitudinal tracking ambition.
  • Mixed-methods study of videoconferencing utility and satisfaction in large global corporations: hundreds of 1:1 video interviews with primary stakeholders and thousands of user-level respondents worldwide. Used to improve user experience, operational efficiency, and strategic value assessment. Demonstrates scale, global execution, and mixed-method rigor.
  • Ongoing market temperature monitoring via regular surveys of a major SaaS Customer Data Platform: continuous feedback loop design, demonstrating the "research as practice" model ARDC is being invited to adopt.

The commercial vs. non-profit tension: This concern is real but manageable. The risk is that some board members react to corporate-context credentials as incongruous with ARDC's public benefit charter. The response: frame these as methodology demonstrations, not sector credentials. Lead each example with the research question and approach, not the company name or financial outcome. Add a brief framing sentence in the appendix introduction — something like: "These engagements were conducted in commercial settings; the research discipline and decision-support orientation they demonstrate are directly applicable to ARDC's needs, now applied to a public benefit mission." The $90M→$500M growth metric should be replaced by a framing that parallels ARDC's context: supporting high-stakes, high-investment decisions under uncertainty. The longitudinal study is the most compelling credential for ARDC specifically — it proves exactly the long-term tracking capability being proposed. The appendix title should be neutral and method-forward: "Research Methods in Practice: Selected Engagements" or similar.


End of initial analysis. Writing can proceed. See additional section below.


ARDC Website Intelligence: Strategic Alignment Notes

Added February 26, 2026 — after direct review of ardc.net (strategy, vision, values, and funding priorities pages)

This section records what we learned from reading ARDC's published strategic materials and how each finding should shape proposal language. The goal is alignment that ARDC feels rather than notices — the proposal should resonate naturally with their thinking, not mechanically echo their vocabulary.


What ARDC Publishes About Itself

Mission (verbatim):

"The mission of Amateur Radio Digital Communications (ARDC) is to support, promote, and enhance digital communication and broader communication science and technology, to promote Amateur Radio, scientific research, experimentation, education, development, open access, and innovation in information and communication technology."

Vision (verbatim):

"A thriving global community of learners, experimenters, and contributors advancing freedom with open source communication and information technology."

Strategic Framework: Promote Learning → Promote Experimenting → Promote Doing (the three-box arrow diagram)

Focus boundary (Venn diagram): ARDC's focus = intersection of AR and DC. DC without connection to AR is explicitly labeled "Not ARDC Focus."

Values (all eight): Curiosity, Experimentalism, Respect, Accountability, Openness & Transparency, Inclusiveness, Fairness, Generosity & Gratitude.

Funding Priorities (published April 2025): Research & Development, Space-Based Communications, Open Source Education. Note: explicitly not exclusive — open call for all projects; these areas receive priority.

Funding Priorities notable details: - R&D: open hardware and software, low-cost hackable designs, broad adoption - Space-Based Communications: spectrum utilization, connecting distant communities, new skills development - Open Source Education: filling gaps in existing open source materials; radio clubs and maker organizations as "effective vehicles for reaching and inspiring people" - Specific callout: "projects that support analog-oriented hams and experimenters moving to a digital and networked communications environment"


Findings and Writing Guidance

1. The Mission/Vision are asset-focused, not outcome-focused

Observation (Jim's): ARDC's mission and vision describe the asset they want to build — a thriving global community — but do not name who benefits or what good that community does in the world. The societal value of amateur radio (emergency communications, spectrum innovation, open-source technology development, STEM pipeline, international goodwill) is implied but not stated. This is a noted gap; Jim has tried to communicate this to ARDC without full success.

Writing implication: We cannot and should not point out this gap directly. Instead, the proposal should quietly model what ARDC's mission statement doesn't do — grounding the research in societal outcomes, not just community growth metrics. When we describe why this research matters, reference outcomes like: enabling better emergency communications, sustaining spectrum access for public benefit, building an open-source technology ecosystem, and widening access to the kind of tinkering and innovation that amateur radio enables. The proposal will demonstrate what mission-with-outcomes looks like without ever pointing out that ARDC's isn't quite there yet. This is a tone and framing choice, woven throughout Sections 1–3 in particular.


2. Inclusiveness — handle with care

ARDC's stated value: "Invite in the kinds of people who have not historically been a part of the fields' mainstream demographics."

Constraint: There is a documented board-level tension on DEI. Do not use DEI language or frame this in ways that trigger that political rift.

Writing implication: Favor language like "broad and inclusive cross-section," "perspectives that cut across age, background, and experience level," "ensuring that underrepresented voices are captured," and "reaching beyond the traditional active-ham profile." Apply this language in Section 4 (Research Populations) when describing why we want diverse respondents, in Section 7 (WS1) when explaining recruitment strategy, and in Section 3 (Decision Map) where relevant. Frame it as research quality (broader cross-section = better data) rather than social policy. The research serves ARDC's stated value without raising flags.


3. Spectrum "use it or lose it" — a compelling argument to add

Source: From ARDC's strategy page — "getting more people regularly using AR and DC does something else important: it helps to use — and maintain access to — the AR spectrum. We must use it or risk losing it."

Writing implication: Add one to two sentences in Section 1 or Section 2 connecting participation growth to spectrum access sustainability. Something like: "A deeper understanding of what drives participation has implications beyond community health — spectrum rights exist on a use-it-or-lose-it basis. A vibrant, actively transmitting community is itself an argument for sustained access." This resonates with the technically-minded ARDC audience and adds a regulatory/strategic dimension to why this research matters that goes beyond "it would be nice to grow." This argument belongs in ARDC's mouth, not ours — we reflect it back, citing their own strategic language.


4. WS2 / DC scope: name the fuzziness, do not resolve it unilaterally

Observation: ARDC's Venn diagram says DC-without-AR is "Not ARDC Focus." But in practice, the line between in-scope and out-of-scope DC has been unclear even to ARDC itself (Jim has observed this in grant review contexts). The boundary is genuinely fuzzy.

Writing implication (updated from earlier guidance): Section 8 (WS2) should: - Describe WS2 population as "digital communications practitioners and communities at or near the intersection of DC and amateur radio — those active in open-source radio software, mesh networking, SDR, and similar spaces where the AR/DC overlap is relevant" - Explicitly acknowledge that defining the precise in-scope DC population is a nuanced question - Commit to resolving it in the Charter phase in dialogue with ARDC — "scope boundaries for the DC community research will be confirmed with ARDC leadership during Charter Phase 0" - Do NOT attempt to draw the line ourselves in the proposal — that positions on a fuzzy question ARDC hasn't resolved and is not ours to resolve


5. Radio clubs and maker organizations — connection to a named funding priority

Source: Funding Priorities (April 2025) — "scalable, hands-on projects to help radio clubs and maker organizations engage their communities and attract new members. This approach is based on the notion that these groups are effective vehicles for reaching and inspiring people."

Writing implication: ARDC explicitly views radio clubs and maker orgs as a high-leverage channel. Our WS1 survey instrument should include question(s) about club membership and involvement. The analysis should surface whether active club participation correlates with retention and deeper engagement. If it does, that finding directly advises on a named funding priority. This is a low-cost addition to the survey that could yield directly actionable grant guidance. Mention this briefly in Section 7 as a specific area of analysis interest.


6. Analog → digital transition — include lightly in instruments

Source: ARDC Funding Priorities — "projects that support analog-oriented hams and experimenters moving to a digital and networked communications environment."

Jim's assessment: The finding itself may be obvious. But including it is low-cost and connects the research to a named funding area.

Writing implication: Add digital mode use (yes/no; which modes; barriers to adoption) to the WS1 survey instrument. Include in Section 7 as one of several specific analysis areas. Do not headline it in the proposal — it should appear in a list of analytical angles alongside more distinctive ones. One or two questions in WS1 cover it without making it a centerpiece.


7. Open data as an explicit grantee requirement — not just a philosophy

Source: ARDC Values, Openness & Transparency — "We also expect that our grantees make their work available under open source licenses or otherwise freely available."

Writing implication: Already resolved in the recommendations (see item 7 above on open source stance). Additional note: our open data commitment is not a concession or a courtesy — it is a requirement. In Section 12 (Deliverables), the publicly available deliverables (anonymized dataset, codebook, instruments, methodology memo) should be framed this way: they fulfill ARDC's own expectation. In informal conversations with ARDC, this framing shows we've read their materials carefully. Connects to Capability 3 (ARDC as Knowledge Authority) — the published research and data become community assets.


8. Geographic scope — US-centric research in a globally-minded foundation

Already covered in item 10 of this document. Supplemental note from website review: ARDC's Vision explicitly names a "thriving global community" and Inclusiveness notes "both in the United States and internationally." This is directional; they are a global funder and are thinking globally. Our US-centric Phase 1 design should be acknowledged honestly, and the path to international extension should be briefly indicated in Section 14 (Scope & Options) as a potential future phase. WS2 (DC community) is more globally accessible by design — open online communities and forums don't have geographic boundaries — and should be framed as such.


ARDC's Language to Mirror (Naturally, Not Mechanically)

When writing, reach for ARDC's vocabulary where it fits organically:

ARDC's phrase Where it fits naturally
"Learn, experiment, do" Section 2 (framing ARDC's goals)
"Freedom to tinker and build" Section 1 or 2 (describing why AR/DC communities matter)
"Curiosity, asking questions before making assumptions" Section 2 (framing this research as ARDC living its values)
"Community-driven development" Section 13 (Building a Data-Driven Culture)
"Accessible, open, unrestricted innovation" Section 13 or Section 12 (Open data framing)
"Makers, hackers, and developers" Section 8 (WS2 population description)
"We must use it or risk losing it" Section 1 or 2 (spectrum argument)
"Effective vehicles for reaching and inspiring people" Section 7 (club/maker finding)

Use these phrases where they arise naturally. If a sentence feels like it's reaching to include ARDC's vocabulary, cut the phrase — the idea matters more than the word.


End of ARDC alignment notes.