Skip to content

Appendix L — Track 1 in Detail

7. Track 1: Engagement and Retention (W1)

7.1 Why this track exists

Track 1 delivers decision-grade metrics for engagement, retention, and reactivation, enabling sharper focus and higher-impact grantmaking.

ARDC needs a reliable baseline for the licensed population now: where participation is strong, where it is fragile, and where retention or reactivation initiatives are most likely to move outcomes. Without that baseline, portfolio choices rely too heavily on partial signals and cannot be evaluated with confidence from one grant cycle to the next.

This track is designed to produce three practical outputs: a community snapshot, actionable segmentation, and repeatable measures that ARDC can refresh over time. Together, these outputs support both prioritization (what to fund next) and evaluation readiness (how to tell whether the portfolio is moving the intended outcomes).

In practical terms, Track 1 establishes a defensible baseline—a “starting point” that ARDC can compare against after new initiatives launch and as the portfolio evolves.

To make smart retention and reactivation investments, ARDC needs to understand the whole licensed population—not just the most visible, highly engaged operators. In most communities, that would require building a population list from scratch and still missing much of the quiet majority.

Track 1 focus view showing post-license engagement, attrition, and renewal pathways with muted pre-license context

Exhibit 7.1. Track 1 focus view (post-license): engagement, attrition, and renewal pathways, with muted pre-license context.

7.2 The FCC ULS advantage

Here in the US, the FCC ULS changes everything. It gives ARDC uniform access to every corner of the amateur radio population—including the “silent” licensees whose experiences and barriers are essential to understanding drift, non-engagement, and return.

Track 1 uses ULS for one purpose: to source research participants from a defined population roster, so we can reach beyond the self-selected participants who show up in clubs, forums, contests, and other high-engagement channels. In most communities, building that kind of reach is the hardest part of the work—and it is where studies most often over-represent the active core and under-represent the people who are drifting or inactive.

ULS makes a practical class of questions answerable in a defensible way. For example, Track 1 can estimate what fraction of licensed operators are essentially inactive (or at high risk of drift) based on a sample drawn from the full population—not inferred from lists that systematically miss the least-engaged participants. That same population-scale reach also supports straightforward sampling cuts (for example by geography, license class, and license status) so the study covers key segments while still reflecting the broader population.

We also recognize the privacy concerns that can arise when using any public record source. Our use of ULS data follows a clear privacy-protection policy: we use only what is necessary for approved research operations, we protect individual-level information in handling and reporting, and we publish findings only in aggregated form—consistent with Appendix D - Privacy, Ethics and Data Stewardship.

7.3 Major Track 1 deliverables

Track 1 produces two complementary lenses on the licensed population: a clear picture of “where things are today” and a practical view of “how people got there.” Together, they let ARDC prioritize initiatives with more precision and place bets earlier—before drift becomes loss.

  • Cross-sectional snapshots (decision-grade baseline). A clear picture of the licensed population today, with particular attention to engagement distribution and segmentation (who is thriving, stable, drifting, or inactive). The snapshot is designed to support faceted exploration—so ARDC can examine how engagement and barriers vary across meaningful cuts (for example: age band, years since first license, license class, operating context, geography, and other measured traits). This enables questions such as: Where is drift concentrated? What barriers dominate for that segment? Which constraints look most “moveable” with well-designed initiatives?

  • Trajectory patterns (early signals and pathway drivers). Insight into not only the current state, but how different segments arrived there—typical pathways into deeper participation, stable maintenance, drift, and reactivation. Trajectory patterns surface early signals and turning points that separate thriving journeys from drifting ones, and they help ARDC identify where interventions can strengthen momentum, remove friction, and reduce avoidable drop-off. This enables ARDC to invest in initiatives that prevent avoidable drop-off, strengthen early momentum, and address root causes before disengagement becomes permanent. This work is organized around a Trajectory Map (Exhibit 7.3) that links engagement patterns to renewal outcomes and highlights likely turning points in the licensed journey.

Together, these deliverables provide: (1) a baseline ARDC can reuse and refresh over time, (2) segment-level guidance for where interventions are most likely to work, and (3) clearer expectations for what portfolio impact should look like in measurable terms.

7.4 Fielding model

Track 1 uses a mail-to-web model anchored in the ULS frame: physical invitation to ULS address, personalized survey link, and staged reminders. This approach is designed to improve coverage beyond digitally visible participants and preserve representativeness.

For former licensees, additional data hygiene is applied (for example NCOA processing and deduplication across active/expired records). These steps are operational details in service of one objective: stronger decision-grade outputs for ARDC.

7.5 Approach and measurement design

Track 1 starts with qualitative discovery to improve instrument quality before full fielding. Focus groups and depth interviews inform vocabulary, pathway framing, and battery design so the survey captures decision-relevant signal rather than assumptions.

The measurement design is organized into four batteries tied to ARDC decision needs:

  • current engagement state (community snapshot and segment sizing)
  • lifecycle stage and trajectory (pathway reconstruction and risk points)
  • adoption and retention drivers (what sustains or erodes participation)
  • perception and messaging (how participants describe amateur radio and adjacent AR/DC relevance)

Exhibit 7.2 summarizes Track 1 as a vertically sequenced flow from Charter through decision outputs, including quality gates before full launch.

Track 1 Workflow — Current and Former Licensees

1. Charter and Research Design Lock

Finalize scope, sample design assumptions, quality thresholds, and gate criteria for Track 1 execution.

2. Qualitative Discovery and Instrument Build

Run focus groups/IDIs, translate findings into measurement batteries, and draft the survey instrument.

Output to Track 2 setup: segment and message signals used in Stage 2 alignment.

3. Cognitive Testing and Pilot Gate

Validate comprehension and operational flow, then approve full field launch through a formal gate decision.

4. Full Survey Execution (ULS Mail-to-Web)

Field invitations and reminders, monitor response quality, and manage bias checks during data collection.

5. Analysis, Synthesis, and Track 1 Reporting

Produce the community snapshot, trajectory insights, and investment lever guidance for licensed populations.

Output to integration: Decision Kit inputs, deliverable schedule, and board briefing materials.

Exhibit 7.2. Track 1 vertical methodology flow, including quality gates and labeled output handoffs to adjacent tracks and decision integration.

7.6 Trajectory analysis limits and value; keep Exhibit 7.3 here

Trajectory analysis is explicitly bounded: retrospective recall is imperfect. The community snapshot is guaranteed; trajectory reconstruction is a complementary analytical layer built where data support is sufficient.

Exhibit 7.3 is the trajectory mental model for Track 1: engagement signals are measured across the post-grant lifecycle and related to renewal and lapse outcomes.

%%{init: {
  "flowchart": { "nodeSpacing": 55, "rankSpacing": 70, "curve": "basis" },
  "themeVariables": { "fontSize": "18px", "nodePadding": 14 }
}}%%
flowchart TB
  A["Initial License"] --> B["Renewal Window"] --> C["Expiration"] --> D["End of Grace"]

  T1["Thriving"] -.-> O1["On-time Renewal"]
  T2["Steady"] -.-> O1
  T2 -.-> O2["Grace Renewal"]
  T3["Drifting"] -.-> O2
  T3 -.-> O3["No Renewal"]
  T4["Early Exit"] -.-> O3
  T5["Immediate Exit"] -.-> O3

  A -.-> T1
  A -.-> T2
  A -.-> T3
  A -.-> T4
  A -.-> T5

  %% Styling (matched to long-form exhibit palette)
  classDef anchor fill:#ffffff,stroke:#111111,stroke-width:2px;
  classDef red fill:#fff5f5,stroke:#c53030,stroke-width:2px;
  classDef orange fill:#fffaf0,stroke:#dd6b20,stroke-width:2px;
  classDef green fill:#f0fff4,stroke:#2f855a,stroke-width:2px;
  class B,C,D anchor;
  class A,T1,O1 green;
  class T2,T3,O2 orange;
  class T4,T5,O3 red;

Exhibit 7.3. FCC lifecycle and engagement trajectory frame linking cross-sectional snapshot findings to common journey patterns.