Skip to content

Specialized Models per Modulation Family: Routing Subsets to SpectralCNN, SignalLSTM, ResNetRF,and SignalTransformer

Deep learning-based RF modulation classifiers are
often deployed as single, “generalist” models trained over a large
mix of signal types, bands, and impairments. In practice, however,
different architectures excel on different families of modulations:
spectral CNNs shine on narrowband constellations, recurrent
models track slowly time-varying analog signals, and transformerstyle feature fusion can exploit joint IQ+FFT structure.
This paper studies a simple but powerful idea: route each
incoming signal to a specialized model chosen for its modulation
family, rather than sending every signal through the same
generalist. Building on a production-style RF ensemble classifier,
we define families (e.g., PSK, QAM, analog), assign each
family a specialist drawn from {SpectralCNN, SignalLSTM,
ResNetRF, SignalTransformer}, and compare this routing
scheme against a flat “all-modulations” generalist.
On synthetic and replayed RF scenarios, family-specialized
models yield up to 3.4, 2.1, and 4.7absolute accuracy points over
the best generalist baselines for PSK, QAM, and analog signals
respectively, while reusing the same input builders and metric
logging already present in the system. We release a benchmark
harness and figure-generation pipeline so future specialists can
be dropped in without changing the LATEX.
Index Terms—Automatic modulation classification, RF machine
learning, ensembles, specialization, deep learning.

Repository:

bgilbert1984/Routing-Subsets-to-SpectralCNN-SignalLSTM-ResNetRF-and-SignalTransformer: Deep neural networks have become the default approach for RF automatic modulation classification (AMC), with convolu- tional and recurrent architectures delivering robust performance across a wide range of SNRs and channels. Most practical pipelines, however, are optimized around a single generalist model trained over a heterogeneous mix