Explore the engineering principles, signal processing architectures, and calibration methodologies behind our instrument platform. We publish this because we believe informed engineers make better purchasing decisions.
Our proprietary ASIC-based digitizer architecture achieves 800 MHz instantaneous bandwidth with 14-bit resolution and real-time FFT processing at rates exceeding 500,000 spectra per second. This enables 100% probability of intercept for transient and frequency-agile signals that swept-tuned analyzers systematically miss.
The FPGA-based signal processing pipeline implements time-domain, frequency-domain, and modulation-domain analysis simultaneously without data loss, supporting parallel measurements on the same acquired signal data.
Signal generators employ a fractional-N PLL architecture with integrated VCO achieving -152 dBc/Hz at 1 GHz carrier, 20 kHz offset. The synthesizer settling time of less than 200 microseconds enables high-throughput production test applications without sacrificing spectral purity.
A proprietary DDS (Direct Digital Synthesis) core generates arbitrary modulation waveforms with 16-bit DAC resolution at 2 GSa/s, supporting complex multi-carrier scenarios including 5G NR, LTE-Advanced with carrier aggregation, and custom OFDM schemes.
EMI test receivers implement time-domain scanning that captures the complete spectral environment in a single measurement sweep, reducing test times by up to 90% compared to traditional stepped-frequency receivers while maintaining full CISPR 16-1-1 compliance.
Simultaneous quasi-peak, CISPR-average, RMS-average, and peak detector operation on the same signal acquisition eliminates the need for sequential detector sweeps, directly accelerating EMC compliance test campaigns.
Four-port VNA architecture with integrated bias tees and built-in pulse generators supports complete active device characterization including S-parameters, noise figure, compression, and harmonic distortion in a single connection setup.
Calibration algorithms based on 12-term error model with automatic electronic calibration units reduce setup time from 45 minutes to under 3 minutes while maintaining residual directivity of 46 dB and effective source match of 42 dB.
Engineering decisions in test and measurement involve real trade-offs. We believe transparency about these trade-offs helps you select the right instrument architecture for your specific application constraints.
Operators deploying 5G face a fundamental infrastructure trade-off that directly impacts measurement requirements. mmWave deployments (24-43 GHz) offer massive bandwidth — up to 800 MHz channels — enabling ultra-low latency for industrial IoT and high-density venue coverage. However, mmWave requires significantly denser infrastructure and more complex over-the-air testing due to propagation challenges.
Sub-6 GHz bands provide superior coverage and building penetration, requiring lower infrastructure density and offering more cost-effective nationwide rollout. The measurement implications differ substantially: mmWave testing demands specialized antenna measurement chambers with higher path loss calibration accuracy, while sub-6 GHz testing can leverage existing conducted test infrastructure with antenna coupler adaptations.
Our position: We provide instruments covering both domains because the optimal deployment strategy depends on population density, spectrum licensing, and use-case requirements. Neither approach is universally superior — most operators will deploy both in a heterogeneous network architecture.
The migration from physical core network equipment to virtualized, cloud-native network functions (NFV/CNF) creates new measurement challenges. On-premises hardware offers full control over data sovereignty, deterministic latency, and proven reliability for mission-critical voice and data services. Traditional T&M instruments are well-matched to this architecture.
Cloud-native cores enable elastic scaling, faster feature deployment, and reduced CAPEX through shared infrastructure. However, they introduce variable latency, distributed failure modes, and the need for service-level monitoring rather than physical-layer testing.
Practical consideration: Test strategies must evolve from pure RF/physical-layer measurement toward a hybrid approach combining protocol-level conformance testing with real-time service quality monitoring. Instruments that bridge both worlds — correlating physical-layer impairments to service-layer KPIs — deliver the most actionable insights during migration.
Collaborative research programs with leading universities and national laboratories drive our next-generation measurement capabilities.
Joint research on sub-THz channel sounding and measurement methodologies for 6G candidate frequency bands. Developing over-the-air test capabilities for massive MIMO antenna systems with 256+ elements.
Developing cryogenic-compatible signal generation and analysis solutions for qubit control and readout at millikelvin temperatures. Addressing the unique measurement challenges of quantum-classical interfaces.
Machine learning models trained on millions of measurement datasets to automate anomaly detection, predict measurement drift, and optimize calibration intervals based on instrument usage patterns and environmental conditions.
Request a white paper or schedule a technical webinar with our measurement scientists.
Request Technical Resources