We use cookies. Find out more about it here. By continuing to browse this site you are agreeing to our use of cookies.
#alert
Back to search results
New

RF Software Engineer, Senior - 1647

Global InfoTek, Inc.
100000.00 To 200000.00 (USD) Annually
United States, Virginia, Reston
Apr 22, 2026

Clearance Level: Public Trust

US Citizenship: Required

Job Classification: Full Time

Location: Remote

Years of Experience: 5-7 years of relevant experience

Education Level: BS Degree - experience may be considered in place of education requirement.

Briefly Describe the Work:

GITI is seeking a Senior RF Software Engineer to support Cyber Operations Research and Development on passive RF emitter identification and network analysis from real-time sensor data streams. The candidate will implement, test, and maintain components of production software pipeline - a stream ingestion, rollup, and post-processing system operating on NDF (Network Description File) data produced by TDMA network sensors in dense, contested RF environments. Working under the direction of the Principal Engineer and the Technical Lead. The Senior RF Software Engineer supports Cyber Operations by contributing to pipeline development across a range of functional areas including stream processing, database integration, display and reporting tools, simulation infrastructure, and CI/CD tooling. The role requires strong Python skills, comfort with air-gapped Linux environments, and the ability to work independently on well-defined components with minimal supervision in support of real world cyber operations.

Responsibilities:

  • Implement, test, and maintain assigned pipeline components including stream ingestion, rollup processing, database write, and batch post-processing modules in support of real world cyber operations
  • Develop and maintain browser-based visualization and reporting tools (track plots, waterfall displays, SmartBook report generation) that consume pipeline database output
  • Implement and maintain stream simulation infrastructure, including TDMA network mission log replay and stream generation at controllable rates for pipeline testing
  • Develop lightweight TNS simulator components: emitter and receiver models capable of following track plots and emitting in accordance with a network description
  • Contribute to database integration work on tactical-box-spec hardware, including MySQL schema design, query optimization, and performance benchmarking
  • Write comprehensive unit and integration tests for assigned components; implement and maintain CI/CD pipelines using GitLab to ensure functionality on hardware or in cloud environment
  • Identify and report performance bottlenecks in Python pipeline components; assist with porting mature components to Rust or C as directed
  • Perform basic Linux system administration on remote servers including package management, user configuration, and environment setup
  • Manage source code using GitLab; follow disciplined versioning, branching, and code review practices as established by the Principal Engineer
  • Produce clear technical documentation for implemented components including interface specifications, configuration guides, and test procedures
  • Participate in periodic technical check-ins with the program technical lead; share findings and flag blockers promptly

Career level with a complete understanding and wide application of technical principles, theories, and concepts. Working under general direction from the Principal Engineer, provides technical solutions to a wide range of well-defined problems and independently executes on assigned components. Bachelor's (or equivalent) with 5-7 years of experience, or a Master's with 3-5 years of experience.

Required Skills:

  • Strong proficiency in Python, with demonstrated experience in data processing pipelines, stream ingestion, or ETL development
  • Proficiency with Python data science libraries including NumPy, Pandas (or Polars), and scikit-learn
  • Experience with relational database development using MySQL, PostgreSQL, or SQLite, including schema design and query optimization
  • Experience parsing or generating binary serialization formats (FlatBuffers, Protocol Buffers, or equivalent)
  • Ability to develop, test, and debug on remote Linux servers via SSH using command-line tools and a modern IDE
  • Solid Linux operating system fundamentals including file system management, process control, and basic security hardening (Ubuntu)
  • Proficient in software engineering practices including Git/GitLab version control, unit testing, and CI/CD pipeline usage
  • Experience developing browser-based data visualization or reporting tools, or demonstrated ability to learn React/D3-based tooling on the job
  • Strong written and oral communication skills; ability to produce clear technical documentation for engineering audiences
  • Ability to work independently on assigned components with minimal supervision in a small, distributed team

Desired Skills:

  • Experience with TNS (Target Network System) sensor data formats and NDF ICD specifications
  • Familiarity with TDMA network protocols, time-division access architectures, and passive RF signal processing concepts
  • Experience with lightweight stream or message queue architectures (ZeroMQ, RabbitMQ, or equivalent)
  • Experience with Rust or Go for systems-level or performance-critical development on Linux
  • Experience with Polars or DuckDB for high-performance analytical workloads
  • Experience with performance profiling and optimization of Python pipelines on resource-constrained x86 hardware
  • Experience with LLM-assisted software development tools (e.g., Claude Code, GitHub Copilot, JetBrains AI Assistant, or equivalent); demonstrated ability to use AI tools productively for code generation, refactoring, and test case development while maintaining engineering judgment and code quality standards
  • Familiarity with AI/ML libraries (PyTorch, TensorFlow); ability to integrate trained model inference into a pipeline without requiring deep ML expertise
  • Experience with Jupyter Notebooks and research enclave environments; ability to read and adapt research prototype code
  • Experience with simulation or synthetic data generation for pipeline testing purposes
  • Familiarity with Apache data science tools such as Spark or Dask for large-scale data processing

Relevant Certifications:

  • Certifications in software engineering, computer science, or related fields (e.g., Certified Software Development Professional (CSDP); Certified Scrum Developer (CSD); Red Hat Certified Enterprise Application Developer; Certified Secure Software Lifecycle Professional (CSSLP); C++ Certified Associate Programmer (CPA); Professional Software Developer Certification (PSD); etc.)

Global InfoTek, Inc. is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability.

About Global InfoTek, Inc. Global InfoTek Inc. has an award-winning track record of designing, developing, and deploying best-of-breed technologies that address the nation's pressing cyber and advanced technology needs. GITI has rapidly merged pioneering technologies, operational effectiveness, and best business practices for over two decades.

Applied = 0

(web-bd9584865-7m7w4)