Data Integration Software DeveloperLeidos
Description
The Leidos Digital Modernization Group seeks a Data Integration Software Developer for the Global Solution Management – Operations II (GSM-O II) contract, under the Global Automations & Intelligent Network Solutions (GAINS) Team. This is an exciting opportunity to work in a fast-paced environment where innovation, collaboration, and growth are at the forefront. The candidate will be responsible for integrating various data sources into Confluent (Kafka) and Elastic platforms, developing robust integration solutions, and contributing to data governance practices. The role requires senior level experience in integrations of Kafka, Elastic, Databricks, cloud platforms, and full software lifecycle automation, with experience in deploying and managing systems in a multi-site, multi-cluster environment.
Candidate must be within driving distance of Fort Meade, Scott AFB, or Hill AFB. At a minimum, a Secret clearance is required upon the start of employment and within 14 days following the start date the employee must obtain their Security + certification.
As a key scrum team member, you will work as part of a fast paced, Agile development and implementation team to develop in an integration solution to ensure a unified data integrated platform that expands the foundational Integrated Data Architecture platform (Confluent, ELK, and Databricks platform). You will work alongside others in support for operational end-users and support client requirements.
Primary Responsibilities:
- Integration Solutions: Develop and implement integration solutions for the GAINS project using Kafka and Elastic as the primary data architecture platforms, with expanded integration to other technologies, including but not limited to Databricks.
- Data Integration: Integrate data sources into Confluent (Kafka), and Elastic and Databricks platforms. Develop Kafka system integrations between Elasticsearch/Logstash and other systems.
- Kafka Integration & Development: Develop Kafka system integrations, custom connectors, and work with ksqlDB and Kafka Streams for data processing based on the design solution.
Kafka Cluster Management: Deploy and manage Kafka clusters on Kubernetes in multi-site environments (both on-premise and cloud).
- Software Lifecycle Automation: Automate the full software lifecycle, from design and development to testing and deployment, including production environments.
- DevOps Pipelines: Design and build application deployment pipelines, including containerized environments using Kubernetes and Docker, and automated testing pipelines.
Basic Qualifications:
Education: Bachelor’s degree in Computer Science, Mathematics, Physics, Electrical Engineering, Computer Engineering, or related discipline, with 4 years of prior relevant experience; or master’s with 2 years with prior relevant experience
Experience: 4+ years of combined experience in Kafka, Java, RESTful services, AWS, and full stack development
Programming Background: Software development experience with Python, Java and SQL. Working knowledge of HTML and JavaScript.
Search & Analytics Applications: Experience with BI tools like Kibana, and technologies like Elasticsearch, Logstash, Kafka, NiFi, and Databricks.
Event Streaming & Integration: Advanced understanding of event streaming and Kafka integration.
Application Integration: Experience in application integration design and strong communication skills for collaboration with virtual teams.
Software Design: Experience in developing software detailed designs, particularly in ksql or kstreams.
Software Development Lifecycle:
Proficiency in following a software development lifecycle and maintaining production-quality code.
Experience with distributed version control software such as Git and Bitbucket.
Knowledge of and ability to apply principles, theories, and concepts of Software Engineering.
Experience developing software on a UNIX command line platform.
Software Documentation & Requirements: Develop DoD requirements, traceability, and detailed plans/schedules. Write software systems engineering documents and interface documents (IDDs/ICDs).
Security Clearance: Active Secret, Interim Secret or higher DOD security clearance.
Certifications: Ability to obtain Security+ certification or equivalent DoD 8570 IAT II certification within 30 days of the start date.
Must be able to report into work site (Preferred: Ft. Meade, with possibility for it to be Scott AFB). There is a possibility that this role may have remote work potential.
Preferred Qualifications:
Text Mining & ELK Stack: Experience with text mining tools and techniques, including ELK Stack for summarization, search, and entity extraction.
CI/CD & DevOps: Familiarity with CI/CD techniques, containerized pipelines, and DevOps practices.
Kubernetes & Agile: Familiarity with Kubernetes deployment, Agile methodologies, and tools.
Cloud Expertise: Familiarity with AWS GovCloud and cloud infrastructure, including networking and security policies.
Cloud Platform Expertise: Utilize expert knowledge of cloud-integrated platforms for integration and deployment tasks.
Cross-Team Collaboration: Work within a matrixed organization, collaborating with project leadership and core GMS teams to combine software and integration practices with data engineering.
System Architecture & Operational Stability: Apply knowledge of system architecture, networks, and Centralized Logging (ELK) to support data transformation initiatives.
Cloud & DoD Environments: Experience developing and deploying software in a DoD environment (DISA experience is a plus), including experience building and deploying software applications that meet DoD security standards, including updating applications and code to meet security scans and meeting security implementation guidelines (e.g. STIGs).
Agile Processes: Experience with Agile methodologies and related tools. Experience with Atlassian tools, including JIRA and Confluence.
Certifications: Certified Confluent Developer and Certified Elastic Engineer.
Remote Teamwork: Experience working remotely with a geographically dispersed team.
Agile: Strong understanding of Agile methodologies, including Scrum and SAFe.
Communication: Excellent communication and collaboration skills, with the ability to engage with customers, stakeholders, team members.
This role offers the opportunity to work on advanced integration projects, combining software engineering with data integration in a dynamic environment. If you meet the qualifications and are passionate about integration engineering, we encourage you to apply.
If you're looking for comfort, keep scrolling. At Leidos, we outthink, outbuild, and outpace the status quo — because the mission demands it. We're not hiring followers. We're recruiting the ones who disrupt, provoke, and refuse to fail. Step 10 is ancient history. We're already at step 30 — and moving faster than anyone else dares.
Original Posting:
April 15, 2026For U.S. Positions: While subject to change based on business needs, Leidos reasonably anticipates that this job requisition will remain open for at least 3 days with an anticipated close date of no earlier than 3 days after the original posting date as listed above.
