full-time employee contract
Become an integral part of a diverse team that leads the world in the Mission, Cyber, and Intelligence Solutions group. At ManTech International Corporation, you ll help protect our national security while working on innovative projects that offer opportunities for advancement.
Currently, ManTech is seeking a motivated, career and customer oriented Big Data Analytics Engineer to perform on our Advanced Network Defense Team in the Springfield, VA area to provide unparalleled support to our customer and to begin an exciting and rewarding career within ManTech.
Big Data Analytics Engineer to analyze, visualize, and model data in order to build and implement data models. As a Big Data Engineer, you’ll work on code that improves our operations and experiences for customers. You'll make use of the latest advances in large scale data processing and data visualization to uncover insights in data. You will merge state-of-art optimization algorithms with distributed systems engineering to build systems that drive efficiencies in workflows. You will contribute to building lasting, scalable data infrastructure. You’ll work with other Engineers to implement ETL processes; to build, train, update, evaluate, and deploy Machine Learning models; and to design processes and data models to ensure that data is easy to find and use.
Responsibilities include but are not limited to:
• Design, implement, or operate comprehensive data to balance optimization of data access with batch loading and resource utilization factors, according to customer requirements
• Develop data warehouse data pipelines, including sourcing, loading, transformation, and extraction
• Create or implement metadata processes and frameworks
• Create plans, test files, and scripts for data warehouse testing, ranging from unit to integration testing
• Create supporting documentation, such as metadata and diagrams of entity relationships, business processes, and process flow
• Design and implement warehouse database structures
• Develop and implement data extraction procedures
• Develop or maintain standards for the design of data warehouse elements, such as data architectures, models, tools, and databases
• Implement business rules via stored procedures, middleware, or other technologies. Map data between systems. Design and develop data graphs, charts, reports and visualizations.
• BS in Computer Science or similar field and 10 or more years experience
• Deep knowledge of computational complexity theory, with ability to code time- and space-efficient algorithms with good scalability. This skill is required due to the familiarity with computer science fundamentals, algorithms and data structures required to be successful in this role
• Experience with Linux and AWS
• Ability to understand and mathematically model practical problems
• Proficient with scripting, such as Python, Java, C++, or similar
• Experience with software integration or testing, including analyzing and implementing test plans and scripts
• Experience with virtualized / cloud computing / storage environments
• Experience with developing solutions and integrating and extending Open Source Software or COTS products
• Strong analytical and problem solving skills
• Excellent interpersonal and communications skills necessary to function within a team environment
• IAT III 8570 certification
• Experience with processing large amounts of data in a cloud environment
• Solid knowledge in state-of-art algorithms
• Experience with building operations research machine learning algorithms, and productionizing them at scale in a distributed computation environment
An Active TS/SCI clearance is required.