Description:
As a Senior Data Pipeline Engineer, you will play a crucial role in designing, building, and maintaining robust data pipelines for our customers. Your expertise will drive the efficient collection, storage, processing, and transformation of large-scale data sets. Here are the key responsibilities and qualifications for this role:
Responsibilities:
Pipeline Development:
- Design, develop, and optimize end-to-end data pipelines for efficient data extraction, transformation, and loading (ETL) processes.
- Collaborate with cross-functional teams to understand data requirements and translate them into scalable pipeline solutions.
- Implement best practices for data integration, ensuring high performance, reliability, and scalability.
Data Transformation and Quality:
- Transform raw data into usable formats, ensuring data quality, consistency, and accuracy.
- Handle data validation, cleansing, and error handling to maintain data integrity.
- Monitor and proactively maintain data pipelines to ensure high service availability.
Performance Optimization:
- Continuously improve pipeline performance by identifying bottlenecks and implementing optimizations.
- Work with cloud-based technologies (e.g., AWS, GCP, Azure) to enhance scalability and efficiency.
Collaboration and Leadership:
- Partner with Data Scientists, Analysts, and other stakeholders to understand their data needs.
- Lead discussions on system enhancements, process improvements, and data governance.
- Mentor junior engineers and contribute to the growth of the data engineering team.
This role will be performed 80% remote with 20% onsite.
#qf #pmf
Requirements:
- Bachelor's degree in Computer Science, Engineering, or a related field.
- 5+ years of experience in data engineering, with a focus on building and maintaining data pipelines.
- 1-2 years experience with building NiFI data flows or similar for Kafka and Hadoop-based NoSQL databases.
- Proficiency in ETL tools, SQL, and scripting languages (Python, Scala, etc.).
- Experience with Data Catalog and Accumulo indexes for information retrieval and discovery.
- Experience with API-led design.
- Experience with ELK stack a plus.
- Experience with cloud-based data platforms (e.g., AWS S3, Redshift, Google BigQuery).
- Strong problem-solving skills and attention to detail.
- Excellent communication and collaboration abilities.
- Security+ Certification
- Active US Government Clearance at Secret level or higher
- Ability to sit for extended periods of time.
- Ability to regularly lift at least 25 pounds.
- Ability to commute to the designated onsite work location as required.
QBE is an equal opportunity/affirmative action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender, gender-identity and/or expression, age, disability, Veteran status, genetic information, pregnancy (including childbirth, lactation, or other related medical conditions), marital-status, neurodivergence, ethnicity, ancestry, caste, military/uniformed service-member status, or any other characteristic protected by applicable federal, state, local, or international law.
PI238954086