Kansas Works Jobs

Kansas Works Logo

Job Information

Kansas Employer Sr Data Engineer (Remote Eligible) in Wichita, Kansas

This job was posted by https://www.kansasworks.com : For more information, please see: https://www.kansasworks.com/jobs/12957684

Are you ready to showcase your advanced technical abilities? Dream, Innovate, Inspire and Empower the next generation to transform humanity through technology and imagination! As a Sr Data Engineer, you\'ll design, develop, document, test, and debug new and existing cloud-based data pipelines and transformations. You will participate and lead design meetings and consult with business clients to develop processes and structures that ingest data from multiple sources into our cloud-based Enterprise Data Warehouse (EDW). Within the EDW, you will use a variety of tools to implement and continuously improve data integration, master data management, data lifecycle management, data security, data quality management, metadata management, and reporting and analytics. You will also perform defect corrections (analysis, design, code). The Sr Data Engineer also identifies and creates data management practices and processes to be included in the technical architectural standard, and collaborates with various teams across the organization on defining and improving the data architecture to support overall enterprise strategy. Finally, the Sr Data Engineer serves as an escalation point for other Data Engineers on the team.

As SNC\'s corporate team, we provide the company and its business areas with strategic direction and business support spanning executive management, finance and accounting, operations, human resources, legal, IT, information security, facilities, marketing, and communications.

Responsibilities:

  • Develop, maintain and optimize data pipelines and workflows on AWS using tools such as AWS Glue, AWS Lambda, and AWS Step Functions.
  • Design and implement data models that ensure data accuracy, completeness, and consistency.
  • Collaborate with stakeholders and analysts to identify data requirements and develop solutions that meet their needs.
  • Troubleshoot and resolve issues related to data pipelines, data quality, and data modeling.
  • Develop and maintain documentation of data pipelines, data models, and other data-related processes.
  • Implement security and compliance measures to ensure data is protected and meets regulatory requirements.
  • Continuously monitor and optimize production data pipelines and data models to improve performance and efficiency.
  • Keep up-to-date with industry trends and advancements in data engineering and AWS services.
  • Communicate effectively with stakeholders to provide updates on project progress and escalate issues when necessary.

Must-haves:

  • Bachelor\'s Degree in a related field with at least 10 years of relevant experience
  • Higher education may substitute for relevant experience
  • Relevant experience may be considered in lieu of required education
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL), and familiarity with a variety of databases
  • Experience with AWS cloud services: S3, EC2, RDS, Redshift, Glue, Lambda, Step Functions, Athena, CloudWatch, ECS, IAM
  • Experience with AWS security: best practices, AWS KMS, AWS Secrets Manager
  • Experience building and optimizing data pipelines, architectures and data sets
  • Operational responsibilities (schedules, monitoring, logging, alerting, error handling, etc.)
  • Experience performing root cause analysis on data to answer specific business questions or issues and identify opportunities for improvement
  • Experience with object-oriented scripting languages and frameworks: Python (BOTO3), PySpark
  • Source system integration patterns (SQL, APIs)
  • Infrastructure as code: Terraform
  • DevOps experience

Preferred: ?

  • Experience building and using custom G tHub Actions and Workflows to support internal DevOps use-cases
  • Experience with complex infrastructure-as-code concepts such as Terraform module composition and centralization
  • Experience working with dbt to model and build Data Warehouse objects
  • Comfortable working with Agile tools to plan and organize large engineering efforts into work items
  • Familiarity with Containerization concepts and tools such as Docker or Podman
DirectEmployers