Job #: 2521
Title: Devops/Airflow Engineer – Ann Arbor, MI
Join our data Platform Support team in the Ann Arbor office as a DevOps/Airflow Engineer working directly with a Computer Science team focusing on Business Intelligence solution development. In this role you’ll work directly with our Computer Science team to develop pipelines for their developed apps.
This will require research, analysis and documentation. Demonstrate and build on your current knowledge of Jenkins, Airflow, Python, SQL/NoSQL, and ELK while working on various AI related tasks. Provide analysis of problems, recommend, and implement solutions.
Work with development teams, understand requirements, and develop prototypes. Work within established procedures to develop, test, implement, and maintain deployment Pipelines, and Airflow DAGs.
Demonstrate your knowledge and experience by troubleshooting and solving critical issues that may involve many systems and platforms, within a complex infrastructure. You can expect to work on small to large projects and maintenance efforts.
– Must be able to follow and help to create processes for both Production and Post Production deployment.
– Performs Deployment oriented tasks of a moderate to high complexity.
– Perform platform design, specification, testing, debugging, and documentation of both release and troubleshooting platforms
– Work independently and with supervisory review on moderate to highly complex tasks. Provides analysis of problems and recommends solutions.
– Participates in client interactions, possibly with more senior team members, to develop system solutions to business problems
– Works within established procedures to develop, test, implement, and maintain pipelines and Apache Airflow DAG.
– Willing to come to work ready to participate in an interactive team and learn new skillsets.
– Positive attitude and a willingness to work in a openly collaborative team
– Minimum of 2 years of experience with Apache Airflow
o Should understand large data processing requirements and data ingestion
o Experience working with data pipelines and creating Apache Airflow DAG
– Minimum of 2 years of experience with SQL/NoSQL and shell scripting
– Experience working with ELK Stack
– Experience building and production deployment of analytics solutions leveraging Python, SQL, JSON
– Experience working in Agile Environment
– Ability to hold self and team members accountable for work product in a professional manner
Preferred Skills, Education, Experience:
– Must have understanding and experience with GIT, Jenkins, Airflow, Python, and SQL
– Nice to have some experience with NoSQL, Unix and ELK Stack