DevOps Engineer, Data Platform bei AB InBev

About AB InBev

AB InBev is the leading global brewer and one of the world’s top 5 consumer product companies. With over 500 beer brands we’re number one or two in many of the world’s top beer markets: North America, Latin America, Europe, Asia, and Africa.

About BEES

BEES, a part of the AB InBev family, is a digital organization within ABI building a platform to improve the ways retailers run their businesses and interact with the world’s largest brewer & other suppliers. We provide transactional and educational resources to SMB retailers across the world to help reduce the overhead of their day-to-day operations and make their businesses more profitable. Today more than 1 million SMB retailers across 10+ countries use the BEES platform to transact with AB InBev and manage their business.
Our team is in search of a DevOps Engineer to be a leader in our BEES Data Platform organization and help with the development of our platform as a service.

About the Team

The BEES Data Platform team is responsible for designing, building and maintaining the overall data infrastructure for BEES and works closely with other data engineering and data science teams to identify, develop and maintain re-usable and state-of-art infrastructures.

What You’ll Do

  • Build and maintain reusable data architectures and services that can be leveraged by agile Data Engineering and Data Science teams to improve development velocity.
  • "Lead by example" by designing, building and executing on DevOps best practices, adherence to governance, control policy recommendations, provide recommendations on simplifying the landscape by advocating better cost management and providing insights into overall cloud spend
  • Designing and Administering Container deployment, cluster scaling and management activities via automation, promoting serverless and domain-distributed architecture

Our Technology

  • Azure (Active Directory, Data Factory, Data Share, DevOps, Event Hub, Key Vault, Storage accounts/Blob containers/ADLS Gen2)
  • Terraform
  • Snowflake
  • Databricks
  • Spark/PySpark
  • Mode
  • PowerBI
  • New Relic

Required Qualifications

  • BA/BS degree (Computer Science, Software/Computer Engineering, Information Systems, Statistics, or similar technical field)
  • Experience building and supporting compute/storage/network infrastructure systems and solutions with a focus on data pipelines and resources.
  • Experienced software developer in Azure Pipelines (deployment pipelines, infra-as-code)
  • Experienced software developer in Python and Bash.
  • Experience in Linux operating system internals, networking, filesystems and storage technologies.
  • Experience in application and infra monitoring and alerting systems
  • Experience in capturing the entire "Infrastructure as Code" using tools like Terraform, Packer.
  • Experience working in a full Data Engineering team: QA, Data Engineers, Data Analysts, DBAs, etc.
  • Understanding of CI/CD and DevOps best practices

Desired Qualifications

  • BA/BS degree (Computer Science, Software/Computer Engineering, Information Systems, Statistics, or similar technical field)
  • Experience in writing software in one or more languages, such as Java, Python, Go, JavaScript, C++ or similar.
  • Experience with Unix/Linux operating systems internals (e.g., filesystems, system calls), and with networking or cloud systems
  • Experience with building and supporting data infrastructure used by data engineering and data science teams.
  • Familiarity with Data Governance and related concepts, e.g., lineage, quality, integrity, security
  • Experience in automating infrastructure provisioning, DevOps, and/or continuous integration/delivery.
  • Expert in capturing the entire "Infrastructure as Code" using tools like Terraform, Packer.

Vergiss nicht anzugeben, dass dieses auf Graduateland gefunden hast.