About Us:
At Lusha, we are dedicated to revolutionizing the business data and intelligence industry. Our innovative solutions enable businesses to identify, engage, and close their ideal prospects. Join our fast-growing DevOps team and play a key role in scaling our infrastructure to support a rapidly expanding user base.
What will you be doing?
As a Senior Data DevOps Engineer, you will take full ownership of building and maintaining our infrastructure, constantly enhancing our development cycle, and improving developer experience and velocity. You will:
- Design & Maintain: Build scalable, resilient, and secure cloud-based infrastructure on AWS.
- Optimize Databricks: Manage Databricks environments for data processing, analytics, and machine learning.
- Lead Best Practices: Drive DevOps best practices within our big data and machine learning teams.
- Monitor & Alert: Implement monitoring, logging, and alerting solutions using Elastic, Apache Airflow, and more.
- CI/CD Pipelines: Develop and maintain smooth and reliable CI/CD pipelines using Jenkins.
- Automate: Use Infrastructure as Code (IaC) tools like Terraform for infrastructure provisioning and management.
- Collaborate: Work with cross-functional teams to define infrastructure needs and provide technical guidance.
- Manage Containers: Orchestrate containerized applications using Kubernetes.
- Serverless Applications: Develop and manage serverless applications using AWS Lambda.
- Troubleshoot: Resolve infrastructure-related issues, ensuring high availability and performance.
- Innovate: Stay updated with industry best practices and emerging technologies to drive continuous improvement.
You are:
- Ambitious: Eager to make an impact and lead by example in a dynamic growth environment.
- Independent: Quick to pick up new concepts, able to plan and execute from ideation to implementation independently.
- Results-Driven: Passionate about operational optimization with a drive to exceed goals.
- Team Player: Collaborative across the DevOps team and with all departments at Lusha, working with a range of disciplines to execute your work.
Requirements:
Requirements:
- 3+ years of experience in a DevOps or similar role, focusing on cloud infrastructure and automation.
- Deep expertise in AWS, including EKS, S3, RDS, Lambda, and VPC.
- Strong knowledge of Databricks and its ecosystem (Spark, Delta Lake, MLflow).
- Expertise in containerization and orchestration tools such as Docker and Kubernetes.
- Experience with Big Data technologies like Spark, Presto, Kafka, Airflow.
- Proven track record managing large-scale SaaS production environments.
- Proficient in scripting languages such as Python or Bash.
- Experience building machine learning pipelines (big advantage).
- Skilled in writing and deploying code in Python.
- Strong background in building CI/CD processes and infrastructure.
- Experience with Terraform or similar IaC tools.
- Excellent problem-solving skills and ability to work under pressure.
- Strong communication and collaboration skills, capable of working effectively in a team.
- Understanding of data engineering and data science workflows.