ManoMano is a leading online marketplace specializing in DIY, home improvement, and gardening. We are committed to delivering exceptional customer experiences and innovative solutions. Our Data Platform Team is pivotal in ensuring our infrastructure and data pipelines are robust, scalable, and secure. We are looking for a skilled and motivated DevOps Engineer to join our team and contribute to our mission.
As a DevOps Engineer on the Data Platform Team, you will be responsible for designing, implementing, and maintaining our infrastructure. You will leverage your expertise in networking, automation, cloud services, and CI/CD pipelines to support and enhance our data platform. Your work will ensure seamless integration, reliability, and scalability of our data processes, enabling us to meet our business objectives.
Key Responsibilities
Networking and Cloud Services
- Design, configure, and manage networking components to ensure optimal performance and security.
- Troubleshoot network issues and implement effective solutions to maintain connectivity and efficiency.
- Set up and manage AWS databases, including RDS.
- Manage S3 buckets, policies, and lifecycle rules for data storage and backup.
- Oversee identity management using IAM roles, policies, and SSO integrations.
- Build and maintain Amazon Machine Images (AMIs) for consistent environment setups.
- Manage EKS (Elastic Kubernetes Service) clusters and deploy applications using Kubernetes.
- Utilize Azure cloud services for specific project requirements and hybrid cloud solutions.
Infrastructure Automation
- Use Terraform for infrastructure provisioning and management.
- Develop and maintain Ansible playbooks for configuration management and application deployment.
- Implement and manage infrastructure as code (IaC) using CDKTF (Cloud Development Kit for Terraform).
CI/CD
- Develop, maintain, and optimize CI/CD pipelines using GitLab.
- Ensure smooth integration and delivery of code changes across multiple environments.
Programming and Scripting
- Develop automation scripts and tools using Bash, Python and/or Java.
- Collaborate with data teams to integrate applications and streamline processes.
What you will need:
Nice to have:
Data and Database Management experience
- Implement and maintain data pipelines using Apache Airflow.
- Manage Snowflake data warehouse solutions for large-scale data storage and analytics.
- Use SQL for database queries and management.
- Monitor database performance, implement tuning, and ensure data integrity and security.