Who we are
At Badi, we make city living affordable for everyone by unlocking available rooms inside crowded cities. How? With the most usable marketplace that intelligently matches verified profiles without intermediaries.
Create a product that people love;
Reshape the Real Estate Market;
Apply New Technologies;
Our 3 rules:
Make it mobile;
Emphasize User Experience;
Enable self-service solutions.
Senior DataOps/DevOps Engineer
We are looking for a talented Senior DataOps Engineer who loves the idea of helping a crack team of Data Scientists in the adoption of DevOps and CI/CD principles to improve all the steps of application and AI algorithm development, for the grand goal of building the most amazing app for rental properties worldwide!
If you are up for developing highly available, scalable analytics microservices on AWS infrastructure then this is the job for you.
- Train a team of Data Scientists to adopt DataOps (DevOps) principles
- Train and support data analytics professionals in implementing CI/CD workflows
- Assist data scientists in the adoption of software development best practices
- Coach data professionals to write production-grade software
- Build and maintain automated testing and validation pipelines
- Take data products from R&D to production
- Deploy and maintain custom machine learning models as highly available, scalable microservices
- Advise on system architecture and cloud management
- Serve as a liaison between Operations and Data
- Help guide our strategy for the selection and development of a host of storage, compute, and analytics tools
- Provision and maintain development and production architecture stacks
Skills and Expertise
- Technical University degree (e.g. BSc in computer science) or equivalent experience
- 4+ years of production engineering related experience
- Solid knowledge of DevOps principles and experience with CI/CD tools:
- Source Control software such as Git
- CI tools such as CodeDeploy, Jenkins or equivalent
- Configuration management software such as Puppet, Chef or equivalent.
- Deep knowledge of databases (both SQL and noSQL)
- Experience with distributed computing tools such as Hadoop, Kafka and Spark preferred
- Experience with AWS services such as CloudFormation, EC2, S3, Lambda, API Gateway, RDS, CodePipeline, etc.
- Experience working with Docker and container orchestration, preferably Kubernetes and/or ECS
- AWS certification is a plus
- A solid *nix background is a must
- Security is deeply embedded in everything you do
- Experience working with Data Scientists and other data professionals or data products preferred
- Strong scripting experience with Python and shell scripting preferred
- Experience in building high performance systems that scale;
- Performance testing and test automation;
- Excellent teamwork skills, flexibility, and ability to handle multiple tasks;
- You must be able to communicate to your teammates with full professional proficiency in English. Spanish is a plus.
- You contribute actively to the open source community
- You are an organizer or promoter of meetups and/or coding events and you are involved with a local community of developers
Does it look like the job for you? What are you waiting for? Apply now!
Applications must be submitted in English. Applicants must have a valid European work visa.