At Cerby we believe security is everyone’s business. Collaborating across your apps doesn’t need to be chaos. We are a mission-critical cybersecurity company that empowers your teams to operate securely and control their apps completely. We’ve built our product on the idea that teams deserve autonomy over their work apps. It turns out that why they are guaranteed a choice, security comes naturally.
More than 50% of all technology is spent outside of centralized IT organizations. Individual business units are taking their technology destiny into their own hands, and we enable that. End user onboarded applications are behind more than one third of all cybersecurity hacks. We provide the solution to manage that through enabling users to select their own technology and we automatically protect those applications.
About the role
We are seeking a highly skilled and experienced Senior Data Engineer to join our engineering team. This role requires to identify and execute key data initiatives within the company that helps internal and external stakeholders, builds on top and extents our existing data platform, help the engineering team with the technical data needs, data warehouse design, dashboard creation, architecture and implementation of new technologies for the data platform, this role communicates among engineering and non technical team to help them how to leverage data within their team. Helps with the strategies around data governance and compliance.
What you'll do
- Architect and Design: Data pipelines, data warehouse, specialized datasets.
- Data Services Management: Owns our current data platform and makes decisions around improvements and technologies used within the cloud.
- Data Quality and Compliance: Responsible for the data quality, data dictionary and sensitive information classification and management.
- Success Metrics: Responsible for the data users success with valuable metrics, dashboards and decision making tools.
- Data Solutions Delivery: Use and Implementation of diverse data tools that provides insights to the internal and external stakeholders.
Qualifications
- Programming: Strong proficiency in structured query language for relational databases (SQL), Python and python libraries such as Numpy, Pandas, PySpark. Scala is desired but not required. Typescript is desired.
- Code Quality: A strong mindset around code testing and quality assurance, unit testing, integration testing, shorter feedback loops.
- Data Technologies: Segment, Apache Flink, Spark, Kinesis, S3 Delta Lake, AWS Athena, DBT, Data Orchestrator (Airflow, Dagster), Data Quality Tools such as Pandera, Observability Tools such as Datadog, Sentry.
- AWS Expertise: Good understanding of AWS services and configuration management.
- Infrastructure as Code (IaC): Experience for managing cloud infrastructure across multiple regions. We use Terraform
- Automation & CI/CD: Ability to automate repetitive tasks, reduce toil, and implement the CI/CD strategies for our Data services.
- Collaboration: Strong communication and collaboration skills, with experience working in an Agile environment.
- Problem-Solving: Strong analytical skills with the ability to troubleshoot complex issues in distributed systems.