TheGuarantors is a cutting edge fintech company setting the standard in rent coverage with unrivaled insurance products. With a deep understanding of owner, operator, and renter needs, we believe renters deserve better access to the home of their dreams and operators deserve greater protection and growth opportunities. That’s why we’re leveraging our expertise in real estate and using AI-based technology to help operators qualify renters faster while mitigating the risk of rental income loss. With $4B+ in rent and deposits guaranteed, we work with 9 of the country’s top 10 operators and have been named one of Inc. 5000’s fastest-growing companies, one of Forbes’ Best Startup Employers, and one of Deloitte’s Technology Fast 500.
As a Data Engineer at TheGuarantors, you will architect and maintain scalable data pipelines, ensuring data reliability and integrity. You will be a key contributor in data ingestion, transformation, and warehousing initiatives.
We are seeking a candidate with 3+ years' experience in data engineering, with demonstrated proficiency in Python, SQL, and data-related technologies on cloud platforms.
You will collaborate closely with the rest of the data engineering team, assist colleagues, and partner with business stakeholders to drive the success of assigned projects.
Experience working with tools like Stitch, Snowflake, dbt, Dagster, and the AWS eco-system is preferred. Experience with machine learning and data governance is a plus.
Key Responsibilities
- Data Ingestion and ETL/ELT: Design and implement data pipelines to collect, ingest, and transform data from various sources into a structured format.
- Data Quality: Collaborate with the team to monitor and maintain data quality standards, perform data validation, and resolve data quality issues.
- Data Warehousing: Design and maintain data warehousing solutions to store and manage large volumes of data efficiently.
- Automation and Orchestration: Work on automating routine data engineering tasks to improve efficiency and reliability.
- Data Apps: Develop and maintain lightweight data applications for ML interfaces, simple CRUD operations, and quick interactive analysis.
- Performance Optimization: Identify and implement optimizations to all parts of the data stack, to improve data pipeline performance and reduce processing times.
- Documentation: Create and maintain documentation for data pipelines, processes, and data flows.
- Collaboration: Collaborate with cross-functional teams including data analysts, data scientists, other engineering teams, and business stakeholders to understand data requirements and deliver data solutions.
- Continuous Learning: Stay up to date with industry best practices and emerging technologies in data engineering.
- Team Participation: Collaborate with other data engineers on solution designs, and help to continuous team improvement and excellence by sharing learnings and acting on suggestions and feedback from others.
- Task Ownership: Interact effectively with leadership, team members, other engineering / technical teams, and business stakeholders to drive assigned tasks to effective and timely resolutions.
Qualifications
- Bachelor's degree in Computer Science, Information Technology, or a related field.
- 3+ years of applicable experience with data engineering concepts and practice, over the full data lifecycle.
- Demonstrated proficiency with SQL and Python in data engineering workflows.
- Proficiency in utilizing and managing data integration and ETL tools (e.g., Stitch/Talend, Airbyte, Fivetran, etc).
- Experience with data orchestration and workflow management tools (e.g., Dagster, Airflow).
- Knowledge of solution deployment within cloud computing platforms in a data engineering context, particularly in AWS.
- Excellent problem-solving skills and attention to detail.
- Strong communication and teamwork skills, with a demonstrated competency for collaboration.
- Ability to work in a fast-paced, collaborative environment.
- Aptitude for continuous improvement through learning and sharing, along with an ability to quickly and effectively adopt and adapt to new technologies and tools.
Nice To Have
- Master’s degree in Computer Science, Information Technology, or a related field.
- Understanding of and prior work with machine learning models
- Applied knowledge of data warehousing concepts and technologies (e.g., Snowflake, AWS Redshift, Google BigQuery, etc).
- Familiarity and experience with data governance and security best practices.
Benefits
- Opportunities to make an impact within a fast growing company
- Medical, dental, & vision insurance, beginning day one
- Health savings account with employer contribution
- Generous PTO and paid holidays
- Flexible working hours
- 401(k)
- Paid parental leave
- Company sponsored short and long term disability
- Flexible spending accounts (healthcare, dependent care, commuter)
- Competitive salary
Base Salary
The base salary range is between $100,000 - $140,000 annually.
Base salary does not include other forms of compensation or benefits. Final offer amounts are determined by multiple factors, including prior experience, expertise, location and current market data and may vary from the range above.
Stay in Touch
Does this role not quite match your skills, but you’re still interested in what we're doing? Stay In Touch and apply to our Dream Job to be one of the first to hear about future opportunities!
TheGuarantors is an Equal Opportunity Employer. We celebrate diversity and are committed to an inclusive environment for all.