About Bronson AI LLC
Bronson utilizes technology and collective action to protect against health and environmental harm, ensuring a future where everyone who has been impacted by corporate lawlessness can seek justice.
About the role
- At Bronson, we leverage data to revolutionize the legal industry. Join our team and help us build data-driven solutions that enhance legal services and improve client outcomes.
- We are seeking a skilled Data Engineer to design, develop, and optimize our data infrastructure. In this role, you will ensure our data pipelines are efficient and scalable, supporting our mission to innovate in the legal space.
Key Responsibilities
- Design and build scalable ETL/ELT pipelines leveraging DBT, Python, and Snowflake to support analytics, reporting, and machine learning workloads.
- Develop data modeling and transformation frameworks to standardize operational workflows and improve platform performance.
- Transform inherited database systems into well-architected, maintainable solutions using modern data engineering principles, dimensional modeling techniques, and cloud-native technologies.
- Contribute to a robust data governance framework—including quality checks, monitoring, versioning, and lineage—to enhance trust in critical datasets.
- Automate workflows and infrastructure to improve deployment velocity, reduce operational overhead, and ensure reproducibility across environments.
- Participate in architectural decisions and technical strategy, shaping the evolution of a modern data platform designed to scale with the business.
Qualifications
- 3+ years of professional experience in software or data engineering roles.
- Proficiency in Python (data and backend scripting), SQL, and modern ETL/ELT practices.
- Hands-on experience with Snowflake or similar cloud data warehouses.
- Experience assessing and remediating technical debt in existing data systems by rearchitecting databases, normalizing schemas, and implementing modern data warehousing patterns and standards.
- Strong understanding of relational and NoSQL data modeling, schema design, and data lifecycle management.
- Proven track record of shipping production-quality systems in fast-paced, startup-like environments.
- Excellent communication and collaboration skills—comfortable working with product managers, engineers, and data scientists.
Preferred Skills
- Data modeling and legacy system migration expertise
- Experience with data ingestion/ELT tools (e.g. DBT, Glue, Snowpipe, Fivetran, Airflow).
- Familiarity with Python, Typescript, and IAC principles as a means of codifying backend systems
- Background in data governance, quality frameworks, and observability tooling (e.g., Monte Carlo, Great Expectations, dbt tests).
- Solid experience with AWS services (Lambda, RDS, DynamoDB, S3, EventBridge, etc.) and infrastructure as code with AWS CDK.
- Comfort contributing to architecture decisions.
We are an equal opportunity employer and value diversity at our firm. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.