Careers at Torus

Senior Data Engineer

About Torus

Torus is headquartered in Utah and is expanding manufacturing at our 540,000-square-foot facility in Salt Lake City called GigaOne. Our mission is to build the world’s first mesh energy infrastructure — built to unite people and communities through resilient, secure, and intelligent power. We design, engineer, manufacture, install, and support our systems end-to-end, standing behind them throughout their lifecycle. Torus systems help reduce costs, lower emissions, and protect facilities from outages, while strengthening the security and reliability of the broader utility grid. Torus is committed to American manufacturing, engineering excellence, and building energy systems that last.


At Torus, you will be part of something larger than a single product or technology. Your work will help build energy infrastructure that supports critical systems, industry, and communities for decades to come. We value accountability, collaboration, and clear thinking. We are looking for people who want to solve hard problems and build things that matter.

About the Role

We are looking for a talented Senior Data Engineer to build and scale the data infrastructure that powers Torus's mission. As a core member of our data team, you'll build and maintain our modern data stack (dbt, Redshift, Airflow, Fivetran, Metabase, and Streamlit), designing and implementing scalable data pipelines that ingest, transform, and serve data from our complex ecosystem of IoT devices, grid systems, and business applications.


In this role, you'll collaborate on and support the infrastructure that enables our data team to operate efficiently. You'll build real-time data pipelines that process telemetry from thousands of energy storage devices, create robust ETL workflows that ensure data quality and reliability, and develop tools that make data accessible to both technical and non-technical stakeholders. Your work will directly enable machine learning models, analytics dashboards, and business intelligence that drive strategic decisions.


Our products operate within complex ecosystems including IoT devices, the electrical grid, commercial and industrial buildings, and smart home systems. You'll work with high-volume streaming data, time-series sensor data, and diverse data sources to build the infrastructure that makes Torus a truly data-driven organization.

Who You Are

  • Autonomous and ownership-oriented: You thrive with autonomy and end-to-end ownership. You take pride in owning your code, pipelines, and services from conception to production. You have strong technical judgment and the ability to span across the data stack—from pipeline development to data modeling to tooling. You have strong DevOps fundamentals and enjoy total ownership of your domain.
  • Collaborative architect: The team is small, so you'll be collaborating on architectural design and contributing to technical direction, not just implementing tickets. You work effectively both independently and as part of a team, actively collaborating with your data team colleagues and cross-functional partners, sharing knowledge and giving/receiving candid feedback.
  • Builder mindset: You strive to write elegant, maintainable code and are comfortable independently picking up new technologies to solve problems efficiently. You're passionate about building infrastructure and tools that empower data scientists, analysts, and business users to move faster and make better decisions.
  • Quality-focused: You have an eye for detail, good data intuition, and a passion for data quality and reliability.
  • Adaptable startup operator: You thrive in a startup environment with ambiguous requirements and rapidly changing priorities. If you prefer clearly defined requirements and established processes, this role may not be the right fit. If you're energized by scaling data systems and shaping how a growing company uses data, you'll excel here. You understand that our existing systems were built to solve real problems under real constraints, and you approach improvements with curiosity and respect rather than judgment.
  • Continuous learner: You're genuinely excited about learning new technologies and tackling unfamiliar problems. You may not check every box in our requirements, but you're confident in your ability to learn quickly and contribute meaningfully. You keep your ear to the ground for opportunities to improve data flows and aren't afraid to propose innovative solutions.
  • Mission-driven: You're passionate about using technology to combat climate change and transform how people consume energy.

If you're someone who loves to learn, is supportive of your teammates, stays curious about where data processes can be improved, and is ready to roll up your sleeves to build better systems together—we want to hear from you, even if you don't meet every single requirement listed below.

What You'll Own

Data Infrastructure & Pipelines

  • Design, build, and maintain scalable batch and streaming data pipelines that handle high-volume IoT telemetry and business data
  • Develop robust ELT workflows to ingest, transform, and load data from diverse sources including APIs, databases, IoT devices, and third-party systems
  • Build and optimize our data warehouse using Redshift, implementing dimensional models that support analytics and machine learning use cases
  • Implement real-time data processing systems that enable immediate insights and rapid response to system events
  • Develop incremental SQL patterns in dbt for efficient data transformation

Data Quality & Reliability

  • Build tools, processes, and pipelines to enforce, check, and manage data quality at scale
  • Develop monitoring and alerting systems to ensure pipeline reliability and data freshness
  • Create data validation frameworks and automated testing for data pipelines
  • Establish best practices for data governance, documentation, and lineage tracking

Platform & Tools Development

  • Build frameworks that enable data scientists to deploy models to production efficiently
  • Develop self-service analytics capabilities and data access patterns for non-technical stakeholders
  • Create and enhance analytics tools to facilitate intuitive data consumption

Infrastructure & Operations

  • Own the full software development lifecycle for data services, focusing on automation, testing, monitoring, and documentation
  • Develop and maintain infrastructure using AWS CDK and Terraform
  • Build and maintain CI/CD pipelines for data operations
  • Manage cloud infrastructure on AWS (ECS, Redshift, Lambda, S3)
  • Support ad hoc data requests and maintain core pipeline operations

Required Experience

  • Typically requires a bachelor's degree in Computer Science, Engineering, Information Technology, Data Science, or a related technical field and 5+ years of experience building scalable data pipelines, but we value diverse learning paths and welcome candidates who demonstrate equivalent expertise.
  • Strong experience building batch and streaming data pipelines using distributed processing frameworks
  • 3+ years of experience designing and implementing ELT pipelines for data extraction, transformation, and loading from diverse sources
  • Expert proficiency in Python with strong software engineering fundamentals
  • Advanced SQL skills and experience with relational databases and data warehousing
  • Hands-on experience with data warehouse modeling, including dimensional modeling and schema design
  • Experience with cloud platforms (AWS preferred) and infrastructure-as-code tools
  • Practical experience owning production data systems with DevOps fundamentals
  • Experience with containerization (Docker) and orchestration concepts
  • Passion for data quality, monitoring, and building reliable systems

Preferred Experience

  • Experience with modern data stack tools: dbt, Redshift, Airflow, Fivetran, Streamlit, or similar
  • Hands-on experience with AWS services (ECS, Lambda, Athena, S3, EC2, VPC)
  • Experience with Terraform and/or AWS CDK for infrastructure as code
  • Experience working with IoT data, time-series data, or sensor data at scale
  • Familiarity with data observability and lineage tools (OpenMetadata, Monte Carlo, Great Expectations, etc.)
  • Experience with monitoring platforms like Datadog
  • Experience with CI/CD pipelines (GitHub Actions or similar)
  • Knowledge of Kubernetes and container orchestration
  • Experience with authentication systems (AWS SSO, Okta, etc.)
  • Experience in energy systems, industrial IoT, or utilities domain
  • Background supporting machine learning infrastructure and MLOps practices
  • Experience in a high-growth startup environment
  • Familiarity with AI-assisted development tools (Cursor, Windsurf, etc.)

Additional Details

  • Background Check: All candidates are subject to a background check
  • Location + Travel: The role is remote based in the US. Requires occasional travel to our South Salt Lake Headquarters
  • Schedule: Full-Time, Salaried
  • Compensation: $130,000 - $170,000 (Note: We have the flexibility to hire at different levels, which may impact the corresponding pay range)
  • Work Authorization: Applicants must already have the legal authorization to work in the US without requiring any employer sponsorship

Physical Requirements

  • Constantly operates a computer and other peripheral office equipment such as a printer or mouse
  • Ability to communicate information so others can understand. Must be able to exchange accurate information in these situations
  • Must report to work reliably and with the ability to use full and unimpaired skills and judgment to safely execute your job
  • Proficiency in reading, writing, and speaking English required
  • When on the production floor, required to don personal protective equipment to include, but not limited to ear protection, gloves, eye protection and/or safety helmet. 
  • When on the production floor, ability to observe, detect and respond to audible and visual machine malfunction warnings.

Our Benefits and Perks

Benefits eligibility is based on employment status.

  • Employee Rewards Package including Equity
  • 401(k) Retirement Savings Plan
  • Health Benefits Package: Choice between traditional PPO or HSA eligible medical plans; Dental insurance; and Vision insurance
  • Human-centered Paid Time Off including Unlimited Discretionary PTO or 10 days of accrued PTO; 10-days paid company holidays; Waiting period-free 100% paid parental leave
  • Torus paid Life and AD&D Insurance with option to purchase additional coverage
  • Voluntary Short- and Long-Term Disability Insurance
  • Peer Recognition Program

Torus is proud to be an Equal Opportunity Employer. All applicants will be considered for employment without attention to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran, or disability status.

Experience

Remote (United States)

Remote (Utah, US)

Share on:

Terms of servicePrivacyCookiesPowered by Rippling