Column Software

Business Intelligence Engineer

Column is the AI-powered software platform for public information. We build automated tools and workflow solutions that help governments, law firms, and media organizations notify the public. Today, thousands of publishers and tens of thousands of agencies, firms, and individuals rely on Column to keep their communities informed every day. We’re reimagining how public information connects people, institutions, and media.


We're a remote-first team of engineers, designers, and builders who are committed to creating software that supports an informed public.

Overview

We are seeking a skilled Business Intelligence Engineer with 2–3 years of analytical experience to own, maintain, and evolve our analytics and reporting infrastructure. This role is critical to ensuring financial reporting accuracy, company KPI visibility, and operational trust in data, with a strong emphasis on recurring reporting, metric validation, and ad-hoc financial and operational analysis.

The BI Engineer will operate across analytics, light analytics engineering, and ongoing data operations to support our business teams in a fully remote environment.

About the Role

Data Pipeline Ownership

  • Own, maintain, and improve core data systems including dbt, Stitch, and the BigQuery data warehouse sourced from NoSQL datasets (e.g., Firestore).
  • Develop reliable, well-documented data models, transformations, and ELT pipelines supporting revenue, operational, and financial reporting.
  • Implement and maintain data quality and observability frameworks, including dbt tests, schema-change monitoring, and pipeline alerts.
  • Be the BigQuery expert to apply version control, documentation, naming conventions, and workflow best practices for long-term maintainability.

Dashboards & Data Visualization

  • Build, maintain, and optimize Google Looker Studio dashboards, ensuring accuracy and usability.
  • Perform ongoing dashboard maintenance, enhancements, and metric updates as business needs evolve.
  • Apply visualization best practices to help business teams quickly interpret performance, trends, and operational bottlenecks.
  • Support adhoc custom internal and external (partner-facing) reporting requirements.

Revenue & Operational Reporting

  • Build and maintain recurring operational reports, including those used for enterprise customer audits and sales pipelines.
  • Own the accuracy and maintenance of reports and data pipelines used for financial decision-making, such as revenue, margin, capture rate, and fee-based metrics.
  • Ensure consistency and QA across datasets/ models, dashboards, and downstream financial models or outputs.

Cross-Functional Collaboration

  • Partner closely with Finance, Operations, and Customer-facing teams to translate business needs into actionable analytics and data workflows.
  • Act as a steward of core business KPIs, owning overall accuracy and maintainability.
  • Proactively identify discrepancies and communicate changes that could impact key reporting or financial outcomes.
  • Communicate insights, changes, and recommendations clearly to non-technical audiences.

About You

Required Qualifications

  • 2–3 years of hands-on experience in data analysis or analytics engineering.
  • Strong SQL skills and proficiency with dbt and BigQuery.
  • Experience with ETL tools (Stitch or similar) and NoSQL datastores (e.g., Firestore).
  • Demonstrated experience building and maintaining dashboards in Google Looker Studio or comparable BI tools.
  • Experience supporting revenue, financial, or compensation-impacting analytics.
  • Ability to own end-to-end analytics work end-to-end, from raw data to stakeholder-facing outputs.
  • Strong attention to detail, data accuracy and QA.
  • Comfortable managing and communicating on multiple priorities.

Preferred Qualifications

  • Experience working with CRM data (e.g. HubSpot), financial and sales pipeline analytics.
  • Familiarity with operational, marketplace, or workflow efficiency metrics.
  • Experience with scheduling/orchestration tools (e.g., dbt Cloud, cron).
  • Exposure to data governance concepts such as access controls, retention policies, and prior success rolling out standardized naming conventions.

Compensation & Benefits

Base salary range: $90-110K

Benefits: Comprehensive health coverage, flexible PTO, remote-first culture.

Product and Engineering

Remote (United States)

Share on:

Terms of servicePrivacyCookiesPowered by Rippling