Integrate Careers

Associate Technical Lead

About Integrate

Integrate is a leading B2B SaaS platform that helps enterprise marketing teams drive pipeline and revenue by generating and governing high-quality, compliant leads across channels

Responsibilities:  

## Snowflake Development
- Design, develop, and maintain SQL-based ETL pipelines in Snowflake
- Create and manage Snowpipes for real-time data ingestion from AWS S3
- Build and orchestrate Snowflake Tasks for data processing workflows
- Develop and optimize data models, including dimensional models and rollup tables for BI consumption
- Manage multiple Snowflake databases, schemas, and warehouses across different environments (staging, production)
- Optimize Snowflake queries and data warehouse performance for cost and efficiency
- Implement data quality checks and monitoring within Snowflake pipelines
- Maintain documentation for data dictionaries and pipeline workflows

## Data Integration & Infrastructure
- Integrate AWS services (S3, SNS, Event Notifications) with Snowflake for automated data ingestion
- Build and maintain Kafka consumers for real-time streaming data processing
- Develop data pipelines that handle multiple data sources and formats (JSON, CSV, Parquet)
- Implement data deduplication, flattening, and transformation logic
- Ensure data pipeline reliability, error handling, and recovery mechanisms
- Collaborate with cross-functional teams to understand data requirements

## Software Development
- Develop and maintain .NET/C# services that interact with Snowflake using Snowflake.Data client
- Work with version control (Git) and follow software development best practices
- Participate in code reviews and contribute to improving code quality
- Write clear documentation and maintain technical documentation for data pipeline

 

Basic Qualifications:  6+ years of hands-on experience

## Technical Skills

- **Snowflake**

  - Snowflake SQL and advanced query optimization

  - Snowpipes for automated data ingestion

  - Tasks for workflow orchestration

  - Snowflake Stages, File Formats, and data loading best practices

  - Data modeling (dimensional modeling, star/snowflake schemas)

  - Snowflake warehouses and compute optimization

- **Programming Languages**:

  - Strong proficiency in Python (pandas, numpy, pyspark)

  - SQL expertise (advanced queries, window functions, CTEs)

- **Cloud & Infrastructure**:

  - Experience with AWS (S3, SNS, IAM, Event Notifications)

  - Understanding of cloud data architecture patterns

- **Data Technologies**:

  - Experience with Kafka or other streaming platforms

  - Knowledge of data warehousing concepts and best practices

  - Understanding of ETL/ELT patterns

## Soft Skills

- Strong problem-solving and analytical thinking

- Excellent communication skills for collaborating with stakeholders

- Attention to detail and commitment to data quality

- Ability to work independently and as part of a team

- Proactive approach to identifying and solving issues

- Experience with PySpark for large-scale data processing

- Knowledge of Salesforce, Jira, or other SaaS platform APIs

- Experience with BI tools (Looker, Tableau, Power BI)

- Understanding of data governance and data quality frameworks

- Experience working in agile development environments

- Experience with .NET/C#

 

 


Data Science

Remote (India)

Share on:

Terms of servicePrivacyCookiesPowered by Rippling