Ascentt is building cutting-edge data analytics & AI/ML solutions for global automotive and manufacturing leaders. We turn enterprise data into real-time decisions using advanced machine learning and GenAI. Our team solves hard engineering problems at scale, with real-world industry impact. We’re hiring passionate builders to shape the future of industrial intelligence.
Position Overview
We are seeking an experienced Data Solutions Architect to design, implement, and optimize enterprise-scale data architectures across cloud platforms. This role requires deep expertise in modern data stack technologies including Databricks, Snowflake, and AWS/GCP cloud services. The ideal candidate will drive data strategy, ensure robust governance frameworks, and enable advanced analytics capabilities that support business-critical decision making.
Key Responsibilities
Architecture & Design
- Design scalable, secure data architectures on AWS and GCP cloud platforms
- Develop data architecture blueprints for data lakes, warehouses, and streaming solutions
- Create technical specifications and establish architectural standards and best practices
- Evaluate and recommend emerging technologies to enhance the data ecosystem
Data Platform Management
- Lead Databricks implementation for advanced analytics and machine learning workloads
- Design and optimize Snowflake data warehouse architectures for performance and cost
- Architect multi-cloud solutions using AWS (S3, Redshift, Glue, EMR) and GCP (BigQuery, Dataflow, Dataproc) services
- Implement containerized data processing solutions
Data Integration & Pipeline Development
- Design robust ETL/ELT pipelines for batch and real-time data processing
- Architect data replication strategies across heterogeneous systems
- Develop data ingestion frameworks for structured and unstructured data sources
- Implement CDC solutions and event-driven architecture for real-time analytics
Data Modeling & Analytics
- Develop logical and physical data models optimized for analytical workloads
- Design dimensional modeling and data vault solutions for enterprise data warehouses
- Architect self-service analytics platforms for business users
- Create domain-specific data marts and advanced modeling techniques
Business Intelligence & Data Governance
- Architect BI solutions integrating with platforms like Tableau, Power BI, and Looker
- Implement comprehensive data governance frameworks ensuring quality, lineage, and metadata management
- Design data privacy solutions compliant with GDPR, CCPA, and regulatory requirements
- Establish role-based access controls, data classification, and security frameworks
Leadership & Collaboration
- Partner with stakeholders to translate business requirements into technical solutions
- Provide technical leadership and mentorship to development teams
- Drive architectural reviews and present designs to executive leadership
- Collaborate with data engineers, analysts, and scientists for optimal platform performance
Required Qualifications
Technical Expertise
- Cloud Platforms: 5+ years of hands-on experience with AWS and/or GCP data services
- Databricks: 3+ years designing and implementing Databricks solutions for analytics and ML workloads
- Snowflake: 3+ years architecting Snowflake data warehouse solutions
- Programming: Proficiency in Python, SQL, Scala, and/or Java for data processing applications
- Big Data Technologies: Experience with Apache Spark, Kafka, Airflow, and other distributed computing frameworks
- Infrastructure as Code: Proficiency with Terraform, CloudFormation, or similar tools
Data Management
- Data Integration: Extensive experience with ETL/ELT tools and frameworks
- Data Modeling: Strong background in dimensional modeling, data vault, and modern data architecture patterns
- Data Governance: Proven experience implementing data quality, lineage, and metadata management solutions
- Analytics: Experience architecting solutions for advanced analytics, machine learning, and AI workloads
Professional Background
- Bachelor's or Master's degree in Computer Science, Engineering, Information Systems, or related field
- 7+ years of experience in data architecture, engineering, or related roles
- Strong project management skills with experience leading complex, multi-stakeholder initiatives
- Excellent communication skills with ability to present technical concepts to non-technical audiences
Preferred Qualifications
- Cloud certifications (AWS Solutions Architect, GCP Professional Data Engineer, or similar)
- Experience with additional cloud platforms (Azure)
- Knowledge of machine learning operations (MLOps) and AI/ML lifecycle management
- Experience with real-time analytics and streaming technologies
- Background in financial services, healthcare, or other highly regulated industries
- Familiarity with open-source data technologies and frameworks
- Experience with agile development methodologies and DevOps practices