Cloud Data Engineer


A little bit about us

Bespin Global is a top global cloud MSP recognized in the Gartner Magic Quadrant for 8 consecutive years. We also won the AWS MSP Partner of the Year globally and many Google Partner of the Year awards! 


We have 1,300+ “Bespineers” across 16 offices and 10 countries including the U.S., South Korea, Singapore, Dubai, Indonesia, China, and Tokyo, serving more than 4,500 customers worldwide. 


If you want a fun and exciting role at a fast-growing company with lots of opportunities, this is the place for you.

We are looking for a highly skilled and motivated Cloud Data Engineer with handson experience in building and maintaining ETL pipelines, working with cloudbased environments, and integrating systems. The ideal candidate will have a strong background in Apache Airflow, Python programming, and cloud technologies.  The successful candidate will be passionate about data engineering, solving complex problems, and driving automation in the data pipeline process.

Key Responsibilities:

  • Design, develop, and maintain cloud-based ETL pipelines to extract, transform, and load data across various systems.
  • Utilize Apache Airflow to automate and orchestrate data workflows and tasks.
  • Collaborate with cross-functional teams to gather data integration requirements and define pipeline architectures.
  • Write Python scripts to build data processing jobs, API integrations, and custom workflows.
  • Develop and manage APIs and integrations for seamless data exchange between internal and external systems.
  • Ensure data quality, consistency, and reliability across the data pipeline.
  • Implement best practices for data engineering, adhering to scalability, performance, and security standards.
  • Monitor data pipeline performance and troubleshoot issues as they arise.
  • Create and maintain comprehensive documentation for all pipelines, integrations, and systems.
  • Continuously evaluate new tools and technologies to improve pipeline efficiency and data processing capabilities.
  • Contribute to the design and deployment of data architecture on cloud platforms like AWS, GCP, or Azure.

Qualifications:

  • Bachelor’s degree in Computer Science, Engineering, Mathematics, or a related field, or equivalent practical experience.
  • 5+ years of proven experience as a Cloud Data Engineer or similar role.
  • Solid experience with Apache Airflow for orchestrating and automating data workflows.
  • Strong proficiency in Python for scripting, data processing, and API development.
  • Experience in building and managing ETL pipelines and understanding data modeling and transformation.
  • Hands-on experience with cloud-based data platforms (AWS, GCP, Azure).
  • Experience with RESTful APIs and API integrations for seamless data flow across systems.
  • Familiarity with SQL for querying databases and managing large datasets.
  • Ability to troubleshoot, optimize, and scale data pipelines.
  • Strong communication skills and the ability to work effectively with cross-functional teams.

Preferred Skills:

  • Experience with big data technologies such as Spark, Hive, or Hadoop.
  • Knowledge of data warehousing solutions like Redshift, BigQuery, or Snowflake.
  • Experience with version control tools like Git.


Cloud Services

Remote (United States)

Share on:

Terms of servicePrivacyCookiesPowered by Rippling