Data Engineer

About Us

Chess.com is one of the largest gaming sites in the world and the #1 platform for playing, learning, and enjoying chess.


We are a team of 600+ fully remote people in 60+ countries working hard to serve the global chess community. We are here to support 185M+ chess players worldwide with the best possible product, content, and tools to serve the community!


We are a tech company. A gaming company. A content company. And we do it all with passion and commitment to the game. Above all we prize our mission-driven, flat, life-celebrating, no-corporate culture, and we look forward to meeting you and learning more about what you can bring to the team.


About You


You are a data nerd with strong engineering skills.  You have built and maintained data pipelines in production at large scale and you are able to make technology decisions about tooling and operations.  You are humble with a sense of humor and don’t take yourself too seriously.  You have been working in or dreamed of working in the gaming industry and are ready to turn your talents towards chess!


Big Challenges

  • We have a MASSIVE amount of games to store.
  • We have a MASSIVE amount of analysis being done on games.
  • We have a MASSIVE amount of activity to be logged and parsed for social feeds and product optimization.

What you'll do

  • Proactively collaborate with teammates to design and implement our world-class data platform.
  • Develop and maintain a Python service/library to serve requests related to our core metrics and cohorts.
  • Write and maintain reliable and efficient Spark/Flink jobs to clean and enrich our data before it reaches our data warehouse.
  • Use Bigquery and Airflow to perform data aggregations supporting insight generation.
  • Protect our player's PII and help support data compliance across our data stack.
  • Build and support tools for our internal teams to use to run experiments and access data.
  • Support query design and query review and enable team members to share data.

Preferred Skills

  • Strong collaboration and communication skills working in a fully distributed team.
  • Experience building and supporting data pipelines.
  • Experiencing running data workloads in the cloud in either GCP or AWS.
  • Experience with Dataflow & BigQuery or similar technology.
  • Experience with SQL and multiple databases (MySQL, MongoDB, etc).
  • Experience deploying infrastructure as code through Terraform.
  • Sense of ownership and responsibility.
  • Chess player
  • Lifelong learner

About the Opportunity

  • This is a full-time opportunity.
  • We are 100% remote (work from anywhere!).

---

You can learn more about us here:

Engineering

Remote (Spain)

Remote (Serbia)

Remote (United States)

Remote (United Kingdom)

Remote (Canada)

Share on:

Terms of servicePrivacyCookiesPowered by Rippling