Employment Opportunity

Job Code: GW-04212210164159
Salary Range: Up to $100K
Job Location: TX--
City: Austin    State: TX


What’s the position?

We are looking for a Data Engineer to join our quickly growing technology team in Austin, 
TX. As a data engineer, you will be a technical contributor on our team supporting cutting 
edge technology, operations, and business needs. The role requires an individual that's 
technically talented, able to manage massively distributed datasets, architect and 
implement big data infrastructure, and is passionate about data and sports.


You will:

·       Be responsible for the quality, scope, and timeliness of all your deliverables.

·       Design, build, and operate your platform components so that others can rely on 
your great ideas (common streaming architectures, data storage methods, proper processing 

·       Work with engineering and business stakeholders to understand data requirements

·       Performing data cleansing and enhancing data quality

·       Take action with quality, performance, scalability, and maintainability in mind

·       Work in a collaborative, fast-paced Agile team environment

What we are looking for:

We are looking for an engineer who is willing to tackle problems with innovative ideas and 
quality technical implementations. We believe the perfect candidate isn't interested in 
just what we're building right now but wants to understand where we're going and how it 
impacts the customer to ensure everything, we create moves us closer to our goal: 
delivering the best customer experience. Because we operate with a startup mindset, we 
will be reliant on your technical skill, but also your passion and ownership over all 
aspects of your work.


The ideal candidate will have:


·       Extensive experience with Python (Java/Scala a plus) and SQL, using on a daily 
basis over number of years

·       Big Data on the cloud (Snowflake, Big Query, S3/Athena, Redshift, Delta Lake)

·       Strong communication skills (both written and verbal)


·       Degree in Computer Science, Engineering, Management Information Systems, 
Mathematics, a related field, or equivalent work experience (3+ years)

Experience in:

·       Database orchestration technologies, specifically Airflow and/or DBT

·       Experience with streaming data architectures, specifically Kafka

·       Knowledge of semi structured data: Parquet, Avro, JSONA deep understanding of AWS 
Cloud Data technologies (RDS, DynamoDB, Aurora)

·       Knowledge and experience with PostgreSQL, MS SQL Server, or MySQL

·       Experience with Big Data Batch and Streaming technologies like Spark, Kafka, 
Flink, Beam, Kinesis

·       SnowPro Certification or equivalent from Databricks or AWS

·       Comfort working within an agile development cycle and exposure to:

o   Linux development

o   Git and versioning software

o   Jupyter or Databricks Notebooks

What’s in it for you?

We offer our employees more than just competitive compensation. Our team benefits include:

·       Competitive pay and benefits

·       Flexible vacation allowance

·       Flexible work from home or office hours

·       Startup culture backed by a secure, global brand

·       Opportunity to build products enjoyed by millions as part of a passionate team