Job Title: Data Engineer
Location: Austin, Tx.
Job Type: Data Engineer
Expected hours per week: 40 hours per week
Schedule: Hybrid
Pay Range: 75-85 per hour
Job Description:
***W2 Only, No C2C***
We are looking for a highly experienced Data Engineer with a strong background in building production-grade data pipelines, working with complex automotive or time-series data formats (especially MF4/MDF4), and deploying large-scale solutions in Databricks.
The ideal candidate is fluent in Python and Data Storage formats, comfortable working at the intersection of data engineering and data analytics. This is a senior technical role requiring deep expertise, independence, and the ability to drive end-to-end delivery of data
________________________________________
Key Responsibilities
• Design and build robust data pipelines in Python to extract, transform, and load data from MF4/MDF4 files(e.g., automotive telemetry, sensor logs).
• Architect scalable ETL/ELT workflows in Databricks, leveraging Delta Lake and cloud-native storage.
• Optimize performance and ensure reliability of pipelines handling large-scale, high-frequency time-series datasets.
• Mentor junior engineers and contribute to technical design reviews, architecture discussions, and code quality.
• Stay ahead of industry trends ,data lake house architecture, and data workflows.
________________________________________
Required Qualifications
• 5+ years of experience in data engineering and software development.
• Advanced proficiency in Python, with experience in performance tuning and large-scale data processing.
• Strong experience with Databricks, Delta Lake, and Spark (PySpark or Scala).
• Demonstrated ability to design and implement high-throughput, fault-tolerant pipelines in production environments.
• Familiarity with cloud platforms (AWS, Azure, or GCP), including data storage, compute, and security best practices.
________________________________________
Preferred Qualifications
• Experience in automotive, IoT, or telemetry-heavy industries.
• Hands-on experience parsing and processing MF4/MDF4 files (ASAM standards).
• Prior exposure to Databricks Unity Catalog, Delta Live Tables.
• Understanding of GitOps and CI/CD in software engineering
Benefits: 80 hours paid time off, paid holidays, medical insurance contributions, dental vision and our 401k retirement savings plan
