Tell us something about yourself
Job Description
As the Tech Lead for Big Data, you will spearhead the architecture and design of cutting-edge solutions utilizing Azure and Databricks technologies. Your primary responsibilities will include leading a team of data engineers, driving large-scale data migration and modernization projects, and serving as the resident expert in Big Data analytics.
Roles & Responsibilities
- Architect and design innovative solutions leveraging Azure and Databricks platforms, with a focus on Big Data analytics.
- Lead and mentor a team of data engineers, providing technical guidance and oversight throughout project lifecycles.
- Drive the execution of large-scale data migration and modernization projects, ensuring seamless transition and optimal performance.
- Develop and optimize Spark programming code in Python/Scala to support various data processing requirements.
- Design and implement data structures optimized for storage and efficient query patterns, utilizing technologies such as Parquet and Delta Lake.
- Facilitate platform migration to Azure cloud and Databricks, with a focus on data modeling and extraction strategies.
- Utilize expertise in database technologies, including traditional RDBMS (e.g., MS SQL Server, Oracle, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra, Neo4J, CosmosDB, Gremlin).
- Collaborate with cross-functional teams to integrate traditional data warehousing and ETL tools, such as Azure Data Factory and Informatica, into overall solution architectures.
- Stay updated on emerging trends and best practices in Big Data analytics, contributing insights to enhance team capabilities.
Skills
- Bachelor's degree in Computer Science, Engineering, or related field.
- 7+ years of experience in distributed data processing, with a focus on Databricks or Apache Spark, in a customer-facing technical or consulting role.
- Proven expertise in writing efficient code in Python/Scala and optimizing Spark jobs for performance.
- Experience with platform migration to Azure cloud and Databricks, including data modeling and extraction techniques.
- Strong understanding of traditional data warehousing concepts and ETL tools, with hands-on experience in Azure Data Factory and/or Informatica.
- Databricks Certified Data Engineer Associate/Professional Certification (preferred).
- Excellent communication and leadership skills, with the ability to effectively collaborate with cross-functional teams and stakeholders.
Experience
7-10 Years