Job Title: Data Engineer
Location: REMOTE
Duration: 6/12 Months
Required Technical Skills:
More than 8 years' experience implementing data solutions
Strong proficiency in Java, Python and kotlin with an emphasis in building data pipelines
Experience in big data technologies like HDFS, YARN, Map-Reduce, Kafka, Spark, Airflow, etc.
Experience in working with Google Internal tools like PLX, CiderV, Buganizer etc.
Experience with Apache Airflow, Google Composer and dataflow
Experience with Cloud platforms such as GCP, Azure
Ability to write complex SQL to perform common types of analysis and aggregations
Experience with Data Visualization Applications. Looker is a plus
Collaborate with cross-functional teams such as developers, analysts, and operations to execute deliverables
Detail-oriented and document all the work
5+ years professional experience as a data or software engineer
BS in Computer Science; MS in Computer Science preferred
GCP solution architect - certified, is a plus
Responsibility of / Expectations from the Role
Data modeling/Data warehouse modernization/Cloud based data lakes
Create and maintain optimal data pipeline architecture
Define technical roadmap for data platform modernization and analysis and assessment of the existing data platforms
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and Google cloud 'big data' technologies.
Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
Define KPIs for successful modernization of data platform and measure the improvement against the KPIs.
Share your resume to noman.m@falconpls.com