Click on this paragraph.
T & A HR Solutions provides best opportunities in various diversified segments and sectors such as IT & Information Techno…
Read More »T & A HR Solutions understand that requirement of each client is unique, so we provide fresh perspective & young talent of…
Read More »If you wish to be considered for any of the Openings, you may please submit your Curriculum Vitae and We will assist you in the be…
Read More »
Designation – Data Engineer
Experience – 3 – 5 years
Notice period – max 15 days
Salary Range – 10 – 15 LPA
Job Location – Remote (Mohali as situation subsides)
Experience Required – 3-5 years
Skillset –
1. Experience in Azure Databricks platform. Delta lake, Time travel design patterns, Lambda Data Architecture
2. Experience in Azure Cloud Platform. (ADLS, Blob , ADF ).
3. Understanding of big data file formats like JSON, Avro, Parquet and other prevalent big data file formats
4. Experience with Python, PySpark, Scala, Spark to write data pipelines and data processing layers
5. Demonstrate expertise in writing complex, highly-optimized SQL queries across large data sets
6. Experience with Data Governance (Data Quality, Metadata Management, Security, etc.)
7. Strong SQL skills and proven experience in working with large datasets and relational databases.
8. Understanding Data Warehouse design principles (Tabular, MD).
9. Experience in working with big data tools/environment to automate production pipelines
10. Familiarity with Azure Synapse and Snowflake
11. Experience with data visual analysis and BI tools (Matplotlib and Power BI)
Responsibilities –
1. As a Data Engineer in our team, you should have a proven ability to deliver high quality work on time and
with minimal supervision.
2. Developing new data pipelines and ETL jobs for processing millions of records and it should be scalable with
growth.
3. Pipelines should be optimised to handle both real time data, batch update data and historical data.
4. Establish scalable, efficient, automated processes for complex, large scale data analysis.
5. Write high quality code to gather and manage large data sets (both real time and batch data) from multiple
sources, perform ETL and store it in a data warehouse/data lake
6. Manipulate and analyse complex, high-volume, high-dimensional data from varying sources using a variety
of tools and data analysis techniques.
7. Participate in data pipelines health monitoring and performance optimisations as well as quality
documentation.
8. Adapt best practices and industry standards, ensuring top quality deliverables and playing an integral role in
cross-functional system integration.
9. Data integration using enterprise development tool-sets (e.g. ETL, MDM, Quality, CDC, Data Masking,
Quality).
10. Design and implement formal data warehouse testing strategies and plans including unit testing, functional
testing, integration testing, performance testing, and validation testing.
11. Interact with end users/clients and translate business language into technical requirements.
If you are an active job seeker, Please apply with your resume at hr5@tasolutions.in Alternatively, you can call us on 9041249494. Also, you may share with your Friends or Colleagues.