Online applications are invited for Azure Data Engineer at Deloitte, Bengaluru, with a minimum of 6 years experience in the relevant field. Check the details below!

About Deloitte

Deloitte is a leading global provider of audit and assurance, consulting, financial advisory, risk advisory, tax, and related services. With more than 150 years of hard work and commitment to making a real difference, our organization has grown in scale and diversity with approximately 286,000 people in 150 countries and territories, providing these services yet our shared culture remains the same. Our organization serves four out of five Fortune Global 500 companies.

Job Description

  • You will join a team delivering a transformative cloud-hosted data platform for some of the world’s biggest organizations.
  • The candidate we seek needs to have a proven track record in implementing data ingestion and transformation pipelines on Microsoft Azure.
  • Deep technical skills and experience working on Azure Databricks. Familiarity with data modelling concepts and exposure to Synapse.
    You will also be required to participate in stakeholder management, highlight risks, propose to deliver plans and estimate time and team size based on requirements.
  • Hence, adequate levels of communication skills and relevant experience in handling such situations are desired.

Responsibilities

  • Designing and implementing highly performant data ingestion pipelines from multiple sources using Apache Spark and/or Azure Databricks.
  • Delivering and presenting proofs of concept of key technology components to project stakeholders.
  • Developing scalable and reusable frameworks for ingesting and enriching datasets.
  • Integrating the end-to-end data pipeline to take data from source systems to target data repositories ensures the quality and consistency of data are maintained at all times.
  • Working with event-based / streaming technologies to ingest and process data.
  • Working with other members of the project team to support the delivery of additional project components (API interfaces, Search).
  • Evaluating the performance and applicability of multiple tools against customer requirements.
  • Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints.

Qualifications

  • Strong knowledge of Data Management principles.
  • 6 to 10 years of total years of experience.
  • Experience in building ETL / data warehouse transformation processes.
  • Direct experience in building data pipelines using Azure Data Factory and Apache Spark (preferably Databricks).
  • Experience using Apache Spark and associated design and development patterns.
  • Microsoft Azure Big Data Architecture certification is an advantage.
  • Hands-on experience designing and delivering solutions using Azure Storage, Azure SQL Data Warehouse, Azure Data Lake, Azure Cosmos DB, and Azure Stream Analytics.
  • Experience with Apache Kafka / Nifi for use with streaming data / event-based data (Nice to have but not mandatory)
  • Experience with other Open Source big data products Hadoop (incl. Hive, Pig, Impala)
  • Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J)
  • Experience working in a Dev/Ops environment with tools such as Microsoft Visual Studio Team Services, Terraform etc.

How to Apply?

Interested candidates can apply through this link.

Location

Bengaluru, Karnataka.

Click here to view the official notification of Azure Data Engineer at Deloitte, Bengaluru.