KPMG Hiring - Data Engineer


   About the company

KPMG entities in India are professional services firm(s). These Indian member firms are affiliated with KPMG International Limited. KPMG was established in India in August 1993. Our professionals leverage the global network of firms, and are conversant with local laws, regulations, markets and competition. KPMG has offices across India in Ahmedabad, Bengaluru, Chandigarh, Chennai, Gurugram, Jaipur, Hyderabad, Jaipur, Kochi, Kolkata, Mumbai, Noida, Pune, Vadodara and Vijayawada. KPMG entities in India offer services to national and international clients in India across sectors. We strive to provide rapid, performance-based, industry-focused and technology-enabled services, which reflect a shared knowledge of global and local industries and our experience of the Indian business environment. 




Job Profile: Data Engineer

Qualifications: Bachelors in CS or related field

Experience: Freshers

Salary: 9 LPA (expected)

Job Location: Bengaluru, Karnataka



Responsibilities

  • Should have experience in Data and Analytics and overseen end-to-end implementation of data pipelines on cloud-based data platforms. 
  • Strong programming skills in Python, Pyspark and some combination Java, Scala (good to have) 
  • Experience writing SQL, Structuring data, and data storage practices. 
  • Experience in Pyspark for Data Processing and transformation. 
  • Experience building stream-processing applications (Spark steaming, Apache-Flink, Kafka, etc.) 
  • Maintaining and developing CI/CD pipelines based on Gitlab. 
  • You have been involved assembling large, complex structured and unstructured datasets that meet functional/non-functional business requirements. 
  • Experience of working with cloud data platform and services. 
  • Conduct code reviews, maintain code quality, and ensure best practices are followed.
  • Debug and upgrade existing systems. Nice to have some knowledge in Devops.

  • Requirements

  • Bachelor’s degree in computer science or related field 
  • Experience in Snowflake and Knowledge in transforming data using Data build tool.
  • Strong programming skills in Python, Pyspark and some combination Java, Scala (good to have) 
  • Experience in AWS and API Integration in general with knowledge of data warehousing concepts. 
  • Excellent communication and team collaboration skills

  • Apply before the link expires !!!


    Apply Link: Click here


    Join our Telegram group: Click here
    Join our WhatsApp Channel: Click here
    Join us on Instagram: Click here

    Post a Comment

    0 Comments
    * Please Don't Spam Here. All the Comments are Reviewed by Admin.