Delivery

Senior Data Engineer - 2889

Chennai
Work Type: Full Time
CES has 26+ years of experience in delivering Software Product Development, Quality Engineering, and Digital Transformation Consulting Services to Global SMEs & Large Enterprises. CES has been delivering services to some of the leading Fortune 500 Companies including Automotive, AgTech, Bio Science, EdTech, FinTech, Manufacturing, Online Retailers, and Investment Banks. These are long-term relationships of more than 10 years and are nurtured by not only our commitment to timely delivery of quality services but also due to our investments and innovations in their technology roadmap. As an organization, we are in an exponential growth phase with a consistent focus on continuous improvement, process-oriented culture, and a true partnership mindset with our customers. We are looking for the right qualified and committed individuals to play an exceptional role as well as to support our accelerated growth.

You can learn more about us at: http://www.cesltd.com/

About the Role:
Responsible for end-to-end development of data pipelines, CDC ingestion, SSIS workflows, data models, and integration with multiple source systems including Oracle, cloud APIs, and file-based feeds. Ensure secure data delivery into the enterprise Data Warehouse.

Key Responsibilities:
• Design and build scalable data pipelines leveraging Snowflake for data warehousing and analytics.
• Apply core data engineering principles such as data modeling, partitioning, indexing, and performance tuning irrespective of the tech stack.
• Develop and optimize ELT processes using Snowflake SQL, Tasks, and Streams.
• Perform data ingestion and transformation using Snowpipe, External Stages, and File Formats.
• Write efficient code using Python, PySpark and SQL for data processing and orchestration.
• Integrate with ETL/ELT and orchestration tools such as dbt and Apache Airflow, and manage Git repositories.
• Implement query optimization techniques, resource management, and costcontrol strategies.
• Troubleshoot issues and work independently with minimal supervision.
• Experience with cloud platforms such as AWS, Azure, or GCP for data services and infrastructure.

Required Skills & Qualifications:
• 5–8 years of experience in Data Engineering
• Proven hands-on expertise with Snowflake for data warehousing and analytics
• Strong proficiency in SQL, including query optimization and performance tuning
• Practical experience with Python and PySpark for data processing and transformation
• Excellent analytical, problem-solving, and debugging skills
• Strong communication skills and ability to collaborate effectively in team environments

Nice to Have:
• Experience building and managing Delta Lake implementations using Apache Spark and Python for ACID transactions and reliable data storage.
• Familiarity with distributed data processing frameworks like Spark for large-scale transformations.

Submit Your Application

You have successfully applied
  • You have errors in applying