CES has 26+ years of experience in delivering Software Product Development, Quality Engineering, and Digital Transformation Consulting Services to Global SMEs & Large Enterprises. CES has been delivering services to some of the leading Fortune 500 Companies including Automotive, AgTech, Bio Science, EdTech, FinTech, Manufacturing, Online Retailers, and Investment Banks. These are long-term relationships of more than 10 years and are nurtured by not only our commitment to timely delivery of quality services but also due to our investments and innovations in their technology roadmap. As an organization, we are in an exponential growth phase with a consistent focus on continuous improvement, process-oriented culture, and a true partnership mindset with our customers. We are looking for the right qualified and committed individuals to play an exceptional role as well as to support our accelerated growth.
You can learn more about us at: http://www.cesltd.com/
About the Role:
We are looking for a Data Engineer to join our team. This role requires a motivated professional with a foundational understanding of data engineering concepts and a willingness to learn advanced techniques. You will work on building and optimizing data pipelines, supporting analytics, and collaborating with senior engineers to deliver high quality data solutions.
Key Responsibilities:
• Assist in designing and building data pipelines leveraging Snowflake for data warehousing and analytics.
• Apply basic data engineering principles such as data modeling and query optimization under guidance.
• Support development and optimization of ELT processes using Snowflake SQL.
• Perform data ingestion and transformation using Snowpipe, External Stages, and File Formats.
• Write and maintain scripts using Python, PySpark, and SQL for data processing.
• Collaborate on integration with ETL/ELT and orchestration tools such as dbt and Apache Airflow.
• Participate in troubleshooting and resolving data pipeline issues.
• Gain exposure to cloud platforms such as AWS, Azure, or GCP for data services.
Required Skills & Qualifications:
• 2–5 years of experience in Data Engineering or related roles.
• Hands-on experience with Snowflake.
• Good knowledge of SQL and ability to write optimized queries.
• Practical experience with Python (and exposure to PySpark preferred).
• Strong analytical and problem-solving skills.
• Good communication skills and ability to work in a collaborative team environment.
Nice to Have:
• Exposure to Delta Lake and Apache Spark for data reliability and large-scale transformations.
• Familiarity with ETL/ELT tools (dbt, Airflow) and version control systems like Git.
• Understanding of data governance, data quality, and CI/CD pipelines.