Open In App

Data Engineer Jobs in Gurugram

Last Updated : 26 Apr, 2024
Improve
Improve
Like Article
Like
Save
Share
Report

Data engineer jobs are very popular in India as well as in Gurugram. In this digital world, the demand for a data engineer is very high, and it is a rapidly growing career. Data engineers collect, manage, and convert raw data into usable information for data scientists and business analysts to analyze them. In this article, we have created a list of companies hiring data engineer jobs in Gurugram and provided the average salary and experience wise salary of data engineers.

Companies Hiring Data Engineer

USEReady

Requirements:

  • Demonstrated experience with MS SQL Server, Azure Data Factory, top-notch programming skills – SQL, Python
  • Hands on knowledge of Azure Dev Ops is preferred
  • Working experience with Azure/SQL Database
  • Expert in working with multiple data source types such as JD Edwards
  • Azure certifications is a plus

Apply here: Careers

Trajector

Requirements:

  • Bachelor’s degree in Computer Science, Computer Engineering, or any relevant technical fields, (or any equivalent practical experiences)
  • Writing experience of complex SQL statements
  • Experience with Python or any server-side programming languages
  • Hands on experience with ERD diagrams and dimensional models
  • Knowledge of understanding of data warehousing
  • Preferred Qualifications: Snowflake, Redshift, Vertica, etc
  • Working experience with modern pipeline orchestration engines such as Airflow, Luigi, and Prefect
  • Experience with modern BI tools such as Tableau, Apache Superset, Looker, etc is preferred
  • Expert in Linux command line
  • Knowledge with AWS services and tools

Apply here: Careers

Treyas Infotech and Consulting Pvt Ltd

Requirements:

  • At least 8 years of IT experience, minimum 3+ years in Data related technologies, and experience of 1+ years in data-related GCP Cloud services and delivered at least 1 project as an architect
  • Mandatory knowledge of Big Data Architecture Patterns and the delivery of end-to-end Big Data solutions on GCP Cloud
  • Expert in programming languages such as Java/ Scala and strong at Python
  • Good at least one distributed data processing framework including Spark (Core, Streaming, SQL), Storm or Flank
  • Good at Hadoop eco-system with GCP cloud distribution and knowledge at least on one or more big data ingestion tools (such as Sqoop, Flume, NiFI, etc), distributed messaging and ingestion frameworks ( such as Kafka, Pulsar, Pub/Sub, etc.) and solid knowledge on traditional tools like Informatica, Talend, etc.
  • Solid knowledge of NoSQL solutions such as Mongo DB, Cassandra, HBase, etc, or Cloud-based NoSQL offerings like DynamoDB, Big Table, etc
  • Expert in development with CI / CD pipelines. Knowledge of understanding of containerization, orchestration, and Kubernetes engine would be a plus
  • Certification on GCP cloud platform or big data technologies is preferred
  • Outstanding analytical and problem-solving skills
  • Solid understanding of data technologies landscape/ecosystem

Apply here: Careers

TEKsystems

Requirements:

  • Solid knowledge of Python
  • Spark framework and streaming knowledge is required
  • Knowledge of Machine Learning Lifecycle is mandatory

Apply here: Careers

Recur Club

Requirements:

  • Overall 2-4 years working experience as a Data Engineer, with a solid record of designing and implementing complex data solutions
  • Expert in data warehousing concepts, ETL processes, and data modeling
  • Expertise in programming skills in languages such as Python, Java, or Scala
  • Experience with Big Data technologies including Hadoop, Spark, or Hive
  • Expertise in relational databases (such as MySQL, PostgreSQL) and NoSQL databases (such as MongoDB, Cassandra)
  • Knowledge of cloud platforms (including AWS, Azure, GCP) and related services (including, AWS Glue, Azure Data Factory)
  • Outstanding problem-solving skills and solid attention to detail
  • Expertise in communication and collaboration skills

Apply here: Careers

EDGE Executive Search

Requirements:

  • Knowledge of PySpark, Python, and ETL processes
  • Hands on experience with Java programming, such as multithreading, concurrency, and object-oriented design principles is a plus
  • Hands on experience with frameworks for data processing and ETL workflows
  • Proven experience with RDBMS and solid experience with performance tuning
  • Good software design skills
  • Strong experience implementing automated unit testing
  • Solid experience of AWS cloud (including Glue, Lambda, S3)
  • Knowledge of Agile working practices
  • Expert in Git, Gitlab, CI/CD pipelines

Apply here: Careers

Cars24

Requirements:

  • Knowledge of optimising code and writing modular data pipelines for custom data integration jobs.
  • Understanding of version control, proficiency in Python,
  • Understanding of snowflake is plus, ability to use snowflake as an advanced user will be considered plus.
  • Knowledge of Google sheets, linux and orchestration tools like Airflow etc
  • Experience with GCS, S3 and other cloud technologies skills
  • Knowledge of custom data integration pipelines that work at scale reliably
  • Knowledge of Query/code profiling for efficiency and bugs

Apply here: Careers

E I DuPont India Pvt Ltd

Requirements:

  • Expert in English language ( both written and oral)
  • Bachelor’s Degree in math, science, engineering, IT or any related fields
  • Solid knowledge of data concepts and tools
  • Fluency in communication skills
  • Working with remote colleagues spread across the world
  • Expert in SQL and Python
  • Experience with cloud computing, mostly in Azure (incl. ADF and ADLS)
  • Experience with distributed systems (including Apache Spark or Data bricks)
  • Experience with version control like Git
  • Delta lake and Lakehouse architecture, Pandas and PySpark, Exposure to DevOps workflows, Power BI, and ML pipelines (e.g. scikit-learn) skilled candidates will get preference

Apply here: Careers

Publicis Sapient

Requirements:

  • Bachelor’s/ Master’s degree in Computer science or in Engineering is preferred
  • Good experience in designing and architecting large scale distributed data management & analytics applications

Apply here: Careers

Jio Platforms Limited

Requirements:

  • Bachelor’s or master’s degree in computer science, Engineering, or any related areas
  • Minimum 5+ years working experience in data engineering roles with a focus on building data pipelines and data infrastructure
  • Hands-on experience with Kafka, Hadoop, Hive, PySpark, and Cloudera
  • Expert in Python programming language for data manipulation, scripting, and automation
  • Good understanding of knowledge of distributed systems, data modelling, and database concepts
  • working experience with on-premises data infrastructure and cloud platforms (such as AWS, Azure, GCP)
  • Hands on experience – Spark streaming, Flink.
  • Hands-on experience on coding of these– Java/Scala/Python
  • Good experience on Linux/Scripting
  • Knowledge of understanding of Data Modelling
  • Outstanding problem-solving skills and ability to troubleshoot complex data issues
  • Strong communication and collaboration skills with the ability to work effectively in cross-functional teams.
  • Proven track record of delivering high-quality solutions on time and within budget.

Apply here: Careers

Oliver Wyman

Requirements:

  • At least 3+ years experience in data engineering, 1+ years of on implementing cloud architectures on AWS or Azure
  • In-depth experience designing, productionizing, maintaining, and documenting reliable and scalable data infrastructure and data products in complex environments across all data lifecycle stages
  • Demonstrated experience in one or more object-oriented programming languages and advanced knowledge of PySpark and Spark SQL
  • Knowledge in designing and implementing large-scale distributed systems
  • Knowledge of integration and continuous delivery technology experience
  • Knowledge of monitoring tools and their integration into other platforms – including AWS/Azure etc
  • Ability to learn and pick up a new tool/ platform immediately
  • Outstanding proven analytical and solid problem-solving skills
  • Expert in written and verbal communication skills with a demonstrated ability to interact effectively with all levels of management

Apply here: Careers

Benovymed Healthcare Private Limited

Requirements:

  • At least 3-6 Years experience working as Hard core Sr. Data Engineer / Data Engineer / Big Data Engineer with most relevant experience in above skill set in a Big Database Data Science driven software product development reputed startup worked on many multiple Projects independently.
  • Strong Statistical analysis and modeling
  • Knowledge of Database designing and Data architectures
  • Experience in Hadoop-based technologies (such as MapReduce, Hive and Pig) and Big data Technologies such as Spark to forecast and recommend improvements based on how the data is consumed
  • SQL-based technologies (such as PostgreSQL and MySQL)
  • NoSQL technologies such as Cassandra and MongoDB)
  • Data modeling tools (such as ERWin, Enterprise Architect and Visio)
  • Knowledge of Python, C/C++ and Java

Apply here: Careers

Transorg Analytics

Requirements:

  • At least 2 – 6 years experience in data engineering
  • Knowledge of latest data engineering ecosystem platforms such as AWS, Azure, GCP, Cloudera and Data bricks
  • Good knowledge of Python/Scala/Java
  • Sound knowledge of SQL / NoSQL databases and data warehouse concepts
  • Demonstrated experience of working on databases including Sql Servers, PostgreSql, Cloud infrastructure, etc.
  • Good knowledge of data backup, recovery, security and integrity
  • Strong knowledge on Spark, HDFS/HIVE/HBASE, Shell Scripting, and Spark Streaming
  • Expert in communication skills
  • Expert with data ingestion tools like Sqoop, flume, talend, and Kafka

Apply here: Careers

airindiali

Requirements:

  • Bachelor’s degree in Computer Science, Engineering, Mathematics, or any related areas
  • Minimum 6+ years of experience as a data engineer or in a similar field
  • Expert in Python, SQL, and other programming languages for data engineering
  • Experience with data pipeline tools and frameworks like Spark, Kafka, Airflow, etc
  • Experience with cloud services and platforms including AWS, GCP, or Azure.
  • Experience with data warehouse and data lake technologies including Redshift, BigQuery, Snowflake, etc
  • Hands on experience with data modeling, data quality, and data governance concepts and practices
  • Outstanding analytical and problem-solving skills
  • Expert in communication and collaboration skills.
  • Ability to work independently and in a team

Apply here: Careers

Planet Spark

Requirements:

  • Skill Requirements; SQL, Ruby or Python, Apache-Hadoop based analytics, Data warehousing, Data architecture, Schema design, ML skills are required
  • Previous experience at least 2 to 5 years as a Data Engineer
  • Knowledge in managing and communicating data warehouse plans to internal teams
  • Hands on experience designing, building, and maintaining data processing systems
  • Understanding of root cause analysis on external and internal processes and data to identify opportunities for improvement and answer questions
  • Outstanding analytic skills associated with working on unstructured datasets.

Apply here: Careers

Crescendo Global

Requirements:

  • Bachelor’s degree in computer science, Information Technology, or any related areas
  • At least 6+ years of experience in managing data engineering projects, with a solid track record of success
  • Expert in SQL, data environments (including Big Query, Google Analytics, Snowflake), and data transformation tools (Python)
  • Solid understanding of knowledge of ETL data pipelines, including integration with APIs and databases.
  • Good experience with Google Analytics tools and cloud-based Data Warehousing solutions ( including Snowflake and/or Google Cloud).
  • Knowledge of Cron job scheduler and Data Vault modeling is an advantage
  • Knowledge in collaborating with client’s Google Analytics partner agencies.
  • Expert in communication and interpersonal skills

Apply here: Careers

S&P Global

Requirements:

  • Bachelor’s or Master’s degree in Computer Science, Engineering, or any related areas
  • Overall 7+ years of working experience in data engineering with a proven track record of designing and
  • implementing complex data solutions.
  • Expert in programming languages including Python, Java, or Scala. Experience with SQL, ETL processes, and data warehousing concepts.
  • Good expertise in big data technologies such as Kafka/OLAP, or similar frameworks.
  • Experience with SQL and No-SQL databases skills
  • Experience with cloud platforms (such as AWS, Azure, GCP) and understanding of containerization and orchestration tools (such as Docker, Kubernetes).
  • Leadership and mentoring skills with the strong ability to lead a team, prioritize tasks, and drive
  • projects to completion
  • Outstanding problem-solving abilities, attention to detail, and a collaborative mindset
  • Expert in communication skills with the ability to convey complex technical concepts to non-technical stakeholders
  • Ability of Optimizing slow-running database queries and data pipelines
  • Certifications in relevant technologies (AWS/Azure/GCP, etc.) is Preferred
  • Experience with machine learning frameworks or data science concepts is preferred

Apply here: Careers

Sampoorna Consultants Private Limited

Requirements:

  • Overall 5 – 10 Years of experience
  • Excellent Communication skills
  • Excellent Flexibility: able to work 24X7 in rotational shift
  • Management Basics: Incident Management of work experience, Change Management ( in work experience), Problem Management Routing(OSPF/BGP) Switching (VLAN/Spanning tree) Cisco Wireless (AP and WLC) Nexus (7k/9k & Fex) Mandatory certification: CCNA

Apply here: Careers

Linkage IT Private Limited

Requirements:

  • Bachelor’s degree in Computer Science, Engineering, or any related areas
  • Working Experience as a Data Engineer or any similar roles
  • Expert in SQL Server or Postgres, good experience in writing complex SQL queries and PL/SQL procedures.
  • Knowledge of programming skills in Scala, with the ability to write efficient Spark jobs.
  • Experience with Informatica ETL tools is an advantage
  • Ability of understanding of database performance optimization techniques
  • Knowledge with AWS technologies is a plus
  • Outstanding problem-solving skills and strong attention to detail.
  • Expert in communication and collaboration skills

Apply here: Careers

Impetus

Requirements:

  • Overall 3.5-7 years of IT experience
  • Extensive production experience of at least 2-3 Years in GCP, Other cloud experience would be a strong bonus
  • Good knowledge in Data engineering at least 3-6 Years of experience in Big Data technologies such as Hadoop, NoSQL, Spark, Kafka etc.
  • Knowledge of enterprise application development is a must.

Apply here: Careers

AbsolutData Research & Analytics (P) Ltd

Requirements:

  • Experience with Azure Data Engineer
  • Knowledge of Career Progress Consultants in Gurgaon and at least 4 – 8 Year of Experience on TimesJobs.com

Apply here: Careers

Xebia

Requirements:

  • Very strong in SQL queries
  • Very strong in Data bricks – ( including workflows etc. )
  • Strong in Python
  • Good at PySpark (comes with Data bricks).
  • Knowledge of Data Pipeline optimization, ETL, process of optimizing the query
  • Banking domain is a plus

Apply here: Careers

Eucloid Data Solutions

Requirements:

  • Overall 1-2 years of working experience in Data Analysis
  • Expert in SQL and good at Data visualization tools in Tableau
  • Outstanding Problem-solving and conflict resolution ability
  • Expert in communication, interpersonal skills
  • Good understanding of the technology stack, dependencies between applications and how the data
  • Excellent team player with the strong ability to influence and negotiate

Apply here: Careers

Deutsche Telekom Digital Labs

Requirements:

  • At least 3-5 years of strong software/data engineering experience
  • Experience in building large scalable and reliable enterprise technology platforms using Big Data open-source technologies including Java, Hadoop, Spark, Kafka
  • Experience in one of the programming language java/scala/python.
  • Experience in Data Storage and Good understanding of SQL & NoSQL Data bases. Mysql, Mongo.
  • Knowledge with cloud solutions for data infrastructure. AWS is a plus
  • Knowledge of data handling frameworks including Spark, Apache Beam, Apache flink etc
  • Knowledge of data formats including Apache Parquet, Avro, ORC etc.
  • Expert in distributed file system and computations concepts

Apply here: Careers

Job Portals

The following are the job portals where you can apply for the post of data engineer jobs in Gurugram:

Salary of Data Engineer

The average annual salary of a data engineer in Gurugram is 10 lakhs.

Experience Wise salary trend

Experience

Salary

Entry level (0 – 5 years)

6.30 Lakhs

Mid level ( 6 – 10 years)

12.50 Lakhs

High level ( 10 – 15 years)

13.70 Lakhs

Senior level ( 15+ years)

22 lakhs

Data Engineer Salary: Designation Wise

Designation

Salary

System Engineer

4.40 L

Project Engineer

4 L

Software Engineer

5.50 L

Associate Software Engineer

4.5 L

Software Development Engineer

9 L

Lead Engineer

21 L

Data Engineer Jobs in Gurugram – FAQs

Are data engineers in high demand?

Data engineers demand is too high in this digital world. The demand of data engineer is growing rapidly.

What is the salary of 1 year data engineer?

Data engineers can earn a healthy salary annually. The average salary of a data engineer is approx 10.30 lakhs which ranges from 3.5 lakhs to 22 lakhs.

Which field is best for data engineer?

A graduate student can start his/her career as a data engineer. The well known fields for data engineer are as follows:

  • Data Science
  • Software Engineering
  • Math, or
  • A business-related field.



    Like Article
    Suggest improvement
    Previous
    Next
    Share your thoughts in the comments

    Similar Reads