This job listing has expired and the position may no longer be open for hire.

Big Data Engineer in AWS - Farmington, CT at Cognizant

Posted in Information Technology 30+ days ago.

This job brought to you by eQuest

Type: Full-Time
Location: Farmington, Connecticut





Job Description:

Title – Big Data Engineer in AWS

Location – Farmington, CT

Travel – No travel will be needed for this engagement, as work will be one 100% onsite.

Desired Skillset - AWS data related services; DMS, Glue, Athena, Lambda, Redshift, Dynamo DB, Sage Maker) Scala, Java, Hadoop, Kafka, Python and Pyspark


 

Key Responsibilities


  • Contribute in coding, testing, implementing, debugging and documenting the complex programs.

  • Contribute in creating proper technical documentation in the work assignments.

  • Responsible for understanding the business needs and designing programs and systems that match the complex business requirements and recording all the specifications that are involved in the development and coding process.

  • Assist in ensuring that all the standard requirements have been met and are involved in performing the technical analysis.

  • Responsible for assisting the project manager by compiling information from the current systems, analyzing the program requirements and ensuring that it meets the specified time requirements.

  • Resolves moderate problems associated with the designed programs and provides technical guidance on complex programming


  • Build end-to-end big data pipelines on AWS, including:

    • Ingestion/replication via DMS from traditional on-prem RDBMS (e.g. Oracle, MS SQL Server, IBM DB2, MySQL, Postgres)

    • Real-time ingestion and processing with Kinesis Streams, Kinesis Firehose, and Kinesis Analytics

    • CDC, ETL and Analytics via AWS Glue, EMR, Spark, Presto, Athena, Flink, Python/PySpark, Scala, Zeppelin

    • Refactoring of existing RDBMS scripts (e.g. PL/SQL. T-SQL, PL/pgSQL) to PySpark or Scala

    • Buildout of data warehouse and published data sets using RedShift, Aurora, RDS, ElasticSearch

    • Scripting with AWS Lambda



 

Critical Experience:


  • Strong experience in software development with Python or Scala. Experience with PySpark is a big plus.

  • Strong hands-on database engineering experience in AWS

  • Database development experience with relational database management systems (RDBMS)

  • Other experience with AWS or AWS certifications is not required, but a plus.

  • Similar database pipeline experience in other cloud technologies such as Azure is also a plus.


 

Technical Skills














SNo Primary Skill Proficiency Level * Rqrd./Dsrd.
1 AWS Data Related Services PL3 Required

 

* Proficiency Legends






















Proficiency Level Generic Reference
PL1 The associate has basic awareness and comprehension of the skill and is in the process of acquiring this skill through various channels.
PL2 The associate possesses working knowledge of the skill, and can actively and independently apply this skill in engagements and projects.
PL3 The associate has comprehensive, in-depth and specialized knowledge of the skill. She / he has extensively demonstrated successful application of the skill in engagements or projects.
PL4 The associate can function as a subject matter expert for this skill. The associate is capable of analyzing, evaluating and synthesizing solutions using the skill.