This job listing has expired and the position may no longer be open for hire.

BigData Tech Lead at Cognizant in Charlotte, North Carolina

Posted in Information Technology 30+ days ago.

Type: Full-Time





Job Description:

Role: Onsite Support Lead

Cognizant (NASDAQ: CTSH) is a leading provider of information technology, consulting, and business process outsourcing services, dedicated to helping the world's leading companies build stronger businesses. Headquartered in Teaneck, New Jersey (U.S.). Cognizant is a member of the NASDAQ-100, the S&P 500, the Forbes Global 1000, and the Fortune 500 and we are among the top performing and fastest growing companies in the world.

*Please note, this role is not able to offer visa transfer or sponsorship now or in the future*

Practice - AIA - Artificial Intelligence and Analytics

About AI & Analytics: Artificial intelligence (AI) and the data it collects and analyzes will soon sit at the core of all intelligent, human-centric businesses. By decoding customer needs, preferences, and behaviors, our clients can understand exactly what services, products, and experiences their consumers need. Within AI & Analytics, we work to design the future—a future in which trial-and-error business decisions have been replaced by informed choices and data-supported strategies.

By applying AI and data science, we help leading companies to prototype, refine, validate, and scale their AI and analytics products and delivery models. Cognizant’s AIA practice takes insights that are buried in data, and provides businesses a clear way to transform how they source, interpret and consume their information. Our clients need flexible data structures and a streamlined data architecture that quickly turns data resources into informative, meaningful intelligence.

BIG DATA Tech LEAD









• 10+ years of success in consultative/complex deployment project, architecture, design, implementation and/or support of data and analytics solutions with different distributed systems and cloud services.

• Deep understanding of different database, analytics and migration services and structures within AWS, and the various trade-offs involved with deciding what services to use.

• Capacity planning and estimating the requirements for lowering or increasing the capacity of the Hadoop cluster.

• In depth understanding of big data tools: Hadoop, Apache Spark, Kafka, Sqoop, Flume, Storm, Ambari etc, with Hadoop Distribution Platform Hortonworks/Cloudera Data Platform. 

• Experience with data modeling in a distribute system (Hadoop/Hbase/Cassandra) for different workload (Batch/Steaming/Lambda).

• Experience with “on-premises to cloud migrations or IT transformations

• Experience migrating client workloads into AWS from other clouds (public or private) as well as traditional on-premises solutions.

• Hands-on experience in any ETL Tool like Informatica or Talend

• Experience with stream-processing systems: Storm, Spark-Streaming, etc.

• Strong technical Experience with AWS cloud services: S3, IAM, EC2, EFS, EBS, EMR, RDS, Redshift, Athena, Cloudwatch , SNS, SQS, SES, Glacier and Kinesis

• DevOps engineering exposure to build CI\CD pipeline with GitHub, Jenkins, Docker, Kubernetes, AWS DevOps (CodeCommit, CodeBuild, CodeDeploy, CodeArtifact).

• Experience in creating tables, partitioning, bucketing, loading and aggregating data using Spark SQL/Scala

• Experience with relational SQL, Snowflake and NoSQL databases, including Postgres and Cassandra/ Elasticsearch.

• Strong understanding of Data Modeling and defining conceptual logical and physical data models.

• Proven track record of building deep sales/technical relationships with CXO’s and practice building executives within highly strategic system integrators.

• Experience in managing various client relationships to get consensus on solutions.

Nice to have:

• Good to have programming experience with object-oriented/object function scripting languages:  Python, Spark, Scala, Shell Scripting

• Experience of working in Multivendor Cloud based application tailored environment.

• Understanding of Agile framework and ability to work in sprints (agile project)

• Functional knowledge on insurance domain is preferred.

• At least 1 AWS Cloud Certification – Associate or Professional and Databricks Certified Professional Data Engineer (Preferred)



• 10+ years of success in consultative/complex deployment project, architecture, design, implementation and/or support of data and analytics solutions with different distributed systems and cloud services.

• Deep understanding of different database, analytics and migration services and structures within AWS, and the various trade-offs involved with deciding what services to use.

• Capacity planning and estimating the requirements for lowering or increasing the capacity of the Hadoop cluster.

• In depth understanding of big data tools: Hadoop, Apache Spark, Kafka, Sqoop, Flume, Storm, Ambari etc, with Hadoop Distribution Platform Hortonworks/Cloudera Data Platform. 

• Experience with data modeling in a distribute system (Hadoop/Hbase/Cassandra) for different workload (Batch/Steaming/Lambda).

• Experience with “on-premises to cloud migrations or IT transformations


Candidate should possess excellent communication skills and ability to architect and design solutions per the requirement.

Cognizant is an Equal Opportunity Employer M/F/D/V. Cognizant is committed to ensuring that all current and prospective associates are afforded equal opportunities and treatment and a work environment free of harassment.

Cognizant is recognized as a Military Friendly Employer and is a coalition member of the Veteran Jobs Mission. Our Cognizant Veterans Network assists Veterans in building and growing a career at Cognizant that allows them to leverage the leadership, loyalty, integrity, and commitment to excellence instilled in them through participation in military service.

Role: Onsite Support Lead

Cognizant (NASDAQ: CTSH) is a leading provider of information technology, consulting, and business process outsourcing services, dedicated to helping the world's leading companies build stronger businesses. Headquartered in Teaneck, New Jersey (U.S.). Cognizant is a member of the NASDAQ-100, the S&P 500, the Forbes Global 1000, and the Fortune 500 and we are among the top performing and fastest growing companies in the world.

*Please note, this role is not able to offer visa transfer or sponsorship now or in the future*

Practice - AIA - Artificial Intelligence and Analytics

About AI & Analytics: Artificial intelligence (AI) and the data it collects and analyzes will soon sit at the core of all intelligent, human-centric businesses. By decoding customer needs, preferences, and behaviors, our clients can understand exactly what services, products, and experiences their consumers need. Within AI & Analytics, we work to design the future—a future in which trial-and-error business decisions have been replaced by informed choices and data-supported strategies.

By applying AI and data science, we help leading companies to prototype, refine, validate, and scale their AI and analytics products and delivery models. Cognizant’s AIA practice takes insights that are buried in data, and provides businesses a clear way to transform how they source, interpret and consume their information. Our clients need flexible data structures and a streamlined data architecture that quickly turns data resources into informative, meaningful intelligence.

BIG DATA Tech LEAD









• 10+ years of success in consultative/complex deployment project, architecture, design, implementation and/or support of data and analytics solutions with different distributed systems and cloud services.

• Deep understanding of different database, analytics and migration services and structures within AWS, and the various trade-offs involved with deciding what services to use.

• Capacity planning and estimating the requirements for lowering or increasing the capacity of the Hadoop cluster.

• In depth understanding of big data tools: Hadoop, Apache Spark, Kafka, Sqoop, Flume, Storm, Ambari etc, with Hadoop Distribution Platform Hortonworks/Cloudera Data Platform. 

• Experience with data modeling in a distribute system (Hadoop/Hbase/Cassandra) for different workload (Batch/Steaming/Lambda).

• Experience with “on-premises to cloud migrations or IT transformations

• Experience migrating client workloads into AWS from other clouds (public or private) as well as traditional on-premises solutions.

• Hands-on experience in any ETL Tool like Informatica or Talend

• Experience with stream-processing systems: Storm, Spark-Streaming, etc.

• Strong technical Experience with AWS cloud services: S3, IAM, EC2, EFS, EBS, EMR, RDS, Redshift, Athena, Cloudwatch , SNS, SQS, SES, Glacier and Kinesis

• DevOps engineering exposure to build CI\CD pipeline with GitHub, Jenkins, Docker, Kubernetes, AWS DevOps (CodeCommit, CodeBuild, CodeDeploy, CodeArtifact).

• Experience in creating tables, partitioning, bucketing, loading and aggregating data using Spark SQL/Scala

• Experience with relational SQL, Snowflake and NoSQL databases, including Postgres and Cassandra/ Elasticsearch.

• Strong understanding of Data Modeling and defining conceptual logical and physical data models.

• Proven track record of building deep sales/technical relationships with CXO’s and practice building executives within highly strategic system integrators.

• Experience in managing various client relationships to get consensus on solutions.

Nice to have:

• Good to have programming experience with object-oriented/object function scripting languages:  Python, Spark, Scala, Shell Scripting

• Experience of working in Multivendor Cloud based application tailored environment.

• Understanding of Agile framework and ability to work in sprints (agile project)

• Functional knowledge on insurance domain is preferred.

• At least 1 AWS Cloud Certification – Associate or Professional and Databricks Certified Professional Data Engineer (Preferred)



• 10+ years of success in consultative/complex deployment project, architecture, design, implementation and/or support of data and analytics solutions with different distributed systems and cloud services.

• Deep understanding of different database, analytics and migration services and structures within AWS, and the various trade-offs involved with deciding what services to use.

• Capacity planning and estimating the requirements for lowering or increasing the capacity of the Hadoop cluster.

• In depth understanding of big data tools: Hadoop, Apache Spark, Kafka, Sqoop, Flume, Storm, Ambari etc, with Hadoop Distribution Platform Hortonworks/Cloudera Data Platform. 

• Experience with data modeling in a distribute system (Hadoop/Hbase/Cassandra) for different workload (Batch/Steaming/Lambda).

• Experience with “on-premises to cloud migrations or IT transformations


Candidate should possess excellent communication skills and ability to architect and design solutions per the requirement.

Cognizant is an Equal Opportunity Employer M/F/D/V. Cognizant is committed to ensuring that all current and prospective associates are afforded equal opportunities and treatment and a work environment free of harassment.

Cognizant is recognized as a Military Friendly Employer and is a coalition member of the Veteran Jobs Mission. Our Cognizant Veterans Network assists Veterans in building and growing a career at Cognizant that allows them to leverage the leadership, loyalty, integrity, and commitment to excellence instilled in them through participation in military service.





More jobs in Charlotte, North Carolina

General Business
6 minutes ago

Overhead Door Corporation
Other
about 2 hours ago

The Judge Group Inc.
Other
about 2 hours ago

The Judge Group Inc.
More jobs in Information Technology

Information Technology
30+ days ago

Metropolitan Water District Southern California
$115,190.00 - $150,675.00 per year
Information Technology
30+ days ago

SailPoint Technologies, Inc
Information Technology
36 minutes ago

Best Buy