This job listing has expired and the position may no longer be open for hire.

Azure Architect at Cognizant in Chicago, Illinois

Posted in Information Technology 30+ days ago.

Type: Full-Time





Job Description:









Must have:

·         Strong knowledge and working experience with SQL, Databricks

·         String understanding about the underlying architecture of data components (Access control/Configurations/ Performance blockers)

·         Working experience with Azure Data Factory, Azure Data Lake Store, Azure Synapse , Databricks

·         Hands on experience with Pyspark, Dataframe API, SQL API

·         Having very good understanding on Spark, Hadoop Map Reduce framework

·         Hands on experience with Data warehousing

·         Having prior experience in performance tuning for Big data work load (Spark or Map-reduce framework)

·         Having prior experience of handling structure and unstructured data

·         Strong knowledge in release management of  azure components

·         Must be able to appreciate the process control, change control.

·         Must be able to provide platform solution/guidance about various cloud services, database technologies and pipeline structures of Ingestion to consumption layers.

 

Nice To Have:

·         Awareness of data security, DMZ, Encryption mechanism, VPC  etc

·         Hands on experience in building devOPS pipeline

·         Awareness about DataOPS



·         Business Understanding

a.       Understand and define data vison (strategic data requirements) based on business requirement and translate the business requirement into a technology requirement.

b.       Provide high-level integrated designs to meet business requirements

·         ELT(Data Processing)

a.       Understand various data sources and source data structure

b.       Understand data processing requirement – like real time, near real time, batch

c.       Understand read pattern, write pattern, usage of data, size of datasets to select right data processing tools

d.       Understand scalability, reliability, maintainability and recoverability   requirement

e.       Define source to target dataflow and ensure data security in the dataflow diagram by ensuring following – data is secure at rest, at motion and while in use

f.        Evaluate schemas of various data sources and select right target data format (for analytical or transaction processing) to enable vectorized processing

g.       Select right compute and storage infrastructure to process data. Perform POC to evaluate tools if necessary.

h.       Define framework, standards, policies and best practices for data processing

i.         Collaborate with various stake holders to get feedback on data processing

·         Data Lake

a.       Define folder structure based on various subject areas and underlying modules

b.       Define data archival strategy(data lifecycle) based on business requirement

c.       Classify data according to its sensitivity and define access control

d.       Define standards, policies and best practices to store and organize data into data lake

e.       Get feedback from various stake holders/users of data lake store

 

·         EDW

a.       Understand how data should be organized and managed

b.       Work with data modeler to define data models which meets data vision of the organization

c.       Identify the EDW solutions which matches the scalability, reliability, recoverability and maintainability needs. Perform POC if necessary to select right tools.

d.       Classify data according to its sensitivity and define access control

e.       Define data mapping specifications, data lineage

f.        Define standards, policies and best practices

g.       Get continuous feedback from various stake holders/users of EDW solutions to make sure it matches user expectations

·         Analytical

a.       Understand business use cases for analytics. Analyze and prioritize use cases based on data availability, schedules and current environment.

b.       Understand usability, security and stability requirement to select right tools

c.       Select technologies and tools for analytics by considering current and future needs. Perform POC if require to select right tools

 


Technical Skills




















SNo Primary Skill Proficiency Level * Rqrd./Dsrd.
1 Azure Cloud Native Security PL1 Required
2 Azure PL1 Required

 

* Proficiency Legends






















Proficiency Level Generic Reference
PL1 The associate has basic awareness and comprehension of the skill and is in the process of acquiring this skill through various channels.
PL2 The associate possesses working knowledge of the skill, and can actively and independently apply this skill in engagements and projects.
PL3 The associate has comprehensive, in-depth and specialized knowledge of the skill. She / he has extensively demonstrated successful application of the skill in engagements or projects.
PL4 The associate can function as a subject matter expert for this skill. The associate is capable of analyzing, evaluating and synthesizing solutions using the skill.





More jobs in Chicago, Illinois

Other
41 minutes ago

Thermo Fisher Scientific
Research
about 1 hour ago

Conagra Brands
Manufacturing
about 1 hour ago

Conagra Brands
More jobs in Information Technology

Information Technology
30+ days ago

Publix Super Markets, Inc.
Information Technology
30+ days ago

Publix Super Markets, Inc.
Information Technology
30+ days ago

Publix Super Markets, Inc.