Digital technologies, including analytics and AI, give companies a once-in-a-generation opportunity to perform orders of magnitude better than ever before. But clients need new business models built from analyzing customers and business operations at every angle to really understand them.
With the power to apply artificial intelligence and data science to business decisions via enterprise data management solutions, we help leading companies prototype, refine, validate and scale the most desirable products and delivery models to enterprise scale within weeks
*You must be legally authorized to work in the United States without the need for employer sponsorship, now or at any time in the future *
Job Title – Kafka Developer
Location – Louisville, KY (Remote until COVID)
Roles/Responsibilities:
Overall 6+ years of experience and minimum 4 years of experience in Kafka programming.
Advanced working Kafka knowledge and experience working with variety of streaming sources and databases.
Experience building and optimizing ‘big data’ data pipelines, architectures and data sets using Kafka. Experience in Core Java/ OOPS.
Experience in Confluent Kakfa components (Connect, Schema, Registry, KSQL, ControlCenter)
Build processes supporting data transformation, data structures, metadata, dependency and workload management.
A successful history of manipulating, processing and extracting value from large disconnected datasets. Working knowledge of message queuing, stream processing and highly scalable ‘big data’ data stores.
Strong project management and organizational skills.
Experience supporting and working with cross-functional teams in a dynamic environment.
Handling the team and wherever required chip in and code with the team.
Able to design and estimate for given requirement independently
Independently able to debug and support to prod support team as and when required
Required Qualifications:
Bachelor’s Degree in Engineering or relevant field
6+ years of success in consultative/complex deployment project, architecture, design, implementation and/or support of data and analytics solutions
Hands on development Design and Develop applications and flows Kafka
At least 4 years of strong hands-on experience in implementation of Kafka (Apache/Confluent).
At least 4 years of development experience in building Kafka distributed messaging and integration ecosystem.
Experience in optimization and tuning based on performance metrics.
Experience with Big Data Technologies -Hadoop etc will be an added advantage.
Good hands on experience on Streaming with Kafka.
Experience with Git and Bitbucket
At least 4 years of experience in software development life cycle.
At least 4 years of experience in Project life cycle activities on development and maintenance projects.
Setup best practices, Standards, Patterns Development and automate process for onboarding.
Willing to work in application/production support
Experience in Banking domain will be an added advantage
Strong communication and Analytical skills
Ability to work in team in diverse/ multiple stakeholder environment
Experience and desire to work in a Global delivery environment
Technical Skills
SNo
Primary Skill
Proficiency Level *
Rqrd./Dsrd.
1
Streamsets
PL1
Required
2
Kafka
PL1
Required
* Proficiency Legends
Proficiency Level
Generic Reference
PL1
The associate has basic awareness and comprehension of the skill and is in the process of acquiring this skill through various channels.
PL2
The associate possesses working knowledge of the skill, and can actively and independently apply this skill in engagements and projects.
PL3
The associate has comprehensive, in-depth and specialized knowledge of the skill. She / he has extensively demonstrated successful application of the skill in engagements or projects.
PL4
The associate can function as a subject matter expert for this skill. The associate is capable of analyzing, evaluating and synthesizing solutions using the skill.