Digital technologies, including analytics and AI, give companies a once-in-a-generation opportunity to perform orders of magnitude better than ever before. But clients need new business models built from analyzing customers and business operations at every angle to really understand them.
With the power to apply artificial intelligence and data science to business decisions via enterprise data management solutions, we help leading companies prototype, refine, validate and scale the most desirable products and delivery models to enterprise scale within weeks
*You must be legally authorized to work in the United States without the need for employer sponsorship, now or at any time in the future *
Job Title – Tech Lead
Location – MINNEAPOLIS, MN (Remote until COVID)
Primary / Secondary Skillset:
JAVA
SPARK
Roles/Responsibilities:
Responsible for designing, developing, modifying, debugging and/or maintaining software applications.
Ensure code is maintainable, scalable and supportable.
Present demos of the software products to stakeholders, using knowledge of the product/solution and technologies
Investigate issues by reviewing/debugging code and providing fixes (analyzes and fixes defects) and workarounds, review changes for operability to maintain existing software solutions, highlight risks and will help mitigate risks from technical aspects.
Provide support to customer and testing teams
Collaborate with customers to understand their business requirements, data processes and analyze their systems’ data, in order to design the data migration and transformation processes.
Develop all the ETL process components required to migrate data from the customers’ source databases into the big data platform.
Load large datasets and improve the efficiency of ETL processes.
Perform data cleansing to guaranty high quality of loaded data.
Provide training and knowledge transfer to enable clients to utilize and manage the deployed solutions
Provides on-going support for data related issues.
Create data migration project schedules and track projects against the plan using project schedules and status report
Required Qualifications:
Bachelor’s degree in relevant technical field, (Information Technology, Computer Science, Software, Engineering, or related technical discipline), or equivalent combination of training and work experience.
Minimum 7-10 years’ experience with Java
Must have experience in one or more ETL software tools. Talend, SSIS, Informatica, DataStage, ODI (Talend experience is an advantage)
Must have minimum 7 years’ experience with software delivery implementations and working with customers.
Person will be responsible for developing new jobs using Spark and Scala programming.
Excellent communication and presentation skills.
Excellent analytical capabilities.
Experience with Big Data technologies is an advantage.
Ability to work on multiple projects concurrently
Must be able to work effectively with project managers and team members to provide support as both an independent operator and team player
Technology Experience: Big Data – Hadoop, Hive, Impala, HBase, Apache Solr
ETL – Talend, SSIS, Informatica, DataStage, ODI
SQL – PostgreSQL, MS SQL, Oracle, other
Programming Languages – Java, Python
Technical Skills
SNo
Primary Skill
Proficiency Level *
Rqrd./Dsrd.
1
Spark
PL1
Required
2
Scala
PL1
Required
3
Java
PL1
Required
* Proficiency Legends
Proficiency Level
Generic Reference
PL1
The associate has basic awareness and comprehension of the skill and is in the process of acquiring this skill through various channels.
PL2
The associate possesses working knowledge of the skill, and can actively and independently apply this skill in engagements and projects.
PL3
The associate has comprehensive, in-depth and specialized knowledge of the skill. She / he has extensively demonstrated successful application of the skill in engagements or projects.
PL4
The associate can function as a subject matter expert for this skill. The associate is capable of analyzing, evaluating and synthesizing solutions using the skill.