ETL BigData Developer
Company Description
At Kelly Services, we work with the best. Our clients include 99 of the Fortune 100TM companies, and more than 70,000 hiring managers rely on Kelly annually to access the best talent to drive their business forward. If you only make one career connection today, connect with Kelly.
Job Description
Key Info:
1. Candidates must have at least 5+ years of professional experience
2. Talend is a MUST
3. Big Data Technologies such as Hadoop, Hive, HBase, Pig, Spark, Sqoop are a MUST
4. DataStage and AbInitio experience is preferred
Primary Job Duties & Responsibilities:
- Responsible for data Ingestion, data quality, metadata management, ETL/ELT/ETLT development, production code management and testing/deployment strategy in BigData development (Talend/Hadoop)
- Act as a lead in identification and troubleshooting processing issues impacting timely availability of data in the BigData or delivery of critical reporting within established SLAs. Provide mentoring to production support team
- Identify and recommend technical improvements in production application solutions or operational processes in support of BigData platform and information delivery assets.
- Focus on the overall stability and availability of the BigData platform and the associated interfaces and transport protocols.
- Research, manage and coordinate resolution of complex issues through root cause analysis as appropriate
- Establish and maintain productive relationships and effective communication with technical leads of key operational sources systems providing data to BigData platform, and infrastructure support groups.
- Ensure adherence to established problem / incident management, change management and other internal IT processes
- Responsible for communication related to day to day issues and problem resolutions. Ensure timely and accurate escalation of issues.
Education, Work Experience & Knowledge:
- Bachelor degree preferred or equivalent work experience, minimum 5 years of work experience in related field. Typically has 7 or more years of experience.
- High level of sophistication using Talend in data integration, data mapping, data quality and metadata management.
- Experience with ETL tools for Big Data and Big Data technologies (Talend, HBASE, HIVE, SQOOP, MAPR, PIG, SPARK, Splunk).
- Programming experience or working knowledge of Ab Initio and Data Stage tools.
- Core Java programming, Oozie , MAPR and/or YARN knowledge is a plus
- Experience with Agile methodologies.
- Experience working with on/offshore teams.
- Strong communication, collaboration skills
Additional Information
Why is this a great opportunity? The answer is simple…working at our client is more than a job; it’s a career. The opportunities are diverse whether you are right at the start of your career or whether you are looking for new challenges this is the job for you, so be quick and apply now!