hireejobsgulf

VP-Software Engineering Team Lead

1.00 to 10.00 Years   Hyderabad, Pakistan   02 Dec, 2022
Job LocationHyderabad, Pakistan
EducationNot Mentioned
SalaryNot Mentioned
IndustryOther Business Support Services
Functional AreaNot Mentioned

Job Description

Core Banking provides technology support for applications across all lines of business (Auto/Education, Business Banking, Card Services, Centralized Transactions Operations, Commercial Banking, Consumer Bank, Consumer Internet Group, Credit Risk, Finance, Home Lending, Investment Bank, Treasury & Securities Services, Wealth Management). ESDD is the centralized data hub for Core Banking. ESDD provides high-volume, high-performance enterprise data integration (ETL) capabilities. These capabilities include extracting, transforming and distributing data from and across multiple databases, applications and platforms including on premise and cloud.A data integration developer is responsible for the following key areas:

  • Creation of ETL Data Pipeline processes to validate, transform, enrich, and integrate data
  • Adopt cutting edge technologies such as cloud/containers as a part of application evolution and modernization
  • Understand business requirements and collaborate with the architecture team to translate them into technical design
  • Participate in end-to-end development lifecycle activities of the application, including design, coding, testing and deployment activities.
  • Produce comprehensive tests for all developed code. Support and participate in system and integrated testing across sub-systems as the need arises.
  • Provide technical support for the application on a rotational basis, including meeting service level and performance requirements; and diagnosing and evaluating inefficient processes/code.
Required Skills:
  • Bachelors degree or equivalent in Computer Science, Engineering (any), or related field.
  • 12+ years of experience in the design and delivery of ETL solutions Java (preferred), Spark (basic)
  • 1+ years of experience in Cloud deployment (AWS, Kubernetes), Kafka, CI / CD, Automation
  • Knowledge of application, design-patterns, data and infrastructure architecture disciplines
  • Experience in UNIX and/or AIX operating environment and UNIX shell scripting is required
  • Ability to understand, modify and write SQL queries
  • Working experience as Agile developer and good understanding of SDLC methodologies/guidelines
  • Knowledge of big data technologies like Hadoop/HIVE/Spark
Ideal Candidate Attributes
  • Self-starter with a strong work ethic
  • Exhibits leadership
  • Possesses analytical and problem-solving skills
  • Strong organizational and prioritizing skills
  • Ability to multi-task and handle multiple priorities
  • Embraces team-based approach
  • Proactive, strong, clear communication skills
  • Strives for Continuous Improvement
Ideal candidates should possess:
  • Basic concepts of data integration architecture
  • Basic understanding of data modeling concepts
  • A good grasp of various components of standard ETL Data Pipeline tools (preferably Ab Initio)
  • Design and development experience with data integration solutions using an ETL tool (preferably Ab Initio)
  • Prior experience in working with data pipelines in a cloud or Kubernetes environment
  • Working knowledge of Oracle, especially relating to PL/SQL
  • Understanding of Big data technologies including but not limited to Spark, Hadoop, Hive, Impala and HBase
  • Hands-on experience with understanding Spark processing and debugging graphs and logs
  • Demonstrable knowledge of UNIX operating systems and shell scripting
  • Working knowledge of scheduling tools such as Control-M, Autosys
  • Strong grasp on SDLC concepts including development best practices, code migration procedures and test automation
  • Ability to work in an Agile development environment against challenging project timelines
  • BS/BA degree or equivalent experience
Additionally, an understanding of at least some of these technologies and practices is beneficial:
  • Knowledge of test automation frameworks such as Cucumber
  • Working knowledge of Kafka as a distributed publish-subscribe system
  • Experience with modern big data consumption technologies such as Dremio
  • Working knowledge of Python, Java and Scala
  • Modern web technologies such as NodeJS, AngularJS, React, Redux, or Typescript
  • Understanding of modern cyber security practices such as Kerberos authentication
JPMorgan Chase & Co., one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world?s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management.We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. In accordance with applicable law, we make reasonable accommodations for applicants? and employees? religious practices and beliefs, as well as any mental health or physical disability needs.

Keyskills :

APPLY NOW

Related Jobs

© 2023 HireeJobsGulf All Rights Reserved