want to work in USA on E3 visa as Big Data Developer - Hobart

Wednesday, 12 July 2017

Item details

City: Hobart, Tasmania

Contacts

Contact name sahitya Reddy
Phone +17349282181

Item description

Job Description:


The Risk IT Group is in the middle of building out the new enterprise risk system. It will have the ability to aggregate the risk measures across the firm, in support of regulatory as well as internal risk management requirements. While the initial focus of the business functions is for counterparty credit risk, the system will provide measures for all other risk disciplines as well, including market, liquidity and operational risk.


This developer will utilize BNYM components to ingest external data into Hadoop, process data using Impala, Hive, HBase and Spark, and present the result in BI/Notebook tools.


· Have a general understanding of entire Hadoop ecosystem, Cloudera DataHub 5.9 in particular


· Understand characteristics of each HDFS file format, especially Parquet


· Have knowledge/experience on Impala/Hive file organization and performance tuning


· Understand/practice HBase data model design principle


· Master HiveQL on Impala and Hive


· Have knowledge/experience programming Spark and SparkSQL


· Have some knowledge/experience on Hue/Zeppelin/Kylin


· Consults with internal business groups to provide appropriate application software development services or technical support. Analyzes, defines and documents requirements for data, workflow, logical processes, hardware and operating system environment, interfaces with other systems, internal and external checks, controls, and outputs using BNY Mellon's standard development methodology. Works with internal business groups on implementation opportunities, challenges, and requirements of various applications. Analyzes information and provides recommendations to address and resolve business issues for a specific business group. Contributes to defining time tables and project plans. Analyzes and estimates feasibility, cost, time, and compatibility with hardware and other programs. Takes lead for establishing, implementing and monitoring 'best practices' for technical development methodologies and tools. Proposes innovative, creative technology solutions. Contributes to the achievement of area objectives.
Qualifications




· Bachelor's degree in computer science engineering or a related discipline, or equivalent work experience required, 8-10 years of experience in software development required, experience in the securities or financial services industry is a plus.
3 to 7 years’ experience with Hadoop ecosystem
3 to 7 years’ experience with Master HiveQL on Impala and Hive.
At least 2 years of experience with ETL tool Pentaho(Kettle)
Proficient in SQL and experience with Oracle PL/SQL will be plus.
Data warehousing experience preferred
Experience of working with development of complex large scale systems required
Ability to work efficiently with our offshore team
Strong communication and interpersonal skills, excellent team player
Superb ownership mindset, strong work ethics, a habit of excellence