Hadoop Solutions Architect-Principal Developer

BNY Mellon

(Palo Alto, California)
Full Time
Job Posting Details
About BNY Mellon
BNY Mellon is an investments company. We provide investment management, investment services and wealth management that help institutions and individuals succeed in markets all over the world.
Summary
BNY Mellon’s Innovation Center Silicon Valley is in search of a seasoned Hadoop System Admin/Architect for our Palo Alto office. The architect will have a huge impact on the future of computing in financial services by helping BNY Mellon streamline analytics and business insights using Big Data and by pioneering new data science techniques. You will be working closely and directly with business partners, developers, and other development teams to understand line of business requirements and design efficient and effective solutions on hadoop. You will be responsible for planning and operationalizing hadoop clusters and mentoring a team of engineers to help you with that.
Responsibilities
* Provide Infrastructure Recommendations, Capacity Planning * Develop utilities to monitor cluster better * Manage large clusters with huge volumes of data * Help architect and develop big data solutions streaming/batch using hadoop technologies. * Evangelize and help other groups take advantage of hadoop eco-system
Ideal Candidate
* BS, MS in Computer Science or equivalent along with hands-on experience in dealing with large data sets and distributed computing in data warehousing and business intelligence systems using hadoop * Proficiency in Java and writing software for distributed systems * Experience in writing software with Hadoop distributions: Hortonworks * Experience around developing code for large clusters with huge volumes of data – streaming as well as batch code. * Strong knowledge of Linux * Experience designing and implementing security for Hadoop clusters – Kerberos, Ranger, Knox * Conceptual/working knowledge of basic data management concepts like ETL, Data quality, RDBMS * Atleast 3 years of experience with and strong understanding of ‘big data’ technologies in hadoop ecosystem – Hive, HDFS, Map/Reduce, Yarn, Kafka, Pig, Oozie, HBase, Sqoop, Spark etc
Compensation and Working Conditions
Benefits Benefits included

Questions

Answered by on
This question has not been answered
Answered by on

There are no answered questions, sign up or login to ask a question

Want to see jobs that are matched to you?

DreamHire recommends you jobs that fit your
skills, experiences, career goals, and more.