Sr Manager, Big Data Engineering

PayPal

(San Jose, California)
Full Time
Job Posting Details
About PayPal
At PayPal, we’re laser focused on creating better ways for consumers and merchants to pay and get paid. Putting customers’ needs first, coupled with a “challenger” mindset, we’re redefining the payments category through product, marketing, and service delivery innovation. PayPal is not for everyone.
Summary
PayPal’s Data Finance Transformation (DFIT) team creates and manages global risk forecasting models and Finance data systems.The DFIT team partners with Global Risk Management, Global Payments and Global Finance to analyze business performance, design and develop insights, assess financial outcomes of business investments and set organization’s financial goals. This role is responsible for delivery of detailed data diagnostics, algorithms for merging disparate data sets, scalable and reliable data systems, ETL design and end-2-end ownership of the Finance data system.
Responsibilities
* Provide technical leadership and contribute to the definition, development, integration, test, documentation, and support across multiple platforms (Unix, Perl, Hadoop, Teradata utilities) * Analyze, assimilate and integrate multi-technology data systems by building a unified back-end platform and a web enabled front-end•Develop back-end and front-end architecture and software to deliver web based solutions•Design and develop “Self Serve” analytics infrastructure. A key component of this task is to build and publish value added data sets that will be utilized by various stakeholders to monitor the tactical performance of Risk initiatives * Instill best practices for software development and documentation, assure designs meet requirements, and deliver high-quality work on tight schedules * Involved in all phases of software development from review of functional specification through assisting with test plans and final QA cycle. Actively participates in monitoring and troubleshooting of production platform related issues * Perform day-to-day tasks that ensure technology platform remains stable and available to users. Closely work with cross functional team to enhance systems, trouble-shoot data issues, etc
Ideal Candidate
* Bachelor’s degree in Computer Engineering or equivalent is desired * 12+ years of post-college working experience as a developer and architect in Engineering, or Data-Mining organization * 10+ years of experience with full life-cycle development in Data Warehousing and Data Integration projects using Teradata and Big Data. Teradata SQL masters certification is required. * 6+ years Linux/Unix/Perl experience, including scripting and version control with working knowledge of Teradata utilities : FastExport, BTEQ, FastLoad and MLoad * 5+ years of Python development experience is required * 5+ years of Hadoop experience using HTML, Java/Javascript and CSS * Expertise in data research/analysis with a focus on data quality and consistency is required

Questions

Answered by on
This question has not been answered
Answered by on

There are no answered questions, sign up or login to ask a question

Want to see jobs that are matched to you?

DreamHire recommends you jobs that fit your
skills, experiences, career goals, and more.