Lead Data Engineer

Wallaby

(Pasadena, California)
Full Time
Job Posting Details
About Wallaby
Wallaby Financial helps consumers get the most out of their credit cards by optimizing usage based on individual preferences. Through our products, you can optimize your credit card usage for rewards, credit limit, statement due date, linked bonuses, and other factors. Wallaby is available for your smartphone, your web browser, and your wearable devices.
Summary
Wallaby Financial is seeking an experienced big data engineer to act as a Lead for the team. We help people get the most out of their money through smart credit card recommendations and are looking for someone to enhance that experience by creating new reporting systems, data models, and analytics tools across large data sets. Our rapid growth requires an experienced engineer to help us to continue to deliver award-winning financial services technology solutions at scale. You’ll be a key team member developing solutions that utilize the growing scale of financial data we manage in our platform. We expect you to be capable of acting as a full project owner and collaborate with our platform, mobile, and product teams as well as business intelligence stakeholders across our partners.
Responsibilities
* Own and drive complex technical projects from the planning stage through execution * Work as hands-on data modeler and ad-hoc data analyst when needed * Work on data access APIs and server-to-server batch transmissions * Work to produce regular automated reports on complex and disparate data sets * Work with DBAs and DevOps to ensure proper scaling and monitoring of various data stores
Ideal Candidate
* At least 4 years experience building complex back-end data systems with at least two years of focus on modern open source NoSQL technologies (Redis and Hadoop a plus) * At least 4 years experience with Java or similar object-oriented language * Extensive use of Open Source software * Experience with schema modeling and large-scale migrations of databases * Experience with APIs and caching paradigms for CRUD queries at scale to abstract a service layer from its underlying DAO/DB (Spring, Hibernate, Memcache a plus) * Experience with large scale logfile aggregation and analysis for trends, errors, and data insights * Experience with data pipelines from raw logs through continuous processing to visual reports * Experience with virtual cloud automation (AWS, Chef, Docker, etc.) * Experience collaborating with Operations teams to ship large-scale data solutions * Solid understanding of DevOps philosophy * Solid understanding of the data and operational complexities around consistency, job monitoring, failover, and archive retention. * Roots in technology, working on development teams * Experience working within an agile environment * Strong written and verbal communication skills
Compensation and Working Conditions
Benefits Benefits included

Additional Notes on Compensation

Flexible PTO. Full medical, dental, vision, disability, and life insurance. Bonuses. 401(k) with employer contributions of 3% fully vested immediately. Fully stocked kitchen(with snacks and drinks), foosball, and more.

Questions

Answered by on
This question has not been answered
Answered by on

There are no answered questions, sign up or login to ask a question

Want to see jobs that are matched to you?

DreamHire recommends you jobs that fit your
skills, experiences, career goals, and more.