Big Data Architect

Epsilon

(Irving, Texas)
Full Time
Job Posting Details
About Epsilon
Epsilon is a global leader in creating connections between people and brands. An all-encompassing global marketing company, we harness the power of rich data, groundbreaking technologies, engaging creative and transformative ideas to get the results our clients require. Recognized by Ad Age as the #1 Largest World CRM/Direct Marketing Network, #1 Largest U.S. Agency from All Disciplines and #1 Largest U.S. Mobile Marketing Agency, Epsilon employs over 7,000 associates in 70 offices worldwide.
Responsibilities
* Validate the reference architecture and provide details for the architecture of platforms, leading and working hands-on towards implementation and delivery to production for Hadoop, Cassandra and Kafka platforms. * Help lead the charge on operational strategy, ensuring rapid delivery while taking responsibility for applying standards, principles, theories and concepts. * Tuning and optimization of clusters. * Define and help enforce data governance and security policies in conjunction with delivery teams to setup new Hadoop users. This job includes setting up Kerberos principals and testing HDFS, Hive and MapReduce access for the new users. * Screen cluster job performance and capacity planning * Diligently teaming with the infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability. * Collaborating with application teams to install operating system and big data platform(Hadoop, Cassandra and Kafka) updates, patches, version upgrades when required. * Work with platform partners to create trouble tickets and incorporate fixes from the partners into the environment * Automation of deployment, customization, upgrades and monitoring through DevOps tools. * Mentor admin and development team members. * Pro-actively helps to resolve difficult technical issues and provide technical knowledge to the team. * Keep management informed of work activities and schedules.
Ideal Candidate
* Computer Science Degree * 5+ years of experience in administration/architecture in the field of big data specific to Hadoop, Cassandra and Kafka * Prior experience in migrating big data platforms from earlier to latest versions of the platforms * Proven track record in incorporating latest operational tools into big data platforms * Certification in Hadoop Operations or Cassandra is desired * Strong knowledge of configuration management process using software such as Ansible,Puppet or Chef. * Experience with monitoring tools like Nagios, Munin, Zenoss, etc. * Knowledge of SCM concepts using tools like Git, SVN etc. * Fundamental knowledge of servers/computers hardware and software. * Fundamental knowledge of Load balancers, firewalls, TCP/IP protocols. * Experience with programming languages like Python, C, C++, Java, Perl or PHP, including with UNIX scripting. * Experience with performance tuning (JVM, JMX, connection pooling) using JConsole or similar profiling tools. * Excellent written and verbal communication skills with ability to communicate technical issues to nontechnical and technical audiences

Questions

Answered by on
This question has not been answered
Answered by on

There are no answered questions, sign up or login to ask a question

Want to see jobs that are matched to you?

DreamHire recommends you jobs that fit your
skills, experiences, career goals, and more.