Data Architect

Hagerty

(Traverse City, Michigan)
Full Time Fully Remote
Job Posting Details
About Hagerty
It's all about passion. We have grown to be the global leader for collector car and boat insurance, but we're still just a family business built on a love for the hobby. Our passion drives us to keep improving our product and to give our clients the best service imaginable.
Summary
Hagerty, the leading provider of classic car insurance, valuation tools and roadside service for people who love cars, has an opportunity for a Data Architect. This individual will be responsible to create design patterns and best practices and standards for data and the integration of data through the enterprise. This role will be based in Traverse City, MI. Working remotely in this role is an option.
Responsibilities
* Develop data architecture standards and delivery roadmap * Develop application data architecture strategies for the migration of legacy applications * Articulate current, transition, and future state data architectures, focused around clear benefits to both IT and business users * Align solutions to roadmaps for existing data mart and data management solutions across multiple business lines * Architecture oversight and governance to ensure alignment to standards and vision * Promote data reference architecture standards across the organization * Create design patterns and best practices and standards for data and the integration of data through the enterprise * Create, manage, and maintain data models and data architectures across the Hagerty enterprise * Evaluate and assess existing legacy data assets with an eye on modernization and simplification * Lead application data architecture software evaluations including RFP development, capabilities assessment, formal scoring models, and delivery of executive presentations supporting a final directional recommendations * Understand and mitigate architecture risk and compromise * Solution architectures will include data models, standard interfaces for data access and value-added data services, metadata including an inventory of data, and follow best practices for data creation, usage, and messaging exchange. * Assist with reviewing data designs, data models, standard interfaces for data access and value-added data services, metadata including an inventory of data
Ideal Candidate
* Expert knowledge and demonstrated abilities working in large data architecture programs * Ability to authentically and effectively communicate (written and verbally) with various stakeholders across all levels within the business and IT * Additional 3-5 years working in another role within an IT delivery team, such as a developer, business/data analyst, quality assurance analyst, ETL developer, DBA * Experience with any existing Cloud Service Provider (e.g. Amazon EC2, Google, or Microsoft Azure) * 7-10+ years in similar role and at least 5 years’ experience of solution design associated with data modeling, message modeling, and Integration * Experience in architecture governance, controls, and peer reviews in regards to data, security, and cloud * Well versed in the data domains (analytics, BI, big data, operational data store, metadata, master data, unstructured data) * Ability to design and develop business process management workflows, spanning both orchestration and human task use cases * Demonstrated expertise in message exchange patterns (synchronous request-reply, asynchronous fire-and-forget, broadcast-multi-cast, and publish-subscribe) * Experienced in SOA, with a deep understanding of web services (XSDs, WSDLs) used within an enterprise service bus * Demonstrated expertise with transformation techniques, including but not limited to transformation, aggregation, decomposition, abstraction, and canonicalization * Demonstrated expertise in defining and implementing distributed enterprise component-based architecture and frameworks to support abstraction. * Hands-on ability to build and manage enterprise data models, leveraging both industry and application specific service models. * Experience with unstructured or semi-structured data design and implementation * Hands on experience with data profiling tools and processes and expert knowledge of normalized and dimensional modeling techniques * Experience of both on premise and cloud integration frameworks (i.e. Oracle ESB, Message Broker, DataPower, Cast Iron, Boomi) * Certifications in Enterprise Architecture frameworks (i.e. TOGAF, Zachman, EABOK, EACOE, etc) * Insurance data model experience (i.e. IAA and ACORD) * Canonical management tools (i.e. IgniteXML) * Experience with traditional data warehouse analyst, ETL developer, or BI/information delivery developer * Experience with SQL Server * Data modeling, reverse-engineering, and profiling tools, e.g. ERwin, Embarcadero, Data Explorer, Data Flux

Questions

Answered by on
This question has not been answered
Answered by on

There are no answered questions, sign up or login to ask a question

Want to see jobs that are matched to you?

DreamHire recommends you jobs that fit your
skills, experiences, career goals, and more.