We use cookies. Find out more about it here. By continuing to browse this site you are agreeing to our use of cookies.
#alert
Back to search results
New

Senior Data Engineer (AWS, FiveTran, dbt, Snowflake)

BOK Financial
United States, Texas, Richardson
Apr 29, 2026

Req ID:78001

Location:Richardson -RCHRD

Areas of Interest:Data

Pay Transparency Salary Range:Not Available

Application Deadline:06/30/2026

BOK Financial Corporation Group includes BOKF, NA; BOK Financial Securities, Inc. and BOK Financial Private Wealth, Inc. BOKF, NA operates TransFund and Cavanal Hill Investment Management, Inc. BOKF, NA operates banking divisions: Bank of Albuquerque; Bank of Oklahoma; Bank of Texas and BOK Financial.

Bonus Type
Discretionary
Summary

If you're in search of a position that blends a fervor for technological breakthroughs, a chance for career progression, and a collaborative work environment, then you've arrived at the right destination. We have an enticing role ready for a tech-savvy individual like you! At BOK Financial, we're fostering a workspace where extraordinary talents can display their skills, aim for the highest standards, and contribute to top-tier projects.

Job Description

The Data Solutions Engineer IV works in close collaboration to design, develop, implement, and support various types of data solutions across all lines of BOKF Financial. This position requires close interaction, influencing and collaboration with other engineers, architects, analytics partners, vendor partners and functional leaders. The Data Solutions Engineer IV defines enterprise-wide best practices, methodologies, governance, and standards and serves as a data solutions subject matter expert. This position mentors and coaches lower-level Data Solutions Engineers and leads meetings as needed.

Team Culture

BOK Financial is a place where your passion for technological innovation is valued and career development is encouraged. The company fosters an environment where unique talents can thrive, achieve high standards, and contribute to prestigious projects. It's an ideal platform to advance your IT career within a vibrant and supportive culture.

How You'll Spend Your Time
  • Lead the design, implementation, and governance of enterprise ETL / ELT pipelines, leveraging Fivetran for managed ingestion and CDC, Snowflake as the central cloud data platform, and dbt for transformation, modeling, and analytics engineering.
  • Architect and review endtoend data flows from source systems to curated, consumptionready datasets, including:
        • Sourcealigned raw ingestion
        • Conformed and reusable transformation layers
        • Businessready data products optimized for analytics and reporting
  • Establish and enforce dbt engineering standards, including:
        • Development of modular dbt models (staging, intermediate, mart layers)
        • Creation and reuse of dbt macros for standardization, automation, and consistency
        • Implementation of dbt tests, documentation, and lineage to ensure data quality and transparency
        • Versioncontrolled dbt projects aligned to enterprise release processes
  • Design and optimize Snowflake data architecture, including:
        • Warehouse sizing and workload isolation strategies
        • Partitioning, clustering, and performance tuning
        • Secure data sharing and access control models
        • Cost optimization through usage patterns and resource governance
  • Define and drive Python, PySpark, and SQL standards for data engineering workloads, supporting advanced transformations, largescale processing, and streaming or hybrid use cases where appropriate.
  • Implement Infrastructure as Code (IaC) using Terraform to provision and manage Snowflake objects, cloud infrastructure, and supporting platform components in a repeatable and auditable manner.
  • Design and integrate CI/CD pipelines for data engineering assets, including:
        • Automated deployment of dbt models, macros, and tests
        • Environment promotion (dev test prod)
        • Code quality checks, linting, and automated validation
        • Controlled and traceable releases aligned with enterprise SDLC practices
  • Oversee and validate Fivetran connector configurations, schema evolution handling, and ingestion SLAs to ensure reliability and trust in sourcetotarget pipelines.
  • Evaluate, approve, and govern opensource and vendor data engineering tools (Fivetran, dbt, Snowflake, Kafka ecosystem, AWS services) with a focus on scalability, security, maintainability, and cost efficiency.
  • Lead proofs of concept and technical evaluations for new data engineering technologies and patterns, ensuring alignment with Snowflakecentric, SQLfirst and automationdriven architecture principles.
  • Establish best practices for pipeline observability, data quality, and operational monitoring, ensuring pipelines are robust, traceable, and productionready.
  • Partner with platform, security, and compliance teams to ensure data pipelines, infrastructure, and deployments meet enterprise security, access control, and regulatory requirements without compromising developer productivity.
Education & Experience Requirements

This level of knowledge is normally acquired through completion of a Bachelor's Degree in a data-centric field (Computer Science, Economics, Information Systems, Data Analytics, etc.) and 7+ years' experience with demonstrated track record of successful technical leadership in the execution of large-scale data projects or equivalent combination of education and experience.

  • Proven experience in successfully building, automating, and supporting solutions built in a large-scale data ecosystem in production environments on-premise, or in Azure/AWS cloud service.
  • Proven experience in successfully building highly complex and scalable data pipelines.
  • Strong understanding of BOKF business information systems and strong Enterprise Data Warehouse concepts.
  • Strong hands-on experience independently designing, developing and testing business intelligence & analytics solutions using proven or emerging technologies in a variety of technologies and environments.
  • Excellent oral and written communication skills to effectively represent self and BOKF, as well as ability to present complex information and issues in a clear and concise manner.
  • Expert conceptual thinking and analytical skills with the ability to analyze complex problems that include interrelationships and dependencies in order to identify common themes and solutions.
  • Ability to understand a goal and build out a work plan to accomplish, adjusting to accommodate other priorities, as needed.
  • Confidently represent Enterprise Data Solutions team in various meetings where business needs and use cases are presented. Provide guidance/next steps with little oversight that are aligned with established best practices, patterns, and architectural principles.
  • Strong sense of accountability, taking ownership over projects and responsibilities, and resolving issues proactively

BOK Financial Corporation Groupis a stable and financially strong organization that provides excellent training and development to support building the long term careers of employees.With passion, skill and partnership you can make an impact on the success of the bank, customers and your own career!
Apply todayand take the first step towards your next career opportunity!


The companies in BOK Financial Corporation Group are equal opportunity employers. We are committed to providing equal employment opportunities for training, compensation, transfer, promotion and other aspects of employment for all qualified applicants and employees without regard to sex, race, color, religion, national origin, age, disability, pregnancy status, sexual orientation, genetic information or veteran status.

Please contactrecruiting_coordinators@bokf.comwith any questions.

Applied = 0

(web-bd9584865-rddb7)