We use cookies. Find out more about it here. By continuing to browse this site you are agreeing to our use of cookies.
#alert
Back to search results
New

Senior Principal Data Software Engineer

WEX, Inc.
life insurance, paid time off, tuition reimbursement
United States, California, San Francisco
Nov 27, 2024

This is a remote position; however, the candidate must reside within 30 miles of one of the following locations: Portland, ME; Boston, MA; Chicago, IL; and San Francisco Bay Area, CA.

About the Team/Role

We are looking for a highly motivated and highly potential Senior Staff Engineer to join our Data team to make big business impacts and grow your career.

This is an exciting time to be part of the Data team at WEX. WEX offers sophisticated business solutions that empower diverse customers. The data generated from these systems, applications, and platforms is rich and complex. As one of the most valuable assets of WEX, this data holds immense potential to drive value for our customers and the business.

The Data team's mission is to build big data technologies, platforms, systems, and tools that clean, process, enrich, and optimize core company data, making it easy and efficient to use. This enables both our customers and internal teams to unlock business value. We also create value-added data products for WEX customers. Leveraging modern big data and AI technologies, we employ agile development practices, a combined engineering approach, and the product operating model to drive innovation and efficiency.

We provide challenging problems that have significant business impact, offering you opportunities to learn and grow. Our team consists of highly skilled engineers and leaders who will support, guide, and coach you throughout your journey.

If you're driven to become a strong engineer capable of solving complex problems, delivering impactful solutions, and growing quickly, this is the ideal opportunity for you!

How you'll make an impact

  • Collaborate with partners and stakeholders to understand customers' business challenges and key requirements.

  • Design, test, code, and instrument complex data products, systems, platforms, and pipelines. Ensure high-quality, scalable, reliable, secure, cost-effective, and user-friendly solutions.

  • Utilize data to drive decisions by effectively measuring and analyzing outcomes.

  • Develop and maintain CI/CD automation using tools like GitHub Actions.

  • Implement Infrastructure as Code (IaC) using tools like Terraform, including provisioning and managing cloud-based data infrastructure.

  • Apply software development methodologies such as TDD, BDD, and Microservice/Event-Oriented Architectures to ensure efficiency, reliability, quality, and scalability.

  • Support live data products, systems, and platforms by promoting proactive monitoring, ensuring high data quality, rapid incident response, and continuous improvement.

  • Analyze data, systems, and processes independently to identify bottlenecks and opportunities for improvement. Lead complex problem diagnostics and drive timely resolutions.

  • Mentor peers and foster continuous learning of new technologies within the team and the broader organization, consistently upholding high technical standards.

  • Attract top industry talent; contribute to interviews and provide timely, high-quality feedback.

  • Serve as a role model by adhering to team processes and best practices, ensuring your solutions effectively solve customer and business problems in a reliable and sustainable way.

  • Collaborate with or lead peers in completing complex tasks, ensuring timely and effective execution.

  • Lead a Scrum team with hands-on involvement, ensuring high-quality and timely development and delivery aligned with agile best practices.

  • Own large, complex systems, platforms, and products, driving future developments and ensuring they deliver measurable business value.

  • Lead and actively participate in technical discussions, ensuring the team stays at the forefront of industry advancements.

  • Design and build high-performance, reliable systems with attention to detail and craftsmanship.

  • Complete large, complex tasks independently, seeking feedback from senior peers to maintain high quality.

  • Proactively identify and communicate project dependencies.

  • Review peer work, providing constructive feedback to promote continuous improvement.

  • Build scalable, secure, and high-quality big data platforms and tools to support data transfer, ingestion, processing, serving, delivery, and data governance needs.

  • Design and build efficient systems, platforms, pipelines, and tools for the entire data lifecycle, including ingestion, cleaning, processing, enrichment, optimization, and serving, leveraging the data platform. Develop systems for high-quality, user-friendly data delivery for internal and external use.

  • Develop data quality measurement and monitoring techniques, metadata management, data catalogs, and Master Data Management (MDM) systems.

  • Use data modeling techniques to design and implement efficient, easy-to-use data models and structures.

  • Become a deep subject matter expert in your functional area, applying best practices.

  • Apply creative problem-solving techniques to assess unique circumstances and suggest or implement solutions.

  • Leverage data and AI technologies to enhance productivity and solution quality, influencing peers to adopt these practices.

  • Lead team initiatives by applying your extensive experience and technical expertise to drive decisions on methods and approaches to complex issues.

  • Hold yourself and your team accountable for delivering high-quality results aligned with defined OKRs (Objectives and Key Results).

  • Provide strategic advice to senior leadership on highly complex situations, leading teams through initiatives that achieve excellent results.

  • Offer thought leadership on business initiatives by applying deep technical and industry expertise.

Experience you'll bring

  • Bachelor's degree in Computer Science, Software Engineering, or a related field, OR demonstrable equivalent deep understanding, experience, and capability. A Master's or PhD degree in Computer Science (or related field) is a plus.

  • 10+ years of experience in large-scale software engineering.

  • A technically deep, innovative, empathetic, and passionate technical leader capable of delivering on business needs.

  • Strong problem-solving skills, with excellent communication and collaboration abilities.

  • Highly self-motivated and eager to learn, consistently adopting new technologies to improve productivity and the quality of deliverables. Proficient in leveraging GenAI technologies to enhance work productivity and build innovative products/systems for customers.

  • Extensive experience in architecture design, creating simple, high-quality, performant, and efficient solutions for large, complex problems.

  • Deep expertise in CI/CD automation.

  • Rich experience in combined engineering practices and Agile development, with a track record of leading teams to adopt these methods effectively.

  • Extensive experience and strong implementation skills in programming languages such as Java, C#, Golang, and Python, including coding, automated testing, measurement, and monitoring, ensuring high productivity.

  • Expertise in data processing techniques, including data pipeline/platform development, SQL, and database management.

  • Extensive experience in data ingestion, cleaning, processing, enrichment, storage, and serving, using tools such as ELT, SQL, relational algebra, and databases.

  • Experience with cloud technologies, including AWS and Azure.

  • Strong understanding of data warehousing and dimensional modeling techniques.

  • Passionate about solving customer and business problems through innovative solutions.

  • Understanding of data governance principles.

Preferred Qualifications:

  • Proven expertise in designing and implementing scalable, reliable, and cost-effective data architectures, including data lakes, lake houses, and data warehouses, to support analytics, real-time processing, and AI/ML applications.

  • Extensive experience building and optimizing high-throughput data ingestion frameworks for diverse data types (structured and unstructured) using tools like Kafka, Spark, AWS Glue, and Apache NiFi, with strong ETL/ELT proficiency.

  • Hands-on experience with AWS, Azure, or GCP managed services for data storage, compute, and orchestration, along with Infrastructure as Code (IaC) for scalable provisioning.

  • Expertise in efficient data modeling and schema design for analytical and transactional data, focusing on optimal data retrieval and storage practices.

  • Deep knowledge of event-driven and streaming architectures to enable real-time data processing and responsive data products.

The base pay range represents the anticipated low and high end of the pay range for this position. Actual pay rates will vary and will be based on various factors, such as your qualifications, skills, competencies, and proficiency for the role. Base pay is one component of WEX's total compensation package. Most sales positions are eligible for commission under the terms of an applicable plan. Non-sales roles are typically eligible for a quarterly or annual bonus based on their role and applicable plan. WEX's comprehensive and market competitive benefits are designed to support your personal and professional well-being. Benefits include health, dental and vision insurances, retirement savings plan, paid time off, health savings account, flexible spending accounts, life insurance, disability insurance, tuition reimbursement, and more. For more information, check out the "About Us" section. Pay Range: $156,000.00 - $208,000.00
Applied = 0

(web-5584d87848-llzd8)