Senior Data Architect - Dallas, United States - Trinity Industries

    Default job background
    Description

    Trinity Industries is searching for a Senior Data Architect at our Dallas, TX HQ

    Trinity Industries is at the forefront of expanding data infrastructure and analytics capabilities. In the role of Senior Data Architect, you will be instrumental in architecting, implementing, and optimizing our data processing and analytics platform based on Databricks for our customer facing Trinity Logistics Platform (TLP). This role requires a collaborative mindset to work with cross-functional teams, understanding business requirements, and ensuring the seamless integration of Databricks within our technology stack.

    Join our team today and be a part of Delivering Goods for the Good of All

    What you'll do:

    • Develop and oversee a comprehensive data architecture, aligning with business goals and integrating technologies such as Azure, Databricks, and Palantir to craft a forward-looking data management and analytics landscape
    • Lead the design of enterprise-grade data platforms addressing needs across Data Engineering, Data Science, and Data Analysis, capitalizing on the capabilities of Azure Databricks
    • Architect, develop, and document scalable data architecture patterns, ETL frameworks, and governance policies, adhering to Databricks best practices to support future and unknown use cases with minimal rework
    • Define cloud data standards, DevOps, Continuous Integration / Continuous Delivery (CI/CD) processes and participate in the proliferation of the corporate meta-data repository
    • Offer hands-on technical guidance and leadership across teams, driving the development of KPIs for effective platform cost management and the creation of repeatable data patterns for data integrity and governance
    • Direct the strategic implementation of Databricks-based solutions, aligning them with business objectives and data governance standards while optimizing performance and efficiency
    • Promote a culture of teamwork, leading evaluations of design, code, data assets, and security features, and working with key external data providers like Databricks and Microsoft to follow best practices. Create and deliver training materials, such as data flow diagrams, conceptual diagrams, UML diagrams, and ER flow diagrams, to explain data model meaning and usage clearly to a diverse audience of technical and non-technical users
    • Communicate complex technical concepts effectively to both technical and non-technical stakeholders, ensuring clear understanding and alignment across the organization
    • Implement robust audit and monitoring solutions, design effective security controls, and collaborate closely with operations teams to ensure data platform stability and reliability

    What you'll need:

    • Bachelors or Masters degree in Computer Science, Information Technology, or a related field
    • 8+ years of experience in technical roles with expertise in Software/Data Engineering, Development Tools, Data Applications Engineering
    • Proficiency in SQL, Python, Scala, or Java. Experience with big data technologies (e.g., Spark, Hadoop, Kafka), MPP databases, and cloud infrastructure
    • Strong background in data modeling, ETL/ELT workloads, and enterprise data architecture on platforms like Azure Databricks
    • Experience with data governance tools, BI tools (Tableau, PowerBI), version control systems, and CI/CD tools
    • Relevant certifications in Databricks, cloud technologies (AWS or Azure), or related fields are a plus

    EOE