Job Description
We are seeking an experienced Analytics Architect to join our enterprise data consulting practice, helping clients design, implement, and optimize modern analytics platforms that deliver measurable business value. This role focuses on building scalable, governed, and AI-ready architectures that support advanced reporting, machine learning, and operational analytics. You will work closely with client stakeholders, ranging from technical teams to executives, to translate business objectives into robust Databricks, Snowflake and/or Microsoft Fabric cloud-native data architectures. You’ll guide solution design across data engineering, data modeling, governance/MDM, BI/visualization, and AI/ML integration, ensuring adoption of best practices for performance, cost optimization, and scalability. Exceptional verbal and written communication skills are required, with the ability to clearly articulate complex technical ideas and concepts to non-technical stakeholders. NOTE : This job can be done remote. Relocation to central Ohio is not required. Key Responsibilities Client Engagement & Strategy • Partner with business and technical leaders to understand goals, pain points, and desired outcomes. • Lead architecture workshops, discovery sessions, and proofs of concept to align technology strategy with business priorities. • Develop enterprise analytics roadmaps that connect ingestion, transformation, modeling, and consumption layers. • Support pre-sales with technical solutioning, estimates, and value articulation (ROI/TCO). • Deliver training, documentation, and ongoing support to empower users and enhance organizational data literacy. Architecture & Design • Architect Databricks, Snowflake and/or Microsoft Fabric solutions leveraging medallion architectures, data mesh principles, and EDW/star schema modeling for governed, analytics-ready datasets. • Define CDC, streaming, and batch ingestion patterns using cloud-native and third-party tools. • Establish metadata-driven pipeline designs for efficient, scalable data integration. • Design governance frameworks, data quality rules, and standardized reporting processes. • Develop data models and schemas (dimensional, 3NF, data vault) that ensure data integrity, consistency, and security. • Architect semantic and reporting layers for BI tools such as Tableau, Power BI, and Looker. • Design feature stores and AI-ready datasets to accelerate predictive, prescriptive, and generative AI use cases. • Collaborate with cloud platform teams to design optimal data storage and retrieval strategies across environments such as AWS S3, Azure Data Lake, and Delta Lake. Build, Development & Engineering • Implement and optimize ETL/ELT workflows, ensuring efficient data movement between systems. • Translate complex data processes (e.g., Alteryx, Tableau workflows) into optimized Databricks workflows using PySpark and SQL. • Develop reusable automation scripts to streamline workflow migrations and improve operational efficiency. • Establish CI/CD practices for data assets, testing, and deployment automation. • Monitor, troubleshoot, and optimize query performance to ensure scalability and efficiency. • Provide hands-on development and troubleshooting support during implementation. • Partner with data science teams to operationalize ML models and integrate them into business workflows, supporting feature engineering, model training, and MLOps. Requirements Required Qualifications • 5+ years in data/analytics architecture or technical consulting. • Proven expertise designing and implementing Databricks, Snowflake and/or Microsoft Fabric solutions. • Strong experience with cloud platforms (Azure, AWS, or GCP). • Deep understanding of data modeling (dimensional, 3NF, data vault) and data lakehouse concepts. • Proficiency in SQL and at least one programming language (Python preferred). • Experience with data governance, data quality, and MDM solutions. • Skilled in ingestion, transformation, and orchestration patterns. • Strong communication skills with the ability to engage effectively with C-level stakeholders. Preferred Qualifications • Experience with business intelligence tools such as Power BI, Tableau, or similar visualization platforms. • Knowledge of big data processing technologies (Apache Spark, Delta Lake, etc.). • Experience with AI/ML implementation and integration with data platforms. • Familiarity with workflow orchestration tools for reliable data pipelines. • Experience with infrastructure automation using Terraform or similar tools. • Cloud certifications (e.g., AWS Certified Solutions Architect, Microsoft Azure Solutions Architect Expert). • Background in multiple industries (e.g., retail, healthcare, financial services). • Experience mentoring junior architects and engineers. Benefits To read more about working at Infoverity, please visit: Employee Benefits - USA - Infoverity