Job Description
Experience: 5+ years
Description:
Data Business Intelligence (BI) Architect role is a hybrid of data architecture, engineering & business strategy, bridging the gap between tech data solutions & business objectives. Designs, develops & maintains the overall data strategy ensuring the County data in scope is accessible, reliable & secure for analysis and decision-making. The right candidate has experience in architecting data solutions that can be used for descriptive, diagnostic, predictive & prescriptive analytic solutions. KEY RESPONSIBILITIES: Stakeholder Collaboration: Work closely with business & IT stakeholders to gather req & translate business needs into tech specifications, including identification of data sources. Data Arch Design & Data Modeling: Architect & implement scalable, secure & efficient data solutions, including data warehouses, data lakes, and/or data marts. Design conceptual, logical & physical data models. Tool and Platform Selection: Evaluate, recommend & implement tools aligned with recommended architecture, including visualization tools aligned with business needs. ETL/ELT Pipeline Mgt: Design, develop & test data pipelines, integrations to source mgt & ETL / ELT processes to move data from various sources into the data warehouse. Data Catalog & Metadata Mgt: Design, create & maintain an enterprise-wide data catalog, automating metadata ingestion, establishing data dictionaries, and ensuring that all data assets are properly documented & tagged. Data Governance and Discovery: Enforce data governance policies through the data catalog, ensuring data quality, security & compliance. Enable self-service data discovery for users by curating & organizing data assets in an intuitive way. Performance Optimization: Monitor & optimize BI systems & data pipelines to ensure high performance, reliability & cost-effectiveness. Technical Leadership: Provide technical guidance & mentorship across the organization, establishing best practices for data mgt & BI development.
Environment Role incl. defining data platform tech stack. Example tech below; not required to have experience in all. Data Platforms: DW & lake concepts incl. dimensional modeling & cloud services (S3, AWS Redshift, RDS, Azure Data Lake Storage, Synapse Analytics, BigQuery, Databricks, Snowflake, Informatica); Databases: SQL & relational/non-relational (SQL Server, Oracle, PostgreSQL, MongoDB); BI Tools: Power BI, Business Objects, Tableau, Crystal, Looker; ETL/ELT: Cloud native (AWS Glue, Azure Data Factory, Google Cloud Dataflow) & in-warehouse transform tools (Fivetran, Talend, dbt); Big Data Tech: Hadoop, Spark, Kafka; Programming/API: Python, Keras, Scikit-learn, R, XML; ML/DL/Analytic Engines: TensorFlow, PyTorch, Trillium, Apache Spark; Modeling Tools: MS Visio, ER/Studio, PowerDesigner; Source systems incl. on-prem, cloud, & SaaS; Must be able to be on-site at OCIT Building a MINIMUM of two days a week. Please DO NOT re-submit candidates previously considered for Requisition #462