The Senior Data Analyst defines and manages the data structures required to
support the enterprise data warehouse (EDW). The data analyst is a focal
point for understanding data from the corporate perspective and representing
the data so it can be understood uniformly throughout the company. This role
supports multiple projects in building new data sets and data structures as
required by the business. This person will utilize a range of new design
techniques optimized for BI data structures and must be capable of teaching
these new techniques to the project level ETL developers. The BI data
analyst must manage the balance of current and future needs in both design
and content. He/she must also resolve semantic discrepancies in data
definitions that arise among multiple sources and projects. While logical data
modeling is a preferred skill, this role requires advanced skills in data design
and physical implementation of databases.
The analyst must be able to manage the data projects ranging from scope to
QA.
Essential Job Functions:
1. Collaborate with Business Analysts, Analysts, and Senior Developers to
establish well defined and flexible data models for the EDW and for projects
2. Develop logical analysture solutions based on conceptual analysture or
designs
3. Provide advice and guidance to team leads regarding technical and
functional design decisions along with industry best practices
4. Curator of Guiding Principles and Standards in alignment with Enterprise
Analyst Teams Principles and Standards
5. Responsible for end to end solution design, analysture, and security for the
EDW
6. Ensure that all activities are in compliance with rules, regulations, policies,
and procedures as defined by law or FordDirect policies
7. Responsible for Hadoop platform master data management (MDM)
8. Responsible for defining Data Governance practices and Data Profiling
9. Design and analyst multidimensional databases for data warehouse and
analytics processing applications
10. Manage scope of data needs in projects, and handle
Required:
1. Hadoop components and its ecosystem, i.e. File Storage, Workflows,
Relation Data store, NoSQL databases.
2. Strong SQL and RDBMS Design skills
3. Strong dimensional modeling concepts
4. Agile software development, JIRA, Wiki, Crucible
5. Project management
Good to Have:
1. Google Analytics, Adobe Analytics
2. Scala , Perl, Python
3. NoSQL – Hbase
4. Stream processing
Other Responsibilities:
1. Document and maintain project artifacts.
2. Suggest best practices and implementation strategies using Hadoop and
relational databases.
3. Maintain comprehensive knowledge of industry standards, methodologies,
processes, and best practices.
4. Mentor and coach junior team members.
5. Other duties as assigned.
Minimum Qualifications and Job Requirements:
• Must have a Bachelors (Masters preferred) degree in Computer Science or
related IT discipline
Experience:
• Must have at least 10 years of IT development experience.
• Must have 10+ years relevant professional experience in enterprise data
warehouse
• Must have 3+ years relevant professional experience working with Hadoop
(HBase, Hive, MapReduce, Sqoop, Flume) Java, JavaScript, SQL, and
NoSQL
• Must have 5+ years working in an Agile software development
environment
• Must be willing to flex work hours accordingly to support application
launches and manage production outages if necessary
• Strong communication skills
Specific Knowledge, Skills and Abilities:
• Ability to multitask with numerous projects and responsibilities
• Experience working with JIRA and WIKI
• Must have experience working in a fast-paced dynamic environment.
• Must have strong analytical and problem solving skills.
• Must have excellent verbal and written communication skills
• Must be able and willing to participate as individual contributor as needed.
• Must have ability to work the time necessary to complete projects and/or
meet deadlines.