Job Description
Key Responsibilities
- Perform in-depth data analysis to extract insights and support strategic business decisions.
- Develop, test, and maintain data models using dbt (Data Build Tool) to structure data for analysis.
- Write advanced SQL queries to extract, manipulate, and analyze data, primarily in BigQuery.
- Create and manage visualizations in Looker, working with stakeholders to meet reporting needs.
- Utilize product analytics tools like Heap to analyze user interactions and provide product insights.
- Leverage Python for data manipulation, automation, and advanced analytics tasks.
- Implement version control and collaborative workflows using GitHub for data and code management.
- Support data pipeline optimization and reliability in partnership with other data team members.
- Document processes and data models thoroughly, ensuring knowledge transfer and maintainability.
- Conduct code reviews and assist team members with best practices in data modeling and analysis.
- Translate technical insights into business terms for non-technical stakeholders.
Required Qualifications
- Education: Bachelor’s degree in Computer Science, Data Science, Information Systems, or a related field.
- Experience: At least 4 years of relevant work experience in a data analyst or analytics engineer role.
- BI Tools: Experience with data analysis and visualization in BI tools (Looker preferred).
- Data Transformation: Hands-on experience with dbt for data transformation and modeling.
- Advance SQL Proficiency:
-
- Expertise in advanced window functions (e.g., LAG, LEAD, NTH_VALUE) and complex data transformations.
- Capable of writing dynamic SQL queries and handling nested and repeated fields.
- Skilled in designing and implementing materialized views, partitioning, and clustering strategies to optimize query performance in cloud databases.
- Proficient in understanding and using query execution plans and profiling tools to fine-tune performance.
- Experience with recursive queries, query automation, and integrating SQL with ETL tools (e.g., dbt).
- Strong understanding of data modelling techniques for building scalable and efficient schemas (e.g., Star Schema, Snowflake Schema).
- Version Control: Familiarity with GitHub for version control and collaboration.
- Product Analytics: Knowledge of tools like Heap for user behavior analysis.
- Programming: Experience with Python for data manipulation and automation.
- Skills: Strong analytical, problem-solving, and data visualization skills.
- Communication: Excellent communication skills with the ability to translate technical findings into business insights for non-technical audiences.
- Teamwork: Ability to work both independently and collaboratively within a team environment.
- Detail-Oriented: High attention to detail and commitment to producing reliable, high-quality work.