We are seeking a Sr Data Consultant/Architect to join our Japan team in Tokyo. As a Sr Data Consultant you will be accountable for developing data products with decentralized data architecture. You will be able to apply your data warehousing experience as well as learn modern and efficient ways of delivering data to business partners. You will be a part of the global team working on multiple, cutting-edge technologies to develop solutions for Data Mesh
Responsibilities:
Candidate will be developing data products in Snowflake deployed on an Azure cloud platform
You will be using DataOps.live to develop, manage and deploy data vault and developing transformation workflows with dbt
You will ensure data product characteristics are met with QA
You will be responsible for releasing new versions of data products including writing release notes and following change request process and automating it
You will work on tasks in Jira following agile processes of estimation, sprint planning and sprint execution
You will document results of your work in Confluence
Engage in data governance, data management, business intelligence, and big data analytics and so on
Analyse data(structured/unstructured) to find actionable insight for the improvement of business, using SQL and Python and so on
Create and maintain BI dashboards (optional and good to have)
Educate the importance of data to internal/external stakeholders
Collaborate with other team and cross-team members effectively
Debugging and resolving technical problems
Requirements
Bachelor's degree in Computer Science or related field
Hands-on experience on “Snowflake Cloud Datawarehouse” with at least couple of Snowflake implementation
Experience in Snowflake or similar technologies (building data warehouses using MS SQL Server, Oracle, AWS Redshift etc.) and knowledge of SQL
Experience and demonstrated proficiency with Snowflake query logging, system logging, and other system management tools
Experience in DataOps.live and dbt or similar technologies (SSIS, Informatica, Apache Airflow etc.)
Knowledge of data modelling methodologies - data vault, data products, and data mesh
Understanding of Datawarehouse (DWH) systems, and migration from DWH to data lakes/Snowflake
Understanding of data models and transforming data into models
Strong understanding of Data Analytics architecture components, including cloud data warehouses, metadata driven processes, devops, data as an asset, etc
Good knowledge on ETL (Datastage) Concepts, data pipeline and workflow management
Experience in writing, troubleshooting, and optimizing complex SQL
Experience with Python or Scala programming
Basic knowledge of Git and modern ways of working with code
Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
Working knowledge of Unix/Shell scripting is good to have