- Design, develop, and implement large-scale data solutions around Snowflake Data Warehouse.
- Translate requirements to database design and reporting design.
- Understand data transformation, translation requirements, and tools needed to automate data pipeline using cloud-based Testing and clearly document implementations.
- Build ETL Pipelines to migrate data from Teradata to Snowflake Cloud Data warehouse.
- Use Python to perform data loads into a Snowflake database.
- Use Snow Pipe for Continuous Data Loading and Data Synchage.
- Write complex Snowsql scripts in the Snowflake cloud data warehouse.
- Define Virtual Warehouse Sizing for Snowflake for different types of workloads.
- Create clone objects to maintain Zero-copy cloning. Identify various process metrics in the business.
- Must have a master’s degree or foreign equivalent in computer science, information systems, or a related field and 8 mos experience in the job offered or in software development.
- Experience must include architecting, designing, and operationalizing of large-scale data analytics solutions on Snowflake cloud data warehouse, defining virtual warehouse sizing for Snowflake, validating the data from Oracle to Snowflake, and establishing Data Pipelines to Snowflake from Different Sources.
- In lieu of a master’s degree and 1 year of experience, the employer will accept a bachelor’s degree or its foreign equivalent and 5 years of progressive experience in the field.
- Employer will accept any suitable combination of education, training, and/or experience. Job at Omaha, NE and other unanticipated locations across the U.S. Resume to Apex Informatics, LLC 3000 Farnam St, suite 6 East, Omaha, NE 68131