Responsibilities: Design overall data structure, ensuring that Snowflake's features (e.g., data sharing, scalability, secure data exchange, etc.) are fully utilized to meet the business requirements. Create a blueprint for how data will be stored, processed, and accessed within the Snowflake platform. Perform optimization of data pipelines and workflows for performance, scalability, and cost-efficiency. Design ETL (Extract, Transform, Load) or ELT (Extract, Load, Transform) processes, and optimize queries and data storage strategies. Integrate with other cloud services (e.g., AWS, Azure, GCP), third-party tools, and on-premises data systems. Designs and implements strategies to control access to sensitive data, applying encryption, role-based access control, and data masking as necessary. Work closely with data engineers, data scientists, business analysts, and other stakeholders to understand their requirements and ensure the Snowflake environment meets those needs. Monitor the performance of the Snowflake environment, identifying bottlenecks, and ensuring optimal query performance. Automate administrative tasks using Snowflake SQL and scripting languages like Python or Shell scripting. Preform data loading methods (bulk loading using COPY INTO, Snowpipe for real-time ingestion, and External tables). Perform Snowflake cloning capabilities for databases and schemas. Perform configuration and management of Snowflake Virtual Warehouses including scaling, resizing and auto-suspend/resume settings. Implement roles and privileges for managing secure access utilizing Snowflake RBAC (Role-Based Access Control). Integrate Snowflake SSO (Single Sign-On) and SCIM (System for Cross-domain Identity Management) for secure access and identity management. Configure alerts and monitor data pipeline failures, resource spikes, and cost thresholds. EXPERIENCE- Required Qualifications & Skills: 8 Years: Data modeling, data integration, data warehousing, data governance, and data security. 8 Years: Oracle and/or PostgreSQL in HA deployments and Expertise in data storage 8 Years: Proficiency in Snowflake architecture and its components. 8 Years: Snowflake objects such as Databases, Procedures, Tasks, and Streams. 8 Years: Using Snowflake’s cloning capabilities for databases and schemas. 8 Years: Managing Snowflake Warehouses and optimizing performance for efficient query execution. 8 Years: Proficiency in Snowflake RBAC (Role-Based Access Control), including implementation of roles and privileges. 8 Years: Integrating Snowflake SSO (Single Sign-On) and SCIM (System for Cross-domain Identity Management) for secure access and identity management. 8 Years: Working with data integration tools like Informatica and ADF for seamless ETL/ELT processes. 8 Years: Ability to automate administrative tasks using Snowflake SQL and scripting languages like Python or Shell scripting. 8 Years: Monitoring and troubleshooting Snowflake environments, including usage tracking and query profiling. 8 Years: Strong understanding of Snowflake’s security features such as data masking, encryption, and network policies. 8 Years: Technical writing and diagramming skills, including proficiency with modeling and mapping tools (e.g., Visio, Erwin), and the Microsoft Office Suite (Word, Excel, and PowerPoint) and MS Project. 8 Years: Experience on an agile sprint team. 8 Years: Experience with JIRA software. 8 Years: Experience working with multiple teams concurrently, being able to prioritize and complete work on time with high quality. 8 Years: Knowledge of Informatica 10.5 8 Years: Developing reports in Cognos Analytics 11.1 5 Years: Familiarity with CI/CD pipelines and version control for managing Snowflake code deployments. 5 Years: Prior experience in the Healthcare Industry. 5 Years: Prior experience with an HHS agency. 5 Years: Prior experience working with PII or PHI data 5 Years: Prior experience working with HL7 data 5 Years: Prior experience with Azure 4 Years: Bachelor's degree in computer science, Information Systems, or Business or equivalent experience. Contract Details: The primary work location(s) will be at 1609 Centre Creek, Austin, Texas 78754.. This position is Remote but you are REQUIRED to already be residing in The State of Texas. Normal business hours are Monday through Friday from 8:00 AM to 5:00 PM, excluding State holidays when the agency is closed. Services are expected to start 01/20/2025 and are expected to complete by 08/31/2025. Total estimated hours per Candidate shall not exceed 1550 hours. This service may be amended, renewed, and/or extended providing both parties agree to do so in writing. S#: 529501270 Job Types: Full-time, Contract Pay: $48.56 - $53.92 per hour Schedule: Monday to Friday Application Question(s): Do you already reside in Austin, Texas? If not, do you reside in the State of Texas? Experience: Snowflake (data sharing, scalability): 8 years (Required) ETLC (Extract, Load, Transform): 8 years (Required) Cloud Services such as AWS, Azure, GCP: 8 years (Required) Oracle & Postgre SQL & SQL: 8 years (Required) Snowflake RBAC (Role-Based Access Control): 8 years (Required) Snowflake SSO (Single Sign-On) and SCIM: 8 years (Required) Informatica and ADF for seamless ETL/ELT processes: 8 years (Required) scripting languages like Python or Shell scripting: 8 years (Required) Mapping tools like Viso & Erwin: 1 year (Required) Agile Sprint Team & Jira Software: 1 year (Required) Developing reports in Cognos Analytics 11.1: 8 years (Required) Healthcare Industry: 5 years (Required) HHS agency: 5 years (Required) Work Location: Hybrid remote in Austin, TX 78754

Salary

Competitive

Project Basis based

Remote Job

Worldwide

Job Overview
Job Posted:
1 year ago
Job Type
Contractual
Job Role
Any
Education
Any
Experience
Any
Total Vacancies
-

Share This Job:

Location

United States