Associate Director of Data Operations
The Opportunity:
Design, build, and automate ETL/ELT workflows to ingest, transform, and integrate data from multiple sources (sales, marketing, clinical, etc.) into our cloud data platform.
Ensure pipelines are scalable, efficient, and minimize downtime through automation and monitoring.
Manage and optimize cloud-based data infrastructure (AWS and Azure) for data storage and processing.
Oversee the provisioning of resources, scheduling of jobs, and infrastructure-as-code deployments to ensure high availability and performance of data systems.
Implement data governance best practices, including data quality checks, validation processes, and metadata management.
Maintain data privacy and compliance with industry regulations (e.g., HIPAA, GDPR), ensuring that sensitive data is handled securely and ethically.
Develop continuous integration/continuous deployment (CI/CD) pipelines for data workflows and analytics applications.
Use modern DevOps tools and containerization (Docker, Kubernetes) to deploy updates to data pipelines, databases, and analytics tools rapidly and reliably.
Work closely with data scientists, BI analysts, and business stakeholders to understand data needs and translate them into technical solutions.
Ensure data is accessible and well-structured for analytics, machine learning models, and business intelligence dashboards.
Set up monitoring, alerting, and logging for data pipelines and databases to proactively identify issues and improve system reliability
Troubleshoot and resolve data pipeline failures, data discrepancies, or performance bottlenecks in a timely manner to minimize impact on business operations.
Create and maintain clear documentation for data pipelines, infrastructure configurations, and processes.
Champion DataOps best practices across teams, mentoring junior engineers and guiding developers in efficient data engineering and operational excellence.
Excellent written and verbal communication skills, able to clearly explain complex data pipelines and infrastructure concepts to both technical colleagues and non-technical stakeholders.
Strong team player who partners well with cross-functional teams on requirements and solutions, open to giving and receiving constructive feedback and sharing knowledge.
Analytical mindset with a solution-oriented approach, capable of troubleshooting issues across the tech stack (data, code, infrastructure) and driving problems to resolution.
Comfortable working in ambiguous environments, defining operating models, processes, roles, and responsibilities while executing and building capabilities and platforms.
Self-motivated and accountable, with a high sense of ownership over deliverables.
Strong experience with cloud platforms such as AWS or Azure (e.g., S3/ADLS, Lambda/Functions, EC2/VMs, Glue/Data Factory, etc.).
Ability to architect and manage data warehouses or lakehouse solutions on the cloud (Databricks preferred).
Proficiency in SQL for data querying and manipulation, as well as programming in Python (Pandas, PySpark, or similar data frameworks) for building pipeline logic and automation.
Experience with containerization and orchestration tools (Docker and Kubernetes) to deploy data services and ensure reproducible environments.
Knowledge of workflow orchestration platforms (Airflow, Airbyte, Fivetran, or similar) for scheduling and managing complex data workflows and integrations.
Hands-on experience implementing CI/CD pipelines using tools like Jenkins, GitLab CI/CD, GitHub Actions, or Azure DevOps.
Expertise in using infrastructure-as-code (Terraform, CloudFormation) and configuration management (Ansible, Helm) to automate deployments and environment management.
Experience with big data processing frameworks (Spark) or streaming platforms (Kafka, Kinesis).
Demonstrated ability to implement monitoring/logging (CloudWatch, Datadog, Splunk, or ELK stack) for data systems. Familiarity with version control (Git) and collaborative development workflows.
Experience with supporting data science and AI/ML workflows, such as provisioning data for machine learning models or knowledge of MLOps principles.
Solid understanding of relational and NoSQL databases (e.g., PostgreSQL, SQL Server, MongoDB) and data modeling concepts.
Bachelor’s degree (or equivalent experience) in Computer Science, Data Engineering, Information Systems, or related field.
5+ years of hands-on experience in Data Engineering, DevOps, or DataOps roles, with a track record of designing scalable data pipelines and infrastructure
Experience in pharma/life sciences.
Experience with oncology data or commercial/medical affairs pharma data at time of launch.
Understanding of industry-specific data sources, terminology, and compliance requirements is a strong plus.
Familiarity with regulations and standards such as HIPAA, GDPR, and GxP as they pertain to data handling and software validation in pharma.
Ability to optimize data pipelines for analytics tools like R, SAS, or visualization platforms (Tableau, Power BI).
Relevant certifications such as AWS Certified Data Analytics or Azure Data Engineer that demonstrate validated expertise.
Experience leading data engineering projects or initiatives. Ability to coordinate work among team members, manage project timelines, and engage with stakeholders to gather requirements and report progress.
Recommended Jobs
Hybrid Individual fully licensed Therapist (LMSW, LPC, Psy D)
Job Description Job Description Management shouldn’t be more stressful than treating clients. We put therapists first so you can put clients first. Anchor Point of Hope Counseling Services is …
Payroll Coordinator
Job Description Job Description The Payroll Coordinator provides support to the Payroll and HR functions by coordinating the bi-weekly payroll process, managing data flow between departments, and…
Warehouse Associate
Compensation Data Starting Pay: $14.26-19.01 per hour, dependent on experience Company Overview With $5 billion in revenue and 3,500+ employees in the U.S. and Mexico, TBC Corporation is a …
UI ReactJS Developer
HTC Global Services wants you. Come build new things with us and advance your career. At HTC Global you'll collaborate with experts. You'll join successful teams contributing to our clients' success. …
Insurance Consultant Account Manager
Job Description Job Description Ready to advance your career as an Insurance Consultant Account Manager with one of the most respected names in the industry? Farmers Insurance is seeking a driven…
Entry Level Communications Agent
Job Title: Entry Level Communications Agent Location: Grand Rapids, MI Reports to: Event Manager Job Type: Full-Time Description Job Summary: We are seeking a motivated and enthu…
CLS II
Overview Join an award-winning team of dedicated professionals committed to our core values of quality, compassion and community! Garden City Hospital, a member of Prime Healthcare, offers incred…
PART TIME Front Desk Office Assistant
Job Description Job Description Salary: 14.00 Front Desk Office Assistant Prestigious Hunt Club is seeking a friendly and service orientated person to join our amazing front desk office team…