Job Description
Job Description
DCI Donor Services (DCIDS) is looking for a dynamic and enthusiastic team member to join us to save lives!! Our mission at DCI Donor Services is to save lives through organ donation and we want professionals on our team that will embrace this important work!! We are currently seeking a BI Data Architect. The BI Data Architect will design, deploy, and maintain a modern Azure Lakehouse architecture leveraging Databricks and the Medallion model. This role is central to integrating data from iTransplant and other 3rd party applications, improving scalability, performance and data quality. The architect will lead technical implementations, set architectural standards and partner with Business Intelligence to enable data-driven decision-making across the organization. This is a remote position; however, candidates must be based in California.
COMPANY OVERVIEW AND MISSION
For over four decades, DCI Donor Services has been a leader in working to end the transplant waiting list. Our unique approach to service allows for nationwide donation, transplantation, and distribution of organs and tissues while maintaining close ties to our local communities.
DCI Donor Services operates three organ procurement/tissue recovery organizations: New Mexico Donor Services, Sierra Donor Services, and Tennessee Donor Services. We also maximize the gift of life through the DCI Donor Services Tissue Bank and Sierra Donor Services Eye Bank.
Our performance is measured by the way we serve donor families and recipients. To be successful in this endeavor is our ultimate mission. By mobilizing the power of people and the potential of technology, we are honored to extend the reach of each donor’s gift and share the importance of the gift of life.
With the help of our employee-led strategy team, we will ensure that all communities feel welcome and safe with us because we are a model for fairness, belonging, and forward thinking.
Key responsibilities this position will perform include:
- Design, build and maintain scalable end-to-end data pipelines using modern ETL/ELT and stream processing tools.
- Architect and manage the Lakehouse environment (Databricks, Azure Data Lake), including Bronze/Silver/Gold Medallion layers.
- Optimize data models for analytical and operational use, enabling self-service analytics through intuitive structures.
- Establish and maintain architectural standards, data governance and security practices in a regulated environment.
- Implement automated testing, CI/CD and monitoring frameworks to ensure data quality, reliability and system performance.
- Collaborate with BI and technical teams to integrate new data sources, prepare technical specifications and improve visibility of data across the organization.
- Document and maintain architecture, processes and data standards.
- Performs other related duties as assigned.
The ideal candidate will have:
TECHNICAL SKILLS:
- Strong expertise in SQL and Python
- Experience with Azure Databricks and Lakehouse architecture (Medallion model) with knowledge of warehouse integration patterns
- Proficiency in designing and implementing scalable data pipelines
- Proficiency in dimensional modeling and data design for both warehouse and Lakehouse environments
- Understanding of data governance principles and practices and data security
- Familiarity with Power BI Service to support integration and enablement of self-service reporting
PHYSICAL TRAITS: Reads, writes, listens and observes. Communicates using both verbal and technological avenues. Walks, stands, lifts, and carries light loads.
QUALIFICATIONS:
Education Required:
Bachelor's degree in Computer Science, Data Science, Engineering, or a related technical field. Master's degree is preferred but not required. Equivalent combination of education and experience may be considered.
Experience:
Minimum of 7 years of professional experience in data engineering, with at least 3 years in a senior or lead role
Proven experience designing and implementing large-scale data pipelines and ELT processes
3+ years hands-on with Azure Databricks (or similar Spark-based platforms), with proven experience implementing Medallion architecture
Experience in applying data governance and security practices in regulated environments
Demonstrated ability to document and maintain data architecture, processes and standards
LICENSES/CERTIFICATION: Certifications in the following areas are preferred but not required
- Azure Data Engineer Associate
- Databricks Certified Professional Data Engineer
- Databricks Certified Associate Developer for Apache Spark
- Security or data governance certifications (CISA, CIPT, CISSP)
We offer a competitive compensation package including:
- Up to 176 hours of PTO your first year
- Up to 72 hours of Sick Time your first year
- Two Medical Plans (your choice of a PPO or HDHP), Dental, and Vision Coverage
- 403(b) plan with matching contribution
- Company provided term life, AD&D, and long-term disability insurance
- Wellness Program
- Supplemental insurance benefits such as accident coverage and short-term disability
- Discounts on home/auto/renter/pet insurance
- Cell phone discounts through Verizon
- Monthly phone stipend
**New employees must have their first dose of the COVID-19 vaccine by their potential start date or be able to supply proof of vaccination.**
***This position does not offer visa sponsorship or OPT Training Plans.***
You will receive a confirmation e-mail upon successful submission of your application. The next step of the selection process will be to complete a video screening. Instructions to complete the video screening will be contained in the confirmation e-mail. Please note - you must complete the video screening within 5 days from submission of your application to be considered for the position.
DCIDS is an EOE/AA employer – M/F/Vet/Disability.