• Data Developer

    SCA HealthMyrtle Point, OR 97458

    Job #2681260423

  • Data Developer

    JOB_DESCRIPTION.SHARE.HTML

    CAROUSEL_PARAGRAPH

    JOB_DESCRIPTION.SHARE.HTML

    • Remote, United States

    • Surgical Care Affiliates

    • Technology

    • Regular

    • Full-time

    • 1

    • USD $67,800.00/Yr.

    • USD $121,200.00/Yr.

    • 32689

    Job Description

    Overview

    Today, SCA Health has grown to 11,000 teammates who care for 1 million patients each year and support physician specialists holistically in many aspects of patient care. Together, our teammates create value in specialty care by aligning physicians, health plans and health systems around a common goal: delivering on the quadruple aim of high-quality outcomes and a better experience for patients and providers, all at a lower total cost of care.

    As part of Optum, we participate in an integrated care delivery system that enables us to support our partners as they navigate a complex healthcare environment, Only SCA Health has a dynamic group of physician-driven, specialty care businesses that allows us to customize solutions, no matter the need or challenge:

    • We connect patients to physicians in new and differentiated ways as part of Optum and with our new Specialty Management Solutions business.
    • We have pioneered a physician-led, multi-site model of practice solutions that restores physician agency by aligning incentives to support growth and transition to value-based care.
    • We lead the industry in value-based payment solutions through our Global 1 bundled payment convener, that provides easy predictable billing to patients.
    • We help physicians address everything beyond surgical procedures, including anesthesia and ancillary service lines.

    The new SCA Health represents who we are today and where we are going—and the growing career opportunities for YOU.

    Responsibilities

    This position is primarily responsible for:

    • Building and supporting data solutions with a focus on quality, performance, scalability, and resilience.

    • Participating in data solution planning, design, development, testing, deployment, and documentation activities.

    • Participating in operational workflows, including code reviews, change management, and release management.

    • Providing accurate estimates and delivering on commitments in an efficient and effective manner.

    • Providing support and guidance to the business concerning data artifacts, solutions, and analysis.

    • Providing production support as required to support business operations.

    • Assisting in team operational, solution, and process improvement initiatives.

    • Having a passion to learn, grow, and teach others both technically and professionally.

    Qualifications

    Minimum Qualifications

    • Bachelor’s Degree in Computer Information Systems or other technology-related fields or equivalent experience.

    • 1+ years of experience using Azure DevOps for data solution management and delivery.

    • 2+ years of experience with Azure Data Factory

    • 2+ years of experience with relational database platform

    • 1+ years of experience Azure Synapse

    • 1+ years of experience Azure Databricks

    • 1+ years of experience Python Scripting Development

    • 2+ years working in SDLC

    Other Qualifications

    • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.

    • Experience supporting and working with cross-functional teams in a dynamic environment.

    • Experience with big data tools: Hadoop, Spark, Kafka, etc.

    • Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in Azure.

    • Work together with data scientists and analysts to understand the needs for data and create effective data workflows.

    • Create and maintain data storage solutions including Azure SQL Database, Azure Data Lake, and Azure Blob Storage.

    • Utilizing Azure Data Factory or comparable technologies, create and maintain ETL (Extract, Transform, Load) operations.

    • Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data.

    • Improve the scalability, efficiency, and cost-effectiveness of data pipelines.

    • Monitoring and resolving data pipeline problems will guarantee consistency and availability of the data.

    USD $67,800.00/Yr. USD $121,200.00/Yr.


    PI09679f4e8e33-~~~4

  • You Can Also Try Searching