Skip to content

Adds support for on-device LLMs with SpeziLLMLocal #141

Adds support for on-device LLMs with SpeziLLMLocal

Adds support for on-device LLMs with SpeziLLMLocal #141

Workflow file for this run

#
# This source file is part of the Stanford HealthGPT project
#
# SPDX-FileCopyrightText: 2023 Stanford University & Project Contributors (see CONTRIBUTORS.md)
#
# SPDX-License-Identifier: MIT
#
name: Pull Request
on:
pull_request:
workflow_dispatch:
workflow_call:
jobs:
buildandtest:
name: Build and Test Application
uses: StanfordBDHG/.github/.github/workflows/xcodebuild-or-fastlane.yml@v2
with:
artifactname: HealthGPT.xcresult
fastlanelane: test
runsonlabels: '["macOS", "self-hosted"]'
reuse_action:
name: REUSE Compliance Check
uses: StanfordBDHG/.github/.github/workflows/reuse.yml@v2
swiftlint:
name: SwiftLint
uses: StanfordBDHG/.github/.github/workflows/swiftlint.yml@v2
uploadcoveragereport:
name: Upload Coverage Report
needs: buildandtest
uses: StanfordBDHG/.github/.github/workflows/create-and-upload-coverage-report.yml@v2
with:
coveragereports: HealthGPT.xcresult