Compare commits

...

108 Commits

Author SHA1 Message Date
suprstarrd
5d0ac4bf53 feat: add Ampersand to Trusted Sources
Signed-off-by: suprstarrd <business@suprstarrd.com>
2026-03-19 11:37:32 -04:00
suprstarrd
8624a8e919 feat: add Mona to Trusted Sources (#1210)
* feat: add Mona to Trusted Sources

Signed-off-by: suprstarrd <business@suprstarrd.com>
2026-03-17 12:28:56 -04:00
ny
6e9e0aee0a fix: revert 26.4 fix partially to fix everywhere 2026-02-28 18:03:35 -05:00
ny
a505d04215 fix: 26.4 patch, add correct dest
- update minimuxer
2026-02-27 22:21:41 -05:00
mahee96
95953ca0e9 IfManager: added String representation for readability 2026-02-28 06:45:12 +05:30
mahee96
3b9b45a06f pbxproj: was missing references to IfManager.swift 2026-02-28 04:41:33 +05:30
mahee96
ac277aa6eb minimuxer: added @nythepegasus's patch as-is 2026-02-28 04:24:03 +05:30
mahee96
7926452661 minimuxer: added @nythepegasus's patch as-is 2026-02-28 04:23:58 +05:30
mahee96
dfb01c2ae5 minimuxer: updated submodule to latest 2026-02-28 03:52:35 +05:30
mahee96
bff192be4e ALTSign cleanup 2026-02-25 08:16:42 +05:30
mahee96
a863b2f39a ALTSign cleanup 2026-02-25 08:15:43 +05:30
mahee96
bf72a0edfc CI: add what xcode is suggesting (xcode noise from auto create schemes) 2026-02-25 08:15:31 +05:30
mahee96
06ad6488cc CI: changelog in md was incorrect in release notes 2026-02-25 08:08:46 +05:30
mahee96
12f84b2365 CI: fixes for author 2026-02-25 07:46:54 +05:30
mahee96
118f64de8a CI: multiple fixes 2026-02-25 07:33:55 +05:30
mahee96
555bb3d985 CI: remove harcoded repo name and use incoming 2026-02-25 03:30:57 +05:30
mahee96
25925aceef CI: create tag if required 2026-02-25 03:18:31 +05:30
mahee96
5444fdd9bb CI: handle zipping irrespective of encryption is possible or not 2026-02-25 03:14:50 +05:30
mahee96
f91e0a6295 fix compilation issue after recent merge 2026-02-25 03:00:34 +05:30
mahee96
4e4f0f6a3f Merge branch 'cell-staging' into develop 2026-02-25 02:50:13 +05:30
mahee96
9706a43bc1 ci - decouple source_metadata.json generation from deploy 2026-02-25 02:41:58 +05:30
mahee96
d748a89b47 ci - do not encrypt build logs if password unavailable (forks CI friendly) 2026-02-25 01:45:23 +05:30
mahee96
f412f6df23 Merge branch 'cell-shortcut' into cell-staging
# Conflicts:
#	AltStore/Operations/SendAppOperation.swift
2026-02-25 01:20:56 +05:30
mahee96
b8354d3d0e cleanup - xcode brought these new entries for schemes 2026-02-25 01:18:09 +05:30
mahee96
191ce15d55 Merge branch 'staging' into develop 2026-02-25 00:50:28 +05:30
mahee96
7f8f4fa9a7 ci: fixes for yml 2026-02-25 00:42:02 +05:30
mahee96
ad54cbadc1 ci: fixes for yml 2026-02-25 00:40:09 +05:30
mahee96
eb251b89c9 ci: fixes for yml 2026-02-25 00:30:07 +05:30
mahee96
9efea00d09 ci: use runner number as build number 2026-02-25 00:25:11 +05:30
mahee96
7df29ca23b ci: fixes in workflow.py 2026-02-25 00:10:40 +05:30
mahee96
d06520d46f ci: fix - tag was not pushed 2026-02-25 00:08:41 +05:30
mahee96
1ad0fe23fc ci: fix - version number was inconsistent across deployment of beta channels 2026-02-25 00:04:42 +05:30
mahee96
21bbcd69f8 ci: updated stable workflow as per workflow.py and other workflows 2026-02-24 22:07:36 +05:30
mahee96
fd920be3bf ci: updated PR workflow to reflect latest 2026-02-24 21:15:28 +05:30
mahee96
16bb57c825 ci: use proper messaging for upstream tag 2026-02-24 20:51:25 +05:30
mahee96
bf5b1c935c ci: cleanup - removed obsolete stuffs 2026-02-24 20:43:35 +05:30
mahee96
1b1c7c58e2 ci: moved cache maintenance script into obsolete 2026-02-24 20:43:03 +05:30
Huge_Black
31e8eb7996 Merge pull request #1183 from LiveContainer/develop
Fix issues profile/afc and use main profile issue
2026-02-24 23:07:11 +08:00
mahee96
47db2c3d5d ci: fix typo in brew command 2026-02-24 20:35:50 +05:30
mahee96
967b9f7572 ci: removed caching for ldid and xcbeautify coz they take much more time to restore than install 2026-02-24 20:32:02 +05:30
mahee96
b91bcee70f ci: added some names to few steps 2026-02-24 20:27:46 +05:30
mahee96
25f34c6f69 ci: added back the nightly schedule checking for new commits and only then build 2026-02-24 20:23:12 +05:30
mahee96
b7085aaeca ci: improve speed by caching brew install step and reading project settings from dumped xcodebuild -showProjectSettings instead of invoking each time. 2026-02-24 18:54:41 +05:30
mahee96
046d2788b9 ci: fix indentation in release notes 2026-02-24 18:20:27 +05:30
mahee96
f44ed0a947 ci: fix indentation in release notes 2026-02-24 18:19:54 +05:30
Huge_Black
d356045b5d bug-fix: fix crash when viewing app ids 2026-02-24 16:08:15 +08:00
Huge_Black
a54881a1c8 bug-fix: fix profile not installed 2026-02-24 16:08:15 +08:00
Huge_Black
e27d44eece fix build issue 2026-02-24 16:08:15 +08:00
Huge_Black
c2cecb63ac bug-fix: fix useMainProfile not saved / not set during initialization
SideStore is killed by iOS before CoreData commit changes to db.

Detect if the app is installed with useMainProfile by checking if applicationIdentifier matches
2026-02-24 16:08:15 +08:00
Huge_Black
381e09d87c bug-fix: Fix crash when installing apps 2026-02-24 16:08:15 +08:00
mahee96
efbb40982e ci: fix indentation in release notes 2026-02-24 13:20:14 +05:30
mahee96
c1a033a627 staging: added some redundant files to gitignore 2026-02-24 13:14:40 +05:30
mahee96
1449f8c74f staging: prepare new branch for alpha release channels and high velocity development 2026-02-24 13:04:44 +05:30
mahee96
3961688b73 ci: more fixes 2026-02-24 12:42:55 +05:30
mahee96
f0da9cf8aa ci: more fixes 2026-02-24 12:25:12 +05:30
mahee96
aa224f68c7 ci: more fixes 2026-02-24 09:13:09 +05:30
mahee96
a02d1c49e8 ci: more fixes 2026-02-24 08:42:52 +05:30
mahee96
226f0dcc6b CI: improve more ci worflow 2026-02-24 08:19:56 +05:30
mahee96
bce38c8743 CI: improve more ci worflow 2026-02-24 07:41:06 +05:30
mahee96
0e72a33af8 CI: improve more ci worflow 2026-02-24 07:23:20 +05:30
mahee96
0677cc287e CI: improve more ci worflow 2026-02-24 07:21:14 +05:30
mahee96
b0bfbf5513 CI: improve more ci worflow 2026-02-24 07:12:56 +05:30
mahee96
ea86b98674 CI: improve more ci worflow 2026-02-24 07:00:17 +05:30
mahee96
3d47d486ef CI: improve more ci worflow 2026-02-24 06:22:03 +05:30
mahee96
3a05485c40 CI: improve more ci worflow 2026-02-24 05:47:38 +05:30
mahee96
31d07534d0 CI: improve more ci worflow 2026-02-24 05:27:15 +05:30
mahee96
99712f0020 CI: improve more ci worflow 2026-02-24 03:58:47 +05:30
mahee96
c5394be883 CI: improve more ci worflow 2026-02-24 03:53:26 +05:30
mahee96
a07657261d CI: improve more ci worflow 2026-02-24 03:47:15 +05:30
mahee96
db00202b37 CI: improve more ci worflow 2026-02-24 03:33:26 +05:30
mahee96
b16dda5590 CI: improve more ci worflow 2026-02-24 03:28:46 +05:30
mahee96
f8c4c558f6 CI: improve more ci worflow 2026-02-24 03:21:19 +05:30
mahee96
ae1bd49a99 CI: improve more ci worflow 2026-02-24 03:16:26 +05:30
mahee96
97b04094eb CI: improve more ci worflow 2026-02-24 03:13:55 +05:30
mahee96
675bdc63ae CI: improve more ci worflow 2026-02-24 02:59:08 +05:30
mahee96
8be9de3b11 CI: improve more ci worflow 2026-02-24 02:53:51 +05:30
mahee96
0403dc3278 CI: improve more ci worflow 2026-02-24 02:43:02 +05:30
mahee96
c546ff6642 CI: improve more ci worflow 2026-02-24 02:40:34 +05:30
mahee96
dc058938ef altsign updated to latest 2026-02-24 02:40:30 +05:30
mahee96
4984e5119f CI: improve more ci worflow 2026-02-24 02:29:13 +05:30
mahee96
bcadc92057 CI: improve more ci worflow 2026-02-24 02:25:50 +05:30
mahee96
cb93168516 CI: improve more ci worflow 2026-02-24 02:22:45 +05:30
mahee96
4a688f20fc CI: improve more ci worflow 2026-02-24 02:20:10 +05:30
mahee96
d37ba80ca1 CI: improve more ci worflow 2026-02-24 02:14:13 +05:30
mahee96
a3190fb016 re added openSSL from new path 2026-02-24 02:14:10 +05:30
mahee96
aec39c0ccd updated altsign to use xcframework for openSSL which was causing huge download of 1.2 GB each time 2026-02-24 02:13:48 +05:30
mahee96
c82662a72b CI: improve more ci worflow 2026-02-24 01:54:41 +05:30
mahee96
a6315442fb CI: improve more ci worflow 2026-02-24 01:54:05 +05:30
mahee96
97d8da1eab CI: improve more ci worflow 2026-02-24 01:51:35 +05:30
mahee96
4975ea4fcd CI: improve more ci worflow 2026-02-24 01:41:36 +05:30
mahee96
88f7122d8d CI: improve more ci worflow 2026-02-24 01:36:33 +05:30
mahee96
50482a9dc4 CI: improve more ci worflow 2026-02-24 01:26:13 +05:30
mahee96
af04388aea CI: improve more ci worflow 2026-02-24 01:20:37 +05:30
mahee96
7553e25bbf CI: improve more ci worflow 2026-02-24 01:10:38 +05:30
mahee96
7c01ba0db6 CI: improve more ci worflow 2026-02-24 00:59:05 +05:30
mahee96
e55351dbb0 CI: improve more ci worflow 2026-02-24 00:41:33 +05:30
mahee96
97ee0b2dac CI: full rewrite - moved logic into ci.py and kept workflow scripts mostly dummy 2026-02-23 20:21:59 +05:30
spidy123222
625389ab96 Add Exit Shortcut 2025-04-08 15:19:33 -07:00
spidy123222
f7e34cbbe9 Rewrite SendAppOperation execution to allow to wait for shortcut execution. 2025-04-08 15:19:33 -07:00
spidy123222
0fe8d7fed9 Move to SendAppOperation 2025-04-08 15:19:33 -07:00
spidy123222
1a1aa42e02 move it behind pendiungunitcount 60 2025-04-08 15:19:33 -07:00
spidy123222
7ff4b48223 Move attempt to a higher Stage. 2025-04-08 15:19:33 -07:00
spidy123222
4801f6e8f2 Attempt a million 2025-04-08 15:19:33 -07:00
Spidy123222
ff28f6fa8f Add files via upload
Signed-off-by: Spidy123222 <64176728+Spidy123222@users.noreply.github.com>
2025-04-08 15:19:33 -07:00
Spidy123222
2d141afbaf remove from install apps
Signed-off-by: Spidy123222 <64176728+Spidy123222@users.noreply.github.com>
2025-04-08 15:19:33 -07:00
Spidy123222
06e38aae00 Hopefully fix problem
Signed-off-by: Spidy123222 <64176728+Spidy123222@users.noreply.github.com>
2025-04-08 15:19:33 -07:00
Spidy123222
d8783230a7 fix error for open link
Signed-off-by: Spidy123222 <64176728+Spidy123222@users.noreply.github.com>
2025-04-08 15:19:33 -07:00
Spidy123222
6c479bfede test open URL
Signed-off-by: Spidy123222 <64176728+Spidy123222@users.noreply.github.com>
2025-04-08 15:19:33 -07:00
34 changed files with 3102 additions and 2290 deletions

View File

@@ -1,63 +0,0 @@
import requests
import sys
import os
# Your GitHub Personal Access Token
GITHUB_TOKEN = os.getenv("GITHUB_TOKEN")
# Repository details
REPO_OWNER = "SideStore"
REPO_NAME = "SideStore"
API_URL = f"https://api.github.com/repos/{REPO_OWNER}/{REPO_NAME}/actions/caches"
# Common headers for GitHub API calls
HEADERS = {
"Accept": "application/vnd.github+json",
"Authorization": f"Bearer {GITHUB_TOKEN}"
}
def list_caches():
response = requests.get(API_URL, headers=HEADERS)
if response.status_code != 200:
print(f"Failed to list caches. HTTP {response.status_code}")
print("Response:", response.text)
sys.exit(1)
data = response.json()
return data.get("actions_caches", [])
def delete_cache(cache_id):
delete_url = f"{API_URL}/{cache_id}"
response = requests.delete(delete_url, headers=HEADERS)
return response.status_code
def main():
caches = list_caches()
if not caches:
print("No caches found.")
return
print("Found caches:")
for cache in caches:
print(f"ID: {cache.get('id')}, Key: {cache.get('key')}")
print("\nDeleting caches...")
for cache in caches:
cache_id = cache.get("id")
status = delete_cache(cache_id)
if status == 204:
print(f"Successfully deleted cache with ID: {cache_id}")
else:
print(f"Failed to delete cache with ID: {cache_id}. HTTP status code: {status}")
print("All caches processed.")
if __name__ == "__main__":
main()
### How to use
'''
just export the GITHUB_TOKEN and then run this script via `python3 cache.py' to delete the caches
'''

View File

@@ -1,28 +1,220 @@
name: Alpha SideStore build
name: Alpha SideStore Build
on:
push:
branches:
- develop-alpha
branches: [staging]
workflow_dispatch:
# cancel duplicate run if from same branch
concurrency:
group: ${{ github.ref }}
cancel-in-progress: true
jobs:
Reusable-build:
uses: ./.github/workflows/reusable-sidestore-build.yml
with:
# bundle_id: "com.SideStore.SideStore.Alpha"
bundle_id: "com.SideStore.SideStore"
# bundle_id_suffix: ".Alpha"
is_beta: true
publish: ${{ vars.PUBLISH_ALPHA_UPDATES == 'true' }}
is_shared_build_num: false
release_tag: "alpha"
release_name: "Alpha"
upstream_tag: "nightly"
upstream_name: "Nightly"
secrets:
CROSS_REPO_PUSH_KEY: ${{ secrets.CROSS_REPO_PUSH_KEY }}
BUILD_LOG_ZIP_PASSWORD: ${{ secrets.BUILD_LOG_ZIP_PASSWORD }}
build:
runs-on: macos-26
env:
DEPLOY_KEY: ${{ secrets.CROSS_REPO_PUSH_KEY }}
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
RELEASE_NAME: Alpha
CHANNEL: alpha
UPSTREAM_CHANNEL: "nightly"
steps:
- uses: actions/checkout@v4
with:
submodules: recursive
fetch-depth: 0
- name: Find Last Successful commit
run: |
LAST_SUCCESSFUL_COMMIT=$(python3 scripts/ci/workflow.py last-successful-commit \
"false" "${{ env.CHANNEL }}" || echo "")
echo "LAST_SUCCESSFUL_COMMIT=$LAST_SUCCESSFUL_COMMIT" | tee -a $GITHUB_ENV
- run: brew install ldid xcbeautify
# --------------------------------------------------
# runtime env setup
# --------------------------------------------------
- name: Setup Env
run: |
BUILD_NUM="${{ github.run_number }}"
MARKETING_VERSION=$(python3 scripts/ci/workflow.py get-marketing-version)
SHORT_COMMIT=$(python3 scripts/ci/workflow.py commit-id)
NORMALIZED_VERSION=$(python3 scripts/ci/workflow.py compute-normalized \
"$MARKETING_VERSION" \
"$BUILD_NUM" \
"$SHORT_COMMIT")
python3 scripts/ci/workflow.py set-marketing-version "$NORMALIZED_VERSION"
echo "BUILD_NUM=$BUILD_NUM" | tee -a $GITHUB_ENV
echo "SHORT_COMMIT=$SHORT_COMMIT" | tee -a $GITHUB_ENV
echo "MARKETING_VERSION=$NORMALIZED_VERSION" | tee -a $GITHUB_ENV
- name: Setup Xcode
uses: maxim-lobanov/setup-xcode@v1.6.0
with:
xcode-version: "26.2"
- name: Restore Cache (exact)
id: xcode-cache-exact
uses: actions/cache/restore@v3
with:
path: |
~/Library/Developer/Xcode/DerivedData
~/Library/Caches/org.swift.swiftpm
key: xcode-build-cache-${{ github.ref_name }}-${{ github.sha }}
- name: Restore Cache (last)
if: steps.xcode-cache-exact.outputs.cache-hit != 'true'
id: xcode-cache-fallback
uses: actions/cache/restore@v3
with:
path: |
~/Library/Developer/Xcode/DerivedData
~/Library/Caches/org.swift.swiftpm
key: xcode-build-cache-${{ github.ref_name }}-
# --------------------------------------------------
# build and test
# --------------------------------------------------
- name: Clean
if: contains(github.event.head_commit.message, '[--clean-build]')
run: |
python3 scripts/ci/workflow.py clean
python3 scripts/ci/workflow.py clean-derived-data
python3 scripts/ci/workflow.py clean-spm-cache
- name: Boot simulator (async)
if: >
vars.ENABLE_TESTS == '1' &&
vars.ENABLE_TESTS_RUN == '1'
run: |
mkdir -p build/logs
python3 scripts/ci/workflow.py boot-sim-async "iPhone 17 Pro"
- name: Build
id: build
env:
BUILD_LOG_ZIP_PASSWORD: ${{ secrets.BUILD_LOG_ZIP_PASSWORD }}
run: |
python3 scripts/ci/workflow.py build; STATUS=$?
python3 scripts/ci/workflow.py encrypt-build
echo "encrypted=true" >> $GITHUB_OUTPUT
exit $STATUS
- name: Tests Build
id: test-build
if: >
vars.ENABLE_TESTS == '1' &&
vars.ENABLE_TESTS_BUILD == '1'
env:
BUILD_LOG_ZIP_PASSWORD: ${{ secrets.BUILD_LOG_ZIP_PASSWORD }}
run: |
python3 scripts/ci/workflow.py tests-build; STATUS=$?
python3 scripts/ci/workflow.py encrypt-tests-build
exit $STATUS
- name: Save Cache
if: ${{ steps.xcode-cache-fallback.outputs.cache-hit != 'true' }}
uses: actions/cache/save@v3
with:
path: |
~/Library/Developer/Xcode/DerivedData
~/Library/Caches/org.swift.swiftpm
key: xcode-build-cache-${{ github.ref_name }}-${{ github.sha }}
- name: Tests Run
id: test-run
if: >
vars.ENABLE_TESTS == '1' &&
vars.ENABLE_TESTS_RUN == '1'
env:
BUILD_LOG_ZIP_PASSWORD: ${{ secrets.BUILD_LOG_ZIP_PASSWORD }}
run: |
python3 scripts/ci/workflow.py tests-run "iPhone 17 Pro"; STATUS=$?
python3 scripts/ci/workflow.py encrypt-tests-run
exit $STATUS
# --------------------------------------------------
# artifacts
# --------------------------------------------------
- uses: actions/upload-artifact@v4
with:
name: build-logs-${{ env.MARKETING_VERSION }}.zip
path: build-logs.zip
- uses: actions/upload-artifact@v4
if: >
vars.ENABLE_TESTS == '1' &&
vars.ENABLE_TESTS_BUILD == '1'
with:
name: tests-build-logs-${{ env.SHORT_COMMIT }}.zip
path: tests-build-logs.zip
- uses: actions/upload-artifact@v4
if: >
vars.ENABLE_TESTS == '1' &&
vars.ENABLE_TESTS_RUN == '1'
with:
name: tests-run-logs-${{ env.SHORT_COMMIT }}.zip
path: tests-run-logs.zip
- uses: actions/upload-artifact@v4
with:
name: SideStore-${{ env.MARKETING_VERSION }}.ipa
path: SideStore.ipa
- uses: actions/upload-artifact@v4
with:
name: SideStore-${{ env.MARKETING_VERSION }}-dSYMs.zip
path: SideStore.dSYMs.zip
- uses: actions/checkout@v4
if: env.DEPLOY_KEY != ''
with:
repository: "SideStore/apps-v2.json"
ref: "main"
token: ${{ secrets.CROSS_REPO_PUSH_KEY }}
path: "SideStore/apps-v2.json"
- name: Generate Metadata
run: |
python3 scripts/ci/workflow.py dump-project-settings
PRODUCT_NAME=$(python3 scripts/ci/workflow.py read-product-name)
BUNDLE_ID=$(python3 scripts/ci/workflow.py read-bundle-id)
IPA_NAME="$PRODUCT_NAME.ipa"
python3 scripts/ci/workflow.py generate-metadata \
"$CHANNEL" \
"$SHORT_COMMIT" \
"$MARKETING_VERSION" \
"$CHANNEL" \
"$BUNDLE_ID" \
"$IPA_NAME" \
"$LAST_SUCCESSFUL_COMMIT"
- name: Deploy
if: env.DEPLOY_KEY != ''
run: |
SOURCE_JSON="_includes/source.json"
python3 scripts/ci/workflow.py deploy \
SideStore/apps-v2.json \
"$SOURCE_JSON" \
"$CHANNEL" \
"$MARKETING_VERSION"
# --------------------------------------------------
# upload release to GH
# --------------------------------------------------
- name: Upload Release
run: |
python3 scripts/ci/workflow.py upload-release \
"$RELEASE_NAME" \
"$CHANNEL" \
"$GITHUB_SHA" \
"$GITHUB_REPOSITORY" \
"$UPSTREAM_CHANNEL"

View File

@@ -1,103 +0,0 @@
name: Beta SideStore build
on:
push:
tags:
- '[0-9]+.[0-9]+.[0-9]+-beta.[0-9]+' # example: 1.0.0-beta.1
jobs:
build:
name: Build and upload SideStore Beta
strategy:
fail-fast: false
matrix:
include:
- os: 'macos-14'
version: '15.4'
runs-on: ${{ matrix.os }}
steps:
- name: Checkout code
uses: actions/checkout@v4
with:
submodules: recursive
- name: Install dependencies
run: brew install ldid
- name: Change version to tag
run: sed -e '/MARKETING_VERSION = .*/s/= .*/= ${{ github.ref_name }}/' -i '' Build.xcconfig
- name: Get version
id: version
run: echo "version=$(grep MARKETING_VERSION Build.xcconfig | sed -e "s/MARKETING_VERSION = //g")" >> $GITHUB_OUTPUT
- name: Echo version
run: echo "${{ steps.version.outputs.version }}"
- name: Setup Xcode
uses: maxim-lobanov/setup-xcode@v1
with:
xcode-version: ${{ matrix.version }}
- name: Cache Build
uses: irgaly/xcode-cache@v1
with:
key: xcode-cache-deriveddata-${{ github.sha }}
restore-keys: xcode-cache-deriveddata
- name: Build SideStore
run: make build | xcpretty && exit ${PIPESTATUS[0]}
- name: Fakesign app
run: make fakesign
- name: Convert to IPA
run: make ipa
- name: Get current date
id: date
run: echo "date=$(date -u +'%c')" >> $GITHUB_OUTPUT
- name: Get current date in AltStore date form
id: date_altstore
run: echo "date=$(date -u +'%Y-%m-%d')" >> $GITHUB_OUTPUT
- name: Upload to new beta release
uses: softprops/action-gh-release@v1
with:
token: ${{ secrets.GITHUB_TOKEN }}
name: ${{ steps.version.outputs.version }}
tag_name: ${{ github.ref_name }}
draft: true
prerelease: true
files: SideStore.ipa
body: |
<!-- NOTE: to reset SideSource cache, go to `https://apps.sidestore.io/reset-cache/nightly/<sidesource key>`. This is not included in the GitHub Action since it makes draft releases so they can be edited and have a changelog. -->
Beta builds are hand-picked builds from development commits that will allow you to try out new features earlier than normal. However, **they might contain bugs and other issues. Use at your own risk!**
## Changelog
- TODO
## Build Info
Built at (UTC): `${{ steps.date.outputs.date }}`
Built at (UTC date): `${{ steps.date_altstore.outputs.date }}`
Commit SHA: `${{ github.sha }}`
Version: `${{ steps.version.outputs.version }}`
- name: Add version to IPA file name
run: mv SideStore.ipa SideStore-${{ steps.version.outputs.version }}.ipa
- name: Upload SideStore.ipa Artifact
uses: actions/upload-artifact@v4
with:
name: SideStore-${{ steps.version.outputs.version }}.ipa
path: SideStore-${{ steps.version.outputs.version }}.ipa
- name: Upload *.dSYM Artifact
uses: actions/upload-artifact@v4
with:
name: SideStore-${{ steps.version.outputs.version }}-dSYM
path: ./*.dSYM/

View File

@@ -1,34 +0,0 @@
#!/usr/bin/env bash
# Ensure we are in root directory
cd "$(dirname "$0")/../.."
DATE=`date -u +'%Y.%m.%d'`
BUILD_NUM=1
# Use RELEASE_CHANNEL from the environment variable or default to "beta"
RELEASE_CHANNEL=${RELEASE_CHANNEL:-"beta"}
write() {
sed -e "/MARKETING_VERSION = .*/s/$/-$RELEASE_CHANNEL.$DATE.$BUILD_NUM+$(git rev-parse --short HEAD)/" -i '' Build.xcconfig
echo "$DATE,$BUILD_NUM" > build_number.txt
}
if [ ! -f "build_number.txt" ]; then
write
exit 0
fi
LAST_DATE=`cat build_number.txt | perl -n -e '/([^,]*),([^ ]*)$/ && print $1'`
LAST_BUILD_NUM=`cat build_number.txt | perl -n -e '/([^,]*),([^ ]*)$/ && print $2'`
# if [[ "$DATE" != "$LAST_DATE" ]]; then
# write
# else
# BUILD_NUM=`expr $LAST_BUILD_NUM + 1`
# write
# fi
# Build number is always incremental
BUILD_NUM=`expr $LAST_BUILD_NUM + 1`
write

View File

@@ -1,82 +1,260 @@
name: Nightly SideStore Build
on:
push:
branches:
- develop
branches: [develop]
schedule:
- cron: '0 0 * * *' # Runs every night at midnight UTC
workflow_dispatch: # Allows manual trigger
- cron: "0 0 * * *"
workflow_dispatch:
# cancel duplicate run if from same branch
concurrency:
group: ${{ github.ref }}
cancel-in-progress: true
jobs:
check-changes:
if: github.event_name == 'schedule'
runs-on: ubuntu-latest
outputs:
has_changes: ${{ steps.check.outputs.has_changes }}
build:
runs-on: macos-26
env:
DEPLOY_KEY: ${{ secrets.CROSS_REPO_PUSH_KEY }}
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
RELEASE_NAME: Nightly
CHANNEL: nightly
UPSTREAM_CHANNEL: ""
steps:
- name: Checkout repository
uses: actions/checkout@v4
- uses: actions/checkout@v4
with:
fetch-depth: 0 # Ensure full history
submodules: recursive
fetch-depth: 0
- name: Get last successful workflow run
id: get_last_success
- name: Find Last Successful commit
run: |
LAST_SUCCESS=$(gh run list --workflow "Nightly SideStore Build" --json createdAt,conclusion \
--jq '[.[] | select(.conclusion=="success")][0].createdAt' || echo "")
echo "Last successful run: $LAST_SUCCESS"
echo "last_success=$LAST_SUCCESS" >> $GITHUB_ENV
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
LAST_SUCCESSFUL_COMMIT=$(python3 scripts/ci/workflow.py last-successful-commit \
"false" "${{ env.CHANNEL }}" || echo "")
echo "LAST_SUCCESSFUL_COMMIT=$LAST_SUCCESSFUL_COMMIT" | tee -a $GITHUB_ENV
- name: Check for new commits since last successful build
id: check
- name: Check for new changes (on schedule)
id: check_changes
if: github.event_name == 'schedule'
run: |
if [ -n "$LAST_SUCCESS" ]; then
NEW_COMMITS=$(git rev-list --count --since="$LAST_SUCCESS" origin/develop)
COMMIT_LOG=$(git log --since="$LAST_SUCCESS" --pretty=format:"%h %s" origin/develop)
else
NEW_COMMITS=1
COMMIT_LOG=$(git log -n 10 --pretty=format:"%h %s" origin/develop) # Show last 10 commits if no history
fi
echo "Has changes: $NEW_COMMITS"
echo "New commits since last successful build:"
echo "$COMMIT_LOG"
if [ "$NEW_COMMITS" -gt 0 ]; then
echo "has_changes=true" >> $GITHUB_OUTPUT
else
echo "has_changes=false" >> $GITHUB_OUTPUT
fi
NEW_COMMITS=$(python3 scripts/ci/workflow.py count-new-commits "$LAST_SUCCESSFUL_COMMIT")
SHOULD_BUILD=$([ "${NEW_COMMITS:-0}" -ge 1 ] && echo true || echo false)
echo "should_build=$SHOULD_BUILD" >> $GITHUB_OUTPUT
echo "NEW_COMMITS=$NEW_COMMITS" | tee -a $GITHUB_ENV
- name: Should Skip Building (on schedule)
id: build_gate
run: |
SHOULD_SKIP=$(
{ [ "${{ github.event_name }}" = "schedule" ] && \
[ "${{ steps.check_changes.outputs.should_build }}" != "true" ]; \
} && echo true || echo false
)
echo "should_skip=$SHOULD_SKIP" >> $GITHUB_OUTPUT
- run: brew install ldid xcbeautify
if: steps.build_gate.outputs.should_skip != 'true'
# --------------------------------------------------
# runtime env setup
# --------------------------------------------------
- name: Setup Env
if: steps.build_gate.outputs.should_skip != 'true'
run: |
BUILD_NUM="${{ github.run_number }}"
MARKETING_VERSION=$(python3 scripts/ci/workflow.py get-marketing-version)
SHORT_COMMIT=$(python3 scripts/ci/workflow.py commit-id)
NORMALIZED_VERSION=$(python3 scripts/ci/workflow.py compute-normalized \
"$MARKETING_VERSION" \
"$BUILD_NUM" \
"$SHORT_COMMIT")
python3 scripts/ci/workflow.py set-marketing-version "$NORMALIZED_VERSION"
echo "BUILD_NUM=$BUILD_NUM" | tee -a $GITHUB_ENV
echo "SHORT_COMMIT=$SHORT_COMMIT" | tee -a $GITHUB_ENV
echo "MARKETING_VERSION=$NORMALIZED_VERSION" | tee -a $GITHUB_ENV
- name: Setup Xcode
if: steps.build_gate.outputs.should_skip != 'true'
uses: maxim-lobanov/setup-xcode@v1.6.0
with:
xcode-version: "26.2"
- name: Restore Cache (exact)
if: steps.build_gate.outputs.should_skip != 'true'
id: xcode-cache-exact
uses: actions/cache/restore@v3
with:
path: |
~/Library/Developer/Xcode/DerivedData
~/Library/Caches/org.swift.swiftpm
key: xcode-build-cache-${{ github.ref_name }}-${{ github.sha }}
- name: Restore Cache (last)
if: >
steps.build_gate.outputs.should_skip != 'true' &&
steps.xcode-cache-exact.outputs.cache-hit != 'true'
id: xcode-cache-fallback
uses: actions/cache/restore@v3
with:
path: |
~/Library/Developer/Xcode/DerivedData
~/Library/Caches/org.swift.swiftpm
key: xcode-build-cache-${{ github.ref_name }}-
# --------------------------------------------------
# build and test
# --------------------------------------------------
- name: Clean
if: steps.build_gate.outputs.should_skip != 'true' && contains(github.event.head_commit.message, '[--clean-build]')
run: |
python3 scripts/ci/workflow.py clean
python3 scripts/ci/workflow.py clean-derived-data
python3 scripts/ci/workflow.py clean-spm-cache
- name: Boot simulator (async)
if: >
steps.build_gate.outputs.should_skip != 'true' &&
vars.ENABLE_TESTS == '1' &&
vars.ENABLE_TESTS_RUN == '1'
run: |
mkdir -p build/logs
python3 scripts/ci/workflow.py boot-sim-async "iPhone 17 Pro"
- name: Build
if: steps.build_gate.outputs.should_skip != 'true'
id: build
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
LAST_SUCCESS: ${{ env.last_success }}
BUILD_LOG_ZIP_PASSWORD: ${{ secrets.BUILD_LOG_ZIP_PASSWORD }}
run: |
python3 scripts/ci/workflow.py build; STATUS=$?
python3 scripts/ci/workflow.py encrypt-build
echo "encrypted=true" >> $GITHUB_OUTPUT
exit $STATUS
Reusable-build:
if: |
always() &&
(github.event_name == 'push' ||
(github.event_name == 'schedule' && needs.check-changes.result == 'success' && needs.check-changes.outputs.has_changes == 'true'))
needs: check-changes
uses: ./.github/workflows/reusable-sidestore-build.yml
with:
# bundle_id: "com.SideStore.SideStore.Nightly"
bundle_id: "com.SideStore.SideStore"
# bundle_id_suffix: ".Nightly"
is_beta: true
publish: ${{ vars.PUBLISH_NIGHTLY_UPDATES == 'true' }}
is_shared_build_num: false
release_tag: "nightly"
release_name: "Nightly"
upstream_tag: "0.5.10"
upstream_name: "Stable"
secrets:
CROSS_REPO_PUSH_KEY: ${{ secrets.CROSS_REPO_PUSH_KEY }}
BUILD_LOG_ZIP_PASSWORD: ${{ secrets.BUILD_LOG_ZIP_PASSWORD }}
- name: Tests Build
if: >
steps.build_gate.outputs.should_skip != 'true' &&
vars.ENABLE_TESTS == '1' &&
vars.ENABLE_TESTS_BUILD == '1'
id: test-build
env:
BUILD_LOG_ZIP_PASSWORD: ${{ secrets.BUILD_LOG_ZIP_PASSWORD }}
run: |
python3 scripts/ci/workflow.py tests-build; STATUS=$?
python3 scripts/ci/workflow.py encrypt-tests-build
exit $STATUS
- name: Save Cache
if: >
steps.build_gate.outputs.should_skip != 'true' &&
steps.xcode-cache-fallback.outputs.cache-hit != 'true'
uses: actions/cache/save@v3
with:
path: |
~/Library/Developer/Xcode/DerivedData
~/Library/Caches/org.swift.swiftpm
key: xcode-build-cache-${{ github.ref_name }}-${{ github.sha }}
- name: Tests Run
if: >
steps.build_gate.outputs.should_skip != 'true' &&
vars.ENABLE_TESTS == '1' &&
vars.ENABLE_TESTS_RUN == '1'
id: test-run
env:
BUILD_LOG_ZIP_PASSWORD: ${{ secrets.BUILD_LOG_ZIP_PASSWORD }}
run: |
python3 scripts/ci/workflow.py tests-run "iPhone 17 Pro"; STATUS=$?
python3 scripts/ci/workflow.py encrypt-tests-run
exit $STATUS
# --------------------------------------------------
# artifacts
# --------------------------------------------------
- uses: actions/upload-artifact@v4
if: steps.build_gate.outputs.should_skip != 'true'
with:
name: build-logs-${{ env.MARKETING_VERSION }}.zip
path: build-logs.zip
- uses: actions/upload-artifact@v4
if: >
steps.build_gate.outputs.should_skip != 'true' &&
vars.ENABLE_TESTS == '1' &&
vars.ENABLE_TESTS_BUILD == '1'
with:
name: tests-build-logs-${{ env.SHORT_COMMIT }}.zip
path: tests-build-logs.zip
- uses: actions/upload-artifact@v4
if: >
steps.build_gate.outputs.should_skip != 'true' &&
vars.ENABLE_TESTS == '1' &&
vars.ENABLE_TESTS_RUN == '1'
with:
name: tests-run-logs-${{ env.SHORT_COMMIT }}.zip
path: tests-run-logs.zip
- uses: actions/upload-artifact@v4
if: steps.build_gate.outputs.should_skip != 'true'
with:
name: SideStore-${{ env.MARKETING_VERSION }}.ipa
path: SideStore.ipa
- uses: actions/upload-artifact@v4
if: steps.build_gate.outputs.should_skip != 'true'
with:
name: SideStore-${{ env.MARKETING_VERSION }}-dSYMs.zip
path: SideStore.dSYMs.zip
- uses: actions/checkout@v4
if: steps.build_gate.outputs.should_skip != 'true' && env.DEPLOY_KEY != ''
with:
repository: "SideStore/apps-v2.json"
ref: "main"
token: ${{ secrets.CROSS_REPO_PUSH_KEY }}
path: "SideStore/apps-v2.json"
- name: Generate Metadata
if: steps.build_gate.outputs.should_skip != 'true'
run: |
python3 scripts/ci/workflow.py dump-project-settings
PRODUCT_NAME=$(python3 scripts/ci/workflow.py read-product-name)
BUNDLE_ID=$(python3 scripts/ci/workflow.py read-bundle-id)
IPA_NAME="$PRODUCT_NAME.ipa"
python3 scripts/ci/workflow.py generate-metadata \
"$CHANNEL" \
"$SHORT_COMMIT" \
"$MARKETING_VERSION" \
"$CHANNEL" \
"$BUNDLE_ID" \
"$IPA_NAME" \
"$LAST_SUCCESSFUL_COMMIT"
- name: Deploy
if: steps.build_gate.outputs.should_skip != 'true' && env.DEPLOY_KEY != ''
run: |
SOURCE_JSON="_includes/source.json"
python3 scripts/ci/workflow.py deploy \
SideStore/apps-v2.json \
"$SOURCE_JSON" \
"$CHANNEL" \
"$MARKETING_VERSION"
# --------------------------------------------------
# upload release to GH
# --------------------------------------------------
- name: Upload Release
if: steps.build_gate.outputs.should_skip != 'true'
run: |
python3 scripts/ci/workflow.py upload-release \
"$RELEASE_NAME" \
"$CHANNEL" \
"$GITHUB_SHA" \
"$GITHUB_REPOSITORY" \
"$UPSTREAM_CHANNEL"

View File

@@ -1,98 +1,90 @@
name: Pull Request SideStore build
on:
pull_request:
# types: [opened, synchronize, reopened, ready_for_review, converted_to_draft]
types: [opened, synchronize, reopened, ready_for_review]
concurrency:
group: pr-${{ github.event.pull_request.number }}
cancel-in-progress: true
jobs:
build:
name: Build and upload SideStore
if: ${{ github.event.pull_request.draft == false }}
strategy:
fail-fast: false
matrix:
include:
- os: 'macos-14'
version: '16.1'
runs-on: macos-26
runs-on: ${{ matrix.os }}
steps:
- name: Checkout code
uses: actions/checkout@v4
- uses: actions/checkout@v4
with:
submodules: recursive
fetch-depth: 1 # shallow clone just for PR
- name: Install dependencies
run: brew install ldid
- run: brew install ldid xcbeautify
- name: Install xcbeautify
run: brew install xcbeautify
- name: Add PR suffix to version
run: sed -e "/MARKETING_VERSION = .*/s/\$/-pr.${{ github.event.pull_request.number }}+$(git rev-parse --short ${COMMIT:-HEAD})/" -i '' Build.xcconfig
env:
COMMIT: ${{ github.event.pull_request.head.sha }}
- name: Get version
id: version
run: echo "version=$(grep MARKETING_VERSION Build.xcconfig | sed -e "s/MARKETING_VERSION = //g")" >> $GITHUB_OUTPUT
- name: Echo version
run: echo "${{ steps.version.outputs.version }}"
- name: Setup Env
run: |
MARKETING_VERSION=$(python3 scripts/ci/workflow.py get-marketing-version)
SHORT_COMMIT=$(git rev-parse --short ${{ github.event.pull_request.head.sha }})
NORMALIZED_VERSION="${MARKETING_VERSION}-pr.${{ github.event.pull_request.number }}+${SHORT_COMMIT}"
python3 scripts/ci/workflow.py set-marketing-version "$NORMALIZED_VERSION"
echo "SHORT_COMMIT=$SHORT_COMMIT" | tee -a $GITHUB_ENV
echo "MARKETING_VERSION=$NORMALIZED_VERSION" | tee -a $GITHUB_ENV
- name: Setup Xcode
uses: maxim-lobanov/setup-xcode@v1.6.0
with:
xcode-version: ${{ matrix.version }}
xcode-version: "26.2"
- name: Cache Build
uses: irgaly/xcode-cache@v1
- name: Restore Cache (exact)
id: xcode-cache-exact
uses: actions/cache/restore@v3
with:
key: xcode-cache-deriveddata-${{ github.sha }}
restore-keys: xcode-cache-deriveddata-
swiftpm-cache-key: xcode-cache-sourcedata-${{ github.sha }}
swiftpm-cache-restore-keys: |
xcode-cache-sourcedata-
path: |
~/Library/Developer/Xcode/DerivedData
~/Library/Caches/org.swift.swiftpm
key: xcode-build-cache-${{ github.ref_name }}-${{ github.sha }}
- name: List Files and derived data
- name: Restore Cache (last)
if: steps.xcode-cache-exact.outputs.cache-hit != 'true'
id: xcode-cache-fallback
uses: actions/cache/restore@v3
with:
path: |
~/Library/Developer/Xcode/DerivedData
~/Library/Caches/org.swift.swiftpm
key: xcode-build-cache-${{ github.ref_name }}-
- name: Build
env:
BUILD_LOG_ZIP_PASSWORD: ${{ secrets.BUILD_LOG_ZIP_PASSWORD }}
run: |
echo ">>>>>>>>> Workdir <<<<<<<<<<"
ls -la .
echo ""
echo ">>>>>>>>> SideStore <<<<<<<<<<"
find SideStore -maxdepth 2 -exec ls -ld {} + || true # List contents if directory exists
echo ""
echo ">>>>>>>>> Dependencies <<<<<<<<<<"
find Dependencies -maxdepth 2 -exec ls -ld {} + || true # List contents if directory exists
echo ""
echo ">>>>>>>>> Xcode-Derived-Data <<<<<<<<<<"
ls -la ~/Library/Developer/Xcode/DerivedData || true # List contents if directory exists
echo ""
python3 scripts/ci/workflow.py build; STATUS=$?
python3 scripts/ci/workflow.py encrypt-build
mv SideStore.ipa SideStore-${{ env.MARKETING_VERSION }}.ipa
exit $STATUS
- name: Build SideStore
run: NSUnbufferedIO=YES make build 2>&1 | xcbeautify --renderer github-actions && exit ${PIPESTATUS[0]}
- name: Fakesign app
run: make fakesign
- name: Convert to IPA
run: make ipa
- name: Add version to IPA file name
run: mv SideStore.ipa SideStore-${{ steps.version.outputs.version }}.ipa
- name: Upload SideStore.ipa Artifact
uses: actions/upload-artifact@v4
- name: Save Cache
if: ${{ steps.xcode-cache-fallback.outputs.cache-hit != 'true' }}
uses: actions/cache/save@v3
with:
name: SideStore-${{ steps.version.outputs.version }}.ipa
path: SideStore-${{ steps.version.outputs.version }}.ipa
path: |
~/Library/Developer/Xcode/DerivedData
~/Library/Caches/org.swift.swiftpm
key: xcode-build-cache-${{ github.ref_name }}-${{ github.sha }}
- name: Upload *.dSYM Artifact
uses: actions/upload-artifact@v4
- uses: actions/upload-artifact@v4
with:
name: SideStore-${{ steps.version.outputs.version }}-dSYM
path: ./SideStore.xcarchive/dSYMs/*
name: build-logs-${{ env.MARKETING_VERSION }}.zip
path: build-logs.zip
- uses: actions/upload-artifact@v4
with:
name: SideStore-${{ env.MARKETING_VERSION }}.ipa
path: SideStore-${{ env.MARKETING_VERSION }}.ipa
- uses: actions/upload-artifact@v4
with:
name: SideStore-${{ env.MARKETING_VERSION }}-dSYMs.zip
path: SideStore.dSYMs.zip

View File

@@ -1,105 +0,0 @@
name: Reusable SideStore Build
on:
workflow_call:
inputs:
is_beta:
required: false
default: false
type: boolean
publish:
required: false
default: false
type: boolean
is_shared_build_num:
required: false
default: true
type: boolean
release_name:
required: true
type: string
release_tag:
required: true
type: string
upstream_tag:
required: true
type: string
upstream_name:
required: true
type: string
bundle_id:
default: com.SideStore.SideStore
required: true
type: string
bundle_id_suffix:
default: ''
required: false
type: string
secrets:
# GITHUB_TOKEN:
# required: true
CROSS_REPO_PUSH_KEY:
required: true
BUILD_LOG_ZIP_PASSWORD:
required: false
# since build cache, test-build cache, test-run cache are involved, out of order exec if serialization is on individual jobs will wreak all sorts of havoc
# so we serialize on the entire workflow
concurrency:
group: serialize-workflow
jobs:
shared:
uses: ./.github/workflows/sidestore-shared.yml
secrets: inherit
build:
needs: shared
uses: ./.github/workflows/sidestore-build.yml
with:
is_beta: ${{ inputs.is_beta }}
is_shared_build_num: ${{ inputs.is_shared_build_num }}
release_tag: ${{ inputs.release_tag }}
short_commit: ${{ needs.shared.outputs.short-commit }}
bundle_id: ${{ inputs.bundle_id }}
bundle_id_suffix: ${{ inputs.bundle_id_suffix }}
secrets: inherit
# tests-build:
# if: ${{ vars.ENABLE_TESTS == '1' && vars.ENABLE_TESTS_BUILD == '1' }}
# needs: shared
# uses: ./.github/workflows/sidestore-tests-build.yml
# with:
# release_tag: ${{ inputs.release_tag }}
# short_commit: ${{ needs.shared.outputs.short-commit }}
# secrets: inherit
# tests-run:
# if: ${{ vars.ENABLE_TESTS == '1' && vars.ENABLE_TESTS_RUN == '1' }}
# needs: [shared, tests-build]
# uses: ./.github/workflows/sidestore-tests-run.yml
# with:
# release_tag: ${{ inputs.release_tag }}
# short_commit: ${{ needs.shared.outputs.short-commit }}
# secrets: inherit
deploy:
# needs: [shared, build, tests-build, tests-run] # Keep tests-run in needs
needs: [shared, build] # Keep tests-run in needs
if: ${{ always() && (needs.tests-run.result == 'skipped' || needs.tests-run.result == 'success') }}
uses: ./.github/workflows/sidestore-deploy.yml
with:
is_beta: ${{ inputs.is_beta }}
publish: ${{ inputs.publish }}
release_name: ${{ inputs.release_name }}
release_tag: ${{ inputs.release_tag }}
upstream_tag: ${{ inputs.upstream_tag }}
upstream_name: ${{ inputs.upstream_name }}
version: ${{ needs.build.outputs.version }}
short_commit: ${{ needs.shared.outputs.short-commit }}
release_channel: ${{ needs.build.outputs.release-channel }}
marketing_version: ${{ needs.build.outputs.marketing-version }}
bundle_id: ${{ inputs.bundle_id }}
secrets: inherit

View File

@@ -1,358 +0,0 @@
name: SideStore Build
on:
workflow_call:
inputs:
is_beta:
type: boolean
is_shared_build_num:
type: boolean
release_tag:
type: string
bundle_id:
type: string
bundle_id_suffix:
type: string
short_commit:
type: string
secrets:
CROSS_REPO_PUSH_KEY:
required: true
BUILD_LOG_ZIP_PASSWORD:
required: false
outputs:
version:
value: ${{ jobs.build.outputs.version }}
marketing-version:
value: ${{ jobs.build.outputs.marketing-version }}
release-channel:
value: ${{ jobs.build.outputs.release-channel }}
jobs:
build:
name: Build SideStore - ${{ inputs.release_tag }}
strategy:
fail-fast: false
matrix:
include:
- os: 'macos-26'
version: '26.0'
runs-on: ${{ matrix.os }}
outputs:
version: ${{ steps.version.outputs.version }}
marketing-version: ${{ steps.marketing-version.outputs.MARKETING_VERSION }}
release-channel: ${{ steps.release-channel.outputs.RELEASE_CHANNEL }}
steps:
- name: Set beta status
run: echo "IS_BETA=${{ inputs.is_beta }}" >> $GITHUB_ENV
shell: bash
- name: Checkout code
uses: actions/checkout@v4
with:
submodules: recursive
fetch-depth: 0
- name: Install dependencies - ldid & xcbeautify
run: |
brew install ldid xcbeautify
- name: Set ref based on is_shared_build_num
if: ${{ inputs.is_beta }}
id: set_ref
run: |
if [ "${{ inputs.is_shared_build_num }}" == "true" ]; then
echo "ref=main" >> $GITHUB_ENV
else
echo "ref=${{ inputs.release_tag }}" >> $GITHUB_ENV
fi
shell: bash
- name: Checkout SideStore/beta-build-num repo
if: ${{ inputs.is_beta }}
uses: actions/checkout@v4
with:
repository: 'SideStore/beta-build-num'
ref: ${{ env.ref }}
token: ${{ secrets.CROSS_REPO_PUSH_KEY }}
path: 'SideStore/beta-build-num'
- name: Copy build_number.txt to repo root
if: ${{ inputs.is_beta }}
run: |
cp SideStore/beta-build-num/build_number.txt .
echo "cat build_number.txt"
cat build_number.txt
shell: bash
- name: Echo Build.xcconfig
run: |
echo "cat Build.xcconfig"
cat Build.xcconfig
shell: bash
- name: Set Release Channel info for build number bumper
id: release-channel
run: |
RELEASE_CHANNEL="${{ inputs.release_tag }}"
echo "RELEASE_CHANNEL=${RELEASE_CHANNEL}" >> $GITHUB_ENV
echo "RELEASE_CHANNEL=${RELEASE_CHANNEL}" >> $GITHUB_OUTPUT
echo "RELEASE_CHANNEL=${RELEASE_CHANNEL}"
shell: bash
- name: Increase build number for beta builds
if: ${{ inputs.is_beta }}
run: |
bash .github/workflows/increase-beta-build-num.sh
shell: bash
- name: Extract MARKETING_VERSION from Build.xcconfig
id: version
run: |
version=$(grep MARKETING_VERSION Build.xcconfig | sed -e 's/MARKETING_VERSION = //g')
echo "version=$version" >> $GITHUB_OUTPUT
echo "version=$version"
shell: bash
- name: Set MARKETING_VERSION
if: ${{ inputs.is_beta }}
id: marketing-version
run: |
# Extract version number (e.g., "0.6.0")
version=$(echo "${{ steps.version.outputs.version }}" | sed -E 's/^[^0-9]*([0-9]+\.[0-9]+\.[0-9]+).*/\1/')
# Extract date (YYYYMMDD) (e.g., "20250205")
date=$(echo "${{ steps.version.outputs.version }}" | sed -E 's/.*\.([0-9]{4})\.([0-9]{2})\.([0-9]{2})\..*/\1\2\3/')
# Extract build number (e.g., "2")
build_num=$(echo "${{ steps.version.outputs.version }}" | sed -E 's/.*\.([0-9]+)\+.*/\1/')
# Combine them into the final output
MARKETING_VERSION="${version}-${date}.${build_num}+${{ inputs.short_commit }}"
echo "MARKETING_VERSION=$MARKETING_VERSION" >> $GITHUB_ENV
echo "MARKETING_VERSION=$MARKETING_VERSION" >> $GITHUB_OUTPUT
echo "MARKETING_VERSION=$MARKETING_VERSION"
shell: bash
- name: Echo Updated Build.xcconfig, build_number.txt
if: ${{ inputs.is_beta }}
run: |
cat Build.xcconfig
cat build_number.txt
shell: bash
- name: Setup Xcode
uses: maxim-lobanov/setup-xcode@v1.6.0
with:
xcode-version: ${{ matrix.version }}
- name: (Build) Restore Xcode & SwiftPM Cache (Exact match)
id: xcode-cache-restore
uses: actions/cache/restore@v3
with:
path: |
~/Library/Developer/Xcode/DerivedData
~/Library/Caches/org.swift.swiftpm
key: xcode-cache-build-${{ github.ref_name }}-${{ github.sha }}
- name: (Build) Restore Xcode & SwiftPM Cache (Last Available)
id: xcode-cache-restore-recent
uses: actions/cache/restore@v3
with:
path: |
~/Library/Developer/Xcode/DerivedData
~/Library/Caches/org.swift.swiftpm
key: xcode-cache-build-${{ github.ref_name }}-
# - name: (Build) Cache Build
# uses: irgaly/xcode-cache@v1.8.1
# with:
# key: xcode-cache-deriveddata-build-${{ github.ref_name }}-${{ github.sha }}
# restore-keys: xcode-cache-deriveddata-build-${{ github.ref_name }}-
# swiftpm-cache-key: xcode-cache-sourcedata-build-${{ github.ref_name }}-${{ github.sha }}
# swiftpm-cache-restore-keys: |
# xcode-cache-sourcedata-build-${{ github.ref_name }}-
- name: (Build) Clean previous build artifacts
# using 'tee' to intercept stdout and log for detailed build-log
run: |
make clean
mkdir -p build/logs
shell: bash
- name: (Build) List Files and derived data
if: always()
shell: bash
run: |
echo ">>>>>>>>> Workdir <<<<<<<<<<"
ls -la .
echo ""
echo ">>>>>>>>> SideStore <<<<<<<<<<"
find SideStore -maxdepth 2 -exec ls -ld {} + || true # List contents if directory exists
echo ""
echo ">>>>>>>>> Dependencies <<<<<<<<<<"
find Dependencies -maxdepth 2 -exec ls -ld {} + || true # List contents if directory exists
echo ""
echo ">>>>>>>>> Xcode-Derived-Data <<<<<<<<<<"
ls -la ~/Library/Developer/Xcode/DerivedData || true # List contents if directory exists
echo ""
- name: Set BundleID Suffix for Sidestore build
run: |
echo "BUNDLE_ID_SUFFIX=${{ inputs.bundle_id_suffix }}" >> $GITHUB_ENV
shell: bash
- name: Build SideStore.xcarchive
# using 'tee' to intercept stdout and log for detailed build-log
run: |
NSUnbufferedIO=YES make -B build 2>&1 | tee -a build/logs/build.log | xcbeautify --renderer github-actions && exit ${PIPESTATUS[0]}
shell: bash
- name: Fakesign app
run: make fakesign | tee -a build/logs/build.log
shell: bash
- name: Convert to IPA
run: make ipa | tee -a build/logs/build.log
shell: bash
- name: (Build) Save Xcode & SwiftPM Cache
id: cache-save
if: ${{ steps.xcode-cache-restore.outputs.cache-hit != 'true' }}
uses: actions/cache/save@v3
with:
path: |
~/Library/Developer/Xcode/DerivedData
~/Library/Caches/org.swift.swiftpm
key: xcode-cache-build-${{ github.ref_name }}-${{ github.sha }}
- name: (Build) List Files and Build artifacts
run: |
echo ">>>>>>>>> Workdir <<<<<<<<<<"
ls -la .
echo ""
echo ">>>>>>>>> Build <<<<<<<<<<"
find build -maxdepth 3 -exec ls -ld {} + || true # List contents if directory exists
echo ""
echo ">>>>>>>>> SideStore <<<<<<<<<<"
find SideStore -maxdepth 3 -exec ls -ld {} + || true # List contents if directory exists
echo ""
echo ">>>>>>>>> SideStore.xcarchive <<<<<<<<<<"
find SideStore.xcarchive -maxdepth 3 -exec ls -ld {} + || true # List contents if directory exists
echo ""
echo ">>>>>>>>> Xcode-Derived-Data <<<<<<<<<<"
ls -la ~/Library/Developer/Xcode/DerivedData || true # List contents if directory exists
echo ""
shell: bash
- name: Encrypt build-logs for upload
id: encrypt-build-log
run: |
DEFAULT_BUILD_LOG_PASSWORD=12345
BUILD_LOG_ZIP_PASSWORD=${{ secrets.BUILD_LOG_ZIP_PASSWORD }}
BUILD_LOG_ZIP_PASSWORD=${BUILD_LOG_ZIP_PASSWORD:-$DEFAULT_BUILD_LOG_PASSWORD}
if [ "$BUILD_LOG_ZIP_PASSWORD" == "$DEFAULT_BUILD_LOG_PASSWORD" ]; then
echo "Warning: BUILD_LOG_ZIP_PASSWORD is not set. Defaulting to '${DEFAULT_BUILD_LOG_PASSWORD}'."
fi
pushd build/logs && zip -e -P "$BUILD_LOG_ZIP_PASSWORD" ../../encrypted-build-logs.zip * || popd
echo "::set-output name=encrypted::true"
shell: bash
- name: Upload encrypted-build-logs.zip
id: attach-encrypted-build-log
if: ${{ always() && steps.encrypt-build-log.outputs.encrypted == 'true' }}
uses: actions/upload-artifact@v4
with:
name: encrypted-build-logs-${{ steps.version.outputs.version }}.zip
path: encrypted-build-logs.zip
- name: Upload SideStore.ipa Artifact
uses: actions/upload-artifact@v4
with:
name: SideStore-${{ steps.version.outputs.version }}.ipa
path: SideStore.ipa
- name: Zip dSYMs
run: zip -r -9 ./SideStore.dSYMs.zip ./SideStore.xcarchive/dSYMs
shell: bash
- name: Upload *.dSYM Artifact
uses: actions/upload-artifact@v4
with:
name: SideStore-${{ steps.version.outputs.version }}-dSYMs.zip
path: SideStore.dSYMs.zip
- name: Keep rolling the build numbers for each successful build
if: ${{ inputs.is_beta }}
run: |
pushd SideStore/beta-build-num/
echo "Configure Git user (committer details)"
git config user.name "GitHub Actions"
git config user.email "github-actions@github.com"
echo "Adding files to commit"
git add --verbose build_number.txt
git commit -m " - updated for ${{ inputs.release_tag }} - ${{ inputs.short_commit }} deployment" || echo "No changes to commit"
echo "Pushing to remote repo"
git push --verbose
popd
shell: bash
- name: Get last successful commit
id: get_last_commit
run: |
# Try to get the last successful workflow run commit
LAST_SUCCESS_SHA=$(gh run list --branch "${{ github.ref_name }}" --status success --json headSha --jq '.[0].headSha')
echo "LAST_SUCCESS_SHA=$LAST_SUCCESS_SHA" >> $GITHUB_OUTPUT
echo "LAST_SUCCESS_SHA=$LAST_SUCCESS_SHA" >> $GITHUB_ENV
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
shell: bash
- name: Create release notes
run: |
LAST_SUCCESS_SHA=${{ steps.get_last_commit.outputs.LAST_SUCCESS_SHA}}
echo "Last successful commit SHA: $LAST_SUCCESS_SHA"
FROM_COMMIT=$LAST_SUCCESS_SHA
# Check if we got a valid SHA
if [ -z "$LAST_SUCCESS_SHA" ] || [ "$LAST_SUCCESS_SHA" = "null" ]; then
echo "No successful run found, using initial commit of branch"
# Get the first commit of the branch (initial commit)
FROM_COMMIT=$(git rev-list --max-parents=0 HEAD)
fi
python3 update_release_notes.py $FROM_COMMIT ${{ inputs.release_tag }} ${{ github.ref_name }}
# cat release-notes.md
shell: bash
- name: Upload release-notes.md
uses: actions/upload-artifact@v4
with:
name: release-notes-${{ inputs.short_commit }}.md
path: release-notes.md
- name: Upload update_release_notes.py
uses: actions/upload-artifact@v4
with:
name: update_release_notes-${{ inputs.short_commit }}.py
path: update_release_notes.py
- name: Upload update_apps.py
uses: actions/upload-artifact@v4
with:
name: update_apps-${{ inputs.short_commit }}.py
path: update_apps.py

View File

@@ -1,281 +0,0 @@
name: SideStore Deploy
on:
workflow_call:
inputs:
is_beta:
type: boolean
publish:
type: boolean
release_name:
type: string
release_tag:
type: string
upstream_tag:
type: string
upstream_name:
type: string
version:
type: string
short_commit:
type: string
marketing_version:
type: string
release_channel:
type: string
bundle_id:
type: string
secrets:
CROSS_REPO_PUSH_KEY:
required: true
# GITHUB_TOKEN:
# required: true
jobs:
deploy:
name: Deploy SideStore - ${{ inputs.release_tag }}
runs-on: macos-15
steps:
- name: Download IPA artifact
uses: actions/download-artifact@v4
with:
name: SideStore-${{ inputs.version }}.ipa
- name: Download dSYM artifact
uses: actions/download-artifact@v4
with:
name: SideStore-${{ inputs.version }}-dSYMs.zip
- name: Download encrypted-build-logs artifact
uses: actions/download-artifact@v4
with:
name: encrypted-build-logs-${{ inputs.version }}.zip
- name: Download encrypted-tests-build-logs artifact
if: ${{ vars.ENABLE_TESTS == '1' && vars.ENABLE_TESTS_BUILD == '1' }}
uses: actions/download-artifact@v4
with:
name: encrypted-tests-build-logs-${{ inputs.short_commit }}.zip
- name: Download encrypted-tests-run-logs artifact
if: ${{ vars.ENABLE_TESTS == '1' && vars.ENABLE_TESTS_RUN == '1' }}
uses: actions/download-artifact@v4
with:
name: encrypted-tests-run-logs-${{ inputs.short_commit }}.zip
- name: Download tests-recording artifact
if: ${{ vars.ENABLE_TESTS == '1' && vars.ENABLE_TESTS_RUN == '1' }}
uses: actions/download-artifact@v4
with:
name: tests-recording-${{ inputs.short_commit }}.mp4
- name: Download test-results artifact
if: ${{ vars.ENABLE_TESTS == '1' && vars.ENABLE_TESTS_RUN == '1' }}
uses: actions/download-artifact@v4
with:
name: test-results-${{ inputs.short_commit }}.zip
- name: Download release-notes.md
uses: actions/download-artifact@v4
with:
name: release-notes-${{ inputs.short_commit }}.md
- name: Download update_release_notes.py
uses: actions/download-artifact@v4
with:
name: update_release_notes-${{ inputs.short_commit }}.py
- name: Download update_apps.py
uses: actions/download-artifact@v4
with:
name: update_apps-${{ inputs.short_commit }}.py
- name: Read release notes
id: release_notes
run: |
CONTENT=$(python3 update_release_notes.py --retrieve ${{ inputs.release_tag }})
echo "content<<EOF" >> $GITHUB_OUTPUT
echo "$CONTENT" >> $GITHUB_OUTPUT
echo "EOF" >> $GITHUB_OUTPUT
shell: bash
- name: List files before upload
run: |
echo ">>>>>>>>> Workdir <<<<<<<<<<"
find . -maxdepth 4 -exec ls -ld {} + || true # List contents if directory exists
echo ""
shell: bash
- name: Get current date
id: date
run: echo "date=$(date -u +'%c')" >> $GITHUB_OUTPUT
shell: bash
- name: Get current date in AltStore date form
id: date_altstore
run: echo "date=$(date -u +'%Y-%m-%d')" >> $GITHUB_OUTPUT
shell: bash
- name: List files to upload
id: list_uploads
run: |
echo ">>>>>>>>> Workdir <<<<<<<<<<"
find . -maxdepth 4 -exec ls -ld {} + || true # List contents if directory exists
echo ""
FILES="SideStore.ipa SideStore.dSYMs.zip encrypted-build-logs.zip"
if [[ "${{ vars.ENABLE_TESTS }}" == "1" && "${{ vars.ENABLE_TESTS_BUILD }}" == "1" ]]; then
FILES="$FILES encrypted-tests-build-logs.zip"
fi
if [[ "${{ vars.ENABLE_TESTS }}" == "1" && "${{ vars.ENABLE_TESTS_RUN }}" == "1" ]]; then
FILES="$FILES encrypted-tests-run-logs.zip test-results.zip tests-recording.mp4"
fi
echo "Final upload list:"
for f in $FILES; do
if [[ -f "$f" ]]; then
echo " ✓ $f"
else
echo " - $f (missing)"
fi
done
echo "files=$FILES" >> $GITHUB_OUTPUT
- name: Set Upstream Recommendation
id: upstream_recommendation
run: |
UPSTREAM_NAME=$(echo "${{ inputs.upstream_name }}" | tr '[:upper:]' '[:lower:]')
if [[ "$UPSTREAM_NAME" != "nightly" ]]; then
echo "content<<EOF" >> $GITHUB_OUTPUT
echo "If you want to try out new features early but want a lower chance of bugs, you can look at [SideStore ${{ inputs.upstream_name }}](https://github.com/${{ github.repository }}/releases?q=${{ inputs.upstream_tag }})." >> $GITHUB_OUTPUT
echo "" >> $GITHUB_OUTPUT
echo "EOF" >> $GITHUB_OUTPUT
else
echo "content=" >> $GITHUB_OUTPUT
fi
shell: bash
- name: Upload to releases
uses: IsaacShelton/update-existing-release@v1.3.1
with:
token: ${{ secrets.GITHUB_TOKEN }}
release: ${{ inputs.release_name }}
tag: ${{ inputs.release_tag }}
prerelease: ${{ inputs.is_beta }}
files: ${{ steps.list_uploads.outputs.files }}
body: |
This is an ⚠️ **EXPERIMENTAL** ⚠️ ${{ inputs.release_name }} build for commit [${{ github.sha }}](https://github.com/${{ github.repository }}/commit/${{ github.sha }}).
${{ inputs.release_name }} builds are **extremely experimental builds only meant to be used by developers and beta testers. They often contain bugs and experimental features. Use at your own risk!**
${{ steps.upstream_recommendation.outputs.content }}
## Build Info
Built at (UTC): `${{ steps.date.outputs.date }}`
Built at (UTC date): `${{ steps.date_altstore.outputs.date }}`
Commit SHA: `${{ github.sha }}`
Version: `${{ inputs.version }}`
${{ steps.release_notes.outputs.content }}
- name: Get formatted date
run: |
FORMATTED_DATE=$(date -u +"%Y-%m-%dT%H:%M:%SZ")
echo "Formatted date: $FORMATTED_DATE"
echo "FORMATTED_DATE=$FORMATTED_DATE" >> $GITHUB_ENV
shell: bash
- name: Get size of IPA in bytes (macOS/Linux)
run: |
if [[ "$(uname)" == "Darwin" ]]; then
# macOS
IPA_SIZE=$(stat -f %z SideStore.ipa)
else
# Linux
IPA_SIZE=$(stat -c %s SideStore.ipa)
fi
echo "IPA size in bytes: $IPA_SIZE"
echo "IPA_SIZE=$IPA_SIZE" >> $GITHUB_ENV
shell: bash
- name: Compute SHA-256 of IPA
run: |
SHA256_HASH=$(shasum -a 256 SideStore.ipa | awk '{ print $1 }')
echo "SHA-256 Hash: $SHA256_HASH"
echo "SHA256_HASH=$SHA256_HASH" >> $GITHUB_ENV
shell: bash
- name: Set Release Info variables
run: |
echo "IS_BETA=${{ inputs.is_beta }}" >> $GITHUB_ENV
echo "BUNDLE_IDENTIFIER=${{ inputs.bundle_id }}" >> $GITHUB_ENV
echo "VERSION_IPA=${{ inputs.marketing_version }}" >> $GITHUB_ENV
echo "VERSION_DATE=$FORMATTED_DATE" >> $GITHUB_ENV
echo "RELEASE_CHANNEL=${{ inputs.release_channel }}" >> $GITHUB_ENV
echo "SIZE=$IPA_SIZE" >> $GITHUB_ENV
echo "SHA256=$SHA256_HASH" >> $GITHUB_ENV
echo "DOWNLOAD_URL=https://github.com/SideStore/SideStore/releases/download/${{ inputs.release_tag }}/SideStore.ipa" >> $GITHUB_ENV
# Format localized description
get_description() {
cat <<EOF
This is release for:
- version: "${{ inputs.version }}"
- revision: "${{ inputs.short_commit }}"
- timestamp: "${{ steps.date.outputs.date }}"
Release Notes:
${{ steps.release_notes.outputs.content }}
EOF
}
LOCALIZED_DESCRIPTION=$(get_description)
echo "$LOCALIZED_DESCRIPTION"
# multiline strings
echo "LOCALIZED_DESCRIPTION<<EOF" >> $GITHUB_ENV
echo "$LOCALIZED_DESCRIPTION" >> $GITHUB_ENV
echo "EOF" >> $GITHUB_ENV
shell: bash
- name: Check if Publish updates is set
id: check_publish
run: |
echo "Publish updates to source.json = ${{ inputs.publish }}"
shell: bash
- name: Checkout SideStore/apps-v2.json
if: ${{ inputs.is_beta && inputs.publish }}
uses: actions/checkout@v4
with:
repository: 'SideStore/apps-v2.json'
ref: 'main' # this branch is shared by all beta builds, so beta build workflows are serialized
token: ${{ secrets.CROSS_REPO_PUSH_KEY }}
path: 'SideStore/apps-v2.json'
# for stable builds, let the user manually edit the source.json
- name: Publish to SideStore/apps-v2.json
if: ${{ inputs.is_beta && inputs.publish }}
id: publish-release
shell: bash
run: |
# Copy and execute the update script
pushd SideStore/apps-v2.json/
# Configure Git user (committer details)
git config user.name "GitHub Actions"
git config user.email "github-actions@github.com"
# update the source.json
python3 ../../update_apps.py "./_includes/source.json"
# Commit changes and push using SSH
git add --verbose ./_includes/source.json
git commit -m " - updated for ${{ inputs.short_commit }} deployment" || echo "No changes to commit"
git push --verbose
popd

View File

@@ -1,24 +0,0 @@
name: SideStore Shared
on:
workflow_call:
outputs:
short-commit:
value: ${{ jobs.shared.outputs.short-commit }}
jobs:
shared:
name: Shared Steps
strategy:
fail-fast: false
runs-on: 'macos-15'
steps:
- name: Set short commit hash
id: commit-id
run: |
# SHORT_COMMIT="${{ github.sha }}"
SHORT_COMMIT=${GITHUB_SHA:0:7}
echo "Short commit hash: $SHORT_COMMIT"
echo "SHORT_COMMIT=$SHORT_COMMIT" >> $GITHUB_OUTPUT
outputs:
short-commit: ${{ steps.commit-id.outputs.SHORT_COMMIT }}

View File

@@ -1,165 +0,0 @@
name: SideStore Tests Build
on:
workflow_call:
inputs:
release_tag:
type: string
short_commit:
type: string
secrets:
BUILD_LOG_ZIP_PASSWORD:
required: false
jobs:
tests-build:
name: Tests-Build SideStore - ${{ inputs.release_tag }}
if: ${{ vars.ENABLE_TESTS == '1' && vars.ENABLE_TESTS_BUILD == '1' }}
strategy:
fail-fast: false
matrix:
include:
- os: 'macos-26'
version: '26.0'
runs-on: ${{ matrix.os }}
steps:
- name: Checkout code
uses: actions/checkout@v4
with:
submodules: recursive
- name: Install dependencies - xcbeautify
run: |
brew install xcbeautify
shell: bash
- name: Setup Xcode
uses: maxim-lobanov/setup-xcode@v1.6.0
with:
xcode-version: '26.0'
# - name: (Tests-Build) Cache Build
# uses: irgaly/xcode-cache@v1.8.1
# with:
# key: xcode-cache-deriveddata-test-${{ github.ref_name }}-${{ github.sha }}
# # tests shouldn't restore cache unless it is same build
# # restore-keys: xcode-cache-deriveddata-test-${{ github.ref_name }}-
# swiftpm-cache-key: xcode-cache-sourcedata-test-${{ github.ref_name }}-${{ github.sha }}
# swiftpm-cache-restore-keys: |
# xcode-cache-sourcedata-test-${{ github.ref_name }}-
# delete-used-deriveddata-cache: true
- name: (Tests-Build) Restore Xcode & SwiftPM Cache (Exact match)
id: xcode-cache-restore
uses: actions/cache/restore@v3
with:
path: |
~/Library/Developer/Xcode/DerivedData
~/Library/Caches/org.swift.swiftpm
key: xcode-cache-tests-${{ github.ref_name }}-${{ github.sha }}
- name: (Tests-Build) Restore Xcode & SwiftPM Cache (Last Available)
id: xcode-cache-restore-recent
uses: actions/cache/restore@v3
with:
path: |
~/Library/Developer/Xcode/DerivedData
~/Library/Caches/org.swift.swiftpm
key: xcode-cache-tests-${{ github.ref_name }}-
- name: Clean Derived Data (if required)
if: ${{ vars.PERFORM_CLEAN_TESTS_BUILD == '1' }}
run: |
rm -rf ~/Library/Developer/Xcode/DerivedData/
make clean
xcodebuild clean
shell: bash
- name: (Tests-Build) Clean previous build artifacts
run: |
make clean
mkdir -p build/logs
shell: bash
- name: (Tests-Build) List Files and derived data
shell: bash
run: |
echo ">>>>>>>>> Workdir <<<<<<<<<<"
ls -la .
echo ""
echo ">>>>>>>>> SideStore <<<<<<<<<<"
find SideStore -maxdepth 2 -exec ls -ld {} + || true # List contents if directory exists
echo ""
echo ">>>>>>>>> Dependencies <<<<<<<<<<"
find Dependencies -maxdepth 2 -exec ls -ld {} + || true # List contents if directory exists
echo ""
echo ">>>>>>>>> Xcode-Derived-Data <<<<<<<<<<"
ls -la ~/Library/Developer/Xcode/DerivedData || true # List contents if directory exists
echo ""
- name: Build SideStore Tests
# using 'tee' to intercept stdout and log for detailed build-log
shell: bash
run: |
NSUnbufferedIO=YES make -B build-tests 2>&1 | tee -a build/logs/tests-build.log | xcbeautify --renderer github-actions && exit ${PIPESTATUS[0]}
- name: (Tests-Build) Save Xcode & SwiftPM Cache
id: cache-save
if: ${{ steps.xcode-cache-restore.outputs.cache-hit != 'true' }}
uses: actions/cache/save@v3
with:
path: |
~/Library/Developer/Xcode/DerivedData
~/Library/Caches/org.swift.swiftpm
key: xcode-cache-tests-${{ github.ref_name }}-${{ github.sha }}
- name: (Tests-Build) List Files and Build artifacts
if: always()
shell: bash
run: |
echo ">>>>>>>>> Workdir <<<<<<<<<<"
ls -la .
echo ""
echo ">>>>>>>>> Build <<<<<<<<<<"
find build -maxdepth 3 -exec ls -ld {} + || true # List contents if directory exists
echo ""
echo ">>>>>>>>> Xcode-Derived-Data <<<<<<<<<<"
find ~/Library/Developer/Xcode/DerivedData -maxdepth 8 -exec ls -ld {} + | grep "Build/Products" >> tests-build-deriveddata.txt || true
echo ""
- uses: actions/upload-artifact@v4
if: always()
with:
name: tests-build-deriveddata-${{ inputs.short_commit }}.txt
path: tests-build-deriveddata.txt
- name: Encrypt tests-build-logs for upload
id: encrypt-test-log
if: always()
shell: bash
run: |
DEFAULT_BUILD_LOG_PASSWORD=12345
BUILD_LOG_ZIP_PASSWORD=${{ secrets.BUILD_LOG_ZIP_PASSWORD }}
BUILD_LOG_ZIP_PASSWORD=${BUILD_LOG_ZIP_PASSWORD:-$DEFAULT_BUILD_LOG_PASSWORD}
if [ "$BUILD_LOG_ZIP_PASSWORD" == "$DEFAULT_BUILD_LOG_PASSWORD" ]; then
echo "Warning: BUILD_LOG_ZIP_PASSWORD is not set. Defaulting to '${DEFAULT_BUILD_LOG_PASSWORD}'."
fi
pushd build/logs && zip -e -P "$BUILD_LOG_ZIP_PASSWORD" ../../encrypted-tests-build-logs.zip * || popd
echo "::set-output name=encrypted::true"
- name: Upload encrypted-tests-build-logs.zip
id: attach-encrypted-test-log
if: always() && steps.encrypt-test-log.outputs.encrypted == 'true'
uses: actions/upload-artifact@v4
with:
name: encrypted-tests-build-logs-${{ inputs.short_commit }}.zip
path: encrypted-tests-build-logs.zip

View File

@@ -1,196 +0,0 @@
name: SideStore Tests Run
on:
workflow_call:
inputs:
release_tag:
type: string
short_commit:
type: string
secrets:
BUILD_LOG_ZIP_PASSWORD:
required: false
jobs:
tests-run:
name: Tests-Run SideStore - ${{ inputs.release_tag }}
if: ${{ vars.ENABLE_TESTS == '1' && vars.ENABLE_TESTS_RUN == '1' }}
strategy:
fail-fast: false
matrix:
include:
- os: 'macos-26'
version: '26.0'
runs-on: ${{ matrix.os }}
steps:
- name: Checkout code
uses: actions/checkout@v4
with:
submodules: recursive
- name: Boot Simulator async(nohup) for testing
run: |
mkdir -p build/logs
nohup make -B boot-sim-async </dev/null >> build/logs/tests-run.log 2>&1 &
shell: bash
- name: Setup Xcode
uses: maxim-lobanov/setup-xcode@v1.6.0
with:
xcode-version: '26.0'
# - name: (Tests-Run) Cache Build
# uses: irgaly/xcode-cache@v1.8.1
# with:
# # This comes from
# key: xcode-cache-deriveddata-test-${{ github.ref_name }}-${{ github.sha }}
# swiftpm-cache-key: xcode-cache-sourcedata-test-${{ github.ref_name }}-${{ github.sha }}
- name: (Tests-Build) Restore Xcode & SwiftPM Cache (Exact match) [from tests-build job]
id: xcode-cache-restore
uses: actions/cache/restore@v3
with:
path: |
~/Library/Developer/Xcode/DerivedData
~/Library/Caches/org.swift.swiftpm
key: xcode-cache-tests-${{ github.ref_name }}-${{ github.sha }}
- name: (Tests-Run) Clean previous build artifacts
run: |
make clean
mkdir -p build/logs
shell: bash
- name: (Tests-Run) List Files and derived data
shell: bash
run: |
echo ">>>>>>>>> Workdir <<<<<<<<<<"
ls -la .
echo ""
echo ">>>>>>>>> SideStore <<<<<<<<<<"
find SideStore -maxdepth 2 -exec ls -ld {} + || true # List contents if directory exists
echo ""
echo ">>>>>>>>> Dependencies <<<<<<<<<<"
find Dependencies -maxdepth 2 -exec ls -ld {} + || true # List contents if directory exists
echo ""
echo ">>>>>>>>> Xcode-Derived-Data <<<<<<<<<<"
find ~/Library/Developer/Xcode/DerivedData -maxdepth 8 -exec ls -ld {} + | grep "Build/Products" >> tests-run-deriveddata.txt || true
echo ""
- uses: actions/upload-artifact@v4
if: always()
with:
name: tests-run-deriveddata-${{ inputs.short_commit }}.txt
path: tests-run-deriveddata.txt
# we expect simulator to have been booted by now, so exit otherwise
- name: Simulator Boot Check
run: |
mkdir -p build/logs
make -B sim-boot-check | tee -a build/logs/tests-run.log
exit ${PIPESTATUS[0]}
shell: bash
- name: Start Recording UI tests (if DEBUG_RECORD_TESTS is set to 1)
if: ${{ vars.DEBUG_RECORD_TESTS == '1' }}
run: |
nohup xcrun simctl io booted recordVideo -f tests-recording.mp4 --codec h264 </dev/null > tests-recording.log 2>&1 &
RECORD_PID=$!
echo "RECORD_PID=$RECORD_PID" >> $GITHUB_ENV
shell: bash
- name: Run SideStore Tests
# using 'tee' to intercept stdout and log for detailed build-log
run: |
make run-tests 2>&1 | tee -a build/logs/tests-run.log && exit ${PIPESTATUS[0]}
# NSUnbufferedIO=YES make -B run-tests 2>&1 | tee build/logs/tests-run.log | xcpretty -r junit --output ./build/tests/test-results.xml && exit ${PIPESTATUS[0]}
shell: bash
- name: Stop Recording tests
if: ${{ always() && env.RECORD_PID != '' }}
run: |
kill -INT ${{ env.RECORD_PID }}
shell: bash
- name: (Tests-Run) List Files and Build artifacts
if: always()
run: |
echo ">>>>>>>>> Workdir <<<<<<<<<<"
ls -la .
echo ""
echo ">>>>>>>>> Build <<<<<<<<<<"
find build -maxdepth 3 -exec ls -ld {} + || true # List contents if directory exists
echo ""
shell: bash
- name: Encrypt tests-run-logs for upload
id: encrypt-test-log
if: always()
run: |
DEFAULT_BUILD_LOG_PASSWORD=12345
BUILD_LOG_ZIP_PASSWORD=${{ secrets.BUILD_LOG_ZIP_PASSWORD }}
BUILD_LOG_ZIP_PASSWORD=${BUILD_LOG_ZIP_PASSWORD:-$DEFAULT_BUILD_LOG_PASSWORD}
if [ "$BUILD_LOG_ZIP_PASSWORD" == "$DEFAULT_BUILD_LOG_PASSWORD" ]; then
echo "Warning: BUILD_LOG_ZIP_PASSWORD is not set. Defaulting to '${DEFAULT_BUILD_LOG_PASSWORD}'."
fi
pushd build/logs && zip -e -P "$BUILD_LOG_ZIP_PASSWORD" ../../encrypted-tests-run-logs.zip * || popd
echo "::set-output name=encrypted::true"
shell: bash
- name: Upload encrypted-tests-run-logs.zip
id: attach-encrypted-test-log
if: always() && steps.encrypt-test-log.outputs.encrypted == 'true'
uses: actions/upload-artifact@v4
with:
name: encrypted-tests-run-logs-${{ inputs.short_commit }}.zip
path: encrypted-tests-run-logs.zip
- name: Print tests-recording.log contents (if exists)
if: ${{ always() && env.RECORD_PID != '' }}
run: |
if [ -f tests-recording.log ]; then
echo "tests-recording.log found. Its contents:"
cat tests-recording.log
else
echo "tests-recording.log not found."
fi
shell: bash
- name: Check for tests-recording.mp4 presence
id: check-recording
if: ${{ always() && env.RECORD_PID != '' }}
run: |
if [ -f tests-recording.mp4 ]; then
echo "::set-output name=found::true"
echo "tests-recording.mp4 found."
else
echo "tests-recording.mp4 not found, skipping upload."
echo "::set-output name=found::false"
fi
shell: bash
- name: Upload tests-recording.mp4
id: upload-recording
if: ${{ always() && steps.check-recording.outputs.found == 'true' }}
uses: actions/upload-artifact@v4
with:
name: tests-recording-${{ inputs.short_commit }}.mp4
path: tests-recording.mp4
- name: Zip test-results
run: zip -r -9 ./test-results.zip ./build/tests
shell: bash
- name: Upload Test Artifacts
uses: actions/upload-artifact@v4
with:
name: test-results-${{ inputs.short_commit }}.zip
path: test-results.zip

View File

@@ -1,242 +1,135 @@
name: Stable SideStore build
on:
push:
tags:
- '[0-9]+.[0-9]+.[0-9]+' # example: 1.0.0
- '[0-9]+.[0-9]+.[0-9]+'
workflow_dispatch:
concurrency:
group: ${{ github.ref }}
cancel-in-progress: true
jobs:
build:
name: Build SideStore - stable (on tag push)
strategy:
fail-fast: false
matrix:
include:
- os: 'macos-26'
version: '26.0'
runs-on: ${{ matrix.os }}
name: Build SideStore - stable
runs-on: macos-26
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
CHANNEL: stable
UPSTREAM_CHANNEL: ""
steps:
- name: Checkout code
uses: actions/checkout@v4
- uses: actions/checkout@v4
with:
submodules: recursive
fetch-depth: 0
- name: Echo Build.xcconfig
- name: Find Last Successful commit
run: |
echo "cat Build.xcconfig"
cat Build.xcconfig
shell: bash
LAST_SUCCESSFUL_COMMIT=$(python3 scripts/ci/workflow.py last-successful-commit \
"true" || echo "")
echo "LAST_SUCCESSFUL_COMMIT=$LAST_SUCCESSFUL_COMMIT" | tee -a $GITHUB_ENV
# - name: Change MARKETING_VERSION to the pushed tag that triggered this build
# run: sed -e '/MARKETING_VERSION = .*/s/= .*/= ${{ github.ref_name }}/' -i '' Build.xcconfig
- run: brew install ldid xcbeautify
- name: Echo Updated Build.xcconfig
- name: Setup Env
run: |
cat Build.xcconfig
shell: bash
MARKETING_VERSION=$(python3 scripts/ci/workflow.py get-marketing-version)
SHORT_COMMIT=$(python3 scripts/ci/workflow.py commit-id)
- name: Extract MARKETING_VERSION from Build.xcconfig
id: version
run: |
version=$(grep MARKETING_VERSION Build.xcconfig | sed -e 's/MARKETING_VERSION = //g')
echo "version=$version" >> $GITHUB_OUTPUT
echo "version=$version"
echo "MARKETING_VERSION=$version" >> $GITHUB_ENV
echo "MARKETING_VERSION=$version" >> $GITHUB_OUTPUT
echo "MARKETING_VERSION=$version"
shell: bash
- name: Fail the build if pushed tag and embedded MARKETING_VERSION in Build.xcconfig are mismatching
run: |
if [ "$MARKETING_VERSION" != "${{ github.ref_name }}" ]; then
echo 'Version mismatch: $tag != $marketing_version ... '
echo " expected-tag : $MARKETING_VERSION"
echo " pushed-tag : ${{ github.ref_name }}"
echo "Version mismatch"
echo "Build.xcconfig: $MARKETING_VERSION"
echo "Tag: ${{ github.ref_name }}"
exit 1
fi
echo 'Version matches: $tag == $marketing_version ... '
echo " expected-tag : $MARKETING_VERSION"
echo " pushed-tag : ${{ github.ref_name }}"
shell: bash
- name: Install dependencies - ldid & xcbeautify
run: |
brew install ldid xcbeautify
echo "MARKETING_VERSION=$MARKETING_VERSION" | tee -a $GITHUB_ENV
echo "SHORT_COMMIT=$SHORT_COMMIT" | tee -a $GITHUB_ENV
- name: Setup Xcode
uses: maxim-lobanov/setup-xcode@v1.6.0
with:
xcode-version: ${{ matrix.version }}
xcode-version: "26.0"
- name: (Build) Restore Xcode & SwiftPM Cache (Exact match)
id: xcode-cache-restore
- name: Restore Cache (exact)
id: xcode-cache-exact
uses: actions/cache/restore@v3
with:
path: |
~/Library/Developer/Xcode/DerivedData
~/Library/Caches/org.swift.swiftpm
key: xcode-cache-build-stable-${{ github.sha }}
key: xcode-build-cache-stable-${{ github.sha }}
- name: (Build) Restore Xcode & SwiftPM Cache (Last Available)
id: xcode-cache-restore-recent
- name: Restore Cache (last)
if: steps.xcode-cache-exact.outputs.cache-hit != 'true'
id: xcode-cache-fallback
uses: actions/cache/restore@v3
with:
path: |
~/Library/Developer/Xcode/DerivedData
~/Library/Caches/org.swift.swiftpm
key: xcode-cache-build-stable-
key: xcode-build-cache-stable-
- name: (Build) Clean previous build artifacts
- name: Build
id: build
env:
BUILD_LOG_ZIP_PASSWORD: ${{ secrets.BUILD_LOG_ZIP_PASSWORD }}
run: |
make clean
mkdir -p build/logs
shell: bash
python3 scripts/ci/workflow.py build; STATUS=$?
python3 scripts/ci/workflow.py encrypt-build
exit $STATUS
- name: (Build) List Files and derived data
if: always()
shell: bash
run: |
echo ">>>>>>>>> Workdir <<<<<<<<<<"
ls -la .
echo ""
echo ">>>>>>>>> SideStore <<<<<<<<<<"
find SideStore -maxdepth 2 -exec ls -ld {} + || true # List contents if directory exists
echo ""
echo ">>>>>>>>> Dependencies <<<<<<<<<<"
find Dependencies -maxdepth 2 -exec ls -ld {} + || true # List contents if directory exists
echo ""
echo ">>>>>>>>> Xcode-Derived-Data <<<<<<<<<<"
ls -la ~/Library/Developer/Xcode/DerivedData || true # List contents if directory exists
echo ""
- name: Build SideStore.xcarchive
# using 'tee' to intercept stdout and log for detailed build-log
run: |
NSUnbufferedIO=YES make -B build 2>&1 | tee -a build/logs/build.log | xcbeautify --renderer github-actions && exit ${PIPESTATUS[0]}
shell: bash
- name: Fakesign app
run: make fakesign | tee -a build/logs/build.log
shell: bash
- name: Convert to IPA
run: make ipa | tee -a build/logs/build.log
shell: bash
- name: (Build) Save Xcode & SwiftPM Cache
id: cache-save
if: ${{ steps.xcode-cache-restore.outputs.cache-hit != 'true' }}
- name: Save Cache
if: ${{ steps.xcode-cache-fallback.outputs.cache-hit != 'true' }}
uses: actions/cache/save@v3
with:
path: |
~/Library/Developer/Xcode/DerivedData
~/Library/Caches/org.swift.swiftpm
key: xcode-cache-build-stable-${{ github.sha }}
- name: (Build) List Files and Build artifacts
run: |
echo ">>>>>>>>> Workdir <<<<<<<<<<"
ls -la .
echo ""
key: xcode-build-cache-stable-${{ github.sha }}
echo ">>>>>>>>> Build <<<<<<<<<<"
find build -maxdepth 3 -exec ls -ld {} + || true # List contents if directory exists
echo ""
echo ">>>>>>>>> SideStore <<<<<<<<<<"
find SideStore -maxdepth 3 -exec ls -ld {} + || true # List contents if directory exists
echo ""
echo ">>>>>>>>> SideStore.xcarchive <<<<<<<<<<"
find SideStore.xcarchive -maxdepth 3 -exec ls -ld {} + || true # List contents if directory exists
echo ""
echo ">>>>>>>>> Xcode-Derived-Data <<<<<<<<<<"
ls -la ~/Library/Developer/Xcode/DerivedData || true # List contents if directory exists
echo ""
shell: bash
- name: Encrypt build-logs for upload
id: encrypt-build-log
run: |
DEFAULT_BUILD_LOG_PASSWORD=12345
BUILD_LOG_ZIP_PASSWORD=${{ secrets.BUILD_LOG_ZIP_PASSWORD }}
BUILD_LOG_ZIP_PASSWORD=${BUILD_LOG_ZIP_PASSWORD:-$DEFAULT_BUILD_LOG_PASSWORD}
if [ "$BUILD_LOG_ZIP_PASSWORD" == "$DEFAULT_BUILD_LOG_PASSWORD" ]; then
echo "Warning: BUILD_LOG_ZIP_PASSWORD is not set. Defaulting to '${DEFAULT_BUILD_LOG_PASSWORD}'."
fi
pushd build/logs && zip -e -P "$BUILD_LOG_ZIP_PASSWORD" ../../encrypted-build-logs.zip * || popd
echo "::set-output name=encrypted::true"
shell: bash
- name: Upload encrypted-build-logs.zip
id: attach-encrypted-build-log
if: ${{ always() && steps.encrypt-build-log.outputs.encrypted == 'true' }}
uses: actions/upload-artifact@v4
- uses: actions/upload-artifact@v4
with:
name: encrypted-build-logs-${{ steps.version.outputs.version }}.zip
path: encrypted-build-logs.zip
name: build-logs-${{ env.MARKETING_VERSION }}.zip
path: build-logs.zip
- name: Upload SideStore.ipa Artifact
uses: actions/upload-artifact@v4
- uses: actions/upload-artifact@v4
with:
name: SideStore-${{ steps.version.outputs.version }}.ipa
name: SideStore-${{ env.MARKETING_VERSION }}.ipa
path: SideStore.ipa
- name: Zip dSYMs
run: zip -r -9 ./SideStore.dSYMs.zip ./SideStore.xcarchive/dSYMs
shell: bash
- name: Upload *.dSYM Artifact
uses: actions/upload-artifact@v4
- uses: actions/upload-artifact@v4
with:
name: SideStore-${{ steps.version.outputs.version }}-dSYMs.zip
name: SideStore-${{ env.MARKETING_VERSION }}-dSYMs.zip
path: SideStore.dSYMs.zip
- name: Get current date
id: date
run: echo "date=$(date -u +'%c')" >> $GITHUB_OUTPUT
shell: bash
- name: Generate Metadata
run: |
python3 scripts/ci/workflow.py dump-project-settings
PRODUCT_NAME=$(python3 scripts/ci/workflow.py read-product-name)
BUNDLE_ID=$(python3 scripts/ci/workflow.py read-bundle-id)
IPA_NAME="$PRODUCT_NAME.ipa"
- name: Get current date in AltStore date form
id: date_altstore
run: echo "date=$(date -u +'%Y-%m-%d')" >> $GITHUB_OUTPUT
shell: bash
python3 scripts/ci/workflow.py generate-metadata \
"${{ github.ref_name }}" \
"$SHORT_COMMIT" \
"$MARKETING_VERSION" \
"$CHANNEL" \
"$BUNDLE_ID" \
"$IPA_NAME" \
"$LAST_SUCCESSFUL_COMMIT"
- name: Upload to releases
uses: IsaacShelton/update-existing-release@v1.3.1
with:
token: ${{ secrets.GITHUB_TOKEN }}
draft: true
release: ${{ github.ref_name }} # name
tag: ${{ github.ref_name }}
# stick with what the user pushed, do not use latest commit or anything,
# ex: if we want to go back to previous release due to hot issue, dev can create a new tag pointing to that older working tag/commit so as to keep it as an update (to revert major issue)
# in this case we do not want the tag to be auto-updated to latest
updateTag: false
prerelease: false
files: >
SideStore.ipa
SideStore.dSYMs.zip
encrypted-build-logs.zip
body: |
<!-- NOTE: to reset SideSource cache, go to `https://apps.sidestore.io/reset-cache/nightly/<sidesource key>`. This is not included in the GitHub Action since it makes draft releases so they can be edited and have a changelog. -->
## Changelog
- TODO
## Build Info
Built at (UTC): `${{ steps.date.outputs.date }}`
Built at (UTC date): `${{ steps.date_altstore.outputs.date }}`
Commit SHA: `${{ github.sha }}`
Version: `${{ steps.version.outputs.version }}`
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
python3 scripts/ci/workflow.py upload-release \
"${{ github.ref_name }}" \
"${{ github.ref_name }}" \
"$GITHUB_SHA" \
"$GITHUB_REPOSITORY" \
"$UPSTREAM_CHANNEL" \
"true"

5
.gitignore vendored
View File

@@ -69,4 +69,7 @@ SideStore/.skip-prebuilt-fetch-em_proxy
test-recording.mp4
test-recording.log
altstore-sources.md
local-build.sh
local-build.sh
source-metadata.json
release-notes.md

File diff suppressed because it is too large Load Diff

View File

@@ -92,7 +92,6 @@ final class AppDelegate: UIResponder, UIApplicationDelegate {
}
}
self.setTintColor()
self.setTintColor()
self.prepareImageCache()

View File

@@ -368,7 +368,7 @@ private extension FeaturedViewController
#keyPath(StoreApp._source._apps),
#keyPath(StoreApp.bundleIdentifier),
StoreApp.altstoreAppID,
#keyPath(StoreApp.installedApp),
#keyPath(StoreApp.installedApp)
)
let primaryFetchRequest = fetchRequest.copy() as! NSFetchRequest<StoreApp>

View File

@@ -997,6 +997,7 @@ extension AppManager
case .failure(let error): completionHandler(.failure(error))
case .success(let installedApp): completionHandler(.success(installedApp))
}
//UIApplication.shared.open(shortcutURLon, options: [:], completionHandler: nil)
}
installOperation.addDependency(sendAppOperation)

View File

@@ -11,6 +11,8 @@ import AltStoreCore
import AltSign
import Roxas
let shortcutURLonDelay = URL(string: "shortcuts://run-shortcut?name=TurnOnDataDelay")!
@objc(InstallAppOperation)
final class InstallAppOperation: ResultOperation<InstalledApp>
{
@@ -176,6 +178,13 @@ final class InstallAppOperation: ResultOperation<InstalledApp>
var installing = true
if installedApp.storeApp?.bundleIdentifier.range(of: Bundle.Info.appbundleIdentifier) != nil {
do {
// we need to flush changes to the disk now in case the changes are lost when iOS kills current process
try installedApp.managedObjectContext?.save()
} catch {
print("Failed to flush installedApp to disk: \(error)")
}
// Reinstalling ourself will hang until we leave the app, so we need to exit it without force closing
DispatchQueue.main.asyncAfter(deadline: .now() + 3) {
if UIApplication.shared.applicationState != .active {
@@ -204,6 +213,10 @@ final class InstallAppOperation: ResultOperation<InstalledApp>
let alert = UIAlertController(title: "Finish Refresh", message: "Please reopen SideStore after the process is finished.To finish refreshing, SideStore must be moved to the background. To do this, you can either go to the Home Screen manually or by hitting Continue. Please reopen SideStore after doing this.", preferredStyle: .alert)
alert.addAction(UIAlertAction(title: NSLocalizedString("Continue", comment: ""), style: .default, handler: { _ in
print("Going home")
// Cell Shortcut
UIApplication.shared.open(shortcutURLonDelay, options: [:]) { _ in
print("Cell OFF Shortcut finished execution.")}
UIApplication.shared.perform(#selector(NSXPCConnection.suspend))
}))
@@ -220,6 +233,8 @@ final class InstallAppOperation: ResultOperation<InstalledApp>
}
}
}
// Cell Shortcut
UIApplication.shared.open(shortcutURLonDelay, options: [:]) { _ in print("Cell OFF Shortcut finished execution.")}
UIApplication.shared.perform(#selector(NSXPCConnection.suspend))
}
}

View File

@@ -211,7 +211,6 @@ private extension PatchAppOperation
#if targetEnvironment(simulator)
throw PatchAppError.unsupportedOperatingSystemVersion(ProcessInfo.processInfo.operatingSystemVersion)
#else
let spotlightPath = "Applications/Spotlight.app/Spotlight"
let spotlightFileURL = self.patchDirectory.appendingPathComponent(spotlightPath)

View File

@@ -25,39 +25,48 @@ final class SendAppOperation: ResultOperation<()>
self.progress.totalUnitCount = 1
}
override func main()
{
override func main() {
super.main()
if let error = self.context.error
{
if let error = self.context.error {
return self.finish(.failure(error))
}
guard let resignedApp = self.context.resignedApp else {
return self.finish(.failure(OperationError.invalidParameters("SendAppOperation.main: self.resignedApp is nil")))
}
// self.context.resignedApp.fileURL points to the app bundle, but we want the .ipa.
let shortcutURLoff = URL(string: "shortcuts://run-shortcut?name=TurnOffData")!
let app = AnyApp(name: resignedApp.name, bundleIdentifier: self.context.bundleIdentifier, url: resignedApp.fileURL, storeApp: nil)
let fileURL = InstalledApp.refreshedIPAURL(for: app)
print("AFC App `fileURL`: \(fileURL.absoluteString)")
if let data = NSData(contentsOf: fileURL) {
do {
let bytes = Data(data)
try yeetAppAFC(app.bundleIdentifier, bytes)
self.progress.completedUnitCount += 1
self.finish(.success(()))
} catch {
self.finish(.failure(MinimuxerError.RwAfc))
self.progress.completedUnitCount += 1
self.finish(.success(()))
// Wait for Shortcut to Finish Before Proceeding
UIApplication.shared.open(shortcutURLoff, options: [:]) { _ in
print("Shortcut finished execution. Proceeding with file transfer.")
DispatchQueue.global().async {
self.processFile(at: fileURL, for: app.bundleIdentifier)
}
} else {
}
}
private func processFile(at fileURL: URL, for bundleIdentifier: String) {
guard let data = NSData(contentsOf: fileURL) else {
print("IPA doesn't exist????")
self.finish(.failure(OperationError(.appNotFound(name: resignedApp.name))))
return self.finish(.failure(OperationError(.appNotFound(name: bundleIdentifier))))
}
do {
let bytes = Data(data)
try yeetAppAFC(bundleIdentifier, bytes)
self.progress.completedUnitCount += 1
self.finish(.success(()))
} catch {
self.finish(.failure(MinimuxerError.RwAfc))
self.progress.completedUnitCount += 1
self.finish(.success(()))
}
}
}

View File

@@ -399,6 +399,40 @@ private extension DatabaseManager
// For backwards compatibility reasons, we cannot use localApp's buildVersion as storeBuildVersion,
// or else the latest update will _always_ be considered new because we don't use buildVersions in our source (yet).
installedApp = InstalledApp(resignedApp: localApp, originalBundleIdentifier: StoreApp.altstoreAppID, certificateSerialNumber: serialNumber, storeBuildVersion: nil, context: context)
// figure out if the current AltStoreApp is signed with "Use Main Profie" option
// by checking if the first extension's entitlement's application-identifier matches current one
repeat {
guard let pluginURL = Bundle.main.builtInPlugInsURL else {
installedApp.useMainProfile = true
break
}
guard let pluginFolders = try? FileManager.default.contentsOfDirectory(at: pluginURL, includingPropertiesForKeys: nil) else {
installedApp.useMainProfile = true
break
}
guard let pluginFolder = pluginFolders.first, let altPluginApp = ALTApplication(fileURL: pluginFolder) else {
installedApp.useMainProfile = true
break
}
let entitlements = altPluginApp.entitlements
guard let appId = entitlements[ALTEntitlement.applicationIdentifier] as? String else {
installedApp.useMainProfile = false
print("no ALTEntitlementApplicationIdentifier???")
break
}
if appId.hasSuffix(Bundle.main.bundleIdentifier!) {
installedApp.useMainProfile = true
} else {
installedApp.useMainProfile = false
}
} while(false)
installedApp.storeApp = storeApp
}

132
SideStore/IfManager.swift Normal file
View File

@@ -0,0 +1,132 @@
//
// IfManager.swift
// AltStore
//
// Created by ny on 2/27/26.
// Copyright © 2026 SideStore. All rights reserved.
//
import Foundation
import Network
fileprivate func uti(_ uint: UInt32) -> String? {
var buf = [CChar](repeating: 0, count: Int(NI_MAXHOST))
var addr = in_addr(s_addr: uint.bigEndian)
guard inet_ntop(AF_INET, &addr, &buf, UInt32(INET_ADDRSTRLEN)) != nil,
let str = String(utf8String: buf) else { return nil }
return str
}
fileprivate func socktouint(_ sock: inout sockaddr) -> UInt32 {
var buf = [CChar](repeating: 0, count: Int(NI_MAXHOST))
guard getnameinfo(&sock, socklen_t(sock.sa_len), &buf, socklen_t(buf.count), nil, socklen_t(0), NI_NUMERICHOST) == 0,
let name = String(utf8String: buf) else {
return 0
}
var addr = in_addr()
guard name.withCString({ cString in
inet_pton(AF_INET, cString, &addr)
}) == 1 else { return 0 }
return addr.s_addr.bigEndian
}
public struct NetInfo: Hashable, CustomStringConvertible {
public let name: String
public let hostIP: String
public let destIP: String
public let maskIP: String
private let host: UInt32
private let dest: UInt32
private let mask: UInt32
init(name: String, host: UInt32, dest: UInt32, mask: UInt32) {
self.name = name
self.host = host
self.dest = dest
self.mask = mask
self.hostIP = uti(host) ?? "10.7.0.0"
self.destIP = uti(dest) ?? "10.7.0.1"
self.maskIP = uti(mask) ?? "255.255.255.0"
}
init?(_ ifaddr: ifaddrs) {
guard
let ianame = String(utf8String: ifaddr.ifa_name)
else { return nil }
let host = socktouint(&ifaddr.ifa_addr.pointee)
let dest = socktouint(&ifaddr.ifa_dstaddr.pointee)
let mask = socktouint(&ifaddr.ifa_netmask.pointee)
self.init(name: ianame, host: host, dest: dest, mask: mask)
}
// computed networking values (still numeric internally)
public var minIP: UInt32 { host & mask }
public var maxIP: UInt32 { host | ~mask }
public var minIPString: String { uti(minIP) ?? "nil" }
public var maxIPString: String { uti(maxIP) ?? "nil" }
public var description: String {
"\(name) | ip=\(hostIP) dest=\(destIP) mask=\(maskIP) range=\(minIPString)-\(maxIPString)"
}
}
final class IfManager: Sendable {
public static let shared = IfManager()
nonisolated(unsafe) private(set) var addrs: Set<NetInfo> = Set()
private init() {
self.addrs = IfManager.query()
}
public func query() {
addrs = IfManager.query()
}
private static func query() -> Set<NetInfo> {
var addrs = Set<NetInfo>()
var head: UnsafeMutablePointer<ifaddrs>? = nil
guard getifaddrs(&head) == 0, let first = head else { return addrs }
defer { freeifaddrs(head) }
var cursor: UnsafeMutablePointer<ifaddrs>? = first
while let current = cursor {
// we only want v4 interfaces that aren't loopback and aren't masked 255.255.255.255
let entry = current.pointee
let flags = Int32(entry.ifa_flags)
let isIPv4 = entry.ifa_addr.pointee.sa_family == UInt8(AF_INET)
let isActive = (flags & (IFF_UP | IFF_RUNNING | IFF_LOOPBACK)) == (IFF_UP | IFF_RUNNING)
if isIPv4, isActive, let info = NetInfo(entry), info.maskIP != "255.255.255.255" {
addrs.insert(info)
}
cursor = entry.ifa_next
}
return addrs
}
private var nextLAN: NetInfo? {
addrs.first { $0.name.starts(with: "en") }
}
var nextProbableSideVPN: NetInfo? {
// try old 10.7.0.1 first, then fallback to next v4
// user should only be connected to StosVPN/LocalDevVPN
addrs.first {
$0.hostIP == "10.7.0.1" ||
$0.name.starts(with: "utun")
}
}
var sideVPNPatched: Bool {
nextLAN?.maskIP == nextProbableSideVPN?.maskIP &&
nextLAN?.maxIP == nextProbableSideVPN?.maxIP
}
}

View File

@@ -13,14 +13,21 @@ var isMinimuxerReady: Bool {
print("isMinimuxerReady property is always true on simulator")
return true
#else
return minimuxer.ready()
IfManager.shared.query()
if #available(iOS 26.4, *) {
print("Running patched check")
return minimuxer.ready() && IfManager.shared.sideVPNPatched
} else {
return minimuxer.ready()
}
#endif
}
func minimuxerStartWithLogger(_ pairingFile: String,_ logPath: String,_ loggingEnabled: Bool) throws {
func minimuxerStartWithLogger(_ pairingFile: String, _ logPath: String, _ loggingEnabled: Bool) throws {
#if targetEnvironment(simulator)
print("minimuxerStartWithLogger(\(pairingFile), \(logPath), \(loggingEnabled) is no-op on simulator")
print("minimuxerStartWithLogger(\(pairingFile), \(logPath), \(loggingEnabled)) is no-op on simulator")
#else
print("minimuxerStartWithLogger(\(pairingFile), \(logPath), \(loggingEnabled))")
try minimuxer.startWithLogger(pairingFile, logPath, loggingEnabled)
#endif
}
@@ -37,7 +44,8 @@ func installProvisioningProfiles(_ profileData: Data) throws {
#if targetEnvironment(simulator)
print("installProvisioningProfiles(\(profileData)) is no-op on simulator")
#else
try minimuxer.install_provisioning_profile(profileData.toRustByteSlice().forRust())
let slice = profileData.toRustByteSlice()
try minimuxer.install_provisioning_profile(slice.forRust())
#endif
}
@@ -55,7 +63,8 @@ func yeetAppAFC(_ bundleId: String, _ rawBytes: Data) throws {
#if targetEnvironment(simulator)
print("yeetAppAFC(\(bundleId), \(rawBytes)) is no-op on simulator")
#else
try minimuxer.yeet_app_afc(bundleId, rawBytes.toRustByteSlice().forRust())
let slice = rawBytes.toRustByteSlice()
try minimuxer.yeet_app_afc(bundleId, slice.forRust())
#endif
}

View File

@@ -0,0 +1,298 @@
#!/usr/bin/env python3
import subprocess
import sys
import os
import re
from pathlib import Path
IGNORED_AUTHORS = []
TAG_MARKER = "###"
HEADER_MARKER = "####"
# ----------------------------------------------------------
# helpers
# ----------------------------------------------------------
def run(cmd: str) -> str:
return subprocess.check_output(cmd, shell=True, text=True).strip()
def commit_exists(rev: str) -> bool:
if not rev:
return False
try:
subprocess.check_output(
f"git rev-parse --verify {rev}^{{commit}}",
shell=True,
stderr=subprocess.DEVNULL,
)
return True
except subprocess.CalledProcessError:
return False
def head_commit():
return run("git rev-parse HEAD")
def first_commit():
return run("git rev-list --max-parents=0 HEAD").splitlines()[0]
def repo_url():
url = run("git config --get remote.origin.url")
if url.startswith("git@"):
url = url.replace("git@", "https://").replace(":", "/")
return url.removesuffix(".git")
def commit_messages(start, end="HEAD"):
out = run(f"git log {start}..{end} --pretty=format:%s")
return out.splitlines() if out else []
def authors(range_expr, fmt="%an"):
try:
out = run(f"git log {range_expr} --pretty=format:{fmt}")
result = {a.strip() for a in out.splitlines() if a.strip()}
return result - set(IGNORED_AUTHORS)
except subprocess.CalledProcessError:
return set()
def branch_base():
try:
default_ref = run("git rev-parse --abbrev-ref origin/HEAD")
default_branch = default_ref.split("/")[-1]
return run(f"git merge-base HEAD origin/{default_branch}")
except Exception:
return first_commit()
def fmt_msg(msg):
msg = msg.lstrip()
if msg.startswith("-"):
msg = msg[1:].strip()
return f"- {msg}"
def fmt_author(author):
return author if author.startswith("@") else f"@{author.split()[0]}"
# ----------------------------------------------------------
# release note generation
# ----------------------------------------------------------
def resolve_start_commit(last_successful: str):
if commit_exists(last_successful):
return last_successful
try:
return run("git rev-parse HEAD~10")
except Exception:
return first_commit()
def generate_release_notes(last_successful, tag, branch):
current = head_commit()
# fallback if missing/invalid
if not last_successful or not commit_exists(last_successful):
try:
last_successful = run("git rev-parse HEAD~10")
except Exception:
last_successful = first_commit()
messages = commit_messages(last_successful, current)
# fallback if empty range
if not messages:
try:
last_successful = run("git rev-parse HEAD~10")
except Exception:
last_successful = first_commit()
messages = commit_messages(last_successful, current)
section = f"{TAG_MARKER} {tag}\n"
section += f"{HEADER_MARKER} What's Changed\n"
if not messages or last_successful == current:
section += "- Nothing...\n"
else:
for m in messages:
section += f"{fmt_msg(m)}\n"
if commit_exists(branch):
previous_range = branch
else:
previous_range = last_successful
prev_authors = authors(previous_range)
recent_authors = authors(f"{last_successful}..{current}")
new_authors = recent_authors - prev_authors
if new_authors:
section += f"\n{HEADER_MARKER} New Contributors\n"
for a in sorted(new_authors):
section += f"- {fmt_author(a)} made their first contribution\n"
if messages and last_successful != current:
url = repo_url()
section += (
f"\n{HEADER_MARKER} Full Changelog: "
f"[{ref_display(last_successful)}...{ref_display(current)}]"
f"({url}/compare/{last_successful}...{current})\n"
)
return section
def ref_display(ref):
try:
tag = run(f'git describe --tags --exact-match "{ref}" 2>/dev/null || true').strip()
# allow only semantic version tags: X.Y.Z
if re.fullmatch(r'\d+\.\d+\.\d+', tag):
return tag
except Exception:
pass
return ref[:8]
# ----------------------------------------------------------
# markdown update
# ----------------------------------------------------------
def update_release_md(existing, new_section, tag):
if not existing:
return new_section
tag_lower = tag.lower()
is_special = tag_lower in {"alpha", "beta", "nightly"}
pattern = fr"(^{TAG_MARKER} .*$)"
parts = re.split(pattern, existing, flags=re.MULTILINE)
processed = []
special_seen = {"alpha": False, "beta": False, "nightly": False}
last_special_idx = -1
i = 0
while i < len(parts):
if i % 2 == 1:
header = parts[i]
name = header[3:].strip().lower()
if name in special_seen:
special_seen[name] = True
last_special_idx = len(processed)
if name == tag_lower:
i += 2
continue
processed.append(parts[i])
i += 1
insert_pos = 0
if is_special:
order = ["alpha", "beta", "nightly"]
for t in order:
if t == tag_lower:
break
if special_seen[t]:
idx = processed.index(f"{TAG_MARKER} {t}")
insert_pos = idx + 2
elif last_special_idx >= 0:
insert_pos = last_special_idx + 2
processed.insert(insert_pos, new_section)
result = ""
for part in processed:
if part.startswith(f"{TAG_MARKER} ") and not result.endswith("\n\n"):
result = result.rstrip("\n") + "\n\n"
result += part
return result.rstrip() + "\n"
# ----------------------------------------------------------
# retrieval
# ----------------------------------------------------------
def retrieve_tag(tag, file_path: Path):
if not file_path.exists():
return ""
content = file_path.read_text()
match = re.search(
fr"^{TAG_MARKER} {re.escape(tag)}$",
content,
re.MULTILINE | re.IGNORECASE,
)
if not match:
return ""
start = match.end()
if start < len(content) and content[start] == "\n":
start += 1
next_tag = re.search(fr"^{TAG_MARKER} ", content[start:], re.MULTILINE)
end = start + next_tag.start() if next_tag else len(content)
return content[start:end].strip()
# ----------------------------------------------------------
# entrypoint
# ----------------------------------------------------------
def main():
args = sys.argv[1:]
if not args:
sys.exit(
"Usage:\n"
" generate_release_notes.py <last_successful> [tag] [branch] [--output-dir DIR]\n"
" generate_release_notes.py --retrieve <tag> [--output-dir DIR]"
)
output_dir = Path.cwd()
if "--output-dir" in args:
idx = args.index("--output-dir")
output_dir = Path(args[idx + 1]).resolve()
del args[idx:idx + 2]
output_dir.mkdir(parents=True, exist_ok=True)
release_file = output_dir / "release-notes.md"
if args[0] == "--retrieve":
print(retrieve_tag(args[1], release_file))
return
last_successful = args[0]
tag = args[1] if len(args) > 1 else head_commit()
branch = args[2] if len(args) > 2 else (
os.environ.get("GITHUB_REF") or branch_base()
)
new_section = generate_release_notes(last_successful, tag, branch)
existing = release_file.read_text() if release_file.exists() else ""
updated = update_release_md(existing, new_section, tag)
release_file.write_text(updated)
print(new_section)
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,176 @@
#!/usr/bin/env python3
import datetime
import hashlib
import json
import subprocess
from pathlib import Path
import argparse
import sys
SCRIPT_DIR = Path(__file__).resolve().parent
# ----------------------------------------------------------
# helpers
# ----------------------------------------------------------
def resolve_script(name: str) -> Path:
p = Path.cwd() / name
if p.exists():
return p
return SCRIPT_DIR / name
def sh(cmd: str, cwd: Path) -> str:
try:
return subprocess.check_output(
cmd,
shell=True,
cwd=cwd,
stderr=subprocess.STDOUT,
).decode().strip()
except subprocess.CalledProcessError as e:
print(e.output.decode(), file=sys.stderr)
raise SystemExit(f"Command failed: {cmd}")
def file_size(path: Path) -> int:
if not path.exists():
raise SystemExit(f"Missing file: {path}")
return path.stat().st_size
def sha256(path: Path) -> str:
h = hashlib.sha256()
with open(path, "rb") as f:
while chunk := f.read(1024 * 1024):
h.update(chunk)
return h.hexdigest()
# ----------------------------------------------------------
# entry
# ----------------------------------------------------------
def main():
p = argparse.ArgumentParser()
p.add_argument("--repo-root", required=True)
p.add_argument("--ipa", required=True)
p.add_argument("--output-dir", required=True)
p.add_argument(
"--output-name",
default="source_metadata.json",
)
p.add_argument("--release-notes-dir", required=True)
p.add_argument("--release-tag", required=True)
p.add_argument("--marketing-version", required=True)
p.add_argument("--short-commit", required=True)
p.add_argument("--release-channel", required=True)
p.add_argument("--bundle-id", required=True)
# optional
p.add_argument("--last-successful-commit")
p.add_argument("--is-beta", action="store_true")
args = p.parse_args()
repo_root = Path(args.repo_root).resolve()
ipa_path = Path(args.ipa).resolve()
out_dir = Path(args.output_dir).resolve()
notes_dir = Path(args.release_notes_dir).resolve()
if not repo_root.is_dir():
raise SystemExit(f"Invalid repo root: {repo_root}")
if not ipa_path.is_file():
raise SystemExit(f"Invalid IPA path: {ipa_path}")
notes_dir.mkdir(parents=True, exist_ok=True)
out_dir.mkdir(parents=True, exist_ok=True)
out_file = out_dir / args.output_name
# ------------------------------------------------------
# generate release notes
# ------------------------------------------------------
print("Generating release notes…")
script = resolve_script("generate_release_notes.py")
if args.last_successful_commit:
gen_cmd = (
f"python3 {script} "
f"{args.last_successful_commit} {args.release_tag} "
f'--output-dir "{notes_dir}"'
)
else:
gen_cmd = (
f"python3 {script} "
f"{args.short_commit} {args.release_tag} "
f'--output-dir "{notes_dir}"'
)
sh(gen_cmd, cwd=repo_root)
# ------------------------------------------------------
# retrieve release notes
# ------------------------------------------------------
notes = sh(
(
f"python3 {script} "
f"--retrieve {args.release_tag} "
f"--output-dir \"{notes_dir}\""
),
cwd=repo_root,
)
# ------------------------------------------------------
# compute metadata
# ------------------------------------------------------
now = datetime.datetime.now(datetime.timezone.utc)
formatted = now.strftime("%Y-%m-%dT%H:%M:%SZ")
human = now.strftime("%c")
localized_description = getFormattedLocalizedDescription(args.marketing_version, args.short_commit, human, notes)
metadata = {
"is_beta": bool(args.is_beta),
"bundle_identifier": args.bundle_id,
"version_ipa": args.marketing_version,
"version_date": formatted,
"release_channel": args.release_channel.lower(),
"size": file_size(ipa_path),
"sha256": sha256(ipa_path),
"download_url": (
"https://github.com/SideStore/SideStore/releases/download/"
f"{args.release_tag}/SideStore.ipa"
),
"localized_description": localized_description,
}
with open(out_file, "w", encoding="utf-8") as f:
json.dump(metadata, f, indent=2, ensure_ascii=False)
print(f"Wrote {out_file}")
def getFormattedLocalizedDescription(marketing_version, short_commit, human, notes):
return f"""
This is release for:
- version: "{marketing_version}"
- revision: "{short_commit}"
- timestamp: "{human}"
Release Notes:
{notes}
""".lstrip("\n")
if __name__ == "__main__":
main()

View File

@@ -0,0 +1,181 @@
#!/usr/bin/env python3
import json
import sys
from pathlib import Path
# ----------------------------------------------------------
# metadata
# ----------------------------------------------------------
def load_metadata(metadata_file: Path):
if not metadata_file.exists():
raise SystemExit(f"Missing metadata file: {metadata_file}")
with open(metadata_file, "r", encoding="utf-8") as f:
meta = json.load(f)
print(" ====> Required parameter list <====")
for k, v in meta.items():
print(f"{k}: {v}")
required = [
"bundle_identifier",
"version_ipa",
"version_date",
"release_channel",
"size",
"sha256",
"localized_description",
"download_url",
]
for r in required:
if not meta.get(r):
raise SystemExit("One or more required metadata fields missing")
meta["size"] = int(meta["size"])
meta["release_channel"] = meta["release_channel"].lower()
return meta
# ----------------------------------------------------------
# source loading
# ----------------------------------------------------------
def load_source(source_file: Path):
if source_file.exists():
with open(source_file, "r", encoding="utf-8") as f:
data = json.load(f)
else:
print("source.json missing — creating minimal structure")
data = {"version": 2, "apps": []}
if int(data.get("version", 1)) < 2:
raise SystemExit("Only v2 and above are supported")
return data
# ----------------------------------------------------------
# locate app
# ----------------------------------------------------------
def ensure_app(data, bundle_id):
apps = data.setdefault("apps", [])
app = next(
(a for a in apps if a.get("bundleIdentifier") == bundle_id),
None,
)
if app is None:
print("App entry missing — creating new app entry")
app = {
"bundleIdentifier": bundle_id,
"releaseChannels": [],
}
apps.append(app)
return app
# ----------------------------------------------------------
# update storefront
# ----------------------------------------------------------
def update_storefront_if_needed(app, meta):
if meta["release_channel"] == "stable":
app.update({
"version": meta["version_ipa"],
"versionDate": meta["version_date"],
"size": meta["size"],
"sha256": meta["sha256"],
"localizedDescription": meta["localized_description"],
"downloadURL": meta["download_url"],
})
# ----------------------------------------------------------
# update release channel (ORIGINAL FORMAT)
# ----------------------------------------------------------
def update_release_channel(app, meta):
channels = app.setdefault("releaseChannels", [])
new_version = {
"version": meta["version_ipa"],
"date": meta["version_date"],
"localizedDescription": meta["localized_description"],
"downloadURL": meta["download_url"],
"size": meta["size"],
"sha256": meta["sha256"],
}
tracks = [
t for t in channels
if isinstance(t, dict)
and t.get("track") == meta["release_channel"]
]
if len(tracks) > 1:
raise SystemExit(f"Multiple tracks named {meta['release_channel']}")
if not tracks:
channels.insert(0, {
"track": meta["release_channel"],
"releases": [new_version],
})
else:
track = tracks[0]
releases = track.setdefault("releases", [])
if not releases:
releases.append(new_version)
else:
releases[0] = new_version
# ----------------------------------------------------------
# save
# ----------------------------------------------------------
def save_source(source_file: Path, data):
print("\nUpdated Sources File:\n")
print(json.dumps(data, indent=2, ensure_ascii=False))
source_file.parent.mkdir(parents=True, exist_ok=True)
with open(source_file, "w", encoding="utf-8") as f:
json.dump(data, f, indent=2, ensure_ascii=False)
print("JSON successfully updated.")
# ----------------------------------------------------------
# main
# ----------------------------------------------------------
def main():
if len(sys.argv) < 3:
print("Usage: python3 update_apps.py <metadata.json> <source.json>")
sys.exit(1)
metadata_file = Path(sys.argv[1])
source_file = Path(sys.argv[2])
meta = load_metadata(metadata_file)
data = load_source(source_file)
app = ensure_app(data, meta["bundle_identifier"])
update_storefront_if_needed(app, meta)
update_release_channel(app, meta)
save_source(source_file, data)
if __name__ == "__main__":
main()

571
scripts/ci/workflow.py Normal file
View File

@@ -0,0 +1,571 @@
#!/usr/bin/env python3
import os
import sys
import subprocess
import datetime
from pathlib import Path
import time
import json
import re
from posix import getcwd
# REPO ROOT relative to script dir
ROOT = Path(__file__).resolve().parents[2]
SCRIPTS = ROOT / 'scripts/ci'
BUILD_SETTINGS_OUTFILE = "project-build-settings.txt"
# ----------------------------------------------------------
# helpers
# ----------------------------------------------------------
def run(cmd, check=True, cwd=None):
wd = cwd if cwd is not None else ROOT
print(f"$ {cmd}", flush=True, file=sys.stderr)
subprocess.run(
cmd,
shell=True,
cwd=wd,
check=check,
stdout=sys.stderr,
stderr=sys.stderr,
)
print("", flush=True, file=sys.stderr)
def runAndGet(cmd, cwd=None):
wd = cwd if cwd is not None else ROOT
print(f"$ {cmd}", flush=True, file=sys.stderr)
result = subprocess.run(
cmd,
shell=True,
cwd=wd,
stdout=subprocess.PIPE,
stderr=sys.stderr,
text=True,
check=True,
)
out = result.stdout.strip()
print(out, flush=True, file=sys.stderr)
print("", flush=True, file=sys.stderr)
return out
def getenv(name, default=""):
return os.environ.get(name, default)
# ----------------------------------------------------------
# SHARED
# ----------------------------------------------------------
def short_commit():
return runAndGet("git rev-parse --short HEAD")
def count_new_commits(last_commit):
if not last_commit or not last_commit.strip():
return 0
try:
total = int(runAndGet("git rev-list --count HEAD"))
if total == 1:
head = runAndGet("git rev-parse HEAD")
return 1 if head != last_commit else 0
out = runAndGet(f"git rev-list --count {last_commit}..HEAD")
return int(out)
except Exception:
return 0
# ----------------------------------------------------------
# PROJECT INFO
# ----------------------------------------------------------
def dump_project_settings(outdir=None):
outfile = Path(outdir).resolve() / BUILD_SETTINGS_OUTFILE if outdir else BUILD_SETTINGS_OUTFILE
run(f"xcodebuild -showBuildSettings 2>&1 > '{outfile}'")
def _extract_setting(cmd):
out = runAndGet(cmd + " || true").strip() # prevent grep failure from aborting
return out if out else None
def _read_dumped_build_setting(name):
return _extract_setting(
f"cat '{BUILD_SETTINGS_OUTFILE}' "
f"| grep '{name} = ' "
"| tail -1 "
"| sed -e 's/.*= //g'"
)
def query_build_setting(name):
return _extract_setting(
f"xcodebuild -showBuildSettings 2>&1 "
f"| grep '{name} = ' "
"| tail -1 "
"| sed -e 's/.*= //g'"
)
def get_product_name(): return query_build_setting("PRODUCT_NAME")
def get_bundle_id(): return query_build_setting("PRODUCT_BUNDLE_IDENTIFIER")
def read_product_name(): return _read_dumped_build_setting("PRODUCT_NAME")
def read_bundle_id(): return _read_dumped_build_setting("PRODUCT_BUNDLE_IDENTIFIER")
def get_marketing_version():
return runAndGet(f"grep MARKETING_VERSION {ROOT}/Build.xcconfig | sed -e 's/MARKETING_VERSION = //g'")
def set_marketing_version(version):
run(
f"sed -E -i '' "
f"'s/^MARKETING_VERSION = .*/MARKETING_VERSION = {version}/' "
f"{ROOT}/Build.xcconfig"
)
def compute_normalized_version(marketing, build_num, short):
now = datetime.datetime.now(datetime.UTC)
date = now.strftime("%Y%m%d") # normalized date
base = marketing.strip()
return f"{base}-{date}.{build_num}+{short}"
# ----------------------------------------------------------
# CLEAN
# ----------------------------------------------------------
def clean():
run("make clean")
def clean_derived_data():
run("rm -rf ~/Library/Developer/Xcode/DerivedData/*", check=False)
def clean_spm_cache():
run("rm -rf ~/Library/Caches/org.swift.swiftpm/*", check=False)
# ----------------------------------------------------------
# BUILD
# ----------------------------------------------------------
def build():
run("mkdir -p build/logs")
run(
"set -o pipefail && "
"NSUnbufferedIO=YES make -B build "
"2>&1 | tee -a build/logs/build.log | xcbeautify --renderer github-actions"
)
run("make fakesign | tee -a build/logs/build.log")
run("make ipa | tee -a build/logs/build.log")
run("zip -r -9 ./SideStore.dSYMs.zip ./SideStore.xcarchive/dSYMs")
# ----------------------------------------------------------
# TESTS BUILD
# ----------------------------------------------------------
def tests_build():
run("mkdir -p build/logs")
run(
"NSUnbufferedIO=YES make -B build-tests "
"2>&1 | tee -a build/logs/tests-build.log | xcbeautify --renderer github-actions"
)
# ----------------------------------------------------------
# TESTS RUN
# ----------------------------------------------------------
def is_sim_booted(model):
out = runAndGet(f'xcrun simctl list devices "{model}"')
return "Booted" in out
def boot_sim_async(model):
log = ROOT / "build/logs/tests-run.log"
log.parent.mkdir(parents=True, exist_ok=True)
if is_sim_booted(model):
run(f'echo "Simulator {model} already booted." | tee -a {log}')
return
run(f'echo "Booting simulator {model} asynchronously..." | tee -a {log}')
with open(log, "a") as f:
subprocess.Popen(
["xcrun", "simctl", "boot", model],
cwd=ROOT,
stdout=f,
stderr=subprocess.STDOUT,
start_new_session=True,
)
def boot_sim_sync(model):
run("mkdir -p build/logs")
for i in range(1, 7):
if is_sim_booted(model):
run('echo "Simulator booted." | tee -a build/logs/tests-run.log')
return
run(f'echo "Simulator not ready (attempt {i}/6), retrying in 10s..." | tee -a build/logs/tests-run.log')
time.sleep(10)
raise SystemExit("Simulator failed to boot")
def tests_run(model):
run("mkdir -p build/logs")
if not is_sim_booted(model):
boot_sim_sync(model)
run("make run-tests 2>&1 | tee -a build/logs/tests-run.log")
run("zip -r -9 ./test-results.zip ./build/tests")
# ----------------------------------------------------------
# LOG ENCRYPTION
# ----------------------------------------------------------
def encrypt_logs(name):
pwd = getenv("BUILD_LOG_ZIP_PASSWORD")
cwd = getcwd()
if not pwd or not pwd.strip():
print("BUILD_LOG_ZIP_PASSWORD not set — logs will be uploaded UNENCRYPTED", file=sys.stderr)
run(f'cd {cwd}/build/logs && zip -r {cwd}/{name}.zip *')
return
run(f'cd {cwd}/build/logs && zip -e -P "{pwd}" {cwd}/{name}.zip *')
# ----------------------------------------------------------
# RELEASE NOTES
# ----------------------------------------------------------
def release_notes(tag):
run(
f"python3 {SCRIPTS}/generate_release_notes.py "
f"{tag} "
f"--repo-root {ROOT} "
f"--output-dir {ROOT}"
)
def retrieve_release_notes(tag):
return runAndGet(
f"python3 {SCRIPTS}/generate_release_notes.py "
f"--retrieve {tag} "
f"--output-dir {ROOT}"
)
# ----------------------------------------------------------
# DEPLOY SOURCE.JSON
# ----------------------------------------------------------
def generate_metadata(release_tag, short_commit, marketing_version, channel, bundle_id, ipa_name, last_successful_commit=None):
ipa_path = ROOT / ipa_name
metadata = 'source-metadata.json'
if not ipa_path.exists():
raise SystemExit(f"{ipa_path} missing")
cmd = (
f"python3 {SCRIPTS}/generate_source_metadata.py "
f"--repo-root {ROOT} "
f"--ipa {ipa_path} "
f"--output-dir {ROOT} "
f"--output-name {metadata} "
f"--release-notes-dir {ROOT} "
f"--release-tag {release_tag} "
f"--marketing-version {marketing_version} "
f"--short-commit {short_commit} "
f"--release-channel {channel} "
f"--bundle-id {bundle_id}"
)
if last_successful_commit:
cmd += f" --last-successful-commit {last_successful_commit}"
run(cmd)
def deploy(repo, source_json, release_tag, marketing_version):
repo = (ROOT / repo).resolve()
source_json_path = repo / source_json
metadata = 'source-metadata.json'
if not repo.exists():
raise SystemExit(f"{repo} repo missing")
if not (repo / ".git").exists():
print("Repo is not a git repository, skipping deploy", file=sys.stderr)
return
if not source_json_path.exists():
raise SystemExit(f"{source_json} missing inside repo")
run("git config user.name 'GitHub Actions'", check=False)
run("git config user.email 'github-actions@github.com'", check=False)
# ------------------------------------------------------
run("git fetch origin main", check=False, cwd=repo)
run("git switch main || git switch -c main origin/main", cwd=repo)
run("git reset --hard origin/main", cwd=repo)
# ------------------------------------------------------
max_attempts = 5
for attempt in range(1, max_attempts + 1):
if attempt > 1:
run("git fetch --depth=1 origin HEAD", check=False, cwd=repo)
run("git reset --hard FETCH_HEAD", check=False, cwd=repo)
# regenerate after reset so we don't lose changes
run(f"python3 {SCRIPTS}/update_source_metadata.py '{ROOT}/{metadata}' '{source_json_path}'", cwd=repo)
run(f"git add --verbose {source_json}", cwd=repo)
run(f"git commit -m '{release_tag} - deployed {marketing_version}' || true", cwd=repo)
rc = subprocess.call("git push", shell=True, cwd=repo)
if rc == 0:
print("Deploy push succeeded", file=sys.stderr)
break
print(f"Push rejected (attempt {attempt}/{max_attempts}), retrying...", file=sys.stderr)
time.sleep(0.5)
else:
raise SystemExit("Deploy push failed after retries")
def last_successful_commit(is_stable, tag=None):
is_stable = str(is_stable).lower() in ("1", "true", "yes")
try:
if is_stable:
prev_tag = runAndGet(
r'git tag --sort=-v:refname '
r'| grep -E "^[0-9]+\.[0-9]+\.[0-9]+$" '
r'| sed -n "2p" || true'
).strip()
if prev_tag:
return runAndGet(f'git rev-parse "{prev_tag}^{{commit}}"')
return None # ← changed
if tag:
exists = subprocess.call(
f'git rev-parse -q --verify "refs/tags/{tag}"',
shell=True,
stdout=subprocess.DEVNULL,
stderr=subprocess.DEVNULL,
) == 0
if exists:
return runAndGet(f'git rev-parse "{tag}^{{commit}}"')
except Exception:
pass
return None
def upload_release(release_name, release_tag, commit_sha, repo, upstream_tag_recommended, is_stable=False):
is_stable = str(is_stable).lower() in ("1", "true", "yes")
draft = False
prerelease = True
latest = False
if is_stable:
prerelease = False
latest = True
token = getenv("GH_TOKEN")
if token:
os.environ["GH_TOKEN"] = token
metadata_path = ROOT / "source-metadata.json"
if not metadata_path.exists():
raise SystemExit("source-metadata.json missing")
meta = json.loads(metadata_path.read_text())
marketing_version = meta.get("version_ipa")
build_datetime = meta.get("version_date")
dt = datetime.datetime.fromisoformat(
build_datetime.replace("Z", "+00:00")
)
built_time = dt.strftime("%a %b %d %H:%M:%S %Y")
built_date = dt.strftime("%Y-%m-%d")
release_notes = runAndGet(
f"python3 {SCRIPTS}/generate_release_notes.py "
f"--retrieve {release_tag} "
f"--output-dir {ROOT}"
)
if is_stable:
release_notes = re.sub(
r'(?im)^[ \t]*#{1,6}[ \t]*what[\']?s[ \t]+changed[ \t]*$',
"## What's Changed",
release_notes,
flags=re.IGNORECASE | re.MULTILINE,
)
upstream_block = ""
if upstream_tag_recommended and upstream_tag_recommended.strip():
tag = upstream_tag_recommended.strip()
upstream_block = (
f"If you want to try out new features early but want a lower chance of bugs, "
f"you can look at [{repo} {tag}]"
f"(https://github.com/{repo}/releases?q={tag}).\n\n"
)
header = getFormattedUploadMsg(
release_name, commit_sha, repo, upstream_block,
built_time, built_date, marketing_version, is_stable,
)
body = header + release_notes.lstrip() + "\n"
body_file = ROOT / "release_body.md"
body_file.write_text(body, encoding="utf-8")
draft_flag = "--draft" if draft else ""
prerelease_flag = "--prerelease" if prerelease else ""
latest_flag = "--latest=true" if latest else ""
# create release if it doesn't exist
exists = subprocess.call(
f'gh release view "{release_tag}"',
shell=True,
cwd=ROOT,
stdout=subprocess.DEVNULL,
stderr=subprocess.DEVNULL,
) == 0
if exists:
run(
f'gh release edit "{release_tag}" '
f'--title "{release_name}" '
f'--notes-file "{body_file}" '
f'{draft_flag} {prerelease_flag} {latest_flag}'
)
else:
run(
f'gh release create "{release_tag}" '
f'--title "{release_name}" '
f'--notes-file "{body_file}" '
f'{draft_flag} {prerelease_flag} {latest_flag}'
)
run(
f'gh release upload "{release_tag}" '
f'SideStore.ipa SideStore.dSYMs.zip build-logs.zip '
f'--clobber'
)
run(f'git tag -f "{release_tag}" "{commit_sha}"')
run(f'git push origin "refs/tags/{release_tag}" --force')
def getFormattedUploadMsg(release_name, commit_sha, repo, upstream_block, built_time, built_date, marketing_version, is_stable):
experimental_header = ""
if not is_stable:
experimental_header = f"""
This is an ⚠️ **EXPERIMENTAL** ⚠️ {release_name} build for commit [{commit_sha}](https://github.com/{repo}/commit/{commit_sha}).
{release_name} builds are **extremely experimental builds only meant to be used by developers and beta testers. They often contain bugs and experimental features. Use at your own risk!**
""".lstrip("\n")
header = f"""
{experimental_header}{upstream_block}## Build Info
Built at (UTC): `{built_time}`
Built at (UTC date): `{built_date}`
Commit SHA: `{commit_sha}`
Version: `{marketing_version}`
""".lstrip("\n")
return header
# ----------------------------------------------------------
# ENTRYPOINT
# ----------------------------------------------------------
COMMANDS = {
# ----------------------------------------------------------
# SHARED
# ----------------------------------------------------------
"commit-id" : (short_commit, 0, ""),
"count-new-commits" : (count_new_commits, 1, "<last_successful_commit>"),
# ----------------------------------------------------------
# PROJECT INFO
# ----------------------------------------------------------
"get-marketing-version" : (get_marketing_version, 0, ""),
"set-marketing-version" : (set_marketing_version, 1, "<normalized_version>"),
"compute-normalized" : (compute_normalized_version,3, "<marketing> <build_num> <short_commit>"),
"get-product-name" : (get_product_name, 0, ""),
"get-bundle-id" : (get_bundle_id, 0, ""),
"dump-project-settings" : (dump_project_settings, 0, ""),
"read-product-name" : (read_product_name, 0, ""),
"read-bundle-id" : (read_bundle_id, 0, ""),
# ----------------------------------------------------------
# CLEAN
# ----------------------------------------------------------
"clean" : (clean, 0, ""),
"clean-derived-data" : (clean_derived_data, 0, ""),
"clean-spm-cache" : (clean_spm_cache, 0, ""),
# ----------------------------------------------------------
# BUILD
# ----------------------------------------------------------
"build" : (build, 0, ""),
# ----------------------------------------------------------
# TESTS
# ----------------------------------------------------------
"tests-build" : (tests_build, 0, ""),
"tests-run" : (tests_run, 1, "<model>"),
"boot-sim-async" : (boot_sim_async, 1, "<model>"),
"boot-sim-sync" : (boot_sim_sync, 1, "<model>"),
# ----------------------------------------------------------
# LOG ENCRYPTION
# ----------------------------------------------------------
"encrypt-build" : (lambda: encrypt_logs("build-logs"), 0, ""),
"encrypt-tests-build" : (lambda: encrypt_logs("tests-build-logs"), 0, ""),
"encrypt-tests-run" : (lambda: encrypt_logs("tests-run-logs"), 0, ""),
# ----------------------------------------------------------
# RELEASE / DEPLOY
# ----------------------------------------------------------
"last-successful-commit" : (last_successful_commit, 1, "<is_stable> [tag]"),
"release-notes" : (release_notes, 1, "<tag>"),
"retrieve-release-notes" : (retrieve_release_notes, 1, "<tag>"),
"generate-metadata" : (generate_metadata, 7,
"<release_tag> <short_commit> <marketing_version> <channel> <bundle_id> <ipa_name> [last_successful_commit]"),
"deploy" : (deploy, 4,
"<repo> <source_json> <release_tag> <marketing_version>"),
"upload-release" : (upload_release, 5,
"<release_name> <release_tag> <commit_sha> <repo> <upstream_tag_recommended> [is_stable]"),}
def main():
def usage():
lines = ["Available commands:"]
for name, (_, argc, arg_usage) in COMMANDS.items():
suffix = f" {arg_usage}" if arg_usage else ""
lines.append(f" - {name}{suffix}")
return "\n".join(lines)
if len(sys.argv) < 2:
raise SystemExit(usage())
cmd = sys.argv[1]
if cmd not in COMMANDS:
raise SystemExit(
f"Unknown command '{cmd}'.\n\n{usage()}"
)
func, argc, arg_usage = COMMANDS[cmd]
if len(sys.argv) - 2 < argc:
suffix = f" {arg_usage}" if arg_usage else ""
raise SystemExit(f"Usage: workflow.py {cmd}{suffix}")
args = sys.argv[2:]
result = func(*args) if args else func()
# ONLY real outputs go to stdout
if result is not None:
sys.stdout.write(str(result))
sys.stdout.flush()
if __name__ == "__main__":
main()

View File

@@ -75,6 +75,14 @@
{
"identifier": "thatstel.la.altsource",
"sourceURL": "https://alt.thatstel.la/"
},
{
"identifier": "com.deliacheminot.mona",
"sourceURL": "https://raw.githubusercontent.com/delia-cheminot/mona-hrt/refs/heads/main/ios_source.json"
},
{
"identifier": "moe.ampersand.app.source",
"sourceURL": "https://github.com/NyaomiDEV/Ampersand/releases/latest/download/altstore.json"
}
],
"sources": [
@@ -148,6 +156,14 @@
{
"identifier": "thatstel.la.altsource",
"sourceURL": "https://alt.thatstel.la/"
},
{
"identifier": "com.deliacheminot.mona",
"sourceURL": "https://raw.githubusercontent.com/delia-cheminot/mona-hrt/refs/heads/main/ios_source.json"
},
{
"identifier": "moe.ampersand.app.source",
"sourceURL": "https://github.com/NyaomiDEV/Ampersand/releases/latest/download/altstore.json"
}
]
}

View File

@@ -1,192 +0,0 @@
#!/usr/bin/env python3
import os
import json
import sys
# SIDESTORE_BUNDLE_ID = "com.SideStore.SideStore"
# Set environment variables with default values
VERSION_IPA = os.getenv("VERSION_IPA")
VERSION_DATE = os.getenv("VERSION_DATE")
IS_BETA = os.getenv("IS_BETA")
RELEASE_CHANNEL = os.getenv("RELEASE_CHANNEL")
SIZE = os.getenv("SIZE")
SHA256 = os.getenv("SHA256")
LOCALIZED_DESCRIPTION = os.getenv("LOCALIZED_DESCRIPTION")
DOWNLOAD_URL = os.getenv("DOWNLOAD_URL")
# BUNDLE_IDENTIFIER = os.getenv("BUNDLE_IDENTIFIER", SIDESTORE_BUNDLE_ID)
BUNDLE_IDENTIFIER = os.getenv("BUNDLE_IDENTIFIER")
# Uncomment to debug/test by simulating dummy input locally
# VERSION_IPA = os.getenv("VERSION_IPA", "0.0.0")
# VERSION_DATE = os.getenv("VERSION_DATE", "2000-12-18T00:00:00Z")
# IS_BETA = os.getenv("IS_BETA", True)
# RELEASE_CHANNEL = os.getenv("RELEASE_CHANNEL", "alpha")
# SIZE = int(os.getenv("SIZE", "0")) # Convert to integer
# SHA256 = os.getenv("SHA256", "")
# LOCALIZED_DESCRIPTION = os.getenv("LOCALIZED_DESCRIPTION", "Invalid Update")
# DOWNLOAD_URL = os.getenv("DOWNLOAD_URL", "https://github.com/SideStore/SideStore/releases/download/0.0.0/SideStore.ipa")
# Check if input file is provided
if len(sys.argv) < 2:
print("Usage: python3 update_apps.py <input_file>")
sys.exit(1)
input_file = sys.argv[1]
print(f"Input File: {input_file}")
# Debugging the environment variables
print(" ====> Required parameter list <====")
print("Bundle Identifier:", BUNDLE_IDENTIFIER)
print("Version:", VERSION_IPA)
print("Version Date:", VERSION_DATE)
print("IsBeta:", IS_BETA)
print("ReleaseChannel:", RELEASE_CHANNEL)
print("Size:", SIZE)
print("Sha256:", SHA256)
print("Localized Description:", LOCALIZED_DESCRIPTION)
print("Download URL:", DOWNLOAD_URL)
if IS_BETA is None:
print("Setting IS_BETA = False since no value was provided")
IS_BETA = False
if str(IS_BETA).lower() in ["true", "1", "yes"]:
IS_BETA = True
# Read the input JSON file
try:
with open(input_file, "r") as file:
data = json.load(file)
except Exception as e:
print(f"Error reading the input file: {e}")
sys.exit(1)
if (not BUNDLE_IDENTIFIER or
not VERSION_IPA or
not VERSION_DATE or
not RELEASE_CHANNEL or
not SIZE or
not SHA256 or
not LOCALIZED_DESCRIPTION or
not DOWNLOAD_URL):
print("One or more required parameter(s) were not defined as environment variable(s)")
sys.exit(1)
# Convert to integer
SIZE = int(SIZE)
# Process the JSON data
updated = False
# apps = data.get("apps", [])
# appsToUpdate = [app for app in apps if app.get("bundleIdentifier") == BUNDLE_IDENTIFIER]
# if len(appsToUpdate) == 0:
# print("No app with the specified bundle identifier found.")
# sys.exit(1)
# if len(appsToUpdate) > 1:
# print(f"Multiple apps with same `bundleIdentifier` = ${BUNDLE_IDENTIFIER} are not allowed!")
# sys.exit(1)
# app = appsToUpdate[0]
# # Update app-level metadata for store front page
# app.update({
# "beta": IS_BETA,
# })
# versions = app.get("versions", [])
# versionIfExists = [version for version in versions if version == VERSION_IPA]
# if versionIfExists: # current version is a duplicate, so reject it
# print(f"`version` = ${VERSION_IPA} already exists!, new build cannot have an existing version, Aborting!")
# sys.exit(1)
# # create an entry and keep ready
# new_version = {
# "version": VERSION_IPA,
# "date": VERSION_DATE,
# "localizedDescription": LOCALIZED_DESCRIPTION,
# "downloadURL": DOWNLOAD_URL,
# "size": SIZE,
# "sha256": SHA256,
# }
# if versions is []:
# versions.append(new_version)
# else:
# # versions.insert(0, new_version) # insert at front
# versions[0] = new_version # replace top one
# make it lowecase
RELEASE_CHANNEL = RELEASE_CHANNEL.lower()
version = data.get("version", 1)
if int(version) < 2:
print("Only v2 and above are supported for direct updates to sources.json on push")
sys.exit(1)
for app in data.get("apps", []):
if app.get("bundleIdentifier") == BUNDLE_IDENTIFIER:
if RELEASE_CHANNEL == "stable" :
# Update app-level metadata for store front page
app.update({
"version": VERSION_IPA,
"versionDate": VERSION_DATE,
"size": SIZE,
"sha256": SHA256,
"localizedDescription": LOCALIZED_DESCRIPTION,
"downloadURL": DOWNLOAD_URL,
})
# Process the versions array
channels = app.get("releaseChannels", [])
if not channels:
app["releaseChannels"] = channels
# create an entry and keep ready
new_version = {
"version": VERSION_IPA,
"date": VERSION_DATE,
"localizedDescription": LOCALIZED_DESCRIPTION,
"downloadURL": DOWNLOAD_URL,
"size": SIZE,
"sha256": SHA256,
}
tracks = [track for track in channels if track.get("track") == RELEASE_CHANNEL]
if len(tracks) > 1:
print(f"Multiple tracks with same `track` name = ${RELEASE_CHANNEL} are not allowed!")
sys.exit(1)
if not tracks:
# there was no entries in this release channel so create one
track = {
"track": RELEASE_CHANNEL,
"releases": [new_version]
}
channels.insert(0, track)
else:
track = tracks[0] # first result is the selected track
# Update the existing TOP version object entry
track["releases"][0] = new_version
updated = True
break
if not updated:
print("No app with the specified bundle identifier found.")
sys.exit(1)
# Save the updated JSON to the input file
try:
print("\nUpdated Sources File:\n")
print(json.dumps(data, indent=2, ensure_ascii=False))
with open(input_file, "w", encoding="utf-8") as file:
json.dump(data, file, indent=2, ensure_ascii=False)
print("JSON successfully updated.")
except Exception as e:
print(f"Error writing to the file: {e}")
sys.exit(1)

View File

@@ -1,381 +0,0 @@
#!/usr/bin/env python3
import subprocess
import sys
import os
import re
IGNORED_AUTHORS = [
]
TAG_MARKER = "###"
HEADER_MARKER = "####"
def run_command(cmd):
"""Run a shell command and return its trimmed output."""
return subprocess.check_output(cmd, shell=True, text=True).strip()
def get_head_commit():
"""Return the HEAD commit SHA."""
return run_command("git rev-parse HEAD")
def get_commit_messages(last_successful, current="HEAD"):
"""Return a list of commit messages between last_successful and current."""
cmd = f"git log {last_successful}..{current} --pretty=format:%s"
output = run_command(cmd)
if not output:
return []
return output.splitlines()
def get_authors_in_range(commit_range, fmt="%an"):
"""Return a set of commit authors in the given commit range using the given format."""
cmd = f"git log {commit_range} --pretty=format:{fmt}"
output = run_command(cmd)
if not output:
return set()
authors = set(line.strip() for line in output.splitlines() if line.strip())
authors = set(authors) - set(IGNORED_AUTHORS)
return authors
def get_first_commit_of_repo():
"""Return the first commit in the repository (root commit)."""
cmd = "git rev-list --max-parents=0 HEAD"
output = run_command(cmd)
return output.splitlines()[0]
def get_branch():
"""
Attempt to determine the branch base (the commit where the current branch diverged
from the default remote branch). Falls back to the repo's first commit.
"""
try:
default_ref = run_command("git rev-parse --abbrev-ref origin/HEAD")
default_branch = default_ref.split('/')[-1]
base_commit = run_command(f"git merge-base HEAD origin/{default_branch}")
return base_commit
except Exception:
return get_first_commit_of_repo()
def get_repo_url():
"""Extract and clean the repository URL from the remote 'origin'."""
url = run_command("git config --get remote.origin.url")
if url.startswith("git@"):
url = url.replace("git@", "https://").replace(":", "/")
if url.endswith(".git"):
url = url[:-4]
return url
def format_contributor(author):
"""
Convert an author name to a GitHub username or first name.
If the author already starts with '@', return it;
otherwise, take the first token and prepend '@'.
"""
if author.startswith('@'):
return author
return f"@{author.split()[0]}"
def format_commit_message(msg):
"""Format a commit message as a bullet point for the release notes."""
msg_clean = msg.lstrip() # remove leading spaces
if msg_clean.startswith("-"):
msg_clean = msg_clean[1:].strip() # remove leading '-' and spaces
return f"- {msg_clean}"
# def generate_release_notes(last_successful, tag, branch):
"""Generate release notes for the given tag."""
current_commit = get_head_commit()
messages = get_commit_messages(last_successful, current_commit)
# Start with the tag header
new_section = f"{TAG_MARKER} {tag}\n"
# What's Changed section (always present)
new_section += f"{HEADER_MARKER} What's Changed\n"
if not messages or last_successful == current_commit:
new_section += "- Nothing...\n"
else:
for msg in messages:
new_section += f"{format_commit_message(msg)}\n"
# New Contributors section (only if there are new contributors)
all_previous_authors = get_authors_in_range(f"{branch}")
recent_authors = get_authors_in_range(f"{last_successful}..{current_commit}")
new_contributors = recent_authors - all_previous_authors
if new_contributors:
new_section += f"\n{HEADER_MARKER} New Contributors\n"
for author in sorted(new_contributors):
new_section += f"- {format_contributor(author)} made their first contribution\n"
# Full Changelog section (only if there are changes)
if messages and last_successful != current_commit:
repo_url = get_repo_url()
changelog_link = f"{repo_url}/compare/{last_successful}...{current_commit}"
new_section += f"\n{HEADER_MARKER} Full Changelog: [{last_successful[:8]}...{current_commit[:8]}]({changelog_link})\n"
return new_section
def generate_release_notes(last_successful, tag, branch):
"""Generate release notes for the given tag."""
current_commit = get_head_commit()
try:
# Try to get commit messages using the provided last_successful commit
messages = get_commit_messages(last_successful, current_commit)
except subprocess.CalledProcessError:
# If the range is invalid (e.g. force push made last_successful obsolete),
# fall back to using the last 10 commits in the current branch.
print("\nInvalid revision range error, using last 10 commits as fallback.\n")
fallback_commit = run_command("git rev-parse HEAD~5")
messages = get_commit_messages(fallback_commit, current_commit)
last_successful = fallback_commit
# Start with the tag header
new_section = f"{TAG_MARKER} {tag}\n"
# What's Changed section (always present)
new_section += f"{HEADER_MARKER} What's Changed\n"
if not messages or last_successful == current_commit:
new_section += "- Nothing...\n"
else:
for msg in messages:
new_section += f"{format_commit_message(msg)}\n"
# New Contributors section (only if there are new contributors)
all_previous_authors = get_authors_in_range(f"{branch}")
recent_authors = get_authors_in_range(f"{last_successful}..{current_commit}")
new_contributors = recent_authors - all_previous_authors
if new_contributors:
new_section += f"\n{HEADER_MARKER} New Contributors\n"
for author in sorted(new_contributors):
new_section += f"- {format_contributor(author)} made their first contribution\n"
# Full Changelog section (only if there are changes)
if messages and last_successful != current_commit:
repo_url = get_repo_url()
changelog_link = f"{repo_url}/compare/{last_successful}...{current_commit}"
new_section += f"\n{HEADER_MARKER} Full Changelog: [{last_successful[:8]}...{current_commit[:8]}]({changelog_link})\n"
return new_section
def update_release_md(existing_content, new_section, tag):
"""
Update input based on rules:
1. If tag exists, update it
2. Special tags (alpha, beta, nightly) stay at the top in that order
3. Numbered tags follow special tags
4. Remove duplicate tags
5. Insert new numbered tags at the top of the numbered section
"""
tag_lower = tag.lower()
is_special_tag = tag_lower in ["alpha", "beta", "nightly"]
# Parse the existing content into sections
if not existing_content:
return new_section
# Split the content into sections by headers
pattern = fr'(^{TAG_MARKER} .*$)'
sections = re.split(pattern, existing_content, flags=re.MULTILINE)
# Create a list to store the processed content
processed_sections = []
# Track special tag positions and whether tag was found
special_tags_map = {"alpha": False, "beta": False, "nightly": False}
last_special_index = -1
tag_found = False
numbered_tag_index = -1
i = 0
while i < len(sections):
# Check if this is a header
if i % 2 == 1: # Headers are at odd indices
header = sections[i]
content = sections[i+1] if i+1 < len(sections) else ""
current_tag = header[3:].strip().lower()
# Check for special tags to track their positions
if current_tag in special_tags_map:
special_tags_map[current_tag] = True
last_special_index = len(processed_sections)
# Check if this is the first numbered tag
elif re.match(r'^[0-9]+\.[0-9]+(\.[0-9]+)?$', current_tag) and numbered_tag_index == -1:
numbered_tag_index = len(processed_sections)
# If this is the tag we're updating, mark it but don't add yet
if current_tag == tag_lower:
if not tag_found: # Replace the first occurrence
tag_found = True
i += 2 # Skip the content
continue
else: # Skip duplicate occurrences
i += 2
continue
# Add the current section
processed_sections.append(sections[i])
i += 1
# Determine where to insert the new section
if tag_found:
# We need to determine the insertion point
if is_special_tag:
# For special tags, insert after last special tag or at beginning
desired_index = -1
for pos, t in enumerate(["alpha", "beta", "nightly"]):
if t == tag_lower:
desired_index = pos
# Find position to insert
insert_pos = 0
for pos, t in enumerate(["alpha", "beta", "nightly"]):
if t == tag_lower:
break
if special_tags_map[t]:
insert_pos = processed_sections.index(f"{TAG_MARKER} {t}")
insert_pos += 2 # Move past the header and content
# Insert at the determined position
processed_sections.insert(insert_pos, new_section)
if insert_pos > 0 and not processed_sections[insert_pos-1].endswith('\n\n'):
processed_sections.insert(insert_pos, '\n\n')
else:
# For numbered tags, insert after special tags but before other numbered tags
insert_pos = 0
if last_special_index >= 0:
# Insert after the last special tag
insert_pos = last_special_index + 2 # +2 to skip header and content
processed_sections.insert(insert_pos, new_section)
if insert_pos > 0 and not processed_sections[insert_pos-1].endswith('\n\n'):
processed_sections.insert(insert_pos, '\n\n')
else:
# Tag doesn't exist yet, determine insertion point
if is_special_tag:
# For special tags, maintain alpha, beta, nightly order
special_tags = ["alpha", "beta", "nightly"]
insert_pos = 0
for i, t in enumerate(special_tags):
if t == tag_lower:
# Check if preceding special tags exist
for prev_tag in special_tags[:i]:
if special_tags_map[prev_tag]:
# Find the position after this tag
prev_index = processed_sections.index(f"{TAG_MARKER} {prev_tag}")
insert_pos = prev_index + 2 # Skip header and content
processed_sections.insert(insert_pos, new_section)
if insert_pos > 0 and not processed_sections[insert_pos-1].endswith('\n\n'):
processed_sections.insert(insert_pos, '\n\n')
else:
# For numbered tags, insert after special tags but before other numbered tags
insert_pos = 0
if last_special_index >= 0:
# Insert after the last special tag
insert_pos = last_special_index + 2 # +2 to skip header and content
processed_sections.insert(insert_pos, new_section)
if insert_pos > 0 and not processed_sections[insert_pos-1].endswith('\n\n'):
processed_sections.insert(insert_pos, '\n\n')
# Combine sections ensuring proper spacing
result = ""
for i, section in enumerate(processed_sections):
if i > 0 and section.startswith(f"{TAG_MARKER} "):
# Ensure single blank line before headers
if not result.endswith("\n\n"):
result = result.rstrip("\n") + "\n\n"
result += section
return result.rstrip() + "\n"
def retrieve_tag_content(tag, file_path):
if not os.path.exists(file_path):
return ""
with open(file_path, "r") as f:
content = f.read()
# Create a pattern for the tag header (case-insensitive)
pattern = re.compile(fr'^{TAG_MARKER} ' + re.escape(tag) + r'$', re.MULTILINE | re.IGNORECASE)
# Find the tag header
match = pattern.search(content)
if not match:
return ""
# Start after the tag line
start_pos = match.end()
# Skip a newline if present
if start_pos < len(content) and content[start_pos] == "\n":
start_pos += 1
# Find the next tag header after the current tag's content
next_tag_match = re.search(fr'^{TAG_MARKER} ', content[start_pos:], re.MULTILINE)
if next_tag_match:
end_pos = start_pos + next_tag_match.start()
return content[start_pos:end_pos].strip()
else:
# Return until the end of the file if this is the last tag
return content[start_pos:].strip()
def main():
# Update input file
release_file = "release-notes.md"
# Usage: python release.py <last_successful_commit> [tag] [branch]
# Or: python release.py --retrieve <tagname>
args = sys.argv[1:]
if len(args) < 1:
print("Usage: python release.py <last_successful_commit> [tag] [branch]")
print(" or: python release.py --retrieve <tagname>")
sys.exit(1)
# Check if we're retrieving a tag
if args[0] == "--retrieve":
if len(args) < 2:
print("Error: Missing tag name after --retrieve")
sys.exit(1)
tag_content = retrieve_tag_content(args[1], file_path=release_file)
if tag_content:
print(tag_content)
else:
print(f"Tag '{args[1]}' not found in '{release_file}'")
return
# Original functionality for generating release notes
last_successful = args[0]
tag = args[1] if len(args) > 1 else get_head_commit()
branch = args[2] if len(args) > 2 else (os.environ.get("GITHUB_REF") or get_branch())
# Generate release notes
new_section = generate_release_notes(last_successful, tag, branch)
existing_content = ""
if os.path.exists(release_file):
with open(release_file, "r") as f:
existing_content = f.read()
updated_content = update_release_md(existing_content, new_section, tag)
with open(release_file, "w") as f:
f.write(updated_content)
# Output the new section for display
print(new_section)
if __name__ == "__main__":
main()