Compare commits

...
Sign in to create a new pull request.

104 commits

Author SHA1 Message Date
4299ab3906 ilot/authentik: upgrade to 2025.2.4 2025-04-09 01:09:52 +00:00
5cf12d0754
ilot/freescout: upgrade to 1.8.174
All checks were successful
/ lint (pull_request) Successful in 30s
/ build-x86_64 (pull_request) Successful in 1m1s
/ deploy-x86_64 (pull_request) Successful in 28s
/ deploy-aarch64 (pull_request) Successful in 55s
/ build-aarch64 (pull_request) Successful in 2m47s
2025-04-08 20:28:55 -04:00
9080e7c6ba
ilot/mastodon: upgrade to 4.2.20
Some checks failed
/ lint (pull_request) Successful in 30s
/ build-aarch64 (pull_request) Successful in 1m19s
/ deploy-aarch64 (pull_request) Failing after 2m3s
/ deploy-x86_64 (pull_request) Successful in 34s
/ build-x86_64 (pull_request) Successful in 6m9s
2025-04-02 15:29:02 -04:00
46ebb5bf61
ilot/authentik: upgrade to 2025.2.3
All checks were successful
/ lint (pull_request) Successful in 31s
/ deploy-x86_64 (pull_request) Successful in 31s
/ build-x86_64 (pull_request) Successful in 38m31s
/ deploy-aarch64 (pull_request) Successful in 1m1s
/ build-aarch64 (pull_request) Successful in 2h38m50s
2025-03-31 19:56:01 -04:00
8db2d8e280 ilot/py3-kadmin: remove in favor of py3-kadmin-rs 2025-03-31 23:39:50 +00:00
bedb27d660 ilot/authentik: upgrade to 2024.12.4 2025-03-31 23:39:50 +00:00
cb5f704a49 ilot/py3-kadmin-rs: new aport 2025-03-31 23:39:50 +00:00
2f581245cb ilot/forgejo-aneksajo: upgrade to 10.0.3-git0 2025-03-31 20:07:37 +00:00
23e8d38f52
ilot/wikijs: upgrade to 2.5.307
Some checks failed
/ lint (pull_request) Successful in 50s
/ deploy-aarch64 (pull_request) Failing after 2m7s
/ build-aarch64 (pull_request) Successful in 1m16s
/ build-x86_64 (pull_request) Successful in 7m18s
/ deploy-x86_64 (pull_request) Successful in 30s
2025-03-31 15:38:38 -04:00
567715dc3a ilot/nextcloud30: upgrade to 30.0.8 2025-03-22 18:00:34 +00:00
84f28b7e5c ilot/forgejo-aneksajo: upgrade to 10.0.1_git1 2025-03-22 17:59:06 +00:00
2263b1374c
ilot/codeberg-pages-server: add post-install script giving right to binary to host on 443,80 2025-03-22 13:57:43 -04:00
1a2203ba3c ilot/codeberg-pages-server: upgrade to 6.2.1 2025-03-22 17:56:09 +00:00
2d900374a5 ilot/freescout: upgrade to 1.8.173 2025-03-22 17:51:26 +00:00
322956f740
ilot/mastodon: upgrade to 4.2.19
Some checks failed
/ lint (pull_request) Successful in 29s
/ deploy-x86_64 (pull_request) Successful in 36s
/ build-x86_64 (pull_request) Successful in 7m40s
/ build-aarch64 (pull_request) Successful in 1m14s
/ deploy-aarch64 (pull_request) Failing after 2m13s
2025-03-22 13:34:37 -04:00
0476da5791
ilot/nextcloud: new aport
Some checks failed
/ lint (pull_request) Failing after 28s
/ deploy-x86_64 (pull_request) Successful in 52s
/ build-x86_64 (pull_request) Successful in 3m54s
/ deploy-aarch64 (pull_request) Successful in 1m13s
/ build-aarch64 (pull_request) Successful in 12m2s
2025-03-01 09:38:53 -05:00
55d8000018
forgejo: preprend authorization with token 2025-02-18 12:56:39 -05:00
602730b1d2
forgejo: fix authorization issues 2025-02-18 12:09:11 -05:00
8963a82eaa
forgejo: rename FORGEJO_TOKEN to ISSUE_TOKEN 2025-02-18 11:41:08 -05:00
3e68c6c54e
ilot/ruby3.2: upgrade to 3.2.6
All checks were successful
/ lint (pull_request) Successful in 29s
/ deploy-x86_64 (pull_request) Successful in 38s
/ build-x86_64 (pull_request) Successful in 10m47s
/ deploy-aarch64 (pull_request) Successful in 58s
/ build-aarch64 (pull_request) Successful in 17m6s
2025-02-15 15:20:00 -05:00
4b7eb3b1c7
ilot/mastodon: upgrade to 4.2.15 2025-02-15 15:19:56 -05:00
9f89de44e0 ilot/ruby3.2-minitest: move from archives 2025-02-15 20:07:24 +00:00
4a83fcd6aa ilot/ruby3.2-rake: move from archives 2025-02-15 20:07:24 +00:00
90af006af3 ilot/ruby3.2-bundler: move from archives 2025-02-15 20:07:24 +00:00
58430a7691 ilot/ruby3.2: move from archives 2025-02-15 20:07:24 +00:00
5cce213407 ilot/mastodon: move from archives 2025-02-15 20:07:24 +00:00
d2dfb88a4c ilot/forgejo-aneksajo: upgrade to 10.0.1_git0 2025-02-15 19:03:42 +00:00
d03ddceaed ilot/wikijs: upgrade to 2.5.306 2025-02-15 18:54:51 +00:00
abf94b58f3 ilot/freescout: upgrade to 1.8.171 2025-02-15 18:52:28 +00:00
e0509e9cf6
ilot/codeberg-pages-server: new aport
All checks were successful
/ lint (pull_request) Successful in 32s
/ deploy-x86_64 (pull_request) Successful in 28s
/ build-x86_64 (pull_request) Successful in 2m54s
/ deploy-aarch64 (pull_request) Successful in 57s
/ build-aarch64 (pull_request) Successful in 10m24s
2025-02-14 23:14:49 -05:00
fd4a1e4702 ilot/authentik: drop unneeded dependencies 2025-01-06 00:46:44 +00:00
be89e3f4df ilot/authentik: upgrade to 2024.10.5 2025-01-06 00:46:44 +00:00
71f41b1a18 ilot/py3-kadmin: new aport 2025-01-06 00:46:44 +00:00
0f2e2e1be3 ilot/freescout: upgrade to 1.8.160 2025-01-05 23:58:15 +00:00
7327176cad ilot/py3-opentelemetry-sdk: upgrade to 1.29.0 2025-01-05 23:37:07 +00:00
cbc3ff961b ilot/py3-msal: upgrade to 1.31.1 2025-01-05 23:32:11 +00:00
0fc71d957f ilot/py3-microsoft-kiota-serialization-text: upgrade to 1.6.8 2025-01-05 23:32:02 +00:00
9eecf16a8a ilot/py3-microsoft-kiota-serialization-multipart: upgrade to 1.6.8 2025-01-05 23:32:02 +00:00
34ddb36624 ilot/py3-microsoft-kiota-serialization-json: upgrade to 1.6.8 2025-01-05 23:32:02 +00:00
bf26fe61b6 ilot/py3-microsoft-kiota-serialization-form: upgrade to 1.6.8 2025-01-05 23:32:02 +00:00
21862f26e9 ilot/py3-microsoft-kiota-http: upgrade to 1.6.8 2025-01-05 23:32:02 +00:00
8a3d97b422 ilot/py3-microsoft-kiota-authentication-azure: upgrade to 1.6.8 2025-01-05 23:32:02 +00:00
9f3ba9e531 ilot/py3-microsoft-kiota-abstractions: upgrade to 1.6.8 2025-01-05 23:32:02 +00:00
30eaecfc7d ilot/py3-std-uritemplate: upgrade to 2.0.1 2025-01-05 23:32:02 +00:00
52b34da89e ilot/py3-msgraph-core: upgrade to 1.1.8 2025-01-05 23:26:42 +00:00
723b39f43c ilot/py3-msgraph-sdk: upgrade to 1.16.0 2025-01-05 23:26:42 +00:00
bfd897eb7b ilot/uptime-kuma: upgrade to 1.23.16 2025-01-05 23:12:10 +00:00
7443d8d886
ilot/forgejo-aneksajo: upgrade to 9.0.3_git0
All checks were successful
/ lint (pull_request) Successful in 59s
/ deploy-x86_64 (pull_request) Successful in 28s
/ build-x86_64 (pull_request) Successful in 5m11s
/ deploy-aarch64 (pull_request) Successful in 1m5s
/ build-aarch64 (pull_request) Successful in 18m32s
2025-01-05 15:06:05 -05:00
bdf770d7d7
ilot/uvicorn: new aport
Some checks failed
/ lint (pull_request) Failing after 28s
/ deploy-x86_64 (pull_request) Successful in 26s
/ build-x86_64 (pull_request) Successful in 2m35s
/ deploy-aarch64 (pull_request) Successful in 54s
/ build-aarch64 (pull_request) Successful in 3m48s
2025-01-05 11:35:07 -05:00
01a8bc900f
ilot/authentik: fix impersonnate api 2024-12-07 09:34:07 -05:00
8a3689b757
ilot/wikijs: remove prebuilts 2024-12-01 15:44:39 -05:00
37e23225dd
ilot/authentik: fix depend syntax 2024-12-01 15:44:03 -05:00
ea28bc8eee
ilot/authentik: missing uvicorn 2024-12-01 15:32:17 -05:00
f20094dd9e
forgejo-ci: check v3.21 branch 2024-12-01 15:15:31 -05:00
06cfdea8b8
ilot/listmonk: upgrade to 4.1.0 2024-12-01 15:07:36 -05:00
1ecc969d96
ilot/py3-django-tenant-schemas: drop due to in community 2024-12-01 15:01:00 -05:00
f5b189f460
ilot/py3-pyrad: drop due to in community 2024-12-01 15:00:08 -05:00
d8e3e4f9f4
ilot/py3-drf-orjson-renderer: drop due to in community 2024-12-01 14:59:26 -05:00
0689e2c3c7
ilot/py3-django-tenants: drop due to in community 2024-12-01 14:58:47 -05:00
ba5cf2f94c
backports/forgejo-runner: drop due to in community 2024-12-01 14:57:23 -05:00
d059d73a8e
ilot/authentik: upgrade to 2024.8.6
Some checks failed
/ lint (pull_request) Successful in 30s
/ build-x86_64 (pull_request) Successful in 44m22s
/ deploy-x86_64 (pull_request) Successful in 31s
/ deploy-aarch64 (pull_request) Has been cancelled
/ build-aarch64 (pull_request) Has been cancelled
2024-11-23 19:30:29 -05:00
77d3ecf5d9
ilot/authentik: upgrade to 2024.8.5
All checks were successful
/ lint (pull_request) Successful in 29s
/ deploy-x86_64 (pull_request) Successful in 31s
/ build-x86_64 (pull_request) Successful in 46m27s
/ deploy-aarch64 (pull_request) Successful in 1m3s
/ build-aarch64 (pull_request) Successful in 2h30m29s
2024-11-21 09:39:40 -05:00
d22514437b
ilot/forgejo-aneksajo: fix GITEA_VERSION bad format
All checks were successful
/ lint (pull_request) Successful in 26s
/ deploy-x86_64 (pull_request) Successful in 28s
/ build-x86_64 (pull_request) Successful in 4m40s
/ build-aarch64 (pull_request) Successful in 18m23s
/ deploy-aarch64 (pull_request) Successful in 58s
2024-11-07 12:56:10 -05:00
7e1fce21a3 ilot/forgejo-aneksajo: upgrade to 8.0.3-git-annex2 2024-11-05 16:37:52 +00:00
0310eecafc
ilot/py3-microsoft-kiota-serialization-multipart: upgrade to 1.6.0
All checks were successful
/ lint (pull_request) Successful in 28s
/ deploy-x86_64 (pull_request) Successful in 37s
/ build-x86_64 (pull_request) Successful in 6m29s
/ build-aarch64 (pull_request) Successful in 19m55s
/ deploy-aarch64 (pull_request) Successful in 59s
2024-11-02 20:30:23 -04:00
6cf8bcea90
ilot/py3-msgraph-core: upgrade to 1.1.6 2024-11-02 20:30:21 -04:00
f5389215f4
ilot/py3-msgraph-sdk: upgrade to 1.11.0 2024-11-02 20:30:19 -04:00
2866ff3f02
ilot/py3-azure-core: upgrade to 1.32.0 2024-11-02 20:30:17 -04:00
a2414fadf7
ilot/py3-azure-identity: upgrade to 1.19.0 2024-11-02 20:30:15 -04:00
dadbb6f590
ilot/py3-microsoft-kiota-serialization-text: upgrade to 1.6.0 2024-11-02 20:30:13 -04:00
de00f92f6c
ilot/py3-microsoft-kiota-abstractions: upgrade to 1.6.0 2024-11-02 20:30:09 -04:00
d564d63986
ilot/py3-microsoft-kiota-authentication-azure: upgrade to 1.6.0 2024-11-02 20:21:12 -04:00
d4eeae6ddd
ilot/py3-microsoft-kiota-http: upgrade to 1.6.0 2024-11-02 20:20:21 -04:00
360b60188f
ilot/py3-microsoft-kiota-serialization-form: upgrade to 1.6.0 2024-11-02 20:20:12 -04:00
de8877ef98
ilot/py3-microsoft-kiota-serialization-json: upgrade to 1.6.0 2024-11-02 20:20:02 -04:00
717d3ba696
ilot/authentik: upgrade to 2024.8.4
All checks were successful
/ lint (pull_request) Successful in 33s
/ build-x86_64 (pull_request) Successful in 48m3s
/ deploy-x86_64 (pull_request) Successful in 31s
/ deploy-aarch64 (pull_request) Successful in 1m3s
/ build-aarch64 (pull_request) Successful in 2h35m55s
2024-10-30 19:26:00 -04:00
30ab52a3f7 ilot/authentik: bump build 2024-10-30 23:24:27 +00:00
7d8ed26c17
forgejo: Fix is_it_old logics 2024-10-29 09:00:06 -04:00
eb8242c6f0
forgejo: chose highest version when dealing with multiple downstream_versions
Some checks failed
/ lint (pull_request) Successful in 26s
/ deploy-aarch64 (pull_request) Failing after 2m4s
/ build-aarch64 (pull_request) Successful in 1m13s
/ deploy-x86_64 (pull_request) Has been cancelled
/ build-x86_64 (pull_request) Has been cancelled
2024-10-29 08:54:18 -04:00
d07957ab63
forgejo: add special case for forgejo-aneksajo when checking upstream version
Some checks failed
/ lint (pull_request) Successful in 26s
/ deploy-x86_64 (pull_request) Successful in 26s
/ build-x86_64 (pull_request) Successful in 4m34s
/ deploy-aarch64 (pull_request) Has been cancelled
/ build-aarch64 (pull_request) Has been cancelled
2024-10-29 08:42:13 -04:00
b9cb527dae
ilot/forgejo-aneksajo: change version scheme 2024-10-29 08:42:01 -04:00
64a3309cba
ilot/wikijs: upgrade to 2.5.305
Some checks failed
/ lint (pull_request) Successful in 26s
/ build-aarch64 (pull_request) Successful in 1m11s
/ deploy-aarch64 (pull_request) Failing after 2m4s
/ deploy-x86_64 (pull_request) Successful in 30s
/ build-x86_64 (pull_request) Successful in 7m10s
2024-10-29 08:09:30 -04:00
0aabc7bd0c
forgejo: only remove files that already exist 2024-10-29 08:02:03 -04:00
178543f2aa ilot/uptime-kuma: upgrade to 1.23.15 2024-10-29 11:52:49 +00:00
b5286d3c38 ilot/py3-sentry-sdk: drop due to inclusion in community 2024-10-29 11:30:14 +00:00
1674b6f550 ilot/py3-scim2-filter-parser: drop due to inclusion in community 2024-10-29 11:30:14 +00:00
5c78bf6cf2 ilot/py3-django-pglock: drop due to inclusion in community 2024-10-29 11:30:14 +00:00
c4a06a6117 ilot/py3-django-pgactivity: drop due to inclusion in community 2024-10-29 11:30:14 +00:00
28175614b7 ilot/py3-django-dynamic-fixture: drop due to inclusion in community 2024-10-29 11:30:14 +00:00
702d61a5dc ilot/py3-django-cte: drop due to inclusion in community 2024-10-29 11:30:14 +00:00
bd438f03e4
Check every day at 5 am instead of hourly
All checks were successful
/ lint (pull_request) Successful in 27s
/ deploy-aarch64 (pull_request) Successful in 59s
/ build-aarch64 (pull_request) Successful in 7m39s
/ build-x86_64 (pull_request) Successful in 35s
/ deploy-x86_64 (pull_request) Successful in 28s
2024-10-28 08:37:24 -04:00
bf00362653
ilot/listmonk: upgrade to 4.0.1 2024-10-28 08:37:19 -04:00
c5c4d3ec50
forgejo: update is_it_old to use new title format 2024-10-27 17:01:58 -04:00
b5dc442a11
forgejo: add package update check 2024-10-27 15:12:17 -04:00
af58598623 ilot/authentik: upgrade to 2024.8.3 2024-10-18 04:27:15 +00:00
3ca03d544d ilot/py3-django-tenants: drop py3-django-tenant-schemas dependency 2024-10-18 04:27:15 +00:00
e32f82ba99
ilot/forgejo-aneksajo: upgrade to 8.0.3-git-annex1
All checks were successful
/ lint (pull_request) Successful in 32s
/ deploy-x86_64 (pull_request) Successful in 33s
/ build-x86_64 (pull_request) Successful in 4m55s
/ build-aarch64 (pull_request) Successful in 18m10s
/ deploy-aarch64 (pull_request) Successful in 1m2s
2024-10-16 07:01:16 -04:00
13836bf48a ilot/forgejo-aneksajo: upgrade to 8.0.3 2024-09-30 23:49:14 +00:00
4da2f29af2
ilot/py3-django-tenant-schemas: new aport
All checks were successful
/ lint (pull_request) Successful in 25s
/ deploy-x86_64 (pull_request) Successful in 26s
/ build-x86_64 (pull_request) Successful in 1m1s
/ deploy-aarch64 (pull_request) Successful in 59s
/ build-aarch64 (pull_request) Successful in 2m14s
2024-09-30 08:52:56 -04:00
d64b003da5
ilot/py3-django-tenants: add py3-django-tenant-schemas depend 2024-09-30 08:52:28 -04:00
5f8fbe32aa
ilot/authentik: clean-up packaging, add pyc subpkg
Some checks failed
/ lint (pull_request) Successful in 25s
/ deploy-aarch64 (pull_request) Has been cancelled
/ build-aarch64 (pull_request) Has been cancelled
/ deploy-x86_64 (pull_request) Has been cancelled
/ build-x86_64 (pull_request) Has been cancelled
2024-09-22 18:43:16 -04:00
7566c53ff7
forgejo-ci: change hostname to what is in hosts 2024-09-21 07:51:06 -04:00
69891fb74f
ilot/py3-sentry-sdk: new aport 2024-09-21 07:51:04 -04:00
a5f12565b4
ilot/authentik: enable check 2024-09-21 07:50:58 -04:00
97 changed files with 2838 additions and 4516 deletions

34
.forgejo/bin/check_ver.sh Executable file
View file

@ -0,0 +1,34 @@
#!/bin/bash
# expects the following env variables:
# downstream: downstream repo
repo=${downstream/*\/}
curl --silent $downstream/x86_64/APKINDEX.tar.gz | tar -O -zx APKINDEX > APKINDEX
owned_by_you=$(awk -v RS= -v ORS="\n\n" '/m:Antoine Martin \(ayakael\) <dev@ayakael.net>/' APKINDEX | awk -F ':' '{if($1=="o"){print $2}}' | sort | uniq)
echo "Found $(printf '%s\n' $owned_by_you | wc -l ) packages owned by you"
rm -f out_of_date not_in_anitya
for pkg in $owned_by_you; do
upstream_version=$(curl --fail -X GET -sS -H 'Content-Type: application/json' "https://release-monitoring.org/api/v2/packages/?name=$pkg&distribution=Alpine" | jq -r '.items.[].stable_version')
downstream_version=$(sed -n "/^P:$pkg$/,/^$/p" APKINDEX | awk -F ':' '{if($1=="V"){print $2}}' | sort -V | tail -n 1)
downstream_version=${downstream_version/-*}
# special case for forgejo-aneksajo:
upstream_version=${upstream_version/-git-annex/_git}
if [ -z "$upstream_version" ]; then
echo "$pkg not in anitya"
echo "$pkg" >> not_in_anitya
elif [ "$downstream_version" != "$(printf '%s\n' $upstream_version $downstream_version | sort -V | head -n 1)" ]; then
echo "$pkg higher downstream"
continue
elif [ "$upstream_version" != "$downstream_version" ]; then
echo "$pkg upstream version $upstream_version does not match downstream version $downstream_version"
echo "$pkg $downstream_version $upstream_version $repo" >> out_of_date
fi
done

165
.forgejo/bin/create_issue.sh Executable file
View file

@ -0,0 +1,165 @@
#!/bin/bash
# expects:
# env variable ISSUE_TOKEN
# file out_of_date
IFS='
'
repo=${downstream/*\/}
does_it_exist() {
name=$1
downstream_version=$2
upstream_version=$3
repo=$4
query="$repo/$name: upgrade to $upstream_version"
query="$(echo $query | sed 's| |%20|g' | sed 's|:|%3A|g' | sed 's|/|%2F|g' )"
result="$(curl --silent -X 'GET' \
"$GITHUB_SERVER_URL/api/v1/repos/$GITHUB_REPOSITORY/issues?state=open&q=$query&type=issues" \
-H 'accept: application/json' \
-H "Authorization: token $ISSUE_TOKEN"
)"
if [ "$result" == "[]" ]; then
return 1
fi
}
is_it_old() {
name=$1
downstream_version=$2
upstream_version=$3
repo=$4
query="$repo/$name: upgrade to"
query="$(echo $query | sed 's| |%20|g' | sed 's|:|%3A|g' | sed 's|/|%2F|g' )"
result="$(curl --silent -X 'GET' \
"$GITHUB_SERVER_URL/api/v1/repos/$GITHUB_REPOSITORY/issues?state=open&q=$query&type=issues" \
-H 'accept: application/json' \
-H "authorization: token $ISSUE_TOKEN"
)"
result_title="$(echo $result | jq -r '.[].title' )"
result_id="$(echo $result | jq -r '.[].number' )"
result_upstream_version="$(echo $result_title | awk '{print $4}')"
if [ "$upstream_version" != "$result_upstream_version" ]; then
echo $result_id
else
echo 0
fi
}
update_title() {
name=$1
downstream_version=$2
upstream_version=$3
repo=$4
id=$5
result=$(curl --silent -X 'PATCH' \
"$GITHUB_SERVER_URL/api/v1/repos/$GITHUB_REPOSITORY/issues/$id" \
-H 'accept: application/json' \
-H "authorization: token $ISSUE_TOKEN" \
-H 'Content-Type: application/json' \
-d "{
\"title\": \"$repo/$name: upgrade to $upstream_version\"
}"
)
return 0
}
create_issue() {
name=$1
downstream_version=$2
upstream_version=$3
repo=$4
result=$(curl --silent -X 'POST' \
"$GITHUB_SERVER_URL/api/v1/repos/$GITHUB_REPOSITORY/issues" \
-H 'accept: application/json' \
-H "authorization: token $ISSUE_TOKEN" \
-H 'Content-Type: application/json' \
-d "{
\"title\": \"$repo/$name: upgrade to $upstream_version\",
\"labels\": [
$LABEL_NUMBER
]
}")
return 0
}
if [ -f out_of_date ]; then
out_of_date="$(cat out_of_date)"
echo "Detected $(wc -l out_of_date) out-of-date packages, creating issues"
for pkg in $out_of_date; do
name="$(echo $pkg | awk '{print $1}')"
downstream_version="$(echo $pkg | awk '{print $2}')"
upstream_version="$(echo $pkg | awk '{print $3}')"
repo="$(echo $pkg | awk '{print $4}')"
if does_it_exist $name $downstream_version $upstream_version $repo; then
echo "Issue for $repo/$name already exists"
continue
fi
id=$(is_it_old $name $downstream_version $upstream_version $repo)
if [ "$id" != "0" ] && [ -n "$id" ]; then
echo "Issue for $repo/$name needs updating"
update_title $name $downstream_version $upstream_version $repo $id
continue
fi
echo "Creating issue for $repo/$name"
create_issue $name $downstream_version $upstream_version $repo
done
fi
if [ -f not_in_anitya ]; then
query="Add missing $repo packages to anitya"
query="$(echo $query | sed 's| |%20|g')"
result="$(curl --silent -X 'GET' \
"$GITHUB_SERVER_URL/api/v1/repos/$GITHUB_REPOSITORY/issues?state=open&q=$query&type=issues" \
-H 'accept: application/json' \
-H "authorization: token $ISSUE_TOKEN"
)"
if [ "$result" == "[]" ]; then
echo "Creating anitya issue"
result=$(curl --silent -X 'POST' \
"$GITHUB_SERVER_URL/api/v1/repos/$GITHUB_REPOSITORY/issues" \
-H 'accept: application/json' \
-H "authorization: token $ISSUE_TOKEN" \
-H 'Content-Type: application/json' \
-d "{
\"title\": \"Add missing $repo packages to anitya\",
\"body\": \"- [ ] $(sed '{:q;N;s/\n/\\n- [ ] /g;t q}' not_in_anitya)\",
\"labels\": [
$LABEL_NUMBER
]
}")
else
echo "Updating anitya issue"
result_id="$(echo $result | jq -r '.[].number' )"
result=$(curl --silent -X 'PATCH' \
"$GITHUB_SERVER_URL/api/v1/repos/$GITHUB_REPOSITORY/issues/$result_id" \
-H 'accept: application/json' \
-H "authorization: token $ISSUE_TOKEN" \
-H 'Content-Type: application/json' \
-d "{
\"body\": \"- [ ] $(sed '{:q;N;s/\n/\\n- [ ] /g;t q}' not_in_anitya)\"
}"
)
fi
fi

View file

@ -3,7 +3,7 @@
# shellcheck disable=SC3040
set -eu -o pipefail
readonly REPOS="backports ilot"
readonly REPOS="backports user"
readonly BASEBRANCH=$GITHUB_BASE_REF
readonly TARGET_REPO=$CI_ALPINE_REPO
@ -14,18 +14,12 @@ for apk in $apkgs; do
arch=$(echo $apk | awk -F '/' '{print $3}')
name=$(echo $apk | awk -F '/' '{print $4}')
# always clear out package before deploying
for delarch in x86_64 aarch64 armv7 armhf s390x ppc64le riscv64 loongarch64 x86; do
curl -s --user $FORGE_REPO_USER:$FORGE_REPO_TOKEN -X DELETE $TARGET_REPO/$BASEBRANCH/$branch/$delarch/$name 2>&1 > /dev/null
done
if [ "$(curl -s $GITHUB_SERVER_URL/api/v1/repos/$GITHUB_REPOSITORY/pulls/$GITHUB_EVENT_NUMBER | jq .draft)" == "true" ]; then
# if draft, send to -testing branch
branch="$branch-testing"
fi
echo "Sending $name of arch $arch to $TARGET_REPO/$BASEBRANCH/$branch"
curl -s --user $FORGE_REPO_USER:$FORGE_REPO_TOKEN --upload-file $apk $TARGET_REPO/$BASEBRANCH/$branch
return=$(curl -s --user $FORGE_REPO_USER:$FORGE_REPO_TOKEN --upload-file $apk $TARGET_REPO/$BASEBRANCH/$branch 2>&1)
echo $return
if [ "$return" == "package file already exists" ]; then
echo "Package already exists, refreshing..."
curl -s --user $FORGE_REPO_USER:$FORGE_REPO_TOKEN -X DELETE $TARGET_REPO/$BASEBRANCH/$branch/$arch/$name
curl -s --user $FORGE_REPO_USER:$FORGE_REPO_TOKEN --upload-file $apk $TARGET_REPO/$BASEBRANCH/$branch
fi
done

View file

@ -19,7 +19,8 @@ jobs:
steps:
- name: Environment setup
run: |
doas apk add nodejs git patch curl
doas apk add nodejs git patch curl net-tools
doas hostname host.docker.internal
cd /etc/apk/keys
doas curl -JO https://forge.ilot.io/api/packages/ilot/alpine/key
- name: Repo pull

View file

@ -19,7 +19,8 @@ jobs:
steps:
- name: Environment setup
run: |
doas apk add nodejs git patch curl
doas apk add nodejs git patch curl net-tools
doas hostname host.docker.internal
cd /etc/apk/keys
doas curl -JO https://forge.ilot.io/api/packages/ilot/alpine/key
- name: Repo pull

View file

@ -0,0 +1,27 @@
on:
workflow_dispatch:
schedule:
- cron: '0 5 * * *'
jobs:
check-user:
name: Check user repo
runs-on: x86_64
container:
image: alpine:latest
env:
downstream: https://forge.ilot.io/api/packages/ilot/alpine/v3.21/ilot
ISSUE_TOKEN: ${{ secrets.issue_token }}
LABEL_NUMBER: 8
steps:
- name: Environment setup
run: apk add grep coreutils gawk curl wget bash nodejs git jq sed
- name: Get scripts
uses: actions/checkout@v4
with:
fetch-depth: 1
- name: Check out-of-date packages
run: ${{ github.workspace }}/.forgejo/bin/check_ver.sh
- name: Create issues
run: ${{ github.workspace }}/.forgejo/bin/create_issue.sh

View file

@ -1,47 +0,0 @@
# Contributor: Patrycja Rosa <alpine@ptrcnull.me>
# Maintainer: Patrycja Rosa <alpine@ptrcnull.me>
pkgname=forgejo-runner
pkgver=3.5.0
pkgrel=2
pkgdesc="CI/CD job runner for Forgejo"
url="https://code.forgejo.org/forgejo/runner"
arch="all"
license="MIT"
makedepends="go"
install="$pkgname.pre-install $pkgname.pre-upgrade"
subpackages="$pkgname-openrc"
source="$pkgname-$pkgver.tar.gz::https://code.forgejo.org/forgejo/runner/archive/v$pkgver.tar.gz
forgejo-runner.logrotate
forgejo-runner.initd
forgejo-runner.confd
"
builddir="$srcdir/runner"
options="!check" # tests require running forgejo
build() {
go build \
-o forgejo-runner \
-ldflags "-X gitea.com/gitea/act_runner/internal/pkg/ver.version=$pkgver"
./forgejo-runner generate-config > config.example.yaml
}
check() {
go test ./...
}
package() {
install -Dm755 forgejo-runner -t "$pkgdir"/usr/bin/
install -Dm644 config.example.yaml -t "$pkgdir"/etc/forgejo-runner/
install -Dm755 "$srcdir"/forgejo-runner.initd "$pkgdir"/etc/init.d/forgejo-runner
install -Dm644 "$srcdir"/forgejo-runner.confd "$pkgdir"/etc/conf.d/forgejo-runner
install -Dm644 "$srcdir"/forgejo-runner.logrotate "$pkgdir"/etc/logrotate.d/forgejo-runner
}
sha512sums="
e78968a5f9b6e797fb759a5c8cbf46a5c2fef2083dabc88599c9017729faface963576c63a948b0add424cb267902e864fb1a1b619202660296976d93e670713 forgejo-runner-3.5.0.tar.gz
a3c7238b0c63053325d31e09277edd88690ef5260854517f82d9042d6173fb5d24ebfe36e1d7363673dd8801972638a6e69b6af8ad43debb6057515c73655236 forgejo-runner.logrotate
bb0c6fbe90109c77f9ef9cb0d35d20b8033be0e4b7a60839b596aa5528dfa24309ec894d8c04066bf8fb30143e63a5fd8cc6fc89aac364422b583e0f840e2da6 forgejo-runner.initd
e11eab27f88f1181112389befa7de3aa0bac7c26841861918707ede53335535425c805e6682e25704e9c8a6aecba3dc13e20900a99df1183762b012b62f26d5f forgejo-runner.confd
"

View file

@ -1,17 +0,0 @@
# Configuration for /etc/init.d/forgejo-runner
# Path to the config file (--config).
#cfgfile="/etc/forgejo-runner/config.yaml"
# Path to the working directory (--working-directory).
#datadir="/var/lib/forgejo-runner"
# Path to the log file where stdout/stderr will be redirected.
# Leave empty/commented out to use syslog instead.
#output_log="/var/log/forgejo-runner.log"
# You may change this to root, e.g. to run jobs in LXC
#command_user="forgejo-runner"
# Comment out to run without process supervisor.
supervisor=supervise-daemon

View file

@ -1,38 +0,0 @@
#!/sbin/openrc-run
description="Forgejo CI Runner"
name="Forgejo Runner"
: ${cfgfile:="/etc/forgejo-runner/config.yaml"}
: ${datadir:="/var/lib/forgejo-runner"}
: ${command_user:="forgejo-runner"}
command="/usr/bin/forgejo-runner"
command_args="daemon --config $cfgfile"
command_background="yes"
directory="$datadir"
pidfile="/run/$RC_SVCNAME.pid"
depend() {
need net
use dns logger
}
start_pre() {
checkpath -d -o "$command_user" /etc/forgejo-runner
checkpath -d -o "$command_user" "$datadir"
if ! [ -e "$cfgfile" ]; then
eerror "Config file $cfgfile doesn't exist."
eerror "You can generate it with: forgejo-runner generate-config,"
eerror "or use the auto-generated one in /etc/forgejo-runner/config.example.yaml"
return 1
fi
if [ "$error_log" ]; then
output_log="$error_log"
else
output_logger="logger -t '${RC_SVCNAME}' -p daemon.info"
error_logger="logger -t '${RC_SVCNAME}' -p daemon.error"
fi
}

View file

@ -1,5 +0,0 @@
/var/log/forgejo-runner.log {
copytruncate
missingok
notifempty
}

View file

@ -1,14 +0,0 @@
#!/bin/sh
addgroup -S forgejo-runner 2>/dev/null
adduser -S -D -H -h /var/lib/forgejo-runner -s /sbin/nologin -G forgejo-runner -g forgejo-runner forgejo-runner 2>/dev/null
cat >&2 <<EOF
* In order to setup the runner, create a config file
* in /etc/forgejo-runner/config.yaml (either from .example.yaml,
* or generating your own with 'forgejo-runner generate-config'),
* then register it with 'doas -u forgejo-runner forgejo-runner register'
* ran in the /var/lib/forgejo-runner directory.
EOF
exit 0

View file

@ -1 +0,0 @@
forgejo-runner.pre-install

View file

@ -1,7 +1,7 @@
# Contributor: Antoine Martin (ayakael) <dev@ayakael.net>
# Maintainer: Antoine Martin (ayakael) <dev@ayakael.net>
pkgname=authentik
pkgver=2024.8.2
pkgver=2025.2.4
pkgrel=0
pkgdesc="An open-source Identity Provider focused on flexibility and versatility"
url="https://github.com/goauthentik/authentik"
@ -10,6 +10,9 @@ url="https://github.com/goauthentik/authentik"
# ppc64le: not supported by Rollup build
arch="aarch64 x86_64"
license="MIT"
# following depends aren't direct dependencies, but are needed:
# py3-asn1crypto, py3-cbor2, py3-email-validator, py3-websockets
# py3-openssl, py3-uvloop, py3-httptools
depends="
bash
libcap-setcap
@ -17,150 +20,103 @@ depends="
postgresql
procps
pwgen
py3-aiohttp
py3-aiosignal
py3-amqp
py3-anyio
py3-asgiref
py3-asn1
py3-asn1crypto
py3-async-timeout
py3-attrs
py3-autobahn
py3-automat
py3-bcrypt
py3-billiard
py3-cachetools
py3-cbor2
py3-celery
py3-certifi
py3-cffi
py3-channels
py3-channels_redis
py3-charset-normalizer
py3-click
py3-click-didyoumean
py3-click-plugins
py3-click-repl
py3-codespell
py3-colorama
py3-constantly
py3-cparser
py3-cryptography
py3-dacite
py3-daphne
py3-dateutil
py3-deepmerge
py3-defusedxml
py3-deprecated
py3-dnspython
py3-docker-py
py3-django
py3-django-countries
py3-django-cte
py3-django-filter
py3-django-guardian
py3-django-model-utils
py3-django-otp
py3-django-prometheus
py3-django-pglock
py3-django-redis
py3-django-rest-framework~=3.14.0
py3-django-rest-framework~3.14.0
py3-django-rest-framework-guardian
py3-django-storages
py3-django-tenants
py3-docker-py
py3-dotenv
py3-dumb-init
py3-duo_client
py3-duo-client
py3-drf-orjson-renderer
py3-drf-spectacular
py3-email-validator
py3-fido2
py3-flower
py3-frozenlist
py3-geoip2
py3-google-auth
py3-geopy
py3-google-api-python-client
py3-gunicorn
py3-h11
py3-httptools
py3-humanize
py3-hyperlink
py3-idna
py3-incremental
py3-inflection
py3-jsonschema
py3-jsonpatch
py3-jwt
py3-kombu
py3-jwcrypto
py3-kadmin-rs
py3-kubernetes
py3-ldap3
py3-lxml
py3-maxminddb
py3-msgpack
py3-msgraph-sdk
py3-multidict
py3-oauthlib
py3-opencontainers
py3-openssl
py3-packaging
py3-paramiko
py3-parsing
py3-prometheus-client
py3-prompt_toolkit
py3-psycopg
py3-psycopg-c
py3-pydantic
py3-pydantic-scim
py3-pynacl
py3-pyrsistent
py3-pyrad
py3-python-jwt
py3-redis
py3-requests
py3-python-gssapi
py3-requests-oauthlib
py3-rsa
py3-scim2-filter-parser
py3-setproctitle
py3-sentry-sdk
py3-service_identity
py3-setuptools
py3-six
py3-sniffio
py3-sqlparse
py3-structlog
py3-swagger-spec-validator
py3-tornado
py3-twilio
py3-txaio
py3-tenant-schemas-celery
py3-typing-extensions
py3-tz
py3-ua-parser
py3-uritemplate
py3-unidecode
py3-urllib3-secure-extra
py3-uvloop
py3-vine
py3-watchdog
py3-watchfiles
py3-wcwidth
py3-webauthn
py3-websocket-client
py3-websockets
py3-wrapt
py3-wsproto
py3-xmlsec
py3-yaml
py3-yarl
py3-zope-interface
py3-zxcvbn
redis
valkey
uvicorn
"
makedepends="go npm"
# checkdepends scooped up by poetry due to number
checkdepends="poetry py3-coverage"
# tests disabled for now
options="!check"
makedepends="go npm py3-packaging"
checkdepends="
py3-pip
py3-coverage
py3-codespell
py3-colorama
py3-pytest
py3-pytest-django
py3-pytest-randomly
py3-pytest-timeout
py3-freezegun
py3-boto3
py3-requests-mock
py3-k5test
"
install="$pkgname.post-install $pkgname.post-upgrade $pkgname.pre-install"
source="
$pkgname-$pkgver.tar.gz::https://github.com/goauthentik/authentik/archive/refs/tags/version/$pkgver.tar.gz
@ -174,7 +130,7 @@ source="
go-downgrade-1.22.patch
"
builddir="$srcdir/"authentik-version-$pkgver
subpackages="$pkgname-openrc $pkgname-doc"
subpackages="$pkgname-openrc $pkgname-doc $pkgname-pyc"
pkgusers="authentik"
pkggroups="authentik"
@ -204,63 +160,137 @@ build() {
npm run build
}
# test failure neutralized due to:
# relation authentik_core_user_pb_groups_id_seq does not exist
check() {
msg "Setting up test environments"
export POSTGRES_DB=authentik
export POSTGRES_USER=authentik
export POSTGRES_PASSWORD="EK-5jnKfjrGRm<77"
export AUTHENTIK_POSTGRESQL__TEST__NAME=authentik
rm -Rf "$srcdir"/tmp
initdb -D "$srcdir"/tmp
postgres -D "$srcdir"/tmp --unix-socket-directories="$srcdir" > "$srcdir"/tmp/psql.log 2>&1 &
valkey-server > "$srcdir"/tmp/valkey.log 2>&1 &
trap "pkill valkey-server; pkill postgres" EXIT
sleep 5
psql -h "$srcdir" -d postgres -c "CREATE ROLE $POSTGRES_USER PASSWORD '$POSTGRES_PASSWORD' INHERIT LOGIN;"
psql -h "$srcdir" -d postgres -c "CREATE DATABASE $POSTGRES_DB OWNER $POSTGRES_USER ENCODING 'UTF-8';"
psql -h "$srcdir" -d postgres -c "CREATE DATABASE test_$POSTGRES_DB OWNER $POSTGRES_USER ENCODING 'UTF-8';"
# .github/actions/setup/action.yml: Generate config + csrf
python3 -c "
from authentik.lib.generators import generate_id
from yaml import safe_dump
with open(\"local.env.yml\", \"w\") as _config:
safe_dump(
{
\"log_level\": \"debug\",
\"secret_key\": generate_id(),
\"csrf\": { \"trusted_origins\": ['https://*']},
},
_config,
default_flow_style=False,
)
"
python -m lifecycle.migrate
# no selenium package
pip install selenium drf_jsonschema_serializer pdoc --break-system-packages
msg "Starting tests"
make test || true
# TODO: Fix go-tests
# make go-test
pkill valkey-server
pkill postgres
}
package() {
msg "Packaging $pkgname"
mkdir -p "$pkgdir"/usr/share/webapps/authentik/web
mkdir -p "$pkgdir"/usr/share/webapps/authentik/website
mkdir -p "$pkgdir"/var/lib/authentik
mkdir -p "$pkgdir"/usr/share/doc
mkdir -p "$pkgdir"/usr/bin
cp -dr "$builddir"/authentik "$pkgdir"/usr/share/webapps/authentik
cp -dr "$builddir"/web/dist "$pkgdir"/usr/share/webapps/authentik/web/dist
cp -dr "$builddir"/web/authentik "$pkgdir"/usr/share/webapps/authentik/web/authentik
cp -dr "$builddir"/website/build "$pkgdir"/usr/share/doc/authentik
cp -dr "$builddir"/tests "$pkgdir"/usr/share/webapps/authentik/tests
cp -dr "$builddir"/lifecycle "$pkgdir"/usr/share/webapps/authentik/lifecycle
cp -dr "$builddir"/locale "$pkgdir"/usr/share/webapps/authentik/locale
cp -dr "$builddir"/blueprints "$pkgdir"/var/lib/authentik/blueprints
install -Dm755 "$builddir"/manage.py "$pkgdir"/usr/share/webapps/authentik/manage.py
install -Dm755 "$builddir"/server "$pkgdir"/usr/share/webapps/authentik/server
ln -s "/etc/authentik/config.yml" "$pkgdir"/usr/share/webapps/authentik/local.env.yml
local prefix="/usr/share/webapps"
local destdir="$pkgdir"$prefix/authentik
install -Dm755 "$builddir"/proxy "$pkgdir"/usr/bin/authentik-proxy
install -Dm755 "$builddir"/ldap "$pkgdir"/usr/bin/authentik-ldap
install -Dm755 "$builddir"/radius "$pkgdir"/usr/bin/authentik-radius
# authentik install
install -d -m755 \
"$destdir" \
"$destdir"/web \
"$pkgdir"/usr/bin \
"$pkgdir"/usr/share/doc \
"$pkgdir"/var/lib/authentik
install -Dm755 "$srcdir"/$pkgname.openrc \
"$pkgdir"/etc/init.d/$pkgname
install -Dm755 "$srcdir"/$pkgname-worker.openrc \
"$pkgdir"/etc/init.d/$pkgname-worker
install -Dm755 "$srcdir"/$pkgname-ldap.openrc \
"$pkgdir"/etc/init.d/$pkgname-ldap
install -Dm640 "$srcdir"/$pkgname-ldap.conf \
"$pkgdir"/etc/conf.d/$pkgname-ldap
cp -rl authentik lifecycle locale tests \
"$destdir"/
cp -rl blueprints \
"$pkgdir"/var/lib/authentik/
cp -rl web/dist web/authentik \
"$destdir"/web/
install -m755 -t "$destdir" \
"$builddir"/server \
"$builddir"/ldap \
"$builddir"/radius \
"$builddir"/proxy \
"$builddir"/manage.py
cp -rl website/build/ "$pkgdir"/usr/share/doc/authentik/
# symbolic bin links to usr/bin
for i in server proxy ldap radius; do
ln -s $prefix/authentik/$i "$pkgdir"/usr/bin/authentik-$i
done
# openrc install
for i in $pkgname $pkgname-worker $pkgname-ldap; do
install -Dm755 "$srcdir"/$i.openrc "$pkgdir"/etc/init.d/$i
done
# config file setup
install -Dm640 "$builddir"/authentik/lib/default.yml \
"$pkgdir"/etc/authentik/config.yml
ln -s "/etc/authentik/config.yml" "$pkgdir"/usr/share/webapps/authentik/local.env.yml
chown root:www-data "$pkgdir"/etc/authentik/config.yml
mv "$pkgdir"/usr/share/webapps/authentik/web/dist/custom.css "$pkgdir"/etc/authentik/custom.css
ln -s "/etc/authentik/custom.css" "$pkgdir"/usr/share/webapps/authentik/web/dist/custom.css
chown root:www-data "$pkgdir"/etc/authentik/custom.css
sed -i 's|cert_discovery_dir.*|cert_discovery_dir: /var/lib/authentik/certs|' "$pkgdir"/etc/authentik/config.yml
sed -i 's|blueprints_dir.*|blueprints_dir: /var/lib/authentik/blueprints|' "$pkgdir"/etc/authentik/config.yml
sed -i 's|template_dir.*|template_dir: /var/lib/authentik/templates|' "$pkgdir"/etc/authentik/config.yml
printf "\ncsrf:\n trusted_origins: ['auth.example.com']" >> "$pkgdir"/etc/authentik/config.yml
printf "\nsecret_key: '@@SECRET_KEY@@'" >> "$pkgdir"/etc/authentik/config.yml
# custom css location change
mv "$pkgdir"/usr/share/webapps/authentik/web/dist/custom.css "$pkgdir"/etc/authentik/custom.css
ln -s "/etc/authentik/custom.css" "$pkgdir"/usr/share/webapps/authentik/web/dist/custom.css
chown root:www-data "$pkgdir"/etc/authentik/custom.css
# Install wrapper script to /usr/bin.
install -m755 -D "$srcdir"/authentik-manage.sh "$pkgdir"/usr/bin/authentik-manage
}
pyc() {
default_pyc
cd "$pkgdir"
# shellcheck disable=SC3003
local IFS=$'\n'
# shellcheck disable=SC2046
amove $(find usr/share/webapps/authentik -type d -name __pycache__)
}
sha512sums="
02e54183fa35e7a06780f68239db7b3b5e2ccd3c6e1fcaf97690d9b596077c7a5345dbb5b005f39ff67a0dae83bd9b71d1c6d18ba8fae9cc7174d5d856360bff authentik-2024.8.2.tar.gz
75928b3ab9ae126f3cbe88ff1256de8adba7add099b0d93615abb8c91a2b7f275e83664a232e8c5393c5031bd9757af2f20fdb9d0153dacdf9a482b6b4bb8b00 authentik-2025.2.4.tar.gz
4defb4fe3a4230f4aa517fbecd5e5b8bcef2a64e1b40615660ae9eec33597310a09df5e126f4d39ce7764bd1716c0a7040637699135c103cbc1879593c6c06f1 authentik.openrc
6cb03b9b69df39bb4539fe05c966536314d766b2e9307a92d87070ba5f5b7e7ab70f1b5ee1ab3c0c50c23454f9c5a4caec29e63fdf411bbb7a124ad687569b89 authentik-worker.openrc
351e6920d987861f8bf0d7ab2f942db716a8dbdad1f690ac662a6ef29ac0fd46cf817cf557de08f1c024703503d36bc8b46f0d9eb1ecaeb399dce4c3bb527d17 authentik-ldap.openrc
89ee5f0ffdade1c153f3a56ff75b25a7104aa81d8c7a97802a8f4b0eab34850cee39f874dabe0f3c6da3f71d6a0f938f5e8904169e8cdd34d407c8984adee6b0 authentik-ldap.conf
f1a3cb215b6210fa7d857a452a9f2bc4dc0520e49b9fa7027547cff093d740a7e2548f1bf1f8831f7d5ccb80c8e523ee0c8bafcc4dc42d2788725f2137d21bee authentik-manage.sh
3e47db684a3f353dcecdb7bab8836b9d5198766735d77f676a51d952141a0cf9903fcb92e6306c48d2522d7a1f3028b37247fdc1dc74d4d6e043da7eb4f36d49 fix-ak-bash.patch
3d38076606d18a438a2d76cdd2067774d5471bb832e641050630726b4d7bd8b8c2218d25d7e987a1fb46ee6a4a81d13e899145f015b3c94204cece039c7fb182 fix-ak-bash.patch
5c60e54b6a7829d611af66f5cb8184a002b5ae927efbd024c054a7c176fcb9efcfbe5685279ffcf0390b0f0abb3bb03e02782c6867c2b38d1ad2d508aae83fa0 root-settings-csrf_trusted_origins.patch
badff70b19aad79cf16046bd46cb62db25c2a8b85b2673ce7c44c42eb60d42f6fcb1b9a7a7236c00f24803b25d3c66a4d64423f7ce14a59763b8415db292a5b9 go-downgrade-1.22.patch
"

View file

@ -1,10 +1,10 @@
diff --git a/lifecycle/ak.orig b/lifecycle/ak
index 615bfe9..1646274 100755
index 44dc480..49a0cef 100755
--- a/lifecycle/ak.orig
+++ b/lifecycle/ak
@@ -1,4 +1,4 @@
-#!/usr/bin/env -S bash -e
-#!/usr/bin/env -S bash
+#!/usr/bin/env bash
set -e -o pipefail
MODE_FILE="${TMPDIR}/authentik-mode"
function log {

View file

@ -0,0 +1,49 @@
# Contributor: Antoine Martin (ayakael) <dev@ayakael.net>
# Maintainer: Antoine Martin (ayakael) <dev@ayakael.net>
pkgname=codeberg-pages-server
pkgver=6.2.1
pkgrel=1
pkgdesc="The Codeberg Pages Server with custom domain support, per-repo pages using the pages branch, caching and more."
url="https://codeberg.org/Codeberg/pages-server"
arch="all"
license="EUPL-1.2"
depends="libcap-setcap nginx"
makedepends="go just"
install="$pkgname.post-install"
# tests disabled for now
options="!check"
source="
$pkgname-$pkgver.tar.gz::https://codeberg.org/Codeberg/pages-server/archive/v$pkgver.tar.gz
codeberg-pages-server.openrc
downgrade-go.patch
"
builddir="$srcdir/"pages-server
subpackages="$pkgname-openrc"
pkgusers="git"
pkggroups="www-data"
export GOPATH=$srcdir/go
export GOCACHE=$srcdir/go-build
export GOTMPDIR=$srcdir
build() {
just build
}
package() {
msg "Packaging $pkgname"
install -Dm755 "$builddir"/build/codeberg-pages-server \
"$pkgdir"/usr/bin/codeberg-pages-server
install -Dm755 "$srcdir"/$pkgname.openrc \
"$pkgdir"/etc/init.d/$pkgname
install -Dm600 "$builddir"/example_config.toml \
"$pkgdir"/etc/codeberg-pages-server/pages.conf
}
sha512sums="
87992a244a580ef109fa891fd4e4ab5bf8320076f396c63e23b83e2c49e3c34fed2d6562283fc57dd89ebc13596dd7b8cbdfa7202eee43cbbd86b6a7f3b52c26 codeberg-pages-server-6.2.1.tar.gz
4808057de5d539fd9ad3db67b650d45ed60c53e07eff840115af09729ac198791b465b61da547eac1dffd0633e5855c348aa7663d6f6cb5984f7fc999be08589 codeberg-pages-server.openrc
1f02e3e9a6f0aab9b516fa7ffaaeb92da3ab839fbcf07f672398063d784c8c0ca373edc0f9a26132d40a60345c4894a5f757c13bf7500f5753f5ffcdf10c52db downgrade-go.patch
"

View file

@ -0,0 +1,23 @@
#!/sbin/openrc-run
: ${config:=/etc/codeberg-pages-server/pages.conf}
name="$RC_SVCNAME"
cfgfile="/etc/conf.d/$RC_SVCNAME.conf"
pidfile="/run/$RC_SVCNAME.pid"
working_directory="/var/lib/codeberg-pages-server"
command="/usr/bin/codeberg-pages-server"
command_args="--config-file $config"
command_user="nginx"
command_group="nginx"
start_stop_daemon_args=""
command_background="yes"
output_log="/var/log/codeberg-pages-server/$RC_SVCNAME.log"
error_log="/var/log/codeberg-pages-server/$RC_SVCNAME.err"
start_pre() {
checkpath --directory --owner $command_user:$command_group --mode 0775 \
/var/log/codeberg-pages-server \
/var/lib/codeberg-pages-server
cd "$working_directory"
}

View file

@ -0,0 +1,10 @@
#!/bin/sh
set -eu
setcap 'cap_net_bind_service=+ep' /usr/bin/codeberg-pages-server
cat >&2 <<-EOF
*
* 1. Adjust settings in /etc/codeberg-pages-server/pages.conf
*
EOF

View file

@ -0,0 +1,12 @@
diff --git a/go.mod.orig b/go.mod
index bff6b77..2b9f2e4 100644
--- a/go.mod.orig
+++ b/go.mod
@@ -1,6 +1,6 @@
module codeberg.org/codeberg/pages
-go 1.24.0
+go 1.23.6
require (
code.gitea.io/sdk/gitea v0.20.0

View file

@ -4,14 +4,14 @@
# Contributor: Patrycja Rosa <alpine@ptrcnull.me>
# Maintainer: Antoine Martin (ayakael) <dev@ayakael.net>
pkgname=forgejo-aneksajo
pkgver=8.0.2
_gittag=v$pkgver-git-annex0
pkgver=10.0.3_git0
_gittag=v${pkgver/_git/-git-annex}
pkgrel=0
pkgdesc="Self-hosted Git service written in Go with git-annex support"
url="https://forgejo.org"
# riscv64: builds fail https://codeberg.org/forgejo/forgejo/issues/3025
arch="all !riscv64"
license="MIT"
license="GPL-3.0-or-later"
depends="git git-lfs gnupg"
makedepends="go nodejs npm"
checkdepends="bash openssh openssh-keygen sqlite tzdata"
@ -55,7 +55,7 @@ build() {
# XXX: LARGEFILE64
export CGO_CFLAGS="$CFLAGS -O2 -D_LARGEFILE64_SOURCE"
export TAGS="bindata sqlite sqlite_unlock_notify"
export GITEA_VERSION="$pkgver"
export GITEA_VERSION="${pkgver/_git/-git-annex}"
export EXTRA_GOFLAGS="$GOFLAGS"
export CGO_LDFLAGS="$LDFLAGS"
unset LDFLAGS
@ -106,7 +106,7 @@ package() {
}
sha512sums="
25d6353aca66e292b4ff51c7d10900f142b62357578fd81069caabc7361cee2de761bc03ef08d809fbbbec5cb65e370a79a1cd91153b73611fa7d9966631383c forgejo-aneksajo-v8.0.2-git-annex0.tar.gz
eb93a9f6c8f204de5c813f58727015f53f9feaab546589e016c60743131559f04fc1518f487b6d2a0e7fa8fab6d4a67cd0cd9713a7ccd9dec767a8c1ddebe129 forgejo-aneksajo.initd
e32c919228df167374e8f3099e2e59bfab610aac6c87465318efe1cac446d014535e270f57b0bf8b2a7eb3843c5dcb189eac4dad2e230b57acd9096ead647eca forgejo-aneksajo-v10.0.3-git-annex0.tar.gz
497d8575f2eb5ac43baf82452e76007ef85e22cca2cc769f1cf55ffd03d7ce4d50ac4dc2b013e23086b7a5577fc6de5a4c7e5ec7c287f0e3528e908aaa2982aa forgejo-aneksajo.initd
b537b41b6b3a945274a6028800f39787b48c318425a37cf5d40ace0d1b305444fd07f17b4acafcd31a629bedd7d008b0bb3e30f82ffeb3d7e7e947bdbe0ff4f3 forgejo-aneksajo.ini
"

View file

@ -1,15 +1,24 @@
#!/sbin/openrc-run
supervisor=supervise-daemon
: ${command_user:="forgejo:www-data"}
: ${cfgfile:="/etc/forgejo/app.ini"}
: ${directory:="/var/lib/forgejo"}
: ${output_log="/var/log/forgejo/http.log"}
: ${error_log="/var/log/forgejo/http.log"}
: ${supervisor="supervise-daemon"}
name=forgejo
command="/usr/bin/forgejo"
command_user="${FORGEJO_USER:-forgejo}:www-data"
command_args="web --config '${FORGEJO_CONF:-/etc/forgejo/app.ini}'"
supervise_daemon_args="--env FORGEJO_WORK_DIR='${FORGEJO_WORK_DIR:-/var/lib/forgejo}' --chdir '${FORGEJO_WORK_DIR:-/var/lib/forgejo}' --stdout '${FORGEJO_LOG_FILE:-/var/log/forgejo/http.log}' --stderr '${FORGEJO_LOG_FILE:-/var/log/forgejo/http.log}'"
pidfile="/run/forgejo.pid"
command_args="web --config '$cfgfile' $command_args"
command_background="yes"
pidfile="/run/$RC_SVCNAME.pid"
required_files="$cfgfile"
export FORGEJO_WORK_DIR="$directory"
depend() {
use logger dns
need net
after firewall mysql postgresql
use logger dns
need net
after firewall mysql postgresql
}

View file

@ -1,7 +1,7 @@
# Maintainer: Antoine Martin (ayakael) <dev@ayakael.net>
# Contributor: Antoine Martin (ayakael) <dev@ayakael.net>
pkgname=freescout
pkgver=1.8.152
pkgver=1.8.174
pkgrel=0
pkgdesc="Free self-hosted help desk & shared mailbox"
arch="noarch"
@ -9,7 +9,7 @@ url="freescout.net"
license="AGPL-3.0"
_php=php83
_php_mods="-fpm -mbstring -xml -imap -zip -gd -curl -intl -tokenizer -pdo_pgsql -openssl -session -iconv -fileinfo -dom -pcntl"
depends="$_php ${_php_mods//-/$_php-} nginx postgresql pwgen"
depends="$_php ${_php_mods//-/$_php-} nginx postgresql pwgen bash"
makedepends="composer pcre"
install="$pkgname.post-install $pkgname.post-upgrade $pkgname.pre-install"
source="
@ -17,6 +17,7 @@ source="
freescout.nginx
freescout-manage.sh
rename-client-to-membre-fr-en.patch
fix-laravel-log-viewer.patch
"
pkgusers="freescout"
pkggroups="freescout"
@ -75,8 +76,9 @@ package() {
install -m755 -D "$srcdir"/freescout-manage.sh "$pkgdir"/usr/bin/freescout-manage
}
sha512sums="
0e4d6d4a1aaeba2d39db8678e436f3c46c1a1fd79ea2b37c9ac95cbb319306b818991981987f6ac7dc8100a084d4189fa12f7639b24e2744705fa409ac349864 freescout-1.8.152.tar.gz
c5ec40b3dd7f6f593a950d96632e69d8e0a43e17f566f3d83b52aa44e2aac8ef98c536e9408faa834051d7fb3f07e003642f5e6e2a25a69ea51cf7b96290fb1d freescout-1.8.174.tar.gz
e4af6c85dc12f694bef2a02e4664e31ed50b2c109914d7ffad5001c2bbd764ef25b17ecaa59ff55ef41bccf17169bf910d1a08888364bdedd0ecc54d310e661f freescout.nginx
7ce9b3ee3a979db44f5e6d7daa69431e04a5281f364ae7be23e5a0a0547f96abc858d2a8010346be2fb99bd2355fb529e7030ed20d54f310249e61ed5db4d0ba freescout-manage.sh
3416da98d71aea5a7093913ea34e783e21ff05dca90bdc5ff3d00c548db5889f6d0ec98441cd65ab9f590be5cd59fdd0d7f1c98b5deef7bb3adbc8db435ec9bf rename-client-to-membre-fr-en.patch
0cba00b7d945ce84f72a2812d40028a073a5278856f610e46dbfe0ac78deff6bf5eba7643635fa4bc64d070c4d49eb47d24ea0a05ba1e6ea76690bfd77906366 rename-client-to-membre-fr-en.patch
2c651db6adac6d53597ba36965d0c65e005293f9b030e6be167853e4089384920524737aa947c5066877ee8caefb46741ccba797f653e7c2678556063540d261 fix-laravel-log-viewer.patch
"

View file

@ -0,0 +1,13 @@
diff --git a/vendor/composer/installed.json.orig b/vendor/composer/installed.json
index 0b826f5..9d14ec8 100644
--- a/vendor/composer/installed.json.orig
+++ b/vendor/composer/installed.json
@@ -4494,7 +4494,7 @@
"installation-source": "dist",
"autoload": {
"classmap": [
- "src/controllers"
+ "src/"
],
"psr-0": {
"Rap2hpoutre\\LaravelLogViewer\\": "src/"

View file

@ -38,7 +38,7 @@ index 00000000..82d26052
+}
\ No newline at end of file
diff --git a/resources/lang/fr.json.orig b/resources/lang/fr.json
index ff8d9d4..98d158f 100644
index 6264973..8a7037e 100644
--- a/resources/lang/fr.json.orig
+++ b/resources/lang/fr.json
@@ -26,8 +26,8 @@
@ -201,8 +201,8 @@ index ff8d9d4..98d158f 100644
- "This number is not visible to customers. It is only used to track conversations within :app_name": "Ce numéro n'est pas visible pour les clients. Il est uniquement utilisé pour suivre les conversations dans :app_name",
+ "This number is not visible to customers. It is only used to track conversations within :app_name": "Ce numéro n'est pas visible pour les membres. Il est uniquement utilisé pour suivre les conversations dans :app_name",
"This password is incorrect.": "Ce mot de passe est incorrect.",
- "This reply will go to the customer. :%switch_start%Switch to a note:switch_end if you are replying to :user_name.": "Cette réponse ira au client. :%switch_start%Passez à une note:switch_end si vous répondez à :user_name.",
+ "This reply will go to the customer. :%switch_start%Switch to a note:switch_end if you are replying to :user_name.": "Cette réponse ira au membre. :%switch_start%Passez à une note:switch_end si vous répondez à :user_name.",
- "This reply will go to the customer. :%switch_start%Switch to a note:%switch_end% if you are replying to :user_name.": "Cette réponse ira au client. :%switch_start%Passez à une note:%switch_end% si vous répondez à :user_name.",
+ "This reply will go to the customer. :%switch_start%Switch to a note:%switch_end% if you are replying to :user_name.": "Cette réponse ira au membre. :%switch_start%Passez à une note:%switch_end% si vous répondez à :user_name.",
"This setting gives you control over what page loads after you perform an action (send a reply, add a note, change conversation status or assignee).": "Ce paramètre vous permet de contrôler la page qui se charge après avoir effectué une action (envoyer une réponse, ajouter une note, etc.).",
- "This text will be added to the beginning of each email reply sent to a customer.": "Ce texte sera ajouté au début de chaque réponse par e-mail envoyée à un client.",
+ "This text will be added to the beginning of each email reply sent to a customer.": "Ce texte sera ajouté au début de chaque réponse par e-mail envoyée à un membre.",

View file

@ -1,8 +1,8 @@
# Contributor: Antoine Martin (ayakael) <dev@ayakael.net>
# Maintainer: Antoine Martin (ayakael) <dev@ayakael.net>
pkgname=listmonk
pkgver=3.0.0
pkgrel=2
pkgver=4.1.0
pkgrel=0
pkgdesc='Self-hosted newsletter and mailing list manager with a modern dashboard'
arch="all"
url=https://listmonk.app
@ -10,6 +10,7 @@ license="AGPL3"
depends="
libcap-setcap
postgresql
postgresql-contrib
procps
"
makedepends="go npm nodejs yarn"
@ -52,6 +53,7 @@ package() {
install -Dm644 -t "$pkgdir"/usr/share/webapps/listmonk/ \
schema.sql \
queries.sql \
permissions.json \
config.toml.sample
install -Dm755 listmonk "$pkgdir"/usr/share/webapps/listmonk/
install -Dm644 -t "$pkgdir"/usr/share/webapps/listmonk/frontend/dist/ \
@ -65,7 +67,7 @@ package() {
ln -s /etc/listmonk/config.toml "$pkgdir"/usr/share/webapps/listmonk/config.toml
}
sha512sums="
afd0ea1d4d2b2753c3043526590cf09c45a541a2d818f5d1581644ffd10818326fd553a3b04bca59494860a7bb6e96364b08afd33d337a9fc5c71bedd1a5ee6c listmonk-3.0.0.tar.gz
936b33d6de1d69ee4e7f768810116ac997c516754aace0371089bc8106bebee944197864afc11b7bc5725afa9a4f195d6629957bfcdd37c847e3780aa34558ec listmonk-4.1.0.tar.gz
939450af4b23708e3d23a5a88fad4c24b957090bdd21351a6dd520959e52e45e5fcac117a3eafa280d9506616dae39ad3943589571f008cac5abe1ffd8062424 listmonk.sh
8e9c0b1f335c295fb741418246eb17c7566e5e4200a284c6483433e8ddbf5250aa692435211cf062ad1dfcdce3fae9148def28f03f2492d33fe5e66cbeebd4bd listmonk.openrc
"

View file

@ -10,6 +10,12 @@ if [ "${0##*.}" = 'post-upgrade' ]; then
*
* listmonk --upgrade
*
* If upgrading from v3.0.0, please first set the following env variables:
*
* export LISTMONK_ADMIN_USER=your-admin-user
* export LISTMONK_ADMIN_PASSWORD=your-admin-password
* listmonk --upgrade
*
EOF
else
cat >&2 <<-EOF

View file

@ -2,9 +2,9 @@
# Maintainer: Antoine Martin (ayakael) <dev@ayakael.net>
pkgname=mastodon
_pkgname=$pkgname
pkgver=4.2.10
pkgver=4.2.20
_gittag=v$pkgver
pkgrel=1
pkgrel=0
pkgdesc="Self-hosted social media and network server based on ActivityPub and OStatus"
arch="x86_64"
url="https://github.com/mastodon/mastodon"
@ -192,7 +192,7 @@ assets() {
}
sha512sums="
1fe5417136bc020a83b83eaccef7f1f46c13fc8318681f12ba556b1b6b03e25ef7b6335c28f4e6722101e97b63020cbd0d3fbacdaf9b3b5a4b73c3cf3e230813 mastodon-v4.2.10.tar.gz
132df11b54bf0f900e2ee6e149ddb730706a67fc6130ead63b327028fa590944f21a19bcba07d859885717208b6abc005d0aee7675fd8e0fb09ad8d6f8f631b7 mastodon-v4.2.20.tar.gz
d49fea9451c97ccefe5e35b68e4274aeb427f9d1e910b89c1f6c810489c3bec1ccff72952fdaef95abf944b8aff0da84a52347540d36ff1fba5ccc19e1d935c6 mastodon.initd
eefe12a31268245f802222c0001dac884e03adb0d301e53a1512a3cd204836ca03ad083908cd14d146cf0dce99e3a4366570efd0e40a9a490ccd381d4c63c32f mastodon.web.initd
8fc9249c01693bb02b8d1a6177288d5d3549addde8c03eb35cc7a32dde669171872ebc2b5deb8019dc7a12970098f1af707171fa41129be31b04e1dc1651a777 mastodon.sidekiq.initd

325
ilot/nextcloud30/APKBUILD Normal file
View file

@ -0,0 +1,325 @@
# Contributor: Jakub Jirutka <jakub@jirutka.cz>
# Contributor: jahway603 <jahway603@protonmail.com>
# Maintainer: Leonardo Arena <rnalrd@alpinelinux.org>
_pkgname=nextcloud
pkgver=30.0.8
pkgrel=0
is_latest=true
_pkgvermaj=${pkgver%%.*}
pkgname=nextcloud$_pkgvermaj
_replaced_ver=$(( _pkgvermaj - 1 ))
pkgdesc="A safe home for all your data"
url="https://nextcloud.com/"
arch="noarch"
license="AGPL-3.0-only"
_php=php83
_php_mods="-bcmath -ctype -curl -dom -gd -fileinfo -gmp -iconv -intl
-mbstring -opcache -openssl -pcntl -posix -session
-simplexml -xml -xmlreader -xmlwriter -zip"
depends="ca-certificates $_php ${_php_mods//-/$_php-}"
makedepends="xmlstarlet"
$is_latest && provides="$_pkgname=$pkgver-r$pkgrel
$_pkgname-accessibility=$pkgver-r$pkgrel
$pkgname-accessibility=$pkgver-r$pkgrel
$_pkgname-bruteforcesettings=$pkgver-r$pkgrel
$pkgname-bruteforcesettings=$pkgver-r$pkgrel
$_pkgname-contactsinteraction=$pkgver-r$pkgrel
$pkgname-contactsinteraction=$pkgver-r$pkgrel
$_pkgname-cloud_federation_api=$pkgver-r$pkgrel
$pkgname-cloud_federation_api=$pkgver-r$pkgrel
$_pkgname-dav=$pkgver-r$pkgrel
$pkgname-dav=$pkgver-r$pkgrel
$_pkgname-files=$pkgver-r$pkgrel
$pkgname-files=$pkgver-r$pkgrel
$_pkgname-files_videoplayer=$pkgver-r$pkgrel
$pkgname-files_videoplayer=$pkgver-r$pkgrel
$_pkgname-federatedfilesharing=$pkgver-r$pkgrel
$pkgname-federatedfilesharing=$pkgver-r$pkgrel
$_pkgname-lookup_server_connector=$pkgver-r$pkgrel
$pkgname-lookup_server_connector=$pkgver-r$pkgrel
$_pkgname-oauth2=$pkgver-r$pkgrel
$pkgname-oauth2=$pkgver-r$pkgrel
$_pkgname-provisioning_api=$pkgver-r$pkgrel
$pkgname-provisioning_api=$pkgver-r$pkgrel
$_pkgname-related_resources=$pkgver-r$pkgrel
$pkgname-related_resources=$pkgver-r$pkgrel
$_pkgname-settings=$pkgver-r$pkgrel
$pkgname-settings=$pkgver-r$pkgrel
$_pkgname-theming=$pkgver-r$pkgrel
$pkgname-theming=$pkgver-r$pkgrel
$_pkgname-twofactor_backupcodes=$pkgver-r$pkgrel
$pkgname-twofactor_backupcodes=$pkgver-r$pkgrel
$_pkgname-twofactor_nextcloud_notification=$pkgver-r$pkgrel
$pkgname-twofactor_nextcloud_notification=$pkgver-r$pkgrel
$_pkgname-twofactor_totp=$pkgver-r$pkgrel
$pkgname-twofactor_totp=$pkgver-r$pkgrel
$_pkgname-viewer=$pkgver-r$pkgrel
$pkgname-viewer=$pkgver-r$pkgrel
$_pkgname-workflowengine=$pkgver-r$pkgrel
$pkgname-workflowengine=$pkgver-r$pkgrel
" || provides="$pkgname-accessibility=$pkgver-r$pkgrel
$pkgname-bruteforcesettings=$pkgver-r$pkgrel
$pkgname-contactsinteraction=$pkgver-r$pkgrel
$pkgname-cloud_federation_api=$pkgver-r$pkgrel
$pkgname-dav=$pkgver-r$pkgrel
$pkgname-files=$pkgver-r$pkgrel
$pkgname-files_videoplayer=$pkgver-r$pkgrel
$pkgname-federatedfilesharing=$pkgver-r$pkgrel
$pkgname-lookup_server_connector=$pkgver-r$pkgrel
$pkgname-oauth2=$pkgver-r$pkgrel
$pkgname-provisioning_api=$pkgver-r$pkgrel
$pkgname-related_resources=$pkgver-r$pkgrel
$pkgname-settings=$pkgver-r$pkgrel
$pkgname-theming=$pkgver-r$pkgrel
$pkgname-twofactor_backupcodes=$pkgver-r$pkgrel
$pkgname-twofactor_nextcloud_notification=$pkgver-r$pkgrel
$pkgname-twofactor_totp=$pkgver-r$pkgrel
$pkgname-viewer=$pkgver-r$pkgrel
$pkgname-workflowengine=$pkgver-r$pkgrel
"
replaces="nextcloud$_replaced_ver"
install="$pkgname.pre-install $pkgname.pre-upgrade $pkgname.post-upgrade $pkgname.post-install
$pkgname-initscript.post-install"
subpackages="$pkgname-doc $pkgname-initscript $pkgname-mysql $pkgname-pgsql $pkgname-sqlite
$pkgname-default-apps:_default_apps $pkgname-occ"
source="https://download.nextcloud.com/server/releases/nextcloud-$pkgver.tar.bz2
nextcloud-dont-chmod.patch
dont-update-htaccess.patch
disable-integrity-check-as-default.patch
use-external-docs-if-local-not-avail.patch
$_pkgname-config.php
$_pkgname.logrotate
$_pkgname.confd
$_pkgname.cron
$_pkgname-mysql.cnf
fpm-pool.conf
occ
"
options="!check"
pkgusers="nextcloud"
pkggroups="www-data"
builddir="$srcdir"/$_pkgname
# List of bundled apps to separate into subpackages. Keep it in sync!
# Note: Don't add "bruteforcesettings", "contactsinteraction",
# "cloud_federation_api", "dav", "files",
# "federatedfilesharing", "lookup_server_connector", "provisioning_api",
# "oauth2", "settings", "twofactor_backupcodes", "twofactor_totp",
# "twofactor_nextcloud_notification", "theming", "viewer",
# "workflowengine", "related_resources"
# here, these should be always installed.
_apps="activity
admin_audit
circles
comments
dashboard
encryption
federation
files_downloadlimit
files_external
files_pdfviewer
files_reminders
files_sharing
files_trashbin
files_versions
firstrunwizard
logreader
nextcloud_announcements
notifications
password_policy
photos
privacy
recommendations
serverinfo
support
sharebymail
survey_client
suspicious_login
systemtags
text
user_ldap
user_status
weather_status
webhook_listeners
"
for _i in $_apps; do
subpackages="$subpackages $pkgname-$_i:_package_app"
done
# Directory for apps shipped with Nextcloud.
_appsdir="usr/share/webapps/$_pkgname/apps"
package() {
local basedir="var/lib/$_pkgname"
local datadir="$basedir/data"
local wwwdir="usr/share/webapps/$_pkgname"
local confdir="etc/$_pkgname"
mkdir -p "$pkgdir"
cd "$pkgdir"
mkdir -p ./${wwwdir%/*}
cp -a "$builddir" ./$wwwdir
chmod +x ./$wwwdir/occ
chmod 664 ./$wwwdir/.htaccess \
./$wwwdir/.user.ini
# Let's not ship upstream's 'updatenotification' app and updater, which
# has zero chance of working and a big chance of blowing things up.
rm -r ./$wwwdir/apps/updatenotification \
./$wwwdir/lib/private/Updater/VersionCheck.php
# Replace bundled CA bundle with ours.
ln -sf /etc/ssl/certs/ca-certificates.crt ./$wwwdir/resources/config/ca-bundle.crt
install -d -m 770 -o nextcloud -g www-data \
./$confdir ./$datadir ./$basedir/apps
install -d -m 775 -o nextcloud -g www-data \
./var/log/$_pkgname
# Create symlink from web root to site-apps, so web server can find
# assets w/o explicit configuration for this layout.
ln -s /$basedir/apps ./$wwwdir/apps-appstore
mv ./$wwwdir/config/* ./$confdir/
rm -r ./$wwwdir/config
ln -s /$confdir ./$wwwdir/config
mkdir -p ./usr/share/doc/$pkgname
mv ./$wwwdir/core/doc ./usr/share/doc/$pkgname/core
install -m 660 -o nextcloud -g www-data \
"$srcdir"/$_pkgname-config.php ./$confdir/config.php
install -m 644 -D "$srcdir"/$_pkgname.logrotate ./etc/logrotate.d/$_pkgname
install -m 755 -D "$srcdir"/occ ./usr/bin/occ
# Clean some unnecessary files.
find . -name .gitignore -delete \
-o -name .bower.json -delete \
-o -name 'README*' -delete \
-o -name 'CHANGELOG*' -delete \
-o -name 'CONTRIBUTING*' -delete
find . -name .github -type d -prune -exec rm -r {} \;
}
doc() {
replaces="nextcloud$_replaced_ver-doc"
$is_latest && provides="$_pkgname-doc=$pkgver-r$pkgrel"
default_doc
local target="$subpkgdir"/usr/share/webapps/$_pkgname/core/doc
mkdir -p "${target%/*}"
ln -s ../../../doc/$pkgname/core "$target"
install -m644 README.alpine "$subpkgdir"/usr/share/webapps/$_pkgname/README.alpine
}
initscript() {
pkgdesc="Init script that runs Nextcloud with php-fpm"
depends="$pkgname $_php-fpm"
replaces="nextcloud$_replaced_ver-initscript"
$is_latest && provides="$_pkgname-initscript=$pkgver-r$pkgrel"
local confdir="$subpkgdir/etc/$_php/php-fpm.d"
local fpm_name="php-fpm${_php#php}"
install -m 644 -D "$srcdir"/fpm-pool.conf "$confdir"/$_pkgname.conf
install -m 644 -D "$srcdir"/$_pkgname.confd "$subpkgdir"/etc/conf.d/$_pkgname
install -m 755 -D "$srcdir"/$_pkgname.cron "$subpkgdir"/etc/periodic/15min/$_pkgname
mkdir -p "$subpkgdir"/etc/init.d
ln -s $fpm_name "$subpkgdir"/etc/init.d/$_pkgname
}
pgsql() {
pkgdesc="Nextcloud PostgreSQL support"
depends="$pkgname $_php-pgsql $_php-pdo_pgsql"
replaces="nextcloud$_replaced_ver-pgsql"
$is_latest && provides="$_pkgname-pgsql=$pkgver-r$pkgrel"
mkdir -p "$subpkgdir"
}
sqlite() {
pkgdesc="Nextcloud SQLite support"
depends="$pkgname $_php-sqlite3 $_php-pdo_sqlite"
replaces="nextcloud$_replaced_ver-sqlite"
$is_latest && provides="$_pkgname-sqlite=$pkgver-r$pkgrel"
mkdir -p "$subpkgdir"
}
mysql() {
pkgdesc="Nextcloud MySQL support"
depends="$pkgname $_php-pdo_mysql"
replaces="nextcloud$_replaced_ver-mysql"
$is_latest && provides="$_pkgname-mysql=$pkgver-r$pkgrel"
mkdir -p "$subpkgdir"
install -m 644 -D "$srcdir"/$_pkgname-mysql.cnf "$subpkgdir"/etc/my.cnf.d/$_pkgname.cnf
}
occ() {
pkgdesc="Nextcloud OCC cmd"
replaces="nextcloud$_replaced_ver-occ"
$is_latest && provides="$_pkgname-occ=$pkgver-r$pkgrel"
mkdir -p "$subpkgdir/usr/share/webapps/$_pkgname"
amove "usr/share/webapps/$_pkgname/occ"
amove "usr/bin/occ"
}
_default_apps() {
pkgdesc="Nextcloud default apps"
depends="$pkgname"
replaces="nextcloud$_replaced_ver-default-apps"
$is_latest && provides="$_pkgname-default-apps=$pkgver-r$pkgrel"
local path; for path in "$pkgdir"/"$_appsdir"/*; do
if grep -q '<default_enable\s*/>' "$path"/appinfo/info.xml; then
depends="$depends $pkgname-${path##*/}"
fi
done
mkdir -p "$subpkgdir"
}
_package_app() {
local appname="${subpkgname#"$pkgname"-}"
local appinfo="$pkgdir/$_appsdir/$appname/appinfo/info.xml"
local name=$(xmlstarlet sel -t -v 'info/name/text()' "$appinfo")
pkgdesc="Nextcloud ${name:-$appname} app"
replaces="nextcloud$_replaced_ver-$appname"
$is_latest && provides="$_pkgname-$appname=$pkgver-r$pkgrel"
local php_deps=$(xmlstarlet sel -t -v 'info/dependencies/lib/text()' "$appinfo" \
| xargs -r -n1 printf "$_php-%s\n")
local app_deps=""
case "$appname" in
files_sharing) app_deps="-federatedfilesharing"
;;
serverinfo) app_deps="-files_sharing"
esac
depends="$pkgname $php_deps ${app_deps//-/$pkgname-}"
mkdir -p "$subpkgdir"/$_appsdir
mv "$pkgdir"/$_appsdir/$appname "$subpkgdir"/$_appsdir/
}
sha512sums="
0bca2f42ccfb7db4befdd2aeeb1df72d2f9acad88907706f8524ced55bd0213b30b687a5e4c623615e59f22246562e195fd74bbb409c4f60b713482e1237d755 nextcloud-30.0.8.tar.bz2
daeabeaa315bb908cc1e49612cce4b2debd71d17acb84b5d14e15fe124c907884b72d54e9aa669ec209eee1b1934d0bc242d72a28d8db7339cfb08383f66fd5c nextcloud-dont-chmod.patch
12f4a39aef0f81a0115c81bf2b345cc194537a7e8300748b800b0e35bc07928091296074b23c2019c17aced69854a11d1ed7225f67eefd27cf00c3969a75c5b0 dont-update-htaccess.patch
cb04252d01407c7030e87dd54616c621ea0f85ef0212674b1161288182538cae0fb31c67e7cc07c66f9607075774c64e386009cc66365b1f1b155f6ad4f83ac0 disable-integrity-check-as-default.patch
c0a9b7c31c8beaca711f8e97d98441007b3dca7fb3d316d2eacd28a73b5233def6f846c02d98202f75efb9cb248b8787a80e20b07c32d1c7534a0e54bb20feab use-external-docs-if-local-not-avail.patch
5f73cd9399fa484ef15bd47e803c93381deffbc7699eceadbb5c27e43b20156806d74e5021a64d28f0165ef87b519e962780651711a37bceb9f0b04455dfdce1 nextcloud-config.php
7388458a9e8b7afd3d3269718306410ffa59c3c23da4bef367a4d7f6d2570136fae9dd421b19c1441e7ffb15a5405e18bb5da67b1a15f9f45e8b98d3fda532ba nextcloud.logrotate
dcc57735d7d4af4a7ebbdd1186d301e51d2ae4675022aea6bf1111222dfa188a3a490ebd6e7c8a7ac30046cb7d93f81cec72a51acbc60d0c10b7fb64630c637a nextcloud.confd
06a62deae219d09df7acbf38ccb1dcac691dd882459ef70243b5583d7ed21d1ea81dbf0751b4e7199c0de9878755a3882e139d9ccc280bf2e90cbe33fb565487 nextcloud.cron
b9ad5434c384c96608f00d65c45f782e279c6756da8fb706f62ecaf7b7aa420077cb6989da5e85becc47418884ec0672e7db874174454ca679fdca84a50f537f nextcloud-mysql.cnf
78ef204ee7c12b228c0b7b04333514e561c1c8e19153f5507224fa4fdd112aaaa6331747014f3b72181298f52ecd4223bcff4bd963b49b49153265254b07e79b fpm-pool.conf
be54ad9308c8250ff3aef3514b10b228487fc2fbdefa1d28dbbb18a4770f7d9fda90e80c722de8e3c25ce752d124ff79314f16f783b1e5ad67df4e1fe6e880f9 occ
"

View file

@ -0,0 +1,5 @@
## nextcloud-serverinfo package
If you are using the provided nextcloud php-fpm configuration,
nextcloud-serverinfo package requires to enable 'shell_exec' function
in php configuration file 'nextcloud.conf'.

View file

@ -0,0 +1,23 @@
We patch some files and Nextcloud's integrity check doesn't like it...
APK ensures integrity of all installed files, so this Nextcloud's integrity
check doesn't add any value.
---
lib/private/IntegrityCheck/Checker.php | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/lib/private/IntegrityCheck/Checker.php b/lib/private/IntegrityCheck/Checker.php
index e8fd087e..cfbaeb7d 100644
--- a/lib/private/IntegrityCheck/Checker.php
+++ b/lib/private/IntegrityCheck/Checker.php
@@ -91,7 +91,7 @@ class Checker {
* applicable for very specific scenarios and we should not advertise it
* too prominent. So please do not add it to config.sample.php.
*/
- return !($this->config?->getSystemValueBool('integrity.check.disabled', false) ?? false);
+ return !($this->config?->getSystemValueBool('integrity.check.disabled', true) ?? true);
}
/**
--
2.44.0

View file

@ -0,0 +1,42 @@
Don't mess with .htaccess files.
Patch ported from https://src.fedoraproject.org/cgit/rpms/nextcloud.git/tree/nextcloud-9.1.0-dont_update_htacess.patch
---
core/register_command.php | 1 -
lib/private/Updater.php | 8 --------
2 files changed, 9 deletions(-)
diff --git a/core/register_command.php b/core/register_command.php
index 4a84e551..a5158dc4 100644
--- a/core/register_command.php
+++ b/core/register_command.php
@@ -136,7 +136,6 @@ if ($config->getSystemValueBool('installed', false)) {
$application->add(Server::get(Command\Maintenance\Mimetype\UpdateDB::class));
$application->add(Server::get(Command\Maintenance\Mimetype\UpdateJS::class));
$application->add(Server::get(Command\Maintenance\Mode::class));
- $application->add(Server::get(Command\Maintenance\UpdateHtaccess::class));
$application->add(Server::get(Command\Maintenance\UpdateTheme::class));
$application->add(Server::get(Command\Upgrade::class));
diff --git a/lib/private/Updater.php b/lib/private/Updater.php
index 09866273..59144308 100644
--- a/lib/private/Updater.php
+++ b/lib/private/Updater.php
@@ -230,14 +230,6 @@ class Updater extends BasicEmitter {
throw new \Exception('Updates between multiple major versions and downgrades are unsupported.');
}
- // Update .htaccess files
- try {
- Setup::updateHtaccess();
- Setup::protectDataDirectory();
- } catch (\Exception $e) {
- throw new \Exception($e->getMessage());
- }
-
// create empty file in data dir, so we can later find
// out that this is indeed an ownCloud data directory
// (in case it didn't exist before)
--
2.44.0

View file

@ -0,0 +1,200 @@
[global]
; Error log file
; Default Value: log/php-fpm.log
error_log = /var/log/nextcloud/php-fpm.log
; Log level
; Possible Values: alert, error, warning, notice, debug
; Default Value: notice
log_level = warning
; If this number of child processes exit with SIGSEGV or SIGBUS within the time
; interval set by emergency_restart_interval then FPM will restart. A value
; of '0' means 'Off'.
; Default Value: 0
emergency_restart_threshold = 10
; Interval of time used by emergency_restart_interval to determine when
; a graceful restart will be initiated. This can be useful to work around
; accidental corruptions in an accelerator's shared memory.
; Available Units: s(econds), m(inutes), h(ours), or d(ays)
; Default Unit: seconds
; Default Value: 0
emergency_restart_interval = 1m
; Time limit for child processes to wait for a reaction on signals from master.
; Available units: s(econds), m(inutes), h(ours), or d(ays)
; Default Unit: seconds
; Default Value: 0
process_control_timeout = 10s
[nextcloud]
user = nextcloud
group = www-data
; The address on which to accept FastCGI requests.
; Valid syntaxes are:
; 'ip.add.re.ss:port' - to listen on a TCP socket to a specific address on
; a specific port;
; 'port' - to listen on a TCP socket to all addresses on a
; specific port;
; '/path/to/unix/socket' - to listen on a unix socket (the path is *not*
; relative to chroot!)
; Note: This value is mandatory.
listen = /run/nextcloud/fastcgi.sock
; Set permissions for unix socket, if one is used. In Linux, read/write
; permissions must be set in order to allow connections from a web server. Many
; BSD-derived systems allow connections regardless of permissions.
; Default Values: user and group are set as the running user
; mode is set to 0666
listen.mode = 0660
; Choose how the process manager will control the number of child processes.
; Possible Values:
; static ... a fixed number of child processes.
; dynamic ... the number of child processes are set dynamically.
; ondemand ... no children are created at startup; children will be forked
; when new requests will connect.
; Note: This value is mandatory.
pm = ondemand
; The number of child processes to be created when pm is set to 'static' and the
; maximum number of child processes when pm is set to 'dynamic' or 'ondemand'.
; This value sets the limit on the number of simultaneous requests that will be
; served.
; Note: Used when pm is set to 'static', 'dynamic' or 'ondemand'
; Note: This value is mandatory.
pm.max_children = 10
; The number of seconds after which an idle process will be killed.
; Note: Used only when pm is set to 'ondemand'
; Default Value: 10s
pm.process_idle_timeout = 120s
; The number of requests each child process should execute before respawning.
; This can be useful to work around memory leaks in 3rd party libraries. For
; endless request processing specify '0'. Equivalent to PHP_FCGI_MAX_REQUESTS.
; Default Value: 0
pm.max_requests = 500
; The URI to view the FPM status page. If this value is not set, no URI will be
; recognized as a status page.
; Note: The value must start with a leading slash (/). The value can be
; anything, but it may not be a good idea to use the .php extension or it
; may conflict with a real PHP file.
; Default Value: not set
pm.status_path =
; The ping URI to call the monitoring page of FPM. If this value is not set, no
; URI will be recognized as a ping page. This could be used to test from outside
; that FPM is alive and responding, or to
; - create a graph of FPM availability (rrd or such);
; - remove a server from a group if it is not responding (load balancing);
; - trigger alerts for the operating team (24/7).
; Note: The value must start with a leading slash (/). The value can be
; anything, but it may not be a good idea to use the .php extension or it
; may conflict with a real PHP file.
; Default Value: not set
ping.path = /ping
; The timeout for serving a single request after which the worker process will
; be killed. This option should be used when the 'max_execution_time' ini option
; does not stop script execution for some reason. A value of '0' means 'off'.
; Available units: s(econds)(default), m(inutes), h(ours), or d(ays)
; Default Value: 0
;request_terminate_timeout = 0
; The timeout for serving a single request after which a PHP backtrace will be
; dumped to the 'slowlog' file. A value of '0s' means 'off'.
; Available units: s(econds)(default), m(inutes), h(ours), or d(ays)
; Default Value: 0
;request_slowlog_timeout = 0
; The log file for slow requests
; Default Value: not set
; Note: slowlog is mandatory if request_slowlog_timeout is set
; Note: the path is *not* relative to chroot.
;slowlog = /var/log/nextcloud/php-fpm.slow.log
; Redirect worker stdout and stderr into main error log. If not set, stdout and
; stderr will be redirected to /dev/null according to FastCGI specs.
; Note: on highloaded environement, this can cause some delay in the page
; process time (several ms).
; Default Value: no
;catch_workers_output = yes
; Pass environment variables like LD_LIBRARY_PATH. All $VARIABLEs are taken from
; the current environment.
; Default Value: clean env
env[PATH] = /usr/local/bin:/usr/bin:/bin
env[TMP] = /tmp
env[TMPDIR] = /tmp
env[TEMP] = /tmp
; Additional php.ini defines, specific to this pool of workers. These settings
; overwrite the values previously defined in the php.ini. The directives are the
; same as the PHP SAPI:
; php_value/php_flag - you can set classic ini defines which can
; be overwritten from PHP call 'ini_set'.
; php_admin_value/php_admin_flag - these directives won't be overwritten by
; PHP call 'ini_set'
; For php_*flag, valid values are on, off, 1, 0, true, false, yes or no.
;
; Defining 'extension' will load the corresponding shared extension from
; extension_dir. Defining 'disable_functions' or 'disable_classes' will not
; overwrite previously defined php.ini values, but will append the new value
; instead.
;
; Note: path INI options can be relative and will be expanded with the prefix
; (pool, global or /usr/lib/php7.x)
; Allow HTTP file uploads.
php_admin_flag[file_uploads] = true
; Maximal size of a file that can be uploaded via web interface.
php_admin_value[memory_limit] = 512M
php_admin_value[post_max_size] = 513M
php_admin_value[upload_max_filesize] = 513M
; Where to store temporary files.
php_admin_value[session.save_path] = /var/tmp/nextcloud
php_admin_value[sys_temp_dir] = /var/tmp/nextcloud
php_admin_value[upload_tmp_dir] = /var/tmp/nextcloud
; Log errors to specified file.
php_admin_flag[log_errors] = on
php_admin_value[error_log] = /var/log/nextcloud/php.error.log
; OPcache error_log file name. Empty string assumes "stderr"
php_admin_value[opcache.error_log] = /var/log/nextcloud/php.error.log
; Output buffering is a mechanism for controlling how much output data
; (excluding headers and cookies) PHP should keep internally before pushing that
; data to the client. If your application's output exceeds this setting, PHP
; will send that data in chunks of roughly the size you specify.
; This must be disabled for ownCloud.
php_admin_flag[output_buffering] = false
; Overload(replace) single byte functions by mbstring functions.
; This must be disabled for ownCloud.
php_admin_flag[mbstring.func_overload] = false
; Never populate the $HTTP_RAW_POST_DATA variable.
; http://php.net/always-populate-raw-post-data
php_admin_value[always_populate_raw_post_data] = -1
; Disable certain functions for security reasons.
; http://php.net/disable-functions
php_admin_value[disable_functions] = exec,passthru,shell_exec,system,proc_open,curl_multi_exec,show_source
; Set recommended settings for OpCache.
; https://docs.nextcloud.com/server/13/admin_manual/configuration_server/server_tuning.html#enable-php-opcache
php_admin_flag[opcache.enable] = true
php_admin_flag[opcache.enable_cli] = true
php_admin_flag[opcache.save_comments] = true
php_admin_value[opcache.interned_strings_buffer] = 8
php_admin_value[opcache.max_accelerated_files] = 10000
php_admin_value[opcache.memory_consumption] = 128
php_admin_value[opcache.revalidate_freq] = 1

View file

@ -0,0 +1,37 @@
<?php
$CONFIG = array (
'datadirectory' => '/var/lib/nextcloud/data',
'logfile' => '/var/log/nextcloud/nextcloud.log',
'apps_paths' => array (
// Read-only location for apps shipped with Nextcloud and installed by apk.
0 => array (
'path' => '/usr/share/webapps/nextcloud/apps',
'url' => '/apps',
'writable' => false,
),
// Writable location for apps installed from AppStore.
1 => array (
'path' => '/var/lib/nextcloud/apps',
'url' => '/apps-appstore',
'writable' => true,
),
),
'updatechecker' => false,
'check_for_working_htaccess' => false,
// Uncomment to enable Zend OPcache.
//'memcache.local' => '\OC\Memcache\APCu',
// Uncomment this and add user nextcloud to the redis group to enable Redis
// cache for file locking. This is highly recommended, see
// https://github.com/nextcloud/server/issues/9305.
//'memcache.locking' => '\OC\Memcache\Redis',
//'redis' => array(
// 'host' => '/run/redis/redis.sock',
// 'port' => 0,
// 'dbindex' => 0,
// 'timeout' => 1.5,
//),
'installed' => false,
);

View file

@ -0,0 +1,46 @@
commit d8f09abd65e5fd620b8b0d720daee293c355660c
Author: Leonardo Arena <rnalrd@alpinelinux.org>
Date: Mon Aug 31 06:59:15 2020 +0000
Don't chmod. The package takes care of setting the right permissions for directories and files
diff --git a/lib/private/Config.php b/lib/private/Config.php
index cbdbc5b2..1118981b 100644
--- a/lib/private/Config.php
+++ b/lib/private/Config.php
@@ -242,9 +242,6 @@ class Config {
touch($this->configFilePath);
$filePointer = fopen($this->configFilePath, 'r+');
- // Prevent others not to read the config
- chmod($this->configFilePath, 0640);
-
// File does not exist, this can happen when doing a fresh install
if (!is_resource($filePointer)) {
throw new HintException(
diff --git a/lib/private/Log/File.php b/lib/private/Log/File.php
index 9e9abb11..7db25286 100644
--- a/lib/private/Log/File.php
+++ b/lib/private/Log/File.php
@@ -82,9 +82,6 @@ class File extends LogDetails implements IWriter, IFileBased {
public function write(string $app, $message, int $level) {
$entry = $this->logDetailsAsJSON($app, $message, $level);
$handle = @fopen($this->logFile, 'a');
- if ($this->logFileMode > 0 && is_file($this->logFile) && (fileperms($this->logFile) & 0777) != $this->logFileMode) {
- @chmod($this->logFile, $this->logFileMode);
- }
if ($handle) {
fwrite($handle, $entry."\n");
fclose($handle);
diff --git a/lib/private/legacy/OC_Util.php b/lib/private/legacy/OC_Util.php
index 71f6edba..216abdf8 100644
--- a/lib/private/legacy/OC_Util.php
+++ b/lib/private/legacy/OC_Util.php
@@ -1004,7 +1004,6 @@ class OC_Util {
. ' cannot be listed by other users.');
$perms = substr(decoct(@fileperms($dataDirectory)), -3);
if (substr($perms, -1) !== '0') {
- chmod($dataDirectory, 0770);
clearstatcache();
$perms = substr(decoct(@fileperms($dataDirectory)), -3);
if ($perms[2] !== '0') {

View file

@ -0,0 +1,3 @@
[server]
# See https://github.com/nextcloud/server/issues/25436
innodb_read_only_compressed=off

View file

@ -0,0 +1,8 @@
# Config file for /etc/init.d/nextcloud
name="Nextcloud"
user="nextcloud"
group="www-data"
# Uncomment if you use Nextcloud with Redis for caching.
#rc_need="redis"

View file

@ -0,0 +1,6 @@
#!/bin/sh
# Run only when nextcloud service is started.
if rc-service nextcloud -q status >/dev/null 2>&1; then
su nextcloud -s /bin/sh -c 'php83 -f /usr/share/webapps/nextcloud/cron.php'
fi

View file

@ -0,0 +1,6 @@
/var/log/nextcloud/*.log {
daily
compress
copytruncate
su nextcloud www-data
}

View file

@ -0,0 +1,28 @@
#!/bin/sh
# It's not needed to be writable for www-data group when running with php-fpm.
for dir in /etc/nextcloud \
/etc/nextcloud/config.php \
/var/lib/nextcloud/data \
/var/lib/nextcloud/apps
do
chmod g-w $dir
done
chgrp root /etc/nextcloud/config.php
# This must be writable (only) by nextcloud user.
chmod 750 /var/log/nextcloud
mkdir /var/tmp/nextcloud # If /var/tmp doesn't exist there's a big problem
chown nextcloud /var/tmp/nextcloud
chmod 700 /var/tmp/nextcloud
cat <<EOF
*
* Point your web server to /run/nextcloud/fastcgi.sock and start Nextcloud with
* /etc/init.d/nextcloud start. You can modify php-fpm settings in
* /etc/php83/fpm.d/nextcloud.conf.
*
EOF
exit 0

View file

@ -0,0 +1,11 @@
#!/bin/sh
# this package could be installed as consequence of an upgrade
echo
echo ' * If you are upgrading to a versioned package (starts at v29.0.4-r1),' >&2
echo ' please do the following:' >&2
echo
echo ' * Run "apk upgrade -a" a second time to complete the upgrade of all' >&2
echo ' nextcloud packages' >&2
echo ' * Run "occ upgrade" to finish upgrading your Nextcloud instance' >&2
echo ' * NOTE: since v29.0.4-r1 "occ" command is now in package "nextcloudNN-occ"' >&2
echo

View file

@ -0,0 +1,47 @@
#!/bin/sh
ver_new="$1"
ver_old="$2"
if [ $(apk version -t "$ver_old" '12.0.0-r2') = '<' ]; then
cat >&2 <<-EOF
*
* All Nextcloud's bundled apps (except "files" and "dav") have been moved to
* separate subpackages (e.g. nextcloud-activity). If you want to install
* all apps that are enabled by default at once, run:
*
* apk add nextcloud-default-apps
*
EOF
if [ "$(ls -A /var/lib/nextcloud/apps)" ]; then
cat >&2 <<-EOF
*
* Nextcloud's bundled apps have been moved from /var/lib/nextcloud/apps
* to /usr/share/webapps/nextcloud/apps. Only apps installed from App Store
* should be stored in /var/lib/nextcloud/apps.
*
* It seems that you have installed some apps from App Store, so you have to
* add /var/lib/nextcloud/apps to your apps_paths. Copy "apps_paths" key
* from /etc/nextcloud/config.php.apk-new to your config.php.
*
EOF
fi
fi
if [ $(apk version -t "$ver_old" '15.0.2-r0') = '<' ]; then
cat >&2 <<-EOF
*
* App "user_external" is no longer available via release channel.
* You need to uninstall the package and install it via appstore:
*
* apk del nextcloud-user_external
*
EOF
fi
if [ "${ver_new%-r*}" != "${ver_old%-r*}" ]; then
echo ' * Run "occ upgrade" to finish upgrading your NextCloud instance!' >&2
echo ' * NOTE: since v29.0.4-r1 "occ" command is now in package "nextcloudNN-occ"' >&2
fi

View file

@ -0,0 +1,6 @@
#!/bin/sh
addgroup -S -g 82 www-data 2>/dev/null
adduser -S -D -H -h /var/lib/nextcloud -s /sbin/nologin -G www-data -g Nextcloud nextcloud 2>/dev/null
exit 0

View file

@ -0,0 +1,10 @@
#!/bin/sh
ver_old="$2"
apps_link='/usr/share/webapps/nextcloud/apps'
# Remove apps symlink before replacing files to avoid losing installed apps.
# This is a workaround for some issue in apk.
if [ $(apk version -t "$ver_old" '12.0.0-r2') = '<' ] && [ -L "$apps_link" ]; then
rm "$apps_link"
fi

10
ilot/nextcloud30/occ Normal file
View file

@ -0,0 +1,10 @@
#!/bin/sh
NEXTCLOUD_DIR='/usr/share/webapps/nextcloud'
: ${NEXTCLOUD_USER:="nextcloud"}
if [ "$(id -un)" != "$NEXTCLOUD_USER" ]; then
exec su -s /bin/sh "$NEXTCLOUD_USER" -c '$0 "$@"' -- php83 $NEXTCLOUD_DIR/occ "$@"
else
exec php83 $NEXTCLOUD_DIR/occ "$@"
fi

View file

@ -0,0 +1,36 @@
From f17c14956c51206ad82acc5d9b66fd752f0e3c03 Mon Sep 17 00:00:00 2001
From: Jakub Jirutka <jakub@jirutka.cz>
Date: Tue, 19 Dec 2023 07:53:40 +0000
Subject: [PATCH] use external docs if local not available
---
apps/settings/templates/help.php | 11 +++++++++++
1 file changed, 11 insertions(+)
diff --git a/apps/settings/templates/help.php b/apps/settings/templates/help.php
index 649178c1..29b5ac4c 100644
--- a/apps/settings/templates/help.php
+++ b/apps/settings/templates/help.php
@@ -48,8 +48,19 @@
</div>
<div id="app-content" class="help-includes">
+ <?php if ($_['localDocs']) { ?>
<iframe src="<?php print_unescaped($_['url']); ?>" class="help-iframe" tabindex="0">
</iframe>
+ <?php } else { ?>
+ <div class="section">
+ <h2>Local documentation is not installed</h2>
+ <p>Please use
+ <a href="<?php print_unescaped($_['url']); ?>" target="_blank" rel="noreferrer">
+ <?php p($l->t('online documentation')); ?> ↗
+ </a>
+ </p>
+ </div>
+ <?php } ?>
</div>
<?php else: ?>
<div id="app-content">
--
2.42.0

View file

@ -3,7 +3,7 @@
pkgname=py3-azure-core
#_pkgreal is used by apkbuild-pypi to find modules at PyPI
_pkgreal=azure-core
pkgver=1.31.0
pkgver=1.32.0
pkgrel=0
pkgdesc="Microsoft Azure Core Library for Python"
url="https://pypi.python.org/project/microsoft-kiota-authentication-azure"
@ -35,5 +35,5 @@ package() {
}
sha512sums="
be2fc27610034ee5c345ed11f59233ec81d8ad628c4b732531a24e0d54720b81f22d855e5a4d9214c6a8234e479da059b37a40c7ad15e738e2dd46fb4755dad6 py3-azure-core-1.31.0.tar.gz
d258a2ca3bc2c9514dec91bf2dbb19c0ee4c0c0bec73a4301b47fb43be768be836f32621b70a8cdb0e39f1491a522191a82a00f318ee7c901e8861a62439e934 py3-azure-core-1.32.0.tar.gz
"

View file

@ -3,8 +3,7 @@
pkgname=py3-azure-identity
#_pkgreal is used by apkbuild-pypi to find modules at PyPI
_pkgreal=azure-identity
pkgver=1.18.0
_pkgver=${pkgver}b2
pkgver=1.19.0
pkgrel=0
pkgdesc="Microsoft Azure Identity Library for Python"
url="https://pypi.org/project/azure-identity/"
@ -19,8 +18,8 @@ depends="
checkdepends="py3-pytest"
makedepends="py3-setuptools py3-gpep517 py3-wheel py3-flit"
options="!check" #todo
source="$pkgname-$pkgver.tar.gz::https://github.com/Azure/azure-sdk-for-python/archive/refs/tags/azure-identity_$_pkgver.tar.gz"
builddir="$srcdir"/azure-sdk-for-python-azure-identity_$_pkgver/sdk/identity/azure-identity
source="$pkgname-$pkgver.tar.gz::https://github.com/Azure/azure-sdk-for-python/archive/refs/tags/azure-identity_$pkgver.tar.gz"
builddir="$srcdir"/azure-sdk-for-python-azure-identity_$pkgver/sdk/identity/azure-identity
subpackages="$pkgname-pyc"
build() {
@ -41,5 +40,5 @@ package() {
}
sha512sums="
84defc19db3aea614b13dbf2d24ee3ea13c210a05460a4ae2968b01d34f136c81eb9d77b7ce1f0c4590e6f36af0b6fe114787fc7897ffa0f2d8093a9bcb48bf4 py3-azure-identity-1.18.0.tar.gz
090aed812a7a72c649ded2574dc0a05dd7d9db41675e3d86921ab0555f8af7c83999cb879a2f2e0984880874b3b6dfead6b8de0563d8a99d81775715640a9e01 py3-azure-identity-1.19.0.tar.gz
"

View file

@ -1,39 +0,0 @@
# Contributor: Antoine Martin (ayakael) <dev@ayakael.net>
# Maintainer: Antoine Martin (ayakael) <dev@ayakael.net>
pkgname=py3-django-cte
#_pkgreal is used by apkbuild-pypi to find modules at PyPI
_pkgreal=django-cte
pkgver=1.3.3
pkgrel=0
pkgdesc="Common Table Expressions (CTE) for Django"
url="https://pypi.python.org/project/django-cte"
arch="noarch"
license="BSD-Clause-3"
depends="py3-django"
checkdepends="py3-pytest"
makedepends="py3-setuptools py3-gpep517 py3-wheel"
source="$pkgname-$pkgver.tar.gz::https://github.com/dimagi/django-cte/archive/refs/tags/v$pkgver.tar.gz"
options="!check" # Requires active psql
builddir="$srcdir/$_pkgreal-$pkgver"
subpackages="$pkgname-pyc"
build() {
gpep517 build-wheel \
--wheel-dir .dist \
--output-fd 3 3>&1 >&2
}
check() {
python3 -m venv --clear --without-pip --system-site-packages .testenv
.testenv/bin/python3 -m installer .dist/*.whl
DJANGO_SETTINGS_MODULE=tests.settings .testenv/bin/python3 -m pytest -v
}
package() {
python3 -m installer -d "$pkgdir" \
.dist/*.whl
}
sha512sums="
543ca5d7ae546d00facadaccb23498534f9016ef7f1ce68cf0ca7640e8d0d4bbf0e1354e2d1b02d6787cc51c58098deda5783100ff9b8974fd9abe37ec662402 py3-django-cte-1.3.3.tar.gz
"

View file

@ -1,38 +0,0 @@
# Contributor: Antoine Martin (ayakael) <dev@ayakael.net>
# Maintainer: Antoine Martin (ayakael) <dev@ayakael.net>
pkgname=py3-django-dynamic-fixture
#_pkgreal is used by apkbuild-pypi to find modules at PyPI
_pkgreal=django-dynamic-fixture
pkgver=4.0.1
pkgrel=0
pkgdesc="A complete library to create dynamic model instances for testing purposes."
url="https://pypi.python.org/project/django-dynamic-fixture"
arch="noarch"
license="BDS-3-Clause"
depends="py3-django"
checkdepends="py3-pytest py3-pytest-django py3-pytest-cov py3-psycopg2"
makedepends="py3-setuptools py3-gpep517 py3-wheel poetry"
source="$pkgname-$pkgver.tar.gz::https://github.com/paulocheque/django-dynamic-fixture/archive/refs/tags/$pkgver.tar.gz"
builddir="$srcdir/$_pkgreal-$pkgver"
subpackages="$pkgname-pyc"
build() {
gpep517 build-wheel \
--wheel-dir .dist \
--output-fd 3 3>&1 >&2
}
check() {
python3 -m venv --clear --without-pip --system-site-packages .testenv
.testenv/bin/python3 -m installer .dist/*.whl
.testenv/bin/python3 -m pytest -v
}
package() {
python3 -m installer -d "$pkgdir" \
.dist/*.whl
}
sha512sums="
e2a1a67459cf35f092b8f3e34a6162ef3334062f2f779df593f1f7e738371e28996a7e96d6a08397385a7b29d59be143284cf4a921cb2da972ee238d1ff6660d py3-django-dynamic-fixture-4.0.1.tar.gz
"

View file

@ -1,39 +0,0 @@
# Contributor: Antoine Martin (ayakael) <dev@ayakael.net>
# Maintainer: Antoine Martin (ayakael) <dev@ayakael.net>
pkgname=py3-django-pgactivity
#_pkgreal is used by apkbuild-pypi to find modules at PyPI
_pkgreal=django-pgactivity
pkgver=1.5.0
pkgrel=0
pkgdesc="View, filter, and kill Postgres queries."
url="https://pypi.python.org/project/django-pgactivity"
arch="noarch"
license="BDS-3-Clause"
depends="py3-django"
checkdepends="py3-pytest py3-pytest-cov py3-pytest-django py3-pytest-mock py3-django-extensions py3-dj-database-url py3-psycopg2 py3-typing-extensions py3-django-dynamic-fixture"
makedepends="py3-setuptools py3-gpep517 py3-wheel poetry"
options="!check" # requires database setup
source="$pkgname-$pkgver.tar.gz::https://github.com/Opus10/django-pgactivity/archive/refs/tags/$pkgver.tar.gz"
builddir="$srcdir/$_pkgreal-$pkgver"
subpackages="$pkgname-pyc"
build() {
gpep517 build-wheel \
--wheel-dir .dist \
--output-fd 3 3>&1 >&2
}
check() {
python3 -m venv --clear --without-pip --system-site-packages .testenv
.testenv/bin/python3 -m installer .dist/*.whl
.testenv/bin/python3 -m pytest -v
}
package() {
python3 -m installer -d "$pkgdir" \
.dist/*.whl
}
sha512sums="
e052965e05878f75e790c30c4faeafec1900080f9d6119e20331c4418c3e9b6f2331b17b97902c41b538e88d6182466513674c33b146e2cbf566c56cbed20a81 py3-django-pgactivity-1.5.0.tar.gz
"

View file

@ -1,40 +0,0 @@
# Contributor: Antoine Martin (ayakael) <dev@ayakael.net>
# Maintainer: Antoine Martin (ayakael) <dev@ayakael.net>
pkgname=py3-django-pglock
#_pkgreal is used by apkbuild-pypi to find modules at PyPI
_pkgreal=django-pglock
pkgver=1.6.2
pkgrel=0
pkgdesc="Postgres advisory locks, table locks, and blocking lock management."
url="https://pypi.python.org/project/django-pglock"
arch="noarch"
license="BDS-3-Clause"
depends="py3-django py3-django-pgactivity"
# missing py3-pytest-reraise py3-pytest-dotenv
checkdepends="py3-pytest py3-pytest-cov py3-pytest-django py3-pytest-mock py3-django-extensions py3-django-dynamic-fixture"
makedepends="py3-setuptools py3-gpep517 py3-wheel poetry"
source="$pkgname-$pkgver.tar.gz::https://github.com/Opus10/django-pglock/archive/refs/tags/$pkgver.tar.gz"
options="!check" # requires test database
builddir="$srcdir/$_pkgreal-$pkgver"
subpackages="$pkgname-pyc"
build() {
gpep517 build-wheel \
--wheel-dir .dist \
--output-fd 3 3>&1 >&2
}
check() {
python3 -m venv --clear --without-pip --system-site-packages .testenv
.testenv/bin/python3 -m installer .dist/*.whl
.testenv/bin/python3 -m pytest -v
}
package() {
python3 -m installer -d "$pkgdir" \
.dist/*.whl
}
sha512sums="
f87914934c72686d38c175d9a617b96ded332d3ad2caa5e1d5f1dc9cd0e3444a50bf29d66b47162221fd9cd12a27056c8d3cad3154f4a09f69b3fd92becb896f py3-django-pglock-1.6.2.tar.gz
"

View file

@ -0,0 +1,48 @@
# Contributor: Antoine Martin (ayakael) <dev@ayakael.net>
# Maintainer: Antoine Martin (ayakael) <dev@ayakael.net>
pkgname=py3-django-tenant-schemas
#_pkgreal is used by apkbuild-pypi to find modules at PyPI
_pkgreal=django-tenant-schemas
pkgver=1.12.0
pkgrel=0
pkgdesc="Tenant support for Django using PostgreSQL schemas."
url="https://pypi.python.org/project/django-tenant-schemas"
arch="noarch"
license="MIT"
depends="
py3-django
py3-ordered-set
py3-six
py3-psycopg2
"
checkdepends="py3-pytest"
makedepends="py3-setuptools py3-setuptools_scm py3-gpep517 py3-wheel"
source="
$pkgname-$pkgver.tar.gz::https://github.com/bernardopires/django-tenant-schemas/archive/refs/tags/v$pkgver.tar.gz
"
options="!check" # requires pg
builddir="$srcdir/$_pkgreal-$pkgver"
subpackages="$pkgname-pyc"
build() {
export SETUPTOOLS_SCM_PRETEND_VERSION=$pkgver
gpep517 build-wheel \
--wheel-dir .dist \
--output-fd 3 3>&1 >&2
}
check() {
python3 -m venv --clear --without-pip --system-site-packages .testenv
.testenv/bin/python3 -m installer .dist/*.whl
cd tenant_schemas
DJANGO_SETTINGS_MODULE=tests.settings ../.testenv/bin/python3 -m pytest -v
}
package() {
python3 -m installer -d "$pkgdir" \
.dist/*.whl
}
sha512sums="
758f68dc834d4c0074097b166d742a7d63c86b6426ad67d3ce2f56983d417666bf05ae9c46b3ee89a04dee2d888892463651355d26eda7c265ebee8971992319 py3-django-tenant-schemas-1.12.0.tar.gz
"

File diff suppressed because it is too large Load diff

View file

@ -1,43 +0,0 @@
# Contributor: Antoine Martin (ayakael) <dev@ayakael.net>
# Maintainer: Antoine Martin (ayakael) <dev@ayakael.net>
pkgname=py3-django-tenants
#_pkgreal is used by apkbuild-pypi to find modules at PyPI
_pkgreal=django-tenants
pkgver=3.6.1
pkgrel=3
pkgdesc="Tenant support for Django using PostgreSQL schemas."
url="https://pypi.python.org/project/django-tenants"
arch="noarch"
license="MIT"
depends="py3-django py3-psycopg py3-gunicorn py3-coverage"
checkdepends="python3-dev py3-pytest"
makedepends="py3-setuptools py3-gpep517 py3-wheel"
source="
$pkgname-$pkgver.tar.gz::https://codeload.github.com/django-tenants/django-tenants/tar.gz/refs/tags/v$pkgver
997_update-from-pgclone-schema.patch
"
builddir="$srcdir/$_pkgreal-$pkgver"
options="!check" # Requires setting up test database
subpackages="$pkgname-pyc"
build() {
gpep517 build-wheel \
--wheel-dir .dist \
--output-fd 3 3>&1 >&2
}
check() {
python3 -m venv --clear --without-pip --system-site-packages .testenv
.testenv/bin/python3 -m installer .dist/*.whl
DJANGO_SETTINGS_MODULE=tests.settings .testenv/bin/python3 -m pytest -v
}
package() {
python3 -m installer -d "$pkgdir" \
.dist/*.whl
}
sha512sums="
b18afce81ccc89e49fcc4ebe85d90be602415ca898c1660a4e71e2bef6a3ed2e8c724e94b61d8c6f48f3fb19eb2a87d6a6f5bbf449b3e2f661f87e4b5638eafb py3-django-tenants-3.6.1.tar.gz
f2424bb188db2e3c7d13c15e5bdf0959c6f794e68dbc677c8b876d4faa321f78aded5565539f1bfd97583c6df0fcc19ec05abe203b08407e4446dd7194756825 997_update-from-pgclone-schema.patch
"

View file

@ -1,39 +0,0 @@
# Contributor: Antoine Martin (ayakael) <dev@ayakael.net>
# Maintainer: Antoine Martin (ayakael) <dev@ayakael.net>
pkgname=py3-drf-orjson-renderer
#_pkgreal is used by apkbuild-pypi to find modules at PyPI
_pkgreal=drf_orjson_renderer
pkgver=1.7.3
_gittag=9a59352f82e262bd78ccc0228361bcb321a33623
pkgrel=0
pkgdesc="Django RestFramework JSON Renderer Backed by orjson"
url="https://pypi.python.org/project/drf-orjson-renderer"
arch="noarch"
license="MIT"
depends="py3-django-rest-framework py3-orjson"
checkdepends="py3-pytest-django py3-numpy"
makedepends="py3-setuptools py3-gpep517 py3-wheel"
source="$pkgname-$pkgver.tar.gz::https://github.com/brianjbuck/drf_orjson_renderer/archive/$_gittag.tar.gz"
builddir="$srcdir/$_pkgreal-$_gittag"
subpackages="$pkgname-pyc"
build() {
gpep517 build-wheel \
--wheel-dir .dist \
--output-fd 3 3>&1 >&2
}
check() {
python3 -m venv --clear --without-pip --system-site-packages .testenv
.testenv/bin/python3 -m installer .dist/*.whl
.testenv/bin/python3 -m pytest -v
}
package() {
python3 -m installer -d "$pkgdir" \
.dist/*.whl
}
sha512sums="
7870aebf6bcc249228b1620f4b50124eef54e251dcac236e23be4287284461617d630b073d2e9122f66779a908dfd69c5e16b486b23de0114b06b3df6b468e95 py3-drf-orjson-renderer-1.7.3.tar.gz
"

View file

@ -0,0 +1,56 @@
# Contributor: Antoine Martin (ayakael) <dev@ayakael.net>
# Maintainer: Antoine Martin (ayakael) <dev@ayakael.net>
pkgname=py3-kadmin-rs
pkgver=0.5.3
pkgrel=0
pkgdesc="Rust and Python interfaces to the Kerberos administration interface (kadm5)"
url="https://github.com/authentik-community/kadmin-rs"
arch="all"
license="MIT"
checkdepends="py3-pytest py3-k5test"
makedepends="
cargo
cargo-auditable
clang-libclang
py3-setuptools
py3-setuptools-rust
py3-gpep517
py3-wheel
poetry
python3-dev
sccache
"
source="$pkgname-$pkgver.tar.gz::https://github.com/authentik-community/kadmin-rs/archive/refs/tags/kadmin/version/$pkgver.tar.gz"
builddir="$srcdir"/kadmin-rs-kadmin-version-$pkgver
subpackages="$pkgname-pyc"
prepare() {
default_prepare
cargo fetch --target="$CTARGET" --locked
}
build() {
cargo auditable build --release --locked
gpep517 build-wheel \
--wheel-dir .dist \
--output-fd 3 3>&1 >&2
}
check() {
cargo test --locked
python3 -m venv --clear --without-pip --system-site-packages .testenv
.testenv/bin/python3 -m installer .dist/*.whl
.testenv/bin/python3 -m unittest python/tests/test_*.py
}
package() {
python3 -m installer -d "$pkgdir" \
.dist/*.whl
}
sha512sums="
61d3ddfe619827cef83af944b2281f2cf6966d95c3d4a5883b82169bf1f34e6b7173cfa086198e3e0f9a227590a497dcb1c9b209cd4d0c6d361fdfce9b98eec0 py3-kadmin-rs-0.5.3.tar.gz
"

View file

@ -3,21 +3,23 @@
pkgname=py3-microsoft-kiota-abstractions
#_pkgreal is used by apkbuild-pypi to find modules at PyPI
_pkgreal=microsoft-kiota-abstractions
pkgver=1.3.3
pkgver=1.6.8
pkgrel=0
pkgdesc="Abstractions library for Kiota generated Python clients"
url="https://pypi.python.org/project/microsoft-kiota-abstractions"
arch="noarch"
license="MIT"
depends="
py3-std-uritemplate
py3-std-uritemplate<2.0.0
py3-opentelemetry-sdk
py3-importlib-metadata
"
checkdepends="py3-pytest"
makedepends="py3-setuptools py3-gpep517 py3-wheel py3-flit"
source="$pkgname-$pkgver.tar.gz::https://github.com/microsoft/kiota-abstractions-python/archive/refs/tags/v$pkgver.tar.gz"
builddir="$srcdir/kiota-abstractions-python-$pkgver"
checkdepends="py3-pytest py3-pytest-asyncio"
makedepends="poetry py3-gpep517 py3-wheel py3-flit"
source="
$pkgname-$pkgver.tar.gz::https://github.com/microsoft/kiota-python/archive/refs/tags/microsoft-kiota-abstractions-v$pkgver.tar.gz
"
builddir="$srcdir/kiota-python-microsoft-kiota-abstractions-v$pkgver/packages/abstractions"
subpackages="$pkgname-pyc"
build() {
@ -38,5 +40,5 @@ package() {
}
sha512sums="
b416b14cc68dab4eb99d8abc2378c8691781c984f453c7328eefe5bc10788d8244bdc0e3c98d4c2cdbad60d5f672893da4eeed99037d4e361849bcef458547e1 py3-microsoft-kiota-abstractions-1.3.3.tar.gz
55341b1ff3fb1a516ceb84817db991d6e6aa83b01326f64cf21690dee1ab84e9c9c4f7162f9f71ec1261b4e0380b73b13284128bd786b80da29faf968720b355 py3-microsoft-kiota-abstractions-1.6.8.tar.gz
"

View file

@ -3,7 +3,7 @@
pkgname=py3-microsoft-kiota-authentication-azure
#_pkgreal is used by apkbuild-pypi to find modules at PyPI
_pkgreal=microsoft-kiota-authentication-azure
pkgver=1.1.0
pkgver=1.6.8
pkgrel=0
pkgdesc="Authentication provider for Kiota using Azure Identity"
url="https://pypi.python.org/project/microsoft-kiota-authentication-azure"
@ -15,10 +15,12 @@ depends="
py3-importlib-metadata
"
checkdepends="py3-pytest"
makedepends="py3-setuptools py3-gpep517 py3-wheel py3-flit"
makedepends="poetry py3-gpep517 py3-wheel py3-flit"
source="
$pkgname-$pkgver.tar.gz::https://github.com/microsoft/kiota-python/archive/refs/tags/microsoft-kiota-authentication-azure-v$pkgver.tar.gz
"
options="!check" # TODO
source="$pkgname-$pkgver.tar.gz::https://github.com/microsoft/kiota-authentication-azure-python/archive/refs/tags/v$pkgver.tar.gz"
builddir="$srcdir/kiota-authentication-azure-python-$pkgver"
builddir="$srcdir/kiota-python-microsoft-kiota-authentication-azure-v$pkgver/packages/authentication/azure"
subpackages="$pkgname-pyc"
build() {
@ -39,5 +41,5 @@ package() {
}
sha512sums="
4a58a49c027951dd856bc24b03c6ba44b448949bcd3210237d2574e3ceec32eefb403057720e4d517027494d6f977874dd48abbfb5cf856399eb5d1c895376fc py3-microsoft-kiota-authentication-azure-1.1.0.tar.gz
d661d379f036b45bf356e349e28d3478f4a10b351dfde2d1b11a429c0f2160cde9696990cc18d72a224cfd3cc4c90bdc2e6f07d9e4763bd126cd9f66a09b9bec py3-microsoft-kiota-authentication-azure-1.6.8.tar.gz
"

View file

@ -3,7 +3,7 @@
pkgname=py3-microsoft-kiota-http
#_pkgreal is used by apkbuild-pypi to find modules at PyPI
_pkgreal=microsoft-kiota-http
pkgver=1.3.3
pkgver=1.6.8
pkgrel=0
pkgdesc="Kiota http request adapter implementation for httpx library"
url="https://pypi.python.org/project/microsoft-kiota-http"
@ -14,10 +14,12 @@ depends="
py3-httpx
"
checkdepends="py3-pytest"
makedepends="py3-setuptools py3-gpep517 py3-wheel py3-flit"
source="$pkgname-$pkgver.tar.gz::https://github.com/microsoft/kiota-http-python/archive/refs/tags/v$pkgver.tar.gz"
makedepends="poetry py3-gpep517 py3-wheel py3-flit"
source="
$pkgname-$pkgver.tar.gz::https://github.com/microsoft/kiota-python/archive/refs/tags/microsoft-kiota-http-v$pkgver.tar.gz
"
options="!check" # TODO
builddir="$srcdir/kiota-http-python-$pkgver"
builddir="$srcdir/kiota-python-microsoft-kiota-http-v$pkgver/packages/http/httpx"
subpackages="$pkgname-pyc"
build() {
@ -38,5 +40,5 @@ package() {
}
sha512sums="
fff2dc37a0e379ad5689ff9532b43e6ee62ca97589b2feed39898c17a45c5cdb17c20bd714c46cd6ae6e2522de695b6c747aaf5fb0ef96dfd504cd37a6169a87 py3-microsoft-kiota-http-1.3.3.tar.gz
c453c89d31cc062f2d8be4a28bda0666dbde6b5a8e42855892cda72e5d104e6bb5516db01d9feb7f619b8fa77237c9e3badd24b29326f627f95b69210835321d py3-microsoft-kiota-http-1.6.8.tar.gz
"

View file

@ -3,7 +3,7 @@
pkgname=py3-microsoft-kiota-serialization-form
#_pkgreal is used by apkbuild-pypi to find modules at PyPI
_pkgreal=microsoft-kiota-serialization-form
pkgver=0.1.1
pkgver=1.6.8
pkgrel=0
pkgdesc="Kiota Form encoded serialization implementation for Python"
url="https://pypi.python.org/project/microsoft-kiota-serialization-form"
@ -14,9 +14,11 @@ depends="
py3-pendulum
"
checkdepends="py3-pytest"
makedepends="py3-setuptools py3-gpep517 py3-wheel py3-flit"
source="$pkgname-$pkgver.tar.gz::https://github.com/microsoft/kiota-serialization-form-python/archive/refs/tags/v$pkgver.tar.gz"
builddir="$srcdir/kiota-serialization-form-python-$pkgver"
makedepends="poetry py3-gpep517 py3-wheel py3-flit"
source="
$pkgname-$pkgver.tar.gz::https://github.com/microsoft/kiota-python/archive/refs/tags/microsoft-kiota-serialization-form-v$pkgver.tar.gz
"
builddir="$srcdir/kiota-python-microsoft-kiota-serialization-form-v$pkgver/packages/serialization/form"
subpackages="$pkgname-pyc"
build() {
@ -37,5 +39,5 @@ package() {
}
sha512sums="
0afb2b3f0f7d325e630b8a11d17a98b2c42446cb803384e36406074c62ade2be994e29b9d7cb098d9de55609dda28c339eed6397ec373375caaf158b139a5449 py3-microsoft-kiota-serialization-form-0.1.1.tar.gz
0e4fabe18980612ca3f55fd7350148d2393da3f35dc79cd4fe56b01f50bc2af147bde5e294580d83b97b4a549d77e6581ece8ddb19ea09ee92fd6cbfead0d3db py3-microsoft-kiota-serialization-form-1.6.8.tar.gz
"

View file

@ -3,7 +3,7 @@
pkgname=py3-microsoft-kiota-serialization-json
#_pkgreal is used by apkbuild-pypi to find modules at PyPI
_pkgreal=microsoft-kiota-serialization-json
pkgver=1.3.2
pkgver=1.6.8
pkgrel=0
pkgdesc="JSON serialization implementation for Kiota clients in Python"
url="https://pypi.python.org/project/microsoft-kiota-serialization-json"
@ -14,10 +14,12 @@ depends="
py3-pendulum
"
checkdepends="py3-pytest"
makedepends="py3-setuptools py3-gpep517 py3-wheel py3-flit"
source="$pkgname-$pkgver.tar.gz::https://github.com/microsoft/kiota-serialization-json-python/archive/refs/tags/v$pkgver.tar.gz"
makedepends="poetry py3-gpep517 py3-wheel py3-flit"
source="
$pkgname-$pkgver.tar.gz::https://github.com/microsoft/kiota-python/archive/refs/tags/microsoft-kiota-serialization-json-v$pkgver.tar.gz
"
options="!check" # TODO
builddir="$srcdir/kiota-serialization-json-python-$pkgver"
builddir="$srcdir/kiota-python-microsoft-kiota-serialization-json-v$pkgver/packages/serialization/json"
subpackages="$pkgname-pyc"
build() {
@ -38,5 +40,5 @@ package() {
}
sha512sums="
bdf2a42d4509b7a6f093295c8f5d661e771d040965ebdd7fb7772503482fbc6d449c5ac7b16f5f497d9005018d463d3a68650b4b4da0f1a7fbcb0ad3377d12b5 py3-microsoft-kiota-serialization-json-1.3.2.tar.gz
42b8e1d2bfb175e52876314a598647de7b70acb8140cefbfb20d0f8de241bbb03a1cfe6c7108a56047f2a8e3f8f781a23fe54d5612d68a5966340279ff0eb8bc py3-microsoft-kiota-serialization-json-1.6.8.tar.gz
"

View file

@ -3,7 +3,7 @@
pkgname=py3-microsoft-kiota-serialization-multipart
#_pkgreal is used by apkbuild-pypi to find modules at PyPI
_pkgreal=microsoft-kiota-serialization-multipart
pkgver=0.1.0
pkgver=1.6.8
pkgrel=0
pkgdesc="Multipart serialization implementation for python based kiota clients"
url="https://pypi.python.org/project/microsoft-kiota-serialization-multipart"
@ -11,9 +11,11 @@ arch="noarch"
license="MIT"
depends="py3-microsoft-kiota-abstractions py3-microsoft-kiota-serialization-json"
checkdepends="py3-pytest"
makedepends="py3-setuptools py3-gpep517 py3-wheel py3-flit"
source="$pkgname-$pkgver.tar.gz::https://github.com/microsoft/kiota-serialization-multipart-python/archive/refs/tags/v$pkgver.tar.gz"
builddir="$srcdir/kiota-serialization-multipart-python-$pkgver"
makedepends="poetry py3-gpep517 py3-wheel py3-flit"
source="
$pkgname-$pkgver.tar.gz::https://github.com/microsoft/kiota-python/archive/refs/tags/microsoft-kiota-serialization-multipart-v$pkgver.tar.gz
"
builddir="$srcdir/kiota-python-microsoft-kiota-serialization-multipart-v$pkgver/packages/serialization/multipart"
subpackages="$pkgname-pyc"
build() {
@ -34,5 +36,5 @@ package() {
}
sha512sums="
a402f4fc891a70789c8ac6cb16ae30f2059e6aed4013c7601a33f37b959446067cbf0abc630f15aadeb4c85eb04703cead3c19fbbff628332efdebce3c4badb8 py3-microsoft-kiota-serialization-multipart-0.1.0.tar.gz
d6d6d36fe55f4aa595d380e43f93f3de7674633edba676aec16fc26254a12e4f700427fedf1bedfddde30a7f708c93ccbbe586bb0e6950748a2debe609bf44c1 py3-microsoft-kiota-serialization-multipart-1.6.8.tar.gz
"

View file

@ -3,7 +3,7 @@
pkgname=py3-microsoft-kiota-serialization-text
#_pkgreal is used by apkbuild-pypi to find modules at PyPI
_pkgreal=microsoft-kiota-serialization-text
pkgver=1.0.0
pkgver=1.6.8
pkgrel=0
pkgdesc="Text serialization implementation for Kiota generated clients in Python"
url="https://pypi.python.org/project/microsoft-kiota-abstractions"
@ -14,9 +14,11 @@ depends="
py3-dateutil
"
checkdepends="py3-pytest"
makedepends="py3-setuptools py3-gpep517 py3-wheel py3-flit"
source="$pkgname-$pkgver.tar.gz::https://github.com/microsoft/kiota-serialization-text-python/archive/refs/tags/v$pkgver.tar.gz"
builddir="$srcdir/kiota-serialization-text-python-$pkgver"
makedepends="poetry py3-gpep517 py3-wheel py3-flit"
source="
$pkgname-$pkgver.tar.gz::https://github.com/microsoft/kiota-python/archive/refs/tags/microsoft-kiota-serialization-text-v$pkgver.tar.gz
"
builddir="$srcdir/kiota-python-microsoft-kiota-serialization-text-v$pkgver/packages/serialization/text"
subpackages="$pkgname-pyc"
build() {
@ -37,5 +39,5 @@ package() {
}
sha512sums="
b3b0d0a7e69c70c14ed606f70179a49107f6df6f2ba577e9bacbdb15b3071062a180d2e6b77a43d82fe7a67264ad24aa685c71695042ffd54ea4406f9b990208 py3-microsoft-kiota-serialization-text-1.0.0.tar.gz
55dbc87253819f496e2f25de2bf24b170761f335117da414bb35c6db9008e9ca8c6fd13d5e429914c322a850a57858d9abdee7dc209ad55e469182995290d568 py3-microsoft-kiota-serialization-text-1.6.8.tar.gz
"

View file

@ -3,7 +3,7 @@
pkgname=py3-msal
#_pkgreal is used by apkbuild-pypi to find modules at PyPI
_pkgreal=msal
pkgver=1.31.0
pkgver=1.31.1
pkgrel=0
pkgdesc="Microsoft Authentication Library (MSAL) for Python"
url="https://pypi.org/project/msal"
@ -39,5 +39,5 @@ package() {
}
sha512sums="
712342167c7cc958c16c45d9c21a58d83efd9ff3dccf4494d7c83fb226678ed944fef1751a4002fcb292450884c682f1b5d00cdca248d1def54d6f884cdb5dc2 py3-msal-1.31.0.tar.gz
f75541337f09ba29d4de13206346ad7793b3f2bdbdbf8fcb050ee7976b397ca666d61aee21121a4efdd7c150c9d2f87f75812e7b8aa96a5f8ac5219e7a946af2 py3-msal-1.31.1.tar.gz
"

View file

@ -3,7 +3,7 @@
pkgname=py3-msgraph-core
#_pkgreal is used by apkbuild-pypi to find modules at PyPI
_pkgreal=msgraph-core
pkgver=1.1.3
pkgver=1.1.8
pkgrel=0
pkgdesc="The Microsoft Graph Python SDK"
url="https://pypi.python.org/project/msgraph-core"
@ -39,5 +39,5 @@ package() {
}
sha512sums="
48b47b5b02062fe05f9f917d1c6461f539f9ff6dfbafb4a2dcfbe91237725eb7427b2673aec7eb994f733fab109879d96e06e122d72ecab69ff77a1f76fafe49 py3-msgraph-core-1.1.3.tar.gz
0cae6f76cb1373d1ef76448e47b9951e5076a144140c19edc14186f7bfd92930e50c9f6c459170e3362ef267903cdf261d1897566983a7302beab205f9d61389 py3-msgraph-core-1.1.8.tar.gz
"

View file

@ -3,7 +3,7 @@
pkgname=py3-msgraph-sdk
#_pkgreal is used by apkbuild-pypi to find modules at PyPI
_pkgreal=msgraph-sdk
pkgver=1.8.0
pkgver=1.16.0
pkgrel=0
pkgdesc="The Microsoft Graph Python SDK"
url="https://pypi.python.org/project/msgraph-sdk"
@ -40,5 +40,5 @@ package() {
}
sha512sums="
e7d93a4b0f29023dcce0529b54a397b568f44ff40b1efe52e1c060b4552dd055e6a62e0ebcb72cbf3c1babe00440c41e6f897e86a01c3e261a8b88b23cd3af2c py3-msgraph-sdk-1.8.0.tar.gz
af930e5e470f6ac78724650885f70cf447482a53f90043d326b3e00dc7572fd0d476658ebb1677118010e38b54f1e4e609dcfb5fcef5664f05b25062786d11af py3-msgraph-sdk-1.16.0.tar.gz
"

View file

@ -3,7 +3,7 @@
pkgname=py3-opentelemetry-sdk
#_pkgreal is used by apkbuild-pypi to find modules at PyPI
_pkgreal=opentelemetry-sdk
pkgver=1.27.0
pkgver=1.29.0
pkgrel=0
pkgdesc="OpenTelemetry Python SDK"
url="https://github.com/open-telemetry/opentelemetry-python/tree/main"
@ -71,5 +71,5 @@ proto() {
}
sha512sums="
d8b5a617c7e804b4e6e1b508395e87481a3dcc3b375573110750830a1cf6037cfeb5c09dba3e7cfa472e385dbf619afedd79b1c31c5bfe4e87d44ea65f4d2f0b py3-opentelemetry-sdk-1.27.0.tar.gz
92c90e6a684d8cfab3bba4d72612ccf53ae54cdd9784e3434b25adc3730fe114f21fd7aa21da80edf6e0e7c80b39c64ee31fb16f68b04809289bbf5d49d4ca2e py3-opentelemetry-sdk-1.29.0.tar.gz
"

View file

@ -1,39 +0,0 @@
# Contributor: Antoine Martin (ayakael) <dev@ayakael.net>
# Maintainer: Antoine Martin (ayakael) <dev@ayakael.net>
pkgname=py3-pyrad
#_pkgreal is used by apkbuild-pypi to find modules at PyPI
_pkgreal=pyrad
pkgver=2.4
pkgrel=0
pkgdesc="Python RADIUS Implementation"
url="https://pypi.python.org/project/pyrad"
arch="noarch"
license="BSD-3-Clause"
depends="py3-netaddr"
checkdepends="py3-pytest"
makedepends="py3-setuptools py3-gpep517 py3-wheel poetry"
source="$pkgname-$pkgver.tar.gz::https://github.com/pyradius/pyrad/archive/refs/tags/$pkgver.tar.gz"
options="!check" # TODO
builddir="$srcdir/$_pkgreal-$pkgver"
subpackages="$pkgname-pyc"
build() {
gpep517 build-wheel \
--wheel-dir .dist \
--output-fd 3 3>&1 >&2
}
check() {
python3 -m venv --clear --without-pip --system-site-packages .testenv
.testenv/bin/python3 -m installer .dist/*.whl
.testenv/bin/python3 -m pytest -v
}
package() {
python3 -m installer -d "$pkgdir" \
.dist/*.whl
}
sha512sums="
e4f4c687596bd226cf2cdb409a8d940c7b665fb7f722d09113dd9a1b05ab176ce8f920b235918ec01695f262930d13b4057b199cf6aac72afa54950c1fb59166 py3-pyrad-2.4.tar.gz
"

View file

@ -1,38 +0,0 @@
# Contributor: Antoine Martin (ayakael) <dev@ayakael.net>
# Maintainer: Antoine Martin (ayakael) <dev@ayakael.net>
pkgname=py3-scim2-filter-parser
#_pkgreal is used by apkbuild-pypi to find modules at PyPI
_pkgreal=scim2-filter-parser
pkgver=0.5.0
pkgrel=1
pkgdesc="A customizable parser/transpiler for SCIM2.0 filters"
url="https://pypi.python.org/project/scim2-filter-parser"
arch="noarch"
license="MIT"
depends="py3-django py3-sly"
checkdepends="py3-pytest"
makedepends="py3-setuptools py3-gpep517 py3-wheel poetry"
source="$pkgname-$pkgver.tar.gz::https://github.com/15five/scim2-filter-parser/archive/refs/tags/$pkgver.tar.gz"
builddir="$srcdir/$_pkgreal-$pkgver"
subpackages="$pkgname-pyc"
build() {
gpep517 build-wheel \
--wheel-dir .dist \
--output-fd 3 3>&1 >&2
}
check() {
python3 -m venv --clear --without-pip --system-site-packages .testenv
.testenv/bin/python3 -m installer .dist/*.whl
.testenv/bin/python3 -m pytest -v
}
package() {
python3 -m installer -d "$pkgdir" \
.dist/*.whl
}
sha512sums="
5347852af6b82a764a32bc491a7e0f05f06b4f4d93dfa375668b5ca1a15ee58f488702536e350100fe5c96a5c94c492ea8cbd0e1952c5920d5a10e1453357f8c py3-scim2-filter-parser-0.5.0.tar.gz
"

View file

@ -3,7 +3,7 @@
pkgname=py3-std-uritemplate
#_pkgreal is used by apkbuild-pypi to find modules at PyPI
_pkgreal=std-uritemplate
pkgver=1.0.6
pkgver=2.0.1
pkgrel=0
pkgdesc="A complete and maintained cross-language implementation of the Uri Template specification RFC 6570 Level 4"
url="https://pypi.python.org/project/std-uritemplate"
@ -37,5 +37,5 @@ package() {
}
sha512sums="
4873ce356170aea1b45479d5ded0b596265782c097d3fd9d1bb4cc8ad902067bab654057173a2e2b1da37e5ac36ebee024feca43b0e4298b103dc979f97e7c1c py3-std-uritemplate-1.0.6.tar.gz
e073a1204d65bb639cc93480b0f68e1edfe5ac3cff607b72c8da8916b7660eea2b2b246b5db02979cd5c856087958c84dc3bc5e9d76a9540f2ac2a7da8cd18df py3-std-uritemplate-2.0.1.tar.gz
"

View file

@ -1,41 +0,0 @@
# Contributor: Antoine Martin (ayakael) <dev@ayakael.net>
# Maintainer: Antoine Martin (ayakael) <dev@ayakael.net>
pkgname=py3-tenant-schemas-celery
#_pkgreal is used by apkbuild-pypi to find modules at PyPI
_pkgreal=tenant-schemas-celery
pkgver=2.2.0
pkgrel=2
pkgdesc="Celery integration for django-tenant-schemas and django-tenants"
url="https://pypi.python.org/project/tenant-schemas-celery"
arch="noarch"
license="MIT"
depends="py3-django-tenants py3-celery"
checkdepends="python3-dev py3-pytest"
makedepends="py3-setuptools py3-gpep517 py3-wheel"
source="
$pkgname-$pkgver.tar.gz::https://codeload.github.com/maciej-gol/tenant-schemas-celery/tar.gz/refs/tags/$pkgver
"
options="!check" # Test suite wants docker
builddir="$srcdir/$_pkgreal-$pkgver"
subpackages="$pkgname-pyc"
build() {
gpep517 build-wheel \
--wheel-dir .dist \
--output-fd 3 3>&1 >&2
}
check() {
python3 -m venv --clear --without-pip --system-site-packages .testenv
.testenv/bin/python3 -m installer .dist/*.whl
DJANGO_SETTINGS_MODULE=tests.settings .testenv/bin/python3 -m pytest -v
}
package() {
python3 -m installer -d "$pkgdir" \
.dist/*.whl
}
sha512sums="
dad71011306936dc84d966797b113008780750e9e973513092bec892be0d1468e0a0e7e8e2fcca9765309a27767e1c72bdaad7c8aca16353ae1eef783c239148 py3-tenant-schemas-celery-2.2.0.tar.gz
"

View file

@ -3,6 +3,10 @@
# Maintainer: Jakub Jirutka <jakub@jirutka.cz>
#
# secfixes:
# 3.2.4-r0:
# - CVE-2024-27282
# - CVE-2024-27281
# - CVE-2024-27280
# 3.1.4-r0:
# - CVE-2023-28755
# - CVE-2023-28756
@ -58,7 +62,7 @@ pkgname=ruby3.2
# When upgrading, upgrade also each ruby-<name> aport listed in file
# gems/bundled_gems. If some aport is missing or not in the main repo,
# create/move it.
pkgver=3.2.2
pkgver=3.2.6
_abiver="${pkgver%.*}.0"
pkgrel=0
pkgdesc="An object-oriented language for quick and easy programming"
@ -73,6 +77,7 @@ depends_dev="
libucontext-dev
"
makedepends="$depends_dev
cargo
autoconf
gdbm-dev
libffi-dev
@ -245,7 +250,7 @@ full() {
}
sha512sums="
bcc68f3f24c1c8987d9c80b57332e5791f25b935ba38daf5addf60dbfe3a05f9dcaf21909681b88e862c67c6ed103150f73259c6e35c564f13a00f432e3c1e46 ruby-3.2.2.tar.gz
26ae9439043cf40e5eddde6b92ae51c9e1fa4e89c8ec6da36732c59c14873b022c683fb3007950d372f35de9b62a4fabbbc3ef1f4ef58cd53058bd56e1552cbe ruby-3.2.6.tar.gz
16fc1f35aee327d1ecac420b091beaa53c675e0504d5a6932004f17ca68a2c38f57b053b0a3903696f2232c5add160d363e3972a962f7f7bcb52e4e998c7315d test_insns-lower-recursion-depth.patch
42cd45c1db089a1ae57834684479a502e357ddba82ead5fa34e64c13971e7ab7ad2919ddd60a104a817864dd3e2e35bdbedb679210eb41d82cab36a0687e43d4 fix-get_main_stack.patch
a77da5e5eb7d60caf3f1cabb81e09b88dc505ddd746e34efd1908c0096621156d81cc65095b846ba9bdb66028891aefce883a43ddec6b56b5beb4aac5e4ee33f dont-install-bundled-gems.patch

View file

@ -1,8 +1,8 @@
# Contributor: Antoine Martin (ayakael) <dev@ayakael.net>
# Maintainer: Antoine Martin (ayakael) <dev@ayakael.net>
pkgname=uptime-kuma
pkgver=1.23.13
pkgrel=2
pkgver=1.23.16
pkgrel=0
pkgdesc='A fancy self-hosted monitoring tool'
arch="all"
url="https://github.com/louislam/uptime-kuma"
@ -43,7 +43,7 @@ package() {
mv "$pkgdir"/usr/share/webapps/uptime-kuma/LICENSE "$pkgdir"/usr/share/licenses/uptime-kuma/.
}
sha512sums="
9045cdc69d46ce34011f7866844a8d1866eee21850be6eede3226e77b9c0d3ecc0190481671f04f25da40345b29cc2d13de07bcc27e7baeff7901b4bd9c8b93f uptime-kuma-1.23.13.tar.gz
a132d1cd796fbd868782627edfd45d2a6bd3d2fadece23e0bbf000e6a30482659062a43c4590c98e390cac9b8c1926efd8ff01c5b358b7ccea4438259b86f24e uptime-kuma-1.23.16.tar.gz
0ceddb98a6f318029b8bd8b5a49b55c883e77a5f8fffe2b9b271c9abf0ac52dc7a6ea4dbb4a881124a7857f1e43040f18755c1c2a034479e6a94d2b65a73d847 uptime-kuma.openrc
1dbae536b23e3624e139155abbff383bba3209ff2219983da2616b4376b1a5041df812d1e5164716fc6e967a8446d94baae3b96ee575d400813cc6fdc2cc274e uptime-kuma.conf
"

View file

@ -0,0 +1,618 @@
diff --git a/docs/deployment.md b/docs/deployment.md
index d69fcf8..99dfbf3 100644
--- a/docs/deployment.md
+++ b/docs/deployment.md
@@ -60,7 +60,7 @@ Options:
--loop [auto|asyncio|uvloop] Event loop implementation. [default: auto]
--http [auto|h11|httptools] HTTP protocol implementation. [default:
auto]
- --ws [auto|none|websockets|wsproto]
+ --ws [auto|none|websockets|websockets-sansio|wsproto]
WebSocket protocol implementation.
[default: auto]
--ws-max-size INTEGER WebSocket max size message in bytes
diff --git a/docs/index.md b/docs/index.md
index bb6fc32..50e2ab9 100644
--- a/docs/index.md
+++ b/docs/index.md
@@ -130,7 +130,7 @@ Options:
--loop [auto|asyncio|uvloop] Event loop implementation. [default: auto]
--http [auto|h11|httptools] HTTP protocol implementation. [default:
auto]
- --ws [auto|none|websockets|wsproto]
+ --ws [auto|none|websockets|websockets-sansio|wsproto]
WebSocket protocol implementation.
[default: auto]
--ws-max-size INTEGER WebSocket max size message in bytes
diff --git a/pyproject.toml b/pyproject.toml
index 0a89966..8771bfb 100644
--- a/pyproject.toml
+++ b/pyproject.toml
@@ -92,6 +92,10 @@ filterwarnings = [
"ignore:Uvicorn's native WSGI implementation is deprecated.*:DeprecationWarning",
"ignore: 'cgi' is deprecated and slated for removal in Python 3.13:DeprecationWarning",
"ignore: remove second argument of ws_handler:DeprecationWarning:websockets",
+ "ignore: websockets.legacy is deprecated.*:DeprecationWarning",
+ "ignore: websockets.server.WebSocketServerProtocol is deprecated.*:DeprecationWarning",
+ "ignore: websockets.client.connect is deprecated.*:DeprecationWarning",
+ "ignore: websockets.exceptions.InvalidStatusCode is deprecated",
]
[tool.coverage.run]
diff --git a/tests/conftest.py b/tests/conftest.py
index 1b0c0e8..7061a14 100644
--- a/tests/conftest.py
+++ b/tests/conftest.py
@@ -233,9 +233,9 @@ def unused_tcp_port() -> int:
marks=pytest.mark.skipif(not importlib.util.find_spec("wsproto"), reason="wsproto not installed."),
id="wsproto",
),
+ pytest.param("uvicorn.protocols.websockets.websockets_impl:WebSocketProtocol", id="websockets"),
pytest.param(
- "uvicorn.protocols.websockets.websockets_impl:WebSocketProtocol",
- id="websockets",
+ "uvicorn.protocols.websockets.websockets_sansio_impl:WebSocketsSansIOProtocol", id="websockets-sansio"
),
]
)
diff --git a/tests/middleware/test_logging.py b/tests/middleware/test_logging.py
index f27633a..63d7daf 100644
--- a/tests/middleware/test_logging.py
+++ b/tests/middleware/test_logging.py
@@ -49,7 +49,9 @@ async def app(scope: Scope, receive: ASGIReceiveCallable, send: ASGISendCallable
await send({"type": "http.response.body", "body": b"", "more_body": False})
-async def test_trace_logging(caplog: pytest.LogCaptureFixture, logging_config, unused_tcp_port: int):
+async def test_trace_logging(
+ caplog: pytest.LogCaptureFixture, logging_config: dict[str, typing.Any], unused_tcp_port: int
+):
config = Config(
app=app,
log_level="trace",
@@ -91,8 +93,8 @@ async def test_trace_logging_on_http_protocol(http_protocol_cls, caplog, logging
async def test_trace_logging_on_ws_protocol(
ws_protocol_cls: WSProtocol,
- caplog,
- logging_config,
+ caplog: pytest.LogCaptureFixture,
+ logging_config: dict[str, typing.Any],
unused_tcp_port: int,
):
async def websocket_app(scope: Scope, receive: ASGIReceiveCallable, send: ASGISendCallable):
@@ -104,7 +106,7 @@ async def test_trace_logging_on_ws_protocol(
elif message["type"] == "websocket.disconnect":
break
- async def open_connection(url):
+ async def open_connection(url: str):
async with websockets.client.connect(url) as websocket:
return websocket.open
diff --git a/tests/middleware/test_proxy_headers.py b/tests/middleware/test_proxy_headers.py
index 0ade974..d300c45 100644
--- a/tests/middleware/test_proxy_headers.py
+++ b/tests/middleware/test_proxy_headers.py
@@ -465,6 +465,7 @@ async def test_proxy_headers_websocket_x_forwarded_proto(
host, port = scope["client"]
await send({"type": "websocket.accept"})
await send({"type": "websocket.send", "text": f"{scheme}://{host}:{port}"})
+ await send({"type": "websocket.close"})
app_with_middleware = ProxyHeadersMiddleware(websocket_app, trusted_hosts="*")
config = Config(
diff --git a/tests/protocols/test_websocket.py b/tests/protocols/test_websocket.py
index 15ccfdd..e728544 100644
--- a/tests/protocols/test_websocket.py
+++ b/tests/protocols/test_websocket.py
@@ -7,6 +7,8 @@ from copy import deepcopy
import httpx
import pytest
import websockets
+import websockets.asyncio
+import websockets.asyncio.client
import websockets.client
import websockets.exceptions
from typing_extensions import TypedDict
@@ -601,12 +603,9 @@ async def test_connection_lost_before_handshake_complete(
await send_accept_task.wait()
disconnect_message = await receive() # type: ignore
- response: httpx.Response | None = None
-
async def websocket_session(uri: str):
- nonlocal response
async with httpx.AsyncClient() as client:
- response = await client.get(
+ await client.get(
f"http://127.0.0.1:{unused_tcp_port}",
headers={
"upgrade": "websocket",
@@ -623,9 +622,6 @@ async def test_connection_lost_before_handshake_complete(
send_accept_task.set()
await asyncio.sleep(0.1)
- assert response is not None
- assert response.status_code == 500, response.text
- assert response.text == "Internal Server Error"
assert disconnect_message == {"type": "websocket.disconnect", "code": 1006}
await task
@@ -920,6 +916,9 @@ async def test_server_reject_connection_with_body_nolength(
async def test_server_reject_connection_with_invalid_msg(
ws_protocol_cls: WSProtocol, http_protocol_cls: HTTPProtocol, unused_tcp_port: int
):
+ if ws_protocol_cls.__name__ == "WebSocketsSansIOProtocol":
+ pytest.skip("WebSocketsSansIOProtocol sends both start and body messages in one message.")
+
async def app(scope: Scope, receive: ASGIReceiveCallable, send: ASGISendCallable):
assert scope["type"] == "websocket"
assert "extensions" in scope and "websocket.http.response" in scope["extensions"]
@@ -951,6 +950,9 @@ async def test_server_reject_connection_with_invalid_msg(
async def test_server_reject_connection_with_missing_body(
ws_protocol_cls: WSProtocol, http_protocol_cls: HTTPProtocol, unused_tcp_port: int
):
+ if ws_protocol_cls.__name__ == "WebSocketsSansIOProtocol":
+ pytest.skip("WebSocketsSansIOProtocol sends both start and body messages in one message.")
+
async def app(scope: Scope, receive: ASGIReceiveCallable, send: ASGISendCallable):
assert scope["type"] == "websocket"
assert "extensions" in scope and "websocket.http.response" in scope["extensions"]
@@ -986,6 +988,8 @@ async def test_server_multiple_websocket_http_response_start_events(
The server should raise an exception if it sends multiple
websocket.http.response.start events.
"""
+ if ws_protocol_cls.__name__ == "WebSocketsSansIOProtocol":
+ pytest.skip("WebSocketsSansIOProtocol sends both start and body messages in one message.")
exception_message: str | None = None
async def app(scope: Scope, receive: ASGIReceiveCallable, send: ASGISendCallable):
diff --git a/uvicorn/config.py b/uvicorn/config.py
index 664d191..cbfeea6 100644
--- a/uvicorn/config.py
+++ b/uvicorn/config.py
@@ -25,7 +25,7 @@ from uvicorn.middleware.proxy_headers import ProxyHeadersMiddleware
from uvicorn.middleware.wsgi import WSGIMiddleware
HTTPProtocolType = Literal["auto", "h11", "httptools"]
-WSProtocolType = Literal["auto", "none", "websockets", "wsproto"]
+WSProtocolType = Literal["auto", "none", "websockets", "websockets-sansio", "wsproto"]
LifespanType = Literal["auto", "on", "off"]
LoopSetupType = Literal["none", "auto", "asyncio", "uvloop"]
InterfaceType = Literal["auto", "asgi3", "asgi2", "wsgi"]
@@ -47,6 +47,7 @@ WS_PROTOCOLS: dict[WSProtocolType, str | None] = {
"auto": "uvicorn.protocols.websockets.auto:AutoWebSocketsProtocol",
"none": None,
"websockets": "uvicorn.protocols.websockets.websockets_impl:WebSocketProtocol",
+ "websockets-sansio": "uvicorn.protocols.websockets.websockets_sansio_impl:WebSocketsSansIOProtocol",
"wsproto": "uvicorn.protocols.websockets.wsproto_impl:WSProtocol",
}
LIFESPAN: dict[LifespanType, str] = {
diff --git a/uvicorn/protocols/websockets/websockets_sansio_impl.py b/uvicorn/protocols/websockets/websockets_sansio_impl.py
new file mode 100644
index 0000000..994af07
--- /dev/null
+++ b/uvicorn/protocols/websockets/websockets_sansio_impl.py
@@ -0,0 +1,405 @@
+from __future__ import annotations
+
+import asyncio
+import logging
+from asyncio.transports import BaseTransport, Transport
+from http import HTTPStatus
+from typing import Any, Literal, cast
+from urllib.parse import unquote
+
+from websockets import InvalidState
+from websockets.extensions.permessage_deflate import ServerPerMessageDeflateFactory
+from websockets.frames import Frame, Opcode
+from websockets.http11 import Request
+from websockets.server import ServerProtocol
+
+from uvicorn._types import (
+ ASGIReceiveEvent,
+ ASGISendEvent,
+ WebSocketAcceptEvent,
+ WebSocketCloseEvent,
+ WebSocketDisconnectEvent,
+ WebSocketReceiveEvent,
+ WebSocketResponseBodyEvent,
+ WebSocketResponseStartEvent,
+ WebSocketScope,
+ WebSocketSendEvent,
+)
+from uvicorn.config import Config
+from uvicorn.logging import TRACE_LOG_LEVEL
+from uvicorn.protocols.utils import (
+ ClientDisconnected,
+ get_local_addr,
+ get_path_with_query_string,
+ get_remote_addr,
+ is_ssl,
+)
+from uvicorn.server import ServerState
+
+
+class WebSocketsSansIOProtocol(asyncio.Protocol):
+ def __init__(
+ self,
+ config: Config,
+ server_state: ServerState,
+ app_state: dict[str, Any],
+ _loop: asyncio.AbstractEventLoop | None = None,
+ ) -> None:
+ if not config.loaded:
+ config.load() # pragma: no cover
+
+ self.config = config
+ self.app = config.loaded_app
+ self.loop = _loop or asyncio.get_event_loop()
+ self.logger = logging.getLogger("uvicorn.error")
+ self.root_path = config.root_path
+ self.app_state = app_state
+
+ # Shared server state
+ self.connections = server_state.connections
+ self.tasks = server_state.tasks
+ self.default_headers = server_state.default_headers
+
+ # Connection state
+ self.transport: asyncio.Transport = None # type: ignore[assignment]
+ self.server: tuple[str, int] | None = None
+ self.client: tuple[str, int] | None = None
+ self.scheme: Literal["wss", "ws"] = None # type: ignore[assignment]
+
+ # WebSocket state
+ self.queue: asyncio.Queue[ASGIReceiveEvent] = asyncio.Queue()
+ self.handshake_initiated = False
+ self.handshake_complete = False
+ self.close_sent = False
+ self.initial_response: tuple[int, list[tuple[str, str]], bytes] | None = None
+
+ extensions = []
+ if self.config.ws_per_message_deflate:
+ extensions = [ServerPerMessageDeflateFactory()]
+ self.conn = ServerProtocol(
+ extensions=extensions,
+ max_size=self.config.ws_max_size,
+ logger=logging.getLogger("uvicorn.error"),
+ )
+
+ self.read_paused = False
+ self.writable = asyncio.Event()
+ self.writable.set()
+
+ # Buffers
+ self.bytes = b""
+
+ def connection_made(self, transport: BaseTransport) -> None:
+ """Called when a connection is made."""
+ transport = cast(Transport, transport)
+ self.connections.add(self)
+ self.transport = transport
+ self.server = get_local_addr(transport)
+ self.client = get_remote_addr(transport)
+ self.scheme = "wss" if is_ssl(transport) else "ws"
+
+ if self.logger.level <= TRACE_LOG_LEVEL:
+ prefix = "%s:%d - " % self.client if self.client else ""
+ self.logger.log(TRACE_LOG_LEVEL, "%sWebSocket connection made", prefix)
+
+ def connection_lost(self, exc: Exception | None) -> None:
+ code = 1005 if self.handshake_complete else 1006
+ self.queue.put_nowait({"type": "websocket.disconnect", "code": code})
+ self.connections.remove(self)
+
+ if self.logger.level <= TRACE_LOG_LEVEL:
+ prefix = "%s:%d - " % self.client if self.client else ""
+ self.logger.log(TRACE_LOG_LEVEL, "%sWebSocket connection lost", prefix)
+
+ self.handshake_complete = True
+ if exc is None:
+ self.transport.close()
+
+ def eof_received(self) -> None:
+ pass
+
+ def shutdown(self) -> None:
+ if self.handshake_complete:
+ self.queue.put_nowait({"type": "websocket.disconnect", "code": 1012})
+ self.conn.send_close(1012)
+ output = self.conn.data_to_send()
+ self.transport.write(b"".join(output))
+ else:
+ self.send_500_response()
+ self.transport.close()
+
+ def data_received(self, data: bytes) -> None:
+ self.conn.receive_data(data)
+ parser_exc = self.conn.parser_exc
+ if parser_exc is not None:
+ self.handle_parser_exception()
+ return
+ self.handle_events()
+
+ def handle_events(self) -> None:
+ for event in self.conn.events_received():
+ if isinstance(event, Request):
+ self.handle_connect(event)
+ if isinstance(event, Frame):
+ if event.opcode == Opcode.CONT:
+ self.handle_cont(event)
+ elif event.opcode == Opcode.TEXT:
+ self.handle_text(event)
+ elif event.opcode == Opcode.BINARY:
+ self.handle_bytes(event)
+ elif event.opcode == Opcode.PING:
+ self.handle_ping(event)
+ elif event.opcode == Opcode.CLOSE:
+ self.handle_close(event)
+
+ # Event handlers
+
+ def handle_connect(self, event: Request) -> None:
+ self.request = event
+ self.response = self.conn.accept(event)
+ self.handshake_initiated = True
+ if self.response.status_code != 101:
+ self.handshake_complete = True
+ self.close_sent = True
+ self.conn.send_response(self.response)
+ output = self.conn.data_to_send()
+ self.transport.write(b"".join(output))
+ self.transport.close()
+ return
+
+ headers = [
+ (key.encode("ascii"), value.encode("ascii", errors="surrogateescape"))
+ for key, value in event.headers.raw_items()
+ ]
+ raw_path, _, query_string = event.path.partition("?")
+ self.scope: WebSocketScope = {
+ "type": "websocket",
+ "asgi": {"version": self.config.asgi_version, "spec_version": "2.3"},
+ "http_version": "1.1",
+ "scheme": self.scheme,
+ "server": self.server,
+ "client": self.client,
+ "root_path": self.root_path,
+ "path": unquote(raw_path),
+ "raw_path": raw_path.encode("ascii"),
+ "query_string": query_string.encode("ascii"),
+ "headers": headers,
+ "subprotocols": event.headers.get_all("Sec-WebSocket-Protocol"),
+ "state": self.app_state.copy(),
+ "extensions": {"websocket.http.response": {}},
+ }
+ self.queue.put_nowait({"type": "websocket.connect"})
+ task = self.loop.create_task(self.run_asgi())
+ task.add_done_callback(self.on_task_complete)
+ self.tasks.add(task)
+
+ def handle_cont(self, event: Frame) -> None:
+ self.bytes += event.data
+ if event.fin:
+ self.send_receive_event_to_app()
+
+ def handle_text(self, event: Frame) -> None:
+ self.bytes = event.data
+ self.curr_msg_data_type: Literal["text", "bytes"] = "text"
+ if event.fin:
+ self.send_receive_event_to_app()
+
+ def handle_bytes(self, event: Frame) -> None:
+ self.bytes = event.data
+ self.curr_msg_data_type = "bytes"
+ if event.fin:
+ self.send_receive_event_to_app()
+
+ def send_receive_event_to_app(self) -> None:
+ data_type = self.curr_msg_data_type
+ msg: WebSocketReceiveEvent
+ if data_type == "text":
+ msg = {"type": "websocket.receive", data_type: self.bytes.decode()}
+ else:
+ msg = {"type": "websocket.receive", data_type: self.bytes}
+ self.queue.put_nowait(msg)
+ if not self.read_paused:
+ self.read_paused = True
+ self.transport.pause_reading()
+
+ def handle_ping(self, event: Frame) -> None:
+ output = self.conn.data_to_send()
+ self.transport.write(b"".join(output))
+
+ def handle_close(self, event: Frame) -> None:
+ if not self.close_sent and not self.transport.is_closing():
+ disconnect_event: WebSocketDisconnectEvent = {
+ "type": "websocket.disconnect",
+ "code": self.conn.close_rcvd.code, # type: ignore[union-attr]
+ "reason": self.conn.close_rcvd.reason, # type: ignore[union-attr]
+ }
+ self.queue.put_nowait(disconnect_event)
+ output = self.conn.data_to_send()
+ self.transport.write(b"".join(output))
+ self.transport.close()
+
+ def handle_parser_exception(self) -> None:
+ disconnect_event: WebSocketDisconnectEvent = {
+ "type": "websocket.disconnect",
+ "code": self.conn.close_sent.code, # type: ignore[union-attr]
+ "reason": self.conn.close_sent.reason, # type: ignore[union-attr]
+ }
+ self.queue.put_nowait(disconnect_event)
+ output = self.conn.data_to_send()
+ self.transport.write(b"".join(output))
+ self.close_sent = True
+ self.transport.close()
+
+ def on_task_complete(self, task: asyncio.Task[None]) -> None:
+ self.tasks.discard(task)
+
+ async def run_asgi(self) -> None:
+ try:
+ result = await self.app(self.scope, self.receive, self.send)
+ except ClientDisconnected:
+ self.transport.close()
+ except BaseException:
+ self.logger.exception("Exception in ASGI application\n")
+ self.send_500_response()
+ self.transport.close()
+ else:
+ if not self.handshake_complete:
+ msg = "ASGI callable returned without completing handshake."
+ self.logger.error(msg)
+ self.send_500_response()
+ self.transport.close()
+ elif result is not None:
+ msg = "ASGI callable should return None, but returned '%s'."
+ self.logger.error(msg, result)
+ self.transport.close()
+
+ def send_500_response(self) -> None:
+ if self.initial_response or self.handshake_complete:
+ return
+ response = self.conn.reject(500, "Internal Server Error")
+ self.conn.send_response(response)
+ output = self.conn.data_to_send()
+ self.transport.write(b"".join(output))
+
+ async def send(self, message: ASGISendEvent) -> None:
+ await self.writable.wait()
+
+ message_type = message["type"]
+
+ if not self.handshake_complete and self.initial_response is None:
+ if message_type == "websocket.accept":
+ message = cast(WebSocketAcceptEvent, message)
+ self.logger.info(
+ '%s - "WebSocket %s" [accepted]',
+ self.scope["client"],
+ get_path_with_query_string(self.scope),
+ )
+ headers = [
+ (name.decode("latin-1").lower(), value.decode("latin-1").lower())
+ for name, value in (self.default_headers + list(message.get("headers", [])))
+ ]
+ accepted_subprotocol = message.get("subprotocol")
+ if accepted_subprotocol:
+ headers.append(("Sec-WebSocket-Protocol", accepted_subprotocol))
+ self.response.headers.update(headers)
+
+ if not self.transport.is_closing():
+ self.handshake_complete = True
+ self.conn.send_response(self.response)
+ output = self.conn.data_to_send()
+ self.transport.write(b"".join(output))
+
+ elif message_type == "websocket.close":
+ message = cast(WebSocketCloseEvent, message)
+ self.queue.put_nowait({"type": "websocket.disconnect", "code": 1006})
+ self.logger.info(
+ '%s - "WebSocket %s" 403',
+ self.scope["client"],
+ get_path_with_query_string(self.scope),
+ )
+ response = self.conn.reject(HTTPStatus.FORBIDDEN, "")
+ self.conn.send_response(response)
+ output = self.conn.data_to_send()
+ self.close_sent = True
+ self.handshake_complete = True
+ self.transport.write(b"".join(output))
+ self.transport.close()
+ elif message_type == "websocket.http.response.start" and self.initial_response is None:
+ message = cast(WebSocketResponseStartEvent, message)
+ if not (100 <= message["status"] < 600):
+ raise RuntimeError("Invalid HTTP status code '%d' in response." % message["status"])
+ self.logger.info(
+ '%s - "WebSocket %s" %d',
+ self.scope["client"],
+ get_path_with_query_string(self.scope),
+ message["status"],
+ )
+ headers = [
+ (name.decode("latin-1"), value.decode("latin-1"))
+ for name, value in list(message.get("headers", []))
+ ]
+ self.initial_response = (message["status"], headers, b"")
+ else:
+ msg = (
+ "Expected ASGI message 'websocket.accept', 'websocket.close' "
+ "or 'websocket.http.response.start' "
+ "but got '%s'."
+ )
+ raise RuntimeError(msg % message_type)
+
+ elif not self.close_sent and self.initial_response is None:
+ try:
+ if message_type == "websocket.send":
+ message = cast(WebSocketSendEvent, message)
+ bytes_data = message.get("bytes")
+ text_data = message.get("text")
+ if text_data:
+ self.conn.send_text(text_data.encode())
+ elif bytes_data:
+ self.conn.send_binary(bytes_data)
+ output = self.conn.data_to_send()
+ self.transport.write(b"".join(output))
+
+ elif message_type == "websocket.close" and not self.transport.is_closing():
+ message = cast(WebSocketCloseEvent, message)
+ code = message.get("code", 1000)
+ reason = message.get("reason", "") or ""
+ self.queue.put_nowait({"type": "websocket.disconnect", "code": code})
+ self.conn.send_close(code, reason)
+ output = self.conn.data_to_send()
+ self.transport.write(b"".join(output))
+ self.close_sent = True
+ self.transport.close()
+ else:
+ msg = "Expected ASGI message 'websocket.send' or 'websocket.close'," " but got '%s'."
+ raise RuntimeError(msg % message_type)
+ except InvalidState:
+ raise ClientDisconnected()
+ elif self.initial_response is not None:
+ if message_type == "websocket.http.response.body":
+ message = cast(WebSocketResponseBodyEvent, message)
+ body = self.initial_response[2] + message["body"]
+ self.initial_response = self.initial_response[:2] + (body,)
+ if not message.get("more_body", False):
+ response = self.conn.reject(self.initial_response[0], body.decode())
+ response.headers.update(self.initial_response[1])
+ self.queue.put_nowait({"type": "websocket.disconnect", "code": 1006})
+ self.conn.send_response(response)
+ output = self.conn.data_to_send()
+ self.close_sent = True
+ self.transport.write(b"".join(output))
+ self.transport.close()
+ else:
+ msg = "Expected ASGI message 'websocket.http.response.body' " "but got '%s'."
+ raise RuntimeError(msg % message_type)
+
+ else:
+ msg = "Unexpected ASGI message '%s', after sending 'websocket.close'."
+ raise RuntimeError(msg % message_type)
+
+ async def receive(self) -> ASGIReceiveEvent:
+ message = await self.queue.get()
+ if self.read_paused and self.queue.empty():
+ self.read_paused = False
+ self.transport.resume_reading()
+ return message
diff --git a/uvicorn/server.py b/uvicorn/server.py
index cca2e85..50c5ed2 100644
--- a/uvicorn/server.py
+++ b/uvicorn/server.py
@@ -23,9 +23,10 @@ if TYPE_CHECKING:
from uvicorn.protocols.http.h11_impl import H11Protocol
from uvicorn.protocols.http.httptools_impl import HttpToolsProtocol
from uvicorn.protocols.websockets.websockets_impl import WebSocketProtocol
+ from uvicorn.protocols.websockets.websockets_sansio_impl import WebSocketsSansIOProtocol
from uvicorn.protocols.websockets.wsproto_impl import WSProtocol
- Protocols = Union[H11Protocol, HttpToolsProtocol, WSProtocol, WebSocketProtocol]
+ Protocols = Union[H11Protocol, HttpToolsProtocol, WSProtocol, WebSocketProtocol, WebSocketsSansIOProtocol]
HANDLED_SIGNALS = (
signal.SIGINT, # Unix signal 2. Sent by Ctrl+C.

View file

@ -0,0 +1,567 @@
diff --git a/requirements.txt b/requirements.txt
index e26e6b3..b16569f 100644
--- a/requirements.txt
+++ b/requirements.txt
@@ -7,7 +7,7 @@ h11 @ git+https://github.com/python-hyper/h11.git@master
# Explicit optionals
a2wsgi==1.10.7
wsproto==1.2.0
-websockets==13.1
+websockets==14.1
# Packaging
build==1.2.2.post1
diff --git a/tests/middleware/test_logging.py b/tests/middleware/test_logging.py
index 63d7daf..5aef174 100644
--- a/tests/middleware/test_logging.py
+++ b/tests/middleware/test_logging.py
@@ -8,8 +8,7 @@ import typing
import httpx
import pytest
-import websockets
-import websockets.client
+from websockets.asyncio.client import connect
from tests.utils import run_server
from uvicorn import Config
@@ -107,8 +106,8 @@ async def test_trace_logging_on_ws_protocol(
break
async def open_connection(url: str):
- async with websockets.client.connect(url) as websocket:
- return websocket.open
+ async with connect(url):
+ return True
config = Config(
app=websocket_app,
diff --git a/tests/middleware/test_proxy_headers.py b/tests/middleware/test_proxy_headers.py
index d300c45..4b5f195 100644
--- a/tests/middleware/test_proxy_headers.py
+++ b/tests/middleware/test_proxy_headers.py
@@ -5,7 +5,7 @@ from typing import TYPE_CHECKING
import httpx
import httpx._transports.asgi
import pytest
-import websockets.client
+from websockets.asyncio.client import connect
from tests.response import Response
from tests.utils import run_server
@@ -479,7 +479,7 @@ async def test_proxy_headers_websocket_x_forwarded_proto(
async with run_server(config):
url = f"ws://127.0.0.1:{unused_tcp_port}"
headers = {X_FORWARDED_FOR: "1.2.3.4", X_FORWARDED_PROTO: forwarded_proto}
- async with websockets.client.connect(url, extra_headers=headers) as websocket:
+ async with connect(url, additional_headers=headers) as websocket:
data = await websocket.recv()
assert data == expected
diff --git a/tests/protocols/test_websocket.py b/tests/protocols/test_websocket.py
index e728544..b9035ec 100644
--- a/tests/protocols/test_websocket.py
+++ b/tests/protocols/test_websocket.py
@@ -12,6 +12,8 @@ import websockets.asyncio.client
import websockets.client
import websockets.exceptions
from typing_extensions import TypedDict
+from websockets.asyncio.client import ClientConnection, connect
+from websockets.exceptions import ConnectionClosed, ConnectionClosedError, InvalidHandshake, InvalidStatus
from websockets.extensions.permessage_deflate import ClientPerMessageDeflateFactory
from websockets.typing import Subprotocol
@@ -130,8 +132,8 @@ async def test_accept_connection(ws_protocol_cls: WSProtocol, http_protocol_cls:
await self.send({"type": "websocket.accept"})
async def open_connection(url: str):
- async with websockets.client.connect(url) as websocket:
- return websocket.open
+ async with connect(url):
+ return True
config = Config(app=App, ws=ws_protocol_cls, http=http_protocol_cls, lifespan="off", port=unused_tcp_port)
async with run_server(config):
@@ -146,7 +148,7 @@ async def test_shutdown(ws_protocol_cls: WSProtocol, http_protocol_cls: HTTPProt
config = Config(app=App, ws=ws_protocol_cls, http=http_protocol_cls, lifespan="off", port=unused_tcp_port)
async with run_server(config) as server:
- async with websockets.client.connect(f"ws://127.0.0.1:{unused_tcp_port}"):
+ async with connect(f"ws://127.0.0.1:{unused_tcp_port}"):
# Attempt shutdown while connection is still open
await server.shutdown()
@@ -160,8 +162,8 @@ async def test_supports_permessage_deflate_extension(
async def open_connection(url: str):
extension_factories = [ClientPerMessageDeflateFactory()]
- async with websockets.client.connect(url, extensions=extension_factories) as websocket:
- return [extension.name for extension in websocket.extensions]
+ async with connect(url, extensions=extension_factories) as websocket:
+ return [extension.name for extension in websocket.protocol.extensions]
config = Config(app=App, ws=ws_protocol_cls, http=http_protocol_cls, lifespan="off", port=unused_tcp_port)
async with run_server(config):
@@ -180,8 +182,8 @@ async def test_can_disable_permessage_deflate_extension(
# enable per-message deflate on the client, so that we can check the server
# won't support it when it's disabled.
extension_factories = [ClientPerMessageDeflateFactory()]
- async with websockets.client.connect(url, extensions=extension_factories) as websocket:
- return [extension.name for extension in websocket.extensions]
+ async with connect(url, extensions=extension_factories) as websocket:
+ return [extension.name for extension in websocket.protocol.extensions]
config = Config(
app=App,
@@ -203,8 +205,8 @@ async def test_close_connection(ws_protocol_cls: WSProtocol, http_protocol_cls:
async def open_connection(url: str):
try:
- await websockets.client.connect(url)
- except websockets.exceptions.InvalidHandshake:
+ await connect(url)
+ except InvalidHandshake:
return False
return True # pragma: no cover
@@ -224,8 +226,8 @@ async def test_headers(ws_protocol_cls: WSProtocol, http_protocol_cls: HTTPProto
await self.send({"type": "websocket.accept"})
async def open_connection(url: str):
- async with websockets.client.connect(url, extra_headers=[("username", "abraão")]) as websocket:
- return websocket.open
+ async with connect(url, additional_headers=[("username", "abraão")]):
+ return True
config = Config(app=App, ws=ws_protocol_cls, http=http_protocol_cls, lifespan="off", port=unused_tcp_port)
async with run_server(config):
@@ -239,8 +241,9 @@ async def test_extra_headers(ws_protocol_cls: WSProtocol, http_protocol_cls: HTT
await self.send({"type": "websocket.accept", "headers": [(b"extra", b"header")]})
async def open_connection(url: str):
- async with websockets.client.connect(url) as websocket:
- return websocket.response_headers
+ async with connect(url) as websocket:
+ assert websocket.response
+ return websocket.response.headers
config = Config(app=App, ws=ws_protocol_cls, http=http_protocol_cls, lifespan="off", port=unused_tcp_port)
async with run_server(config):
@@ -258,8 +261,8 @@ async def test_path_and_raw_path(ws_protocol_cls: WSProtocol, http_protocol_cls:
await self.send({"type": "websocket.accept"})
async def open_connection(url: str):
- async with websockets.client.connect(url) as websocket:
- return websocket.open
+ async with connect(url):
+ return True
config = Config(app=App, ws=ws_protocol_cls, http=http_protocol_cls, lifespan="off", port=unused_tcp_port)
async with run_server(config):
@@ -276,7 +279,7 @@ async def test_send_text_data_to_client(
await self.send({"type": "websocket.send", "text": "123"})
async def get_data(url: str):
- async with websockets.client.connect(url) as websocket:
+ async with connect(url) as websocket:
return await websocket.recv()
config = Config(app=App, ws=ws_protocol_cls, http=http_protocol_cls, lifespan="off", port=unused_tcp_port)
@@ -294,7 +297,7 @@ async def test_send_binary_data_to_client(
await self.send({"type": "websocket.send", "bytes": b"123"})
async def get_data(url: str):
- async with websockets.client.connect(url) as websocket:
+ async with connect(url) as websocket:
return await websocket.recv()
config = Config(app=App, ws=ws_protocol_cls, http=http_protocol_cls, lifespan="off", port=unused_tcp_port)
@@ -313,7 +316,7 @@ async def test_send_and_close_connection(
await self.send({"type": "websocket.close"})
async def get_data(url: str):
- async with websockets.client.connect(url) as websocket:
+ async with connect(url) as websocket:
data = await websocket.recv()
is_open = True
try:
@@ -342,7 +345,7 @@ async def test_send_text_data_to_server(
await self.send({"type": "websocket.send", "text": _text})
async def send_text(url: str):
- async with websockets.client.connect(url) as websocket:
+ async with connect(url) as websocket:
await websocket.send("abc")
return await websocket.recv()
@@ -365,7 +368,7 @@ async def test_send_binary_data_to_server(
await self.send({"type": "websocket.send", "bytes": _bytes})
async def send_text(url: str):
- async with websockets.client.connect(url) as websocket:
+ async with connect(url) as websocket:
await websocket.send(b"abc")
return await websocket.recv()
@@ -387,7 +390,7 @@ async def test_send_after_protocol_close(
await self.send({"type": "websocket.send", "text": "123"})
async def get_data(url: str):
- async with websockets.client.connect(url) as websocket:
+ async with connect(url) as websocket:
data = await websocket.recv()
is_open = True
try:
@@ -407,14 +410,14 @@ async def test_missing_handshake(ws_protocol_cls: WSProtocol, http_protocol_cls:
async def app(scope: Scope, receive: ASGIReceiveCallable, send: ASGISendCallable):
pass
- async def connect(url: str):
- await websockets.client.connect(url)
+ async def open_connection(url: str):
+ await connect(url)
config = Config(app=app, ws=ws_protocol_cls, http=http_protocol_cls, lifespan="off", port=unused_tcp_port)
async with run_server(config):
- with pytest.raises(websockets.exceptions.InvalidStatusCode) as exc_info:
- await connect(f"ws://127.0.0.1:{unused_tcp_port}")
- assert exc_info.value.status_code == 500
+ with pytest.raises(InvalidStatus) as exc_info:
+ await open_connection(f"ws://127.0.0.1:{unused_tcp_port}")
+ assert exc_info.value.response.status_code == 500
async def test_send_before_handshake(
@@ -423,14 +426,14 @@ async def test_send_before_handshake(
async def app(scope: Scope, receive: ASGIReceiveCallable, send: ASGISendCallable):
await send({"type": "websocket.send", "text": "123"})
- async def connect(url: str):
- await websockets.client.connect(url)
+ async def open_connection(url: str):
+ await connect(url)
config = Config(app=app, ws=ws_protocol_cls, http=http_protocol_cls, lifespan="off", port=unused_tcp_port)
async with run_server(config):
- with pytest.raises(websockets.exceptions.InvalidStatusCode) as exc_info:
- await connect(f"ws://127.0.0.1:{unused_tcp_port}")
- assert exc_info.value.status_code == 500
+ with pytest.raises(InvalidStatus) as exc_info:
+ await open_connection(f"ws://127.0.0.1:{unused_tcp_port}")
+ assert exc_info.value.response.status_code == 500
async def test_duplicate_handshake(ws_protocol_cls: WSProtocol, http_protocol_cls: HTTPProtocol, unused_tcp_port: int):
@@ -440,10 +443,10 @@ async def test_duplicate_handshake(ws_protocol_cls: WSProtocol, http_protocol_cl
config = Config(app=app, ws=ws_protocol_cls, http=http_protocol_cls, lifespan="off", port=unused_tcp_port)
async with run_server(config):
- async with websockets.client.connect(f"ws://127.0.0.1:{unused_tcp_port}") as websocket:
- with pytest.raises(websockets.exceptions.ConnectionClosed):
+ async with connect(f"ws://127.0.0.1:{unused_tcp_port}") as websocket:
+ with pytest.raises(ConnectionClosed):
_ = await websocket.recv()
- assert websocket.close_code == 1006
+ assert websocket.protocol.close_code == 1006
async def test_asgi_return_value(ws_protocol_cls: WSProtocol, http_protocol_cls: HTTPProtocol, unused_tcp_port: int):
@@ -458,10 +461,10 @@ async def test_asgi_return_value(ws_protocol_cls: WSProtocol, http_protocol_cls:
config = Config(app=app, ws=ws_protocol_cls, http=http_protocol_cls, lifespan="off", port=unused_tcp_port)
async with run_server(config):
- async with websockets.client.connect(f"ws://127.0.0.1:{unused_tcp_port}") as websocket:
- with pytest.raises(websockets.exceptions.ConnectionClosed):
+ async with connect(f"ws://127.0.0.1:{unused_tcp_port}") as websocket:
+ with pytest.raises(ConnectionClosed):
_ = await websocket.recv()
- assert websocket.close_code == 1006
+ assert websocket.protocol.close_code == 1006
@pytest.mark.parametrize("code", [None, 1000, 1001])
@@ -493,13 +496,13 @@ async def test_app_close(
config = Config(app=app, ws=ws_protocol_cls, http=http_protocol_cls, lifespan="off", port=unused_tcp_port)
async with run_server(config):
- async with websockets.client.connect(f"ws://127.0.0.1:{unused_tcp_port}") as websocket:
+ async with connect(f"ws://127.0.0.1:{unused_tcp_port}") as websocket:
await websocket.ping()
await websocket.send("abc")
- with pytest.raises(websockets.exceptions.ConnectionClosed):
+ with pytest.raises(ConnectionClosed):
await websocket.recv()
- assert websocket.close_code == (code or 1000)
- assert websocket.close_reason == (reason or "")
+ assert websocket.protocol.close_code == (code or 1000)
+ assert websocket.protocol.close_reason == (reason or "")
async def test_client_close(ws_protocol_cls: WSProtocol, http_protocol_cls: HTTPProtocol, unused_tcp_port: int):
@@ -518,7 +521,7 @@ async def test_client_close(ws_protocol_cls: WSProtocol, http_protocol_cls: HTTP
break
async def websocket_session(url: str):
- async with websockets.client.connect(url) as websocket:
+ async with connect(url) as websocket:
await websocket.ping()
await websocket.send("abc")
await websocket.close(code=1001, reason="custom reason")
@@ -555,7 +558,7 @@ async def test_client_connection_lost(
port=unused_tcp_port,
)
async with run_server(config):
- async with websockets.client.connect(f"ws://127.0.0.1:{unused_tcp_port}") as websocket:
+ async with connect(f"ws://127.0.0.1:{unused_tcp_port}") as websocket:
websocket.transport.close()
await asyncio.sleep(0.1)
got_disconnect_event_before_shutdown = got_disconnect_event
@@ -583,7 +586,7 @@ async def test_client_connection_lost_on_send(
config = Config(app=app, ws=ws_protocol_cls, http=http_protocol_cls, lifespan="off", port=unused_tcp_port)
async with run_server(config):
url = f"ws://127.0.0.1:{unused_tcp_port}"
- async with websockets.client.connect(url):
+ async with connect(url):
await asyncio.sleep(0.1)
disconnect.set()
@@ -642,11 +645,11 @@ async def test_send_close_on_server_shutdown(
disconnect_message = message
break
- websocket: websockets.client.WebSocketClientProtocol | None = None
+ websocket: ClientConnection | None = None
async def websocket_session(uri: str):
nonlocal websocket
- async with websockets.client.connect(uri) as ws_connection:
+ async with connect(uri) as ws_connection:
websocket = ws_connection
await server_shutdown_event.wait()
@@ -676,9 +679,7 @@ async def test_subprotocols(
await self.send({"type": "websocket.accept", "subprotocol": subprotocol})
async def get_subprotocol(url: str):
- async with websockets.client.connect(
- url, subprotocols=[Subprotocol("proto1"), Subprotocol("proto2")]
- ) as websocket:
+ async with connect(url, subprotocols=[Subprotocol("proto1"), Subprotocol("proto2")]) as websocket:
return websocket.subprotocol
config = Config(app=App, ws=ws_protocol_cls, http=http_protocol_cls, lifespan="off", port=unused_tcp_port)
@@ -688,7 +689,7 @@ async def test_subprotocols(
MAX_WS_BYTES = 1024 * 1024 * 16
-MAX_WS_BYTES_PLUS1 = MAX_WS_BYTES + 1
+MAX_WS_BYTES_PLUS1 = MAX_WS_BYTES + 10
@pytest.mark.parametrize(
@@ -731,15 +732,15 @@ async def test_send_binary_data_to_server_bigger_than_default_on_websockets(
port=unused_tcp_port,
)
async with run_server(config):
- async with websockets.client.connect(f"ws://127.0.0.1:{unused_tcp_port}", max_size=client_size_sent) as ws:
+ async with connect(f"ws://127.0.0.1:{unused_tcp_port}", max_size=client_size_sent) as ws:
await ws.send(b"\x01" * client_size_sent)
if expected_result == 0:
data = await ws.recv()
assert data == b"\x01" * client_size_sent
else:
- with pytest.raises(websockets.exceptions.ConnectionClosedError):
+ with pytest.raises(ConnectionClosedError):
await ws.recv()
- assert ws.close_code == expected_result
+ assert ws.protocol.close_code == expected_result
async def test_server_reject_connection(
@@ -764,10 +765,10 @@ async def test_server_reject_connection(
disconnected_message = await receive()
async def websocket_session(url: str):
- with pytest.raises(websockets.exceptions.InvalidStatusCode) as exc_info:
- async with websockets.client.connect(url):
+ with pytest.raises(InvalidStatus) as exc_info:
+ async with connect(url):
pass # pragma: no cover
- assert exc_info.value.status_code == 403
+ assert exc_info.value.response.status_code == 403
config = Config(app=app, ws=ws_protocol_cls, http=http_protocol_cls, lifespan="off", port=unused_tcp_port)
async with run_server(config):
@@ -937,10 +938,10 @@ async def test_server_reject_connection_with_invalid_msg(
await send(message)
async def websocket_session(url: str):
- with pytest.raises(websockets.exceptions.InvalidStatusCode) as exc_info:
- async with websockets.client.connect(url):
+ with pytest.raises(InvalidStatus) as exc_info:
+ async with connect(url):
pass # pragma: no cover
- assert exc_info.value.status_code == 404
+ assert exc_info.value.response.status_code == 404
config = Config(app=app, ws=ws_protocol_cls, http=http_protocol_cls, lifespan="off", port=unused_tcp_port)
async with run_server(config):
@@ -971,10 +972,10 @@ async def test_server_reject_connection_with_missing_body(
# no further message
async def websocket_session(url: str):
- with pytest.raises(websockets.exceptions.InvalidStatusCode) as exc_info:
- async with websockets.client.connect(url):
+ with pytest.raises(InvalidStatus) as exc_info:
+ async with connect(url):
pass # pragma: no cover
- assert exc_info.value.status_code == 404
+ assert exc_info.value.response.status_code == 404
config = Config(app=app, ws=ws_protocol_cls, http=http_protocol_cls, lifespan="off", port=unused_tcp_port)
async with run_server(config):
@@ -1014,17 +1015,17 @@ async def test_server_multiple_websocket_http_response_start_events(
exception_message = str(exc)
async def websocket_session(url: str):
- with pytest.raises(websockets.exceptions.InvalidStatusCode) as exc_info:
- async with websockets.client.connect(url):
+ with pytest.raises(InvalidStatus) as exc_info:
+ async with connect(url):
pass # pragma: no cover
- assert exc_info.value.status_code == 404
+ assert exc_info.value.response.status_code == 404
config = Config(app=app, ws=ws_protocol_cls, http=http_protocol_cls, lifespan="off", port=unused_tcp_port)
async with run_server(config):
await websocket_session(f"ws://127.0.0.1:{unused_tcp_port}")
assert exception_message == (
- "Expected ASGI message 'websocket.http.response.body' but got " "'websocket.http.response.start'."
+ "Expected ASGI message 'websocket.http.response.body' but got 'websocket.http.response.start'."
)
@@ -1053,7 +1054,7 @@ async def test_server_can_read_messages_in_buffer_after_close(
config = Config(app=App, ws=ws_protocol_cls, http=http_protocol_cls, lifespan="off", port=unused_tcp_port)
async with run_server(config):
- async with websockets.client.connect(f"ws://127.0.0.1:{unused_tcp_port}") as websocket:
+ async with connect(f"ws://127.0.0.1:{unused_tcp_port}") as websocket:
await websocket.send(b"abc")
await websocket.send(b"abc")
await websocket.send(b"abc")
@@ -1070,8 +1071,9 @@ async def test_default_server_headers(
await self.send({"type": "websocket.accept"})
async def open_connection(url: str):
- async with websockets.client.connect(url) as websocket:
- return websocket.response_headers
+ async with connect(url) as websocket:
+ assert websocket.response
+ return websocket.response.headers
config = Config(app=App, ws=ws_protocol_cls, http=http_protocol_cls, lifespan="off", port=unused_tcp_port)
async with run_server(config):
@@ -1085,8 +1087,9 @@ async def test_no_server_headers(ws_protocol_cls: WSProtocol, http_protocol_cls:
await self.send({"type": "websocket.accept"})
async def open_connection(url: str):
- async with websockets.client.connect(url) as websocket:
- return websocket.response_headers
+ async with connect(url) as websocket:
+ assert websocket.response
+ return websocket.response.headers
config = Config(
app=App,
@@ -1108,8 +1111,9 @@ async def test_no_date_header_on_wsproto(http_protocol_cls: HTTPProtocol, unused
await self.send({"type": "websocket.accept"})
async def open_connection(url: str):
- async with websockets.client.connect(url) as websocket:
- return websocket.response_headers
+ async with connect(url) as websocket:
+ assert websocket.response
+ return websocket.response.headers
config = Config(
app=App,
@@ -1140,8 +1144,9 @@ async def test_multiple_server_header(
)
async def open_connection(url: str):
- async with websockets.client.connect(url) as websocket:
- return websocket.response_headers
+ async with connect(url) as websocket:
+ assert websocket.response
+ return websocket.response.headers
config = Config(app=App, ws=ws_protocol_cls, http=http_protocol_cls, lifespan="off", port=unused_tcp_port)
async with run_server(config):
@@ -1176,8 +1181,8 @@ async def test_lifespan_state(ws_protocol_cls: WSProtocol, http_protocol_cls: HT
await self.send({"type": "websocket.accept"})
async def open_connection(url: str):
- async with websockets.client.connect(url) as websocket:
- return websocket.open
+ async with connect(url):
+ return True
async def app_wrapper(scope: Scope, receive: ASGIReceiveCallable, send: ASGISendCallable):
if scope["type"] == "lifespan":
diff --git a/uvicorn/protocols/websockets/websockets_impl.py b/uvicorn/protocols/websockets/websockets_impl.py
index cd6c54f..685d6b6 100644
--- a/uvicorn/protocols/websockets/websockets_impl.py
+++ b/uvicorn/protocols/websockets/websockets_impl.py
@@ -13,8 +13,7 @@ from websockets.datastructures import Headers
from websockets.exceptions import ConnectionClosed
from websockets.extensions.base import ServerExtensionFactory
from websockets.extensions.permessage_deflate import ServerPerMessageDeflateFactory
-from websockets.legacy.server import HTTPResponse
-from websockets.server import WebSocketServerProtocol
+from websockets.legacy.server import HTTPResponse, WebSocketServerProtocol
from websockets.typing import Subprotocol
from uvicorn._types import (
diff --git a/uvicorn/protocols/websockets/wsproto_impl.py b/uvicorn/protocols/websockets/wsproto_impl.py
index 828afe5..5d84bff 100644
--- a/uvicorn/protocols/websockets/wsproto_impl.py
+++ b/uvicorn/protocols/websockets/wsproto_impl.py
@@ -149,12 +149,13 @@ class WSProtocol(asyncio.Protocol):
self.writable.set() # pragma: full coverage
def shutdown(self) -> None:
- if self.handshake_complete:
- self.queue.put_nowait({"type": "websocket.disconnect", "code": 1012})
- output = self.conn.send(wsproto.events.CloseConnection(code=1012))
- self.transport.write(output)
- else:
- self.send_500_response()
+ if not self.response_started:
+ if self.handshake_complete:
+ self.queue.put_nowait({"type": "websocket.disconnect", "code": 1012})
+ output = self.conn.send(wsproto.events.CloseConnection(code=1012))
+ self.transport.write(output)
+ else:
+ self.send_500_response()
self.transport.close()
def on_task_complete(self, task: asyncio.Task[None]) -> None:
@@ -221,13 +222,15 @@ class WSProtocol(asyncio.Protocol):
def send_500_response(self) -> None:
if self.response_started or self.handshake_complete:
return # we cannot send responses anymore
+ reject_data = b"Internal Server Error"
headers: list[tuple[bytes, bytes]] = [
(b"content-type", b"text/plain; charset=utf-8"),
+ (b"content-length", str(len(reject_data)).encode()),
(b"connection", b"close"),
(b"content-length", b"21"),
]
output = self.conn.send(wsproto.events.RejectConnection(status_code=500, headers=headers, has_body=True))
- output += self.conn.send(wsproto.events.RejectData(data=b"Internal Server Error"))
+ output += self.conn.send(wsproto.events.RejectData(data=reject_data))
self.transport.write(output)
async def run_asgi(self) -> None:

59
ilot/uvicorn/APKBUILD Normal file
View file

@ -0,0 +1,59 @@
maintainer="Michał Polański <michal@polanski.me>"
pkgname=uvicorn
pkgver=0.34.0
pkgrel=0
pkgdesc="Lightning-fast ASGI server"
url="https://www.uvicorn.org/"
license="BSD-3-Clause"
# disable due to lack of support for websockets 14
# https://gitlab.alpinelinux.org/alpine/aports/-/issues/16646
arch="noarch"
depends="py3-click py3-h11"
makedepends="py3-gpep517 py3-hatchling"
checkdepends="
py3-a2wsgi
py3-dotenv
py3-httptools
py3-httpx
py3-pytest
py3-pytest-mock
py3-trustme
py3-typing-extensions
py3-watchfiles
py3-websockets
py3-wsproto
py3-yaml
"
subpackages="$pkgname-pyc"
source="https://github.com/encode/uvicorn/archive/$pkgver/uvicorn-$pkgver.tar.gz
test_multiprocess.patch
2540_add-websocketssansioprotocol.patch
2541_bump-wesockets-on-requirements.patch
fix-test-wsgi.patch
"
build() {
gpep517 build-wheel \
--wheel-dir .dist \
--output-fd 3 3>&1 >&2
}
check() {
python3 -m venv --clear --without-pip --system-site-packages .testenv
.testenv/bin/python3 -m installer .dist/*.whl
.testenv/bin/python3 -m pytest \
-k "not test_close_connection_with_multiple_requests" # a known issue
}
package() {
python3 -m installer -d "$pkgdir" \
.dist/uvicorn-$pkgver-py3-none-any.whl
}
sha512sums="
260782e385a2934049da8c474750958826afe1bfe23b38fe2f6420f355af7a537563f8fe6ac3830814c7469203703d10f4f9f3d6e53e79113bfd2fd34f7a7c72 uvicorn-0.34.0.tar.gz
cfad91dd84f8974362f52d754d7a29f09d07927a46acaa0eb490b6115a5729d84d6df94fead10ccd4cce7f5ea376f1348b0f59daede661dd8373a3851c313c46 test_multiprocess.patch
858e9a7baaf1c12e076aecd81aaaf622b35a59dcaabea4ee1bfc4cda704c9fe271b1cc616a5910d845393717e4989cecb3b04be249cb5d0df1001ec5224c293f 2540_add-websocketssansioprotocol.patch
f8a8c190981b9070232ea985880685bc801947cc7f673d59abf73d3e68bc2e13515ad200232a1de2af0808bc85da48a341f57d47caf87bcc190bfdc3c45718e0 2541_bump-wesockets-on-requirements.patch
379963f9ccbda013e4a0bc3441eee70a581c91f60206aedc15df6a8737950824b7cb8d867774fc415763449bb3e0bba66601e8551101bfc1741098acd035f0cc fix-test-wsgi.patch
"

View file

@ -0,0 +1,13 @@
diff --git a/tests/middleware/test_wsgi.py.orig b/tests/middleware/test_wsgi.py
index 6003f27..2750487 100644
--- a/tests/middleware/test_wsgi.py.orig
+++ b/tests/middleware/test_wsgi.py
@@ -73,7 +73,7 @@ async def test_wsgi_post(wsgi_middleware: Callable) -> None:
async with httpx.AsyncClient(transport=transport, base_url="http://testserver") as client:
response = await client.post("/", json={"example": 123})
assert response.status_code == 200
- assert response.text == '{"example":123}'
+ assert response.text == '{"example": 123}'
@pytest.mark.anyio

View file

@ -0,0 +1,14 @@
Wait a bit longer, otherwise the workers might
not have time to finish restarting.
--- a/tests/supervisors/test_multiprocess.py
+++ b/tests/supervisors/test_multiprocess.py
@@ -132,7 +132,7 @@ def test_multiprocess_sighup() -> None:
time.sleep(1)
pids = [p.pid for p in supervisor.processes]
supervisor.signal_queue.append(signal.SIGHUP)
- time.sleep(1)
+ time.sleep(3)
assert pids != [p.pid for p in supervisor.processes]
supervisor.signal_queue.append(signal.SIGINT)
supervisor.join_all()

View file

@ -1,8 +1,8 @@
# Maintainer: Antoine Martin (ayakael) <dev@ayakael.net>
# Contributor: Antoine Martin (ayakael) <dev@ayakael.net>
pkgname=wikijs
pkgver=2.5.303
pkgrel=1
pkgver=2.5.307
pkgrel=0
pkgdesc="Wiki.js | A modern, lightweight and powerful wiki app built on Node.js"
license="AGPL-3.0"
arch="!armv7 x86_64"
@ -49,11 +49,14 @@ package() {
install -Dm644 "$builddir"/package.json -t "$pkgdir"/usr/lib/bundles/wikijs
cp -aR "$builddir"/assets "$builddir"/server "$builddir"/node_modules "$pkgdir"/usr/lib/bundles/wikijs
# remove prebuilts
rm -Rf "$pkgdir"/usr/lib/bundles/wikijs/node_modules/*/prebuilds
mkdir -p "$pkgdir"/var/lib/wikijs
chown 5494:5494 "$pkgdir"/var/lib/wikijs
}
sha512sums="
a463d79ad0d8ff15dbe568b839094d697c6de0b2e991b77a4944e2a82f9789de6840e504a4673e4e0900d61596e880ca276008de86dac4f05f5823dc0427d2fc wikijs-2.5.303.tar.gz
8bf22ae87a9e3b8dd6f7114d0cf59913ad2cb05a2ed0e9bb7ac302b546d71f34a14de64cbe6e0f8b887d5df65e9d2b065ca18fe4493d3939895b8fa7076dd567 wikijs-2.5.307.tar.gz
355131ee5617348b82681cb8543c784eea59689990a268ecd3b77d44fe9abcca9c86fb8b047f0a8faeba079c650faa7790c5dd65418d313cd7561f38bb590c03 wikijs.initd
07b536c20e370d2a926038165f0e953283259c213a80a8648419565f5359ab05f528ac310e81606914013da212270df6feddb22e514cbcb2464c8274c956e4af config.sample.yml.patch
"