Merge branch 'release-v1.82' into matrix-org-hotfixes
This commit is contained in:
commit
85fc42c546
|
@ -14,7 +14,7 @@ jobs:
|
||||||
# There's a 'download artifact' action, but it hasn't been updated for the workflow_run action
|
# There's a 'download artifact' action, but it hasn't been updated for the workflow_run action
|
||||||
# (https://github.com/actions/download-artifact/issues/60) so instead we get this mess:
|
# (https://github.com/actions/download-artifact/issues/60) so instead we get this mess:
|
||||||
- name: 📥 Download artifact
|
- name: 📥 Download artifact
|
||||||
uses: dawidd6/action-download-artifact@7132ab516fba5f602fafae6fdd4822afa10db76f # v2.26.1
|
uses: dawidd6/action-download-artifact@246dbf436b23d7c49e21a7ab8204ca9ecd1fe615 # v2.27.0
|
||||||
with:
|
with:
|
||||||
workflow: docs-pr.yaml
|
workflow: docs-pr.yaml
|
||||||
run_id: ${{ github.event.workflow_run.id }}
|
run_id: ${{ github.event.workflow_run.id }}
|
||||||
|
|
69
CHANGES.md
69
CHANGES.md
|
@ -1,3 +1,72 @@
|
||||||
|
Synapse 1.82.0rc1 (2023-04-18)
|
||||||
|
==============================
|
||||||
|
|
||||||
|
Features
|
||||||
|
--------
|
||||||
|
|
||||||
|
- Allow loading the `/directory/room/{roomAlias}` endpoint on workers. ([\#15333](https://github.com/matrix-org/synapse/issues/15333))
|
||||||
|
- Add some validation to `instance_map` configuration loading. ([\#15431](https://github.com/matrix-org/synapse/issues/15431))
|
||||||
|
- Allow loading the `/capabilities` endpoint on workers. ([\#15436](https://github.com/matrix-org/synapse/issues/15436))
|
||||||
|
|
||||||
|
|
||||||
|
Bugfixes
|
||||||
|
--------
|
||||||
|
|
||||||
|
- Delete server-side backup keys when deactivating an account. ([\#15181](https://github.com/matrix-org/synapse/issues/15181))
|
||||||
|
- Fix and document untold assumption that `on_logged_out` module hooks will be called before the deletion of pushers. ([\#15410](https://github.com/matrix-org/synapse/issues/15410))
|
||||||
|
- Improve robustness when handling a perspective key response by deduplicating received server keys. ([\#15423](https://github.com/matrix-org/synapse/issues/15423))
|
||||||
|
- Synapse now correctly fails to start if the config option `app_service_config_files` is not a list. ([\#15425](https://github.com/matrix-org/synapse/issues/15425))
|
||||||
|
- Disable loading `RefreshTokenServlet` (`/_matrix/client/(r0|v3|unstable)/refresh`) on workers. ([\#15428](https://github.com/matrix-org/synapse/issues/15428))
|
||||||
|
|
||||||
|
|
||||||
|
Improved Documentation
|
||||||
|
----------------------
|
||||||
|
|
||||||
|
- Note that the `delete_stale_devices_after` background job always runs on the main process. ([\#15452](https://github.com/matrix-org/synapse/issues/15452))
|
||||||
|
|
||||||
|
|
||||||
|
Deprecations and Removals
|
||||||
|
-------------------------
|
||||||
|
|
||||||
|
- Remove the broken, unspecced registration fallback. Note that the *login* fallback is unaffected by this change. ([\#15405](https://github.com/matrix-org/synapse/issues/15405))
|
||||||
|
|
||||||
|
|
||||||
|
Internal Changes
|
||||||
|
----------------
|
||||||
|
|
||||||
|
- Bump black from 23.1.0 to 23.3.0. ([\#15372](https://github.com/matrix-org/synapse/issues/15372))
|
||||||
|
- Bump pyopenssl from 23.1.0 to 23.1.1. ([\#15373](https://github.com/matrix-org/synapse/issues/15373))
|
||||||
|
- Bump types-psycopg2 from 2.9.21.8 to 2.9.21.9. ([\#15374](https://github.com/matrix-org/synapse/issues/15374))
|
||||||
|
- Bump types-netaddr from 0.8.0.6 to 0.8.0.7. ([\#15375](https://github.com/matrix-org/synapse/issues/15375))
|
||||||
|
- Bump types-opentracing from 2.4.10.3 to 2.4.10.4. ([\#15376](https://github.com/matrix-org/synapse/issues/15376))
|
||||||
|
- Bump dawidd6/action-download-artifact from 2.26.0 to 2.26.1. ([\#15404](https://github.com/matrix-org/synapse/issues/15404))
|
||||||
|
- Bump parameterized from 0.8.1 to 0.9.0. ([\#15412](https://github.com/matrix-org/synapse/issues/15412))
|
||||||
|
- Bump types-pillow from 9.4.0.17 to 9.4.0.19. ([\#15413](https://github.com/matrix-org/synapse/issues/15413))
|
||||||
|
- Bump sentry-sdk from 1.17.0 to 1.19.1. ([\#15414](https://github.com/matrix-org/synapse/issues/15414))
|
||||||
|
- Bump immutabledict from 2.2.3 to 2.2.4. ([\#15415](https://github.com/matrix-org/synapse/issues/15415))
|
||||||
|
- Bump dawidd6/action-download-artifact from 2.26.1 to 2.27.0. ([\#15441](https://github.com/matrix-org/synapse/issues/15441))
|
||||||
|
- Bump serde_json from 1.0.95 to 1.0.96. ([\#15442](https://github.com/matrix-org/synapse/issues/15442))
|
||||||
|
- Bump serde from 1.0.159 to 1.0.160. ([\#15443](https://github.com/matrix-org/synapse/issues/15443))
|
||||||
|
- Bump pillow from 9.4.0 to 9.5.0. ([\#15444](https://github.com/matrix-org/synapse/issues/15444))
|
||||||
|
- Bump furo from 2023.3.23 to 2023.3.27. ([\#15445](https://github.com/matrix-org/synapse/issues/15445))
|
||||||
|
- Bump types-pyopenssl from 23.1.0.0 to 23.1.0.2. ([\#15446](https://github.com/matrix-org/synapse/issues/15446))
|
||||||
|
- Bump mypy from 1.0.0 to 1.0.1. ([\#15447](https://github.com/matrix-org/synapse/issues/15447))
|
||||||
|
- Bump psycopg2 from 2.9.5 to 2.9.6. ([\#15448](https://github.com/matrix-org/synapse/issues/15448))
|
||||||
|
- Improve DB performance of clearing out old data from `stream_ordering_to_exterm`. ([\#15382](https://github.com/matrix-org/synapse/issues/15382), [\#15429](https://github.com/matrix-org/synapse/issues/15429))
|
||||||
|
- Implement [MSC3989](https://github.com/matrix-org/matrix-spec-proposals/pull/3989) redaction algorithm. ([\#15393](https://github.com/matrix-org/synapse/issues/15393))
|
||||||
|
- Implement [MSC2175](https://github.com/matrix-org/matrix-doc/pull/2175) to stop adding `creator` to create events. ([\#15394](https://github.com/matrix-org/synapse/issues/15394))
|
||||||
|
- Implement [MSC2174](https://github.com/matrix-org/matrix-spec-proposals/pull/2174) to move the `redacts` key to a `content` property. ([\#15395](https://github.com/matrix-org/synapse/issues/15395))
|
||||||
|
- Trust dtonlay/rust-toolchain in CI. ([\#15406](https://github.com/matrix-org/synapse/issues/15406))
|
||||||
|
- Explicitly install Synapse during typechecking in CI. ([\#15409](https://github.com/matrix-org/synapse/issues/15409))
|
||||||
|
- Only load the SSO redirect servlet if SSO is enabled. ([\#15421](https://github.com/matrix-org/synapse/issues/15421))
|
||||||
|
- Refactor `SimpleHttpClient` to pull out a base class. ([\#15427](https://github.com/matrix-org/synapse/issues/15427))
|
||||||
|
- Improve type hints. ([\#15432](https://github.com/matrix-org/synapse/issues/15432))
|
||||||
|
- Convert async to normal tests in `TestSSOHandler`. ([\#15433](https://github.com/matrix-org/synapse/issues/15433))
|
||||||
|
- Speed up the user directory background update. ([\#15435](https://github.com/matrix-org/synapse/issues/15435))
|
||||||
|
- Disable directory listing for static resources in `/_matrix/static/`. ([\#15438](https://github.com/matrix-org/synapse/issues/15438))
|
||||||
|
- Move various module API callback registration methods to a dedicated class. ([\#15453](https://github.com/matrix-org/synapse/issues/15453))
|
||||||
|
|
||||||
|
|
||||||
Synapse 1.81.0 (2023-04-11)
|
Synapse 1.81.0 (2023-04-11)
|
||||||
===========================
|
===========================
|
||||||
|
|
||||||
|
|
|
@ -323,18 +323,18 @@ checksum = "d29ab0c6d3fc0ee92fe66e2d99f700eab17a8d57d1c1d3b748380fb20baa78cd"
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "serde"
|
name = "serde"
|
||||||
version = "1.0.159"
|
version = "1.0.160"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "3c04e8343c3daeec41f58990b9d77068df31209f2af111e059e9fe9646693065"
|
checksum = "bb2f3770c8bce3bcda7e149193a069a0f4365bda1fa5cd88e03bca26afc1216c"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"serde_derive",
|
"serde_derive",
|
||||||
]
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "serde_derive"
|
name = "serde_derive"
|
||||||
version = "1.0.159"
|
version = "1.0.160"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "4c614d17805b093df4b147b51339e7e44bf05ef59fba1e45d83500bcfb4d8585"
|
checksum = "291a097c63d8497e00160b166a967a4a79c64f3facdd01cbd7502231688d77df"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"proc-macro2",
|
"proc-macro2",
|
||||||
"quote",
|
"quote",
|
||||||
|
@ -343,9 +343,9 @@ dependencies = [
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "serde_json"
|
name = "serde_json"
|
||||||
version = "1.0.95"
|
version = "1.0.96"
|
||||||
source = "registry+https://github.com/rust-lang/crates.io-index"
|
source = "registry+https://github.com/rust-lang/crates.io-index"
|
||||||
checksum = "d721eca97ac802aa7777b701877c8004d950fc142651367300d21c1cc0194744"
|
checksum = "057d394a50403bcac12672b2b18fb387ab6d289d957dab67dd201875391e52f1"
|
||||||
dependencies = [
|
dependencies = [
|
||||||
"itoa",
|
"itoa",
|
||||||
"ryu",
|
"ryu",
|
||||||
|
|
|
@ -1 +0,0 @@
|
||||||
Delete server-side backup keys when deactivating an account.
|
|
|
@ -1 +0,0 @@
|
||||||
Bump black from 23.1.0 to 23.3.0.
|
|
|
@ -1 +0,0 @@
|
||||||
Bump pyopenssl from 23.1.0 to 23.1.1.
|
|
|
@ -1 +0,0 @@
|
||||||
Bump types-psycopg2 from 2.9.21.8 to 2.9.21.9.
|
|
|
@ -1 +0,0 @@
|
||||||
Bump types-netaddr from 0.8.0.6 to 0.8.0.7.
|
|
|
@ -1 +0,0 @@
|
||||||
Bump types-opentracing from 2.4.10.3 to 2.4.10.4.
|
|
|
@ -1 +0,0 @@
|
||||||
Improve DB performance of clearing out old data from `stream_ordering_to_exterm`.
|
|
|
@ -1 +0,0 @@
|
||||||
Implement [MSC3989](https://github.com/matrix-org/matrix-spec-proposals/pull/3989) redaction algorithm.
|
|
|
@ -1 +0,0 @@
|
||||||
Implement [MSC2175](https://github.com/matrix-org/matrix-doc/pull/2175) to stop adding `creator` to create events.
|
|
|
@ -1 +0,0 @@
|
||||||
Implement [MSC2174](https://github.com/matrix-org/matrix-spec-proposals/pull/2174) to move the `redacts` key to a `content` property.
|
|
|
@ -1 +0,0 @@
|
||||||
Bump dawidd6/action-download-artifact from 2.26.0 to 2.26.1.
|
|
|
@ -1 +0,0 @@
|
||||||
Trust dtonlay/rust-toolchain in CI.
|
|
|
@ -1 +0,0 @@
|
||||||
Explicitly install Synapse during typechecking in CI.
|
|
|
@ -1 +0,0 @@
|
||||||
Bump parameterized from 0.8.1 to 0.9.0.
|
|
|
@ -1 +0,0 @@
|
||||||
Bump types-pillow from 9.4.0.17 to 9.4.0.19.
|
|
|
@ -1 +0,0 @@
|
||||||
Bump sentry-sdk from 1.17.0 to 1.19.1.
|
|
|
@ -1 +0,0 @@
|
||||||
Bump immutabledict from 2.2.3 to 2.2.4.
|
|
|
@ -1 +0,0 @@
|
||||||
Only load the SSO redirect servlet if SSO is enabled.
|
|
|
@ -1 +0,0 @@
|
||||||
Synapse now correctly fails to start if the config option `app_service_config_files` is not a list.
|
|
|
@ -1 +0,0 @@
|
||||||
Disable loading `RefreshTokenServlet` (`/_matrix/client/(r0|v3|unstable)/refresh`) on workers.
|
|
|
@ -1 +0,0 @@
|
||||||
Improve DB performance of clearing out old data from `stream_ordering_to_exterm`.
|
|
|
@ -1,3 +1,9 @@
|
||||||
|
matrix-synapse-py3 (1.82.0~rc1) stable; urgency=medium
|
||||||
|
|
||||||
|
* New Synapse release 1.82.0rc1.
|
||||||
|
|
||||||
|
-- Synapse Packaging team <packages@matrix.org> Tue, 18 Apr 2023 09:47:30 +0100
|
||||||
|
|
||||||
matrix-synapse-py3 (1.81.0) stable; urgency=medium
|
matrix-synapse-py3 (1.81.0) stable; urgency=medium
|
||||||
|
|
||||||
* New Synapse release 1.81.0.
|
* New Synapse release 1.81.0.
|
||||||
|
|
|
@ -173,6 +173,8 @@ WORKERS_CONFIG: Dict[str, Dict[str, Any]] = {
|
||||||
"^/_matrix/client/(api/v1|r0|v3|unstable)/search",
|
"^/_matrix/client/(api/v1|r0|v3|unstable)/search",
|
||||||
"^/_matrix/client/(r0|v3|unstable)/user/.*/filter(/|$)",
|
"^/_matrix/client/(r0|v3|unstable)/user/.*/filter(/|$)",
|
||||||
"^/_matrix/client/(r0|v3|unstable)/password_policy$",
|
"^/_matrix/client/(r0|v3|unstable)/password_policy$",
|
||||||
|
"^/_matrix/client/(api/v1|r0|v3|unstable)/directory/room/.*$",
|
||||||
|
"^/_matrix/client/(r0|v3|unstable)/capabilities$",
|
||||||
],
|
],
|
||||||
"shared_extra_conf": {},
|
"shared_extra_conf": {},
|
||||||
"worker_extra_conf": "",
|
"worker_extra_conf": "",
|
||||||
|
|
|
@ -103,6 +103,9 @@ Called during a logout request for a user. It is passed the qualified user ID, t
|
||||||
deactivated device (if any: access tokens are occasionally created without an associated
|
deactivated device (if any: access tokens are occasionally created without an associated
|
||||||
device ID), and the (now deactivated) access token.
|
device ID), and the (now deactivated) access token.
|
||||||
|
|
||||||
|
Deleting the related pushers is done after calling `on_logged_out`, so you can rely on them
|
||||||
|
to still be present.
|
||||||
|
|
||||||
If multiple modules implement this callback, Synapse runs them all in order.
|
If multiple modules implement this callback, Synapse runs them all in order.
|
||||||
|
|
||||||
### `get_username_for_registration`
|
### `get_username_for_registration`
|
||||||
|
|
|
@ -577,6 +577,10 @@ delete any device that hasn't been accessed for more than the specified amount o
|
||||||
|
|
||||||
Defaults to no duration, which means devices are never pruned.
|
Defaults to no duration, which means devices are never pruned.
|
||||||
|
|
||||||
|
**Note:** This task will always run on the main process, regardless of the value of
|
||||||
|
`run_background_tasks_on`. This is due to workers currently not having the ability to
|
||||||
|
delete devices.
|
||||||
|
|
||||||
Example configuration:
|
Example configuration:
|
||||||
```yaml
|
```yaml
|
||||||
delete_stale_devices_after: 1y
|
delete_stale_devices_after: 1y
|
||||||
|
|
|
@ -234,6 +234,8 @@ information.
|
||||||
^/_matrix/client/(api/v1|r0|v3|unstable/.*)/rooms/.*/aliases
|
^/_matrix/client/(api/v1|r0|v3|unstable/.*)/rooms/.*/aliases
|
||||||
^/_matrix/client/(api/v1|r0|v3|unstable)/search$
|
^/_matrix/client/(api/v1|r0|v3|unstable)/search$
|
||||||
^/_matrix/client/(r0|v3|unstable)/user/.*/filter(/|$)
|
^/_matrix/client/(r0|v3|unstable)/user/.*/filter(/|$)
|
||||||
|
^/_matrix/client/(api/v1|r0|v3|unstable)/directory/room/.*$
|
||||||
|
^/_matrix/client/(r0|v3|unstable)/capabilities$
|
||||||
|
|
||||||
# Encryption requests
|
# Encryption requests
|
||||||
^/_matrix/client/(r0|v3|unstable)/keys/query$
|
^/_matrix/client/(r0|v3|unstable)/keys/query$
|
||||||
|
|
|
@ -580,14 +580,14 @@ dev = ["Sphinx", "coverage", "flake8", "lxml", "lxml-stubs", "memory-profiler",
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "furo"
|
name = "furo"
|
||||||
version = "2023.3.23"
|
version = "2023.3.27"
|
||||||
description = "A clean customisable Sphinx documentation theme."
|
description = "A clean customisable Sphinx documentation theme."
|
||||||
category = "dev"
|
category = "dev"
|
||||||
optional = false
|
optional = false
|
||||||
python-versions = ">=3.7"
|
python-versions = ">=3.7"
|
||||||
files = [
|
files = [
|
||||||
{file = "furo-2023.3.23-py3-none-any.whl", hash = "sha256:1cdf0730496f6ac0ecf394845fe55010539d987a3085f29d819e49a8e87da60a"},
|
{file = "furo-2023.3.27-py3-none-any.whl", hash = "sha256:4ab2be254a2d5e52792d0ca793a12c35582dd09897228a6dd47885dabd5c9521"},
|
||||||
{file = "furo-2023.3.23.tar.gz", hash = "sha256:6cf9a59fe2919693ecff6509a08229afa081583cbb5b81f59c3e755f3bd81d26"},
|
{file = "furo-2023.3.27.tar.gz", hash = "sha256:b99e7867a5cc833b2b34d7230631dd6558c7a29f93071fdbb5709634bb33c5a5"},
|
||||||
]
|
]
|
||||||
|
|
||||||
[package.dependencies]
|
[package.dependencies]
|
||||||
|
@ -1460,38 +1460,38 @@ files = [
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "mypy"
|
name = "mypy"
|
||||||
version = "1.0.0"
|
version = "1.0.1"
|
||||||
description = "Optional static typing for Python"
|
description = "Optional static typing for Python"
|
||||||
category = "dev"
|
category = "dev"
|
||||||
optional = false
|
optional = false
|
||||||
python-versions = ">=3.7"
|
python-versions = ">=3.7"
|
||||||
files = [
|
files = [
|
||||||
{file = "mypy-1.0.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e0626db16705ab9f7fa6c249c017c887baf20738ce7f9129da162bb3075fc1af"},
|
{file = "mypy-1.0.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:71a808334d3f41ef011faa5a5cd8153606df5fc0b56de5b2e89566c8093a0c9a"},
|
||||||
{file = "mypy-1.0.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:1ace23f6bb4aec4604b86c4843276e8fa548d667dbbd0cb83a3ae14b18b2db6c"},
|
{file = "mypy-1.0.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:920169f0184215eef19294fa86ea49ffd4635dedfdea2b57e45cb4ee85d5ccaf"},
|
||||||
{file = "mypy-1.0.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:87edfaf344c9401942883fad030909116aa77b0fa7e6e8e1c5407e14549afe9a"},
|
{file = "mypy-1.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:27a0f74a298769d9fdc8498fcb4f2beb86f0564bcdb1a37b58cbbe78e55cf8c0"},
|
||||||
{file = "mypy-1.0.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:0ab090d9240d6b4e99e1fa998c2d0aa5b29fc0fb06bd30e7ad6183c95fa07593"},
|
{file = "mypy-1.0.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:65b122a993d9c81ea0bfde7689b3365318a88bde952e4dfa1b3a8b4ac05d168b"},
|
||||||
{file = "mypy-1.0.0-cp310-cp310-win_amd64.whl", hash = "sha256:7cc2c01dfc5a3cbddfa6c13f530ef3b95292f926329929001d45e124342cd6b7"},
|
{file = "mypy-1.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:5deb252fd42a77add936b463033a59b8e48eb2eaec2976d76b6878d031933fe4"},
|
||||||
{file = "mypy-1.0.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:14d776869a3e6c89c17eb943100f7868f677703c8a4e00b3803918f86aafbc52"},
|
{file = "mypy-1.0.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:2013226d17f20468f34feddd6aae4635a55f79626549099354ce641bc7d40262"},
|
||||||
{file = "mypy-1.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:bb2782a036d9eb6b5a6efcdda0986774bf798beef86a62da86cb73e2a10b423d"},
|
{file = "mypy-1.0.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:48525aec92b47baed9b3380371ab8ab6e63a5aab317347dfe9e55e02aaad22e8"},
|
||||||
{file = "mypy-1.0.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5cfca124f0ac6707747544c127880893ad72a656e136adc935c8600740b21ff5"},
|
{file = "mypy-1.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c96b8a0c019fe29040d520d9257d8c8f122a7343a8307bf8d6d4a43f5c5bfcc8"},
|
||||||
{file = "mypy-1.0.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:8845125d0b7c57838a10fd8925b0f5f709d0e08568ce587cc862aacce453e3dd"},
|
{file = "mypy-1.0.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:448de661536d270ce04f2d7dddaa49b2fdba6e3bd8a83212164d4174ff43aa65"},
|
||||||
{file = "mypy-1.0.0-cp311-cp311-win_amd64.whl", hash = "sha256:01b1b9e1ed40544ef486fa8ac022232ccc57109f379611633ede8e71630d07d2"},
|
{file = "mypy-1.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:d42a98e76070a365a1d1c220fcac8aa4ada12ae0db679cb4d910fabefc88b994"},
|
||||||
{file = "mypy-1.0.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:c7cf862aef988b5fbaa17764ad1d21b4831436701c7d2b653156a9497d92c83c"},
|
{file = "mypy-1.0.1-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:e64f48c6176e243ad015e995de05af7f22bbe370dbb5b32bd6988438ec873919"},
|
||||||
{file = "mypy-1.0.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5cd187d92b6939617f1168a4fe68f68add749902c010e66fe574c165c742ed88"},
|
{file = "mypy-1.0.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5fdd63e4f50e3538617887e9aee91855368d9fc1dea30da743837b0df7373bc4"},
|
||||||
{file = "mypy-1.0.0-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:4e5175026618c178dfba6188228b845b64131034ab3ba52acaffa8f6c361f805"},
|
{file = "mypy-1.0.1-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:dbeb24514c4acbc78d205f85dd0e800f34062efcc1f4a4857c57e4b4b8712bff"},
|
||||||
{file = "mypy-1.0.0-cp37-cp37m-win_amd64.whl", hash = "sha256:2f6ac8c87e046dc18c7d1d7f6653a66787a4555085b056fe2d599f1f1a2a2d21"},
|
{file = "mypy-1.0.1-cp37-cp37m-win_amd64.whl", hash = "sha256:a2948c40a7dd46c1c33765718936669dc1f628f134013b02ff5ac6c7ef6942bf"},
|
||||||
{file = "mypy-1.0.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:7306edca1c6f1b5fa0bc9aa645e6ac8393014fa82d0fa180d0ebc990ebe15964"},
|
{file = "mypy-1.0.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:5bc8d6bd3b274dd3846597855d96d38d947aedba18776aa998a8d46fabdaed76"},
|
||||||
{file = "mypy-1.0.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:3cfad08f16a9c6611e6143485a93de0e1e13f48cfb90bcad7d5fde1c0cec3d36"},
|
{file = "mypy-1.0.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:17455cda53eeee0a4adb6371a21dd3dbf465897de82843751cf822605d152c8c"},
|
||||||
{file = "mypy-1.0.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:67cced7f15654710386e5c10b96608f1ee3d5c94ca1da5a2aad5889793a824c1"},
|
{file = "mypy-1.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e831662208055b006eef68392a768ff83596035ffd6d846786578ba1714ba8f6"},
|
||||||
{file = "mypy-1.0.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:a86b794e8a56ada65c573183756eac8ac5b8d3d59daf9d5ebd72ecdbb7867a43"},
|
{file = "mypy-1.0.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:e60d0b09f62ae97a94605c3f73fd952395286cf3e3b9e7b97f60b01ddfbbda88"},
|
||||||
{file = "mypy-1.0.0-cp38-cp38-win_amd64.whl", hash = "sha256:50979d5efff8d4135d9db293c6cb2c42260e70fb010cbc697b1311a4d7a39ddb"},
|
{file = "mypy-1.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:0af4f0e20706aadf4e6f8f8dc5ab739089146b83fd53cb4a7e0e850ef3de0bb6"},
|
||||||
{file = "mypy-1.0.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:3ae4c7a99e5153496243146a3baf33b9beff714464ca386b5f62daad601d87af"},
|
{file = "mypy-1.0.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:24189f23dc66f83b839bd1cce2dfc356020dfc9a8bae03978477b15be61b062e"},
|
||||||
{file = "mypy-1.0.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:5e398652d005a198a7f3c132426b33c6b85d98aa7dc852137a2a3be8890c4072"},
|
{file = "mypy-1.0.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:93a85495fb13dc484251b4c1fd7a5ac370cd0d812bbfc3b39c1bafefe95275d5"},
|
||||||
{file = "mypy-1.0.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:be78077064d016bc1b639c2cbcc5be945b47b4261a4f4b7d8923f6c69c5c9457"},
|
{file = "mypy-1.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5f546ac34093c6ce33f6278f7c88f0f147a4849386d3bf3ae193702f4fe31407"},
|
||||||
{file = "mypy-1.0.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:92024447a339400ea00ac228369cd242e988dd775640755fa4ac0c126e49bb74"},
|
{file = "mypy-1.0.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:c6c2ccb7af7154673c591189c3687b013122c5a891bb5651eca3db8e6c6c55bd"},
|
||||||
{file = "mypy-1.0.0-cp39-cp39-win_amd64.whl", hash = "sha256:fe523fcbd52c05040c7bee370d66fee8373c5972171e4fbc323153433198592d"},
|
{file = "mypy-1.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:15b5a824b58c7c822c51bc66308e759243c32631896743f030daf449fe3677f3"},
|
||||||
{file = "mypy-1.0.0-py3-none-any.whl", hash = "sha256:2efa963bdddb27cb4a0d42545cd137a8d2b883bd181bbc4525b568ef6eca258f"},
|
{file = "mypy-1.0.1-py3-none-any.whl", hash = "sha256:eda5c8b9949ed411ff752b9a01adda31afe7eae1e53e946dbdf9db23865e66c4"},
|
||||||
{file = "mypy-1.0.0.tar.gz", hash = "sha256:f34495079c8d9da05b183f9f7daec2878280c2ad7cc81da686ef0b484cea2ecf"},
|
{file = "mypy-1.0.1.tar.gz", hash = "sha256:28cea5a6392bb43d266782983b5a4216c25544cd7d80be681a155ddcdafd152d"},
|
||||||
]
|
]
|
||||||
|
|
||||||
[package.dependencies]
|
[package.dependencies]
|
||||||
|
@ -1644,93 +1644,82 @@ files = [
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "pillow"
|
name = "pillow"
|
||||||
version = "9.4.0"
|
version = "9.5.0"
|
||||||
description = "Python Imaging Library (Fork)"
|
description = "Python Imaging Library (Fork)"
|
||||||
category = "main"
|
category = "main"
|
||||||
optional = false
|
optional = false
|
||||||
python-versions = ">=3.7"
|
python-versions = ">=3.7"
|
||||||
files = [
|
files = [
|
||||||
{file = "Pillow-9.4.0-1-cp310-cp310-macosx_10_10_x86_64.whl", hash = "sha256:1b4b4e9dda4f4e4c4e6896f93e84a8f0bcca3b059de9ddf67dac3c334b1195e1"},
|
{file = "Pillow-9.5.0-cp310-cp310-macosx_10_10_x86_64.whl", hash = "sha256:ace6ca218308447b9077c14ea4ef381ba0b67ee78d64046b3f19cf4e1139ad16"},
|
||||||
{file = "Pillow-9.4.0-1-cp311-cp311-macosx_10_10_x86_64.whl", hash = "sha256:fb5c1ad6bad98c57482236a21bf985ab0ef42bd51f7ad4e4538e89a997624e12"},
|
{file = "Pillow-9.5.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:d3d403753c9d5adc04d4694d35cf0391f0f3d57c8e0030aac09d7678fa8030aa"},
|
||||||
{file = "Pillow-9.4.0-1-cp37-cp37m-macosx_10_10_x86_64.whl", hash = "sha256:f0caf4a5dcf610d96c3bd32932bfac8aee61c96e60481c2a0ea58da435e25acd"},
|
{file = "Pillow-9.5.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5ba1b81ee69573fe7124881762bb4cd2e4b6ed9dd28c9c60a632902fe8db8b38"},
|
||||||
{file = "Pillow-9.4.0-1-cp38-cp38-macosx_10_10_x86_64.whl", hash = "sha256:3f4cc516e0b264c8d4ccd6b6cbc69a07c6d582d8337df79be1e15a5056b258c9"},
|
{file = "Pillow-9.5.0-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:fe7e1c262d3392afcf5071df9afa574544f28eac825284596ac6db56e6d11062"},
|
||||||
{file = "Pillow-9.4.0-1-cp39-cp39-macosx_10_10_x86_64.whl", hash = "sha256:b8c2f6eb0df979ee99433d8b3f6d193d9590f735cf12274c108bd954e30ca858"},
|
{file = "Pillow-9.5.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8f36397bf3f7d7c6a3abdea815ecf6fd14e7fcd4418ab24bae01008d8d8ca15e"},
|
||||||
{file = "Pillow-9.4.0-1-pp38-pypy38_pp73-macosx_10_10_x86_64.whl", hash = "sha256:b70756ec9417c34e097f987b4d8c510975216ad26ba6e57ccb53bc758f490dab"},
|
{file = "Pillow-9.5.0-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:252a03f1bdddce077eff2354c3861bf437c892fb1832f75ce813ee94347aa9b5"},
|
||||||
{file = "Pillow-9.4.0-1-pp39-pypy39_pp73-macosx_10_10_x86_64.whl", hash = "sha256:43521ce2c4b865d385e78579a082b6ad1166ebed2b1a2293c3be1d68dd7ca3b9"},
|
{file = "Pillow-9.5.0-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:85ec677246533e27770b0de5cf0f9d6e4ec0c212a1f89dfc941b64b21226009d"},
|
||||||
{file = "Pillow-9.4.0-2-cp310-cp310-macosx_10_10_x86_64.whl", hash = "sha256:9d9a62576b68cd90f7075876f4e8444487db5eeea0e4df3ba298ee38a8d067b0"},
|
{file = "Pillow-9.5.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:b416f03d37d27290cb93597335a2f85ed446731200705b22bb927405320de903"},
|
||||||
{file = "Pillow-9.4.0-2-cp311-cp311-macosx_10_10_x86_64.whl", hash = "sha256:87708d78a14d56a990fbf4f9cb350b7d89ee8988705e58e39bdf4d82c149210f"},
|
{file = "Pillow-9.5.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:1781a624c229cb35a2ac31cc4a77e28cafc8900733a864870c49bfeedacd106a"},
|
||||||
{file = "Pillow-9.4.0-2-cp37-cp37m-macosx_10_10_x86_64.whl", hash = "sha256:8a2b5874d17e72dfb80d917213abd55d7e1ed2479f38f001f264f7ce7bae757c"},
|
{file = "Pillow-9.5.0-cp310-cp310-win32.whl", hash = "sha256:8507eda3cd0608a1f94f58c64817e83ec12fa93a9436938b191b80d9e4c0fc44"},
|
||||||
{file = "Pillow-9.4.0-2-cp38-cp38-macosx_10_10_x86_64.whl", hash = "sha256:83125753a60cfc8c412de5896d10a0a405e0bd88d0470ad82e0869ddf0cb3848"},
|
{file = "Pillow-9.5.0-cp310-cp310-win_amd64.whl", hash = "sha256:d3c6b54e304c60c4181da1c9dadf83e4a54fd266a99c70ba646a9baa626819eb"},
|
||||||
{file = "Pillow-9.4.0-2-cp39-cp39-macosx_10_10_x86_64.whl", hash = "sha256:9e5f94742033898bfe84c93c831a6f552bb629448d4072dd312306bab3bd96f1"},
|
{file = "Pillow-9.5.0-cp311-cp311-macosx_10_10_x86_64.whl", hash = "sha256:7ec6f6ce99dab90b52da21cf0dc519e21095e332ff3b399a357c187b1a5eee32"},
|
||||||
{file = "Pillow-9.4.0-2-pp38-pypy38_pp73-macosx_10_10_x86_64.whl", hash = "sha256:013016af6b3a12a2f40b704677f8b51f72cb007dac785a9933d5c86a72a7fe33"},
|
{file = "Pillow-9.5.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:560737e70cb9c6255d6dcba3de6578a9e2ec4b573659943a5e7e4af13f298f5c"},
|
||||||
{file = "Pillow-9.4.0-2-pp39-pypy39_pp73-macosx_10_10_x86_64.whl", hash = "sha256:99d92d148dd03fd19d16175b6d355cc1b01faf80dae93c6c3eb4163709edc0a9"},
|
{file = "Pillow-9.5.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:96e88745a55b88a7c64fa49bceff363a1a27d9a64e04019c2281049444a571e3"},
|
||||||
{file = "Pillow-9.4.0-cp310-cp310-macosx_10_10_x86_64.whl", hash = "sha256:2968c58feca624bb6c8502f9564dd187d0e1389964898f5e9e1fbc8533169157"},
|
{file = "Pillow-9.5.0-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d9c206c29b46cfd343ea7cdfe1232443072bbb270d6a46f59c259460db76779a"},
|
||||||
{file = "Pillow-9.4.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:c5c1362c14aee73f50143d74389b2c158707b4abce2cb055b7ad37ce60738d47"},
|
{file = "Pillow-9.5.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cfcc2c53c06f2ccb8976fb5c71d448bdd0a07d26d8e07e321c103416444c7ad1"},
|
||||||
{file = "Pillow-9.4.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bd752c5ff1b4a870b7661234694f24b1d2b9076b8bf337321a814c612665f343"},
|
{file = "Pillow-9.5.0-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:a0f9bb6c80e6efcde93ffc51256d5cfb2155ff8f78292f074f60f9e70b942d99"},
|
||||||
{file = "Pillow-9.4.0-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:9a3049a10261d7f2b6514d35bbb7a4dfc3ece4c4de14ef5876c4b7a23a0e566d"},
|
{file = "Pillow-9.5.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:8d935f924bbab8f0a9a28404422da8af4904e36d5c33fc6f677e4c4485515625"},
|
||||||
{file = "Pillow-9.4.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:16a8df99701f9095bea8a6c4b3197da105df6f74e6176c5b410bc2df2fd29a57"},
|
{file = "Pillow-9.5.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:fed1e1cf6a42577953abbe8e6cf2fe2f566daebde7c34724ec8803c4c0cda579"},
|
||||||
{file = "Pillow-9.4.0-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:94cdff45173b1919350601f82d61365e792895e3c3a3443cf99819e6fbf717a5"},
|
{file = "Pillow-9.5.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:c1170d6b195555644f0616fd6ed929dfcf6333b8675fcca044ae5ab110ded296"},
|
||||||
{file = "Pillow-9.4.0-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:ed3e4b4e1e6de75fdc16d3259098de7c6571b1a6cc863b1a49e7d3d53e036070"},
|
{file = "Pillow-9.5.0-cp311-cp311-win32.whl", hash = "sha256:54f7102ad31a3de5666827526e248c3530b3a33539dbda27c6843d19d72644ec"},
|
||||||
{file = "Pillow-9.4.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:d5b2f8a31bd43e0f18172d8ac82347c8f37ef3e0b414431157718aa234991b28"},
|
{file = "Pillow-9.5.0-cp311-cp311-win_amd64.whl", hash = "sha256:cfa4561277f677ecf651e2b22dc43e8f5368b74a25a8f7d1d4a3a243e573f2d4"},
|
||||||
{file = "Pillow-9.4.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:09b89ddc95c248ee788328528e6a2996e09eaccddeeb82a5356e92645733be35"},
|
{file = "Pillow-9.5.0-cp311-cp311-win_arm64.whl", hash = "sha256:965e4a05ef364e7b973dd17fc765f42233415974d773e82144c9bbaaaea5d089"},
|
||||||
{file = "Pillow-9.4.0-cp310-cp310-win32.whl", hash = "sha256:f09598b416ba39a8f489c124447b007fe865f786a89dbfa48bb5cf395693132a"},
|
{file = "Pillow-9.5.0-cp312-cp312-win32.whl", hash = "sha256:22baf0c3cf0c7f26e82d6e1adf118027afb325e703922c8dfc1d5d0156bb2eeb"},
|
||||||
{file = "Pillow-9.4.0-cp310-cp310-win_amd64.whl", hash = "sha256:f6e78171be3fb7941f9910ea15b4b14ec27725865a73c15277bc39f5ca4f8391"},
|
{file = "Pillow-9.5.0-cp312-cp312-win_amd64.whl", hash = "sha256:432b975c009cf649420615388561c0ce7cc31ce9b2e374db659ee4f7d57a1f8b"},
|
||||||
{file = "Pillow-9.4.0-cp311-cp311-macosx_10_10_x86_64.whl", hash = "sha256:3fa1284762aacca6dc97474ee9c16f83990b8eeb6697f2ba17140d54b453e133"},
|
{file = "Pillow-9.5.0-cp37-cp37m-macosx_10_10_x86_64.whl", hash = "sha256:5d4ebf8e1db4441a55c509c4baa7a0587a0210f7cd25fcfe74dbbce7a4bd1906"},
|
||||||
{file = "Pillow-9.4.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:eaef5d2de3c7e9b21f1e762f289d17b726c2239a42b11e25446abf82b26ac132"},
|
{file = "Pillow-9.5.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:375f6e5ee9620a271acb6820b3d1e94ffa8e741c0601db4c0c4d3cb0a9c224bf"},
|
||||||
{file = "Pillow-9.4.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a4dfdae195335abb4e89cc9762b2edc524f3c6e80d647a9a81bf81e17e3fb6f0"},
|
{file = "Pillow-9.5.0-cp37-cp37m-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:99eb6cafb6ba90e436684e08dad8be1637efb71c4f2180ee6b8f940739406e78"},
|
||||||
{file = "Pillow-9.4.0-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6abfb51a82e919e3933eb137e17c4ae9c0475a25508ea88993bb59faf82f3b35"},
|
{file = "Pillow-9.5.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2dfaaf10b6172697b9bceb9a3bd7b951819d1ca339a5ef294d1f1ac6d7f63270"},
|
||||||
{file = "Pillow-9.4.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:451f10ef963918e65b8869e17d67db5e2f4ab40e716ee6ce7129b0cde2876eab"},
|
{file = "Pillow-9.5.0-cp37-cp37m-manylinux_2_28_aarch64.whl", hash = "sha256:763782b2e03e45e2c77d7779875f4432e25121ef002a41829d8868700d119392"},
|
||||||
{file = "Pillow-9.4.0-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:6663977496d616b618b6cfa43ec86e479ee62b942e1da76a2c3daa1c75933ef4"},
|
{file = "Pillow-9.5.0-cp37-cp37m-manylinux_2_28_x86_64.whl", hash = "sha256:35f6e77122a0c0762268216315bf239cf52b88865bba522999dc38f1c52b9b47"},
|
||||||
{file = "Pillow-9.4.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:60e7da3a3ad1812c128750fc1bc14a7ceeb8d29f77e0a2356a8fb2aa8925287d"},
|
{file = "Pillow-9.5.0-cp37-cp37m-win32.whl", hash = "sha256:aca1c196f407ec7cf04dcbb15d19a43c507a81f7ffc45b690899d6a76ac9fda7"},
|
||||||
{file = "Pillow-9.4.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:19005a8e58b7c1796bc0167862b1f54a64d3b44ee5d48152b06bb861458bc0f8"},
|
{file = "Pillow-9.5.0-cp37-cp37m-win_amd64.whl", hash = "sha256:322724c0032af6692456cd6ed554bb85f8149214d97398bb80613b04e33769f6"},
|
||||||
{file = "Pillow-9.4.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:f715c32e774a60a337b2bb8ad9839b4abf75b267a0f18806f6f4f5f1688c4b5a"},
|
{file = "Pillow-9.5.0-cp38-cp38-macosx_10_10_x86_64.whl", hash = "sha256:a0aa9417994d91301056f3d0038af1199eb7adc86e646a36b9e050b06f526597"},
|
||||||
{file = "Pillow-9.4.0-cp311-cp311-win32.whl", hash = "sha256:b222090c455d6d1a64e6b7bb5f4035c4dff479e22455c9eaa1bdd4c75b52c80c"},
|
{file = "Pillow-9.5.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:f8286396b351785801a976b1e85ea88e937712ee2c3ac653710a4a57a8da5d9c"},
|
||||||
{file = "Pillow-9.4.0-cp311-cp311-win_amd64.whl", hash = "sha256:ba6612b6548220ff5e9df85261bddc811a057b0b465a1226b39bfb8550616aee"},
|
{file = "Pillow-9.5.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c830a02caeb789633863b466b9de10c015bded434deb3ec87c768e53752ad22a"},
|
||||||
{file = "Pillow-9.4.0-cp37-cp37m-macosx_10_10_x86_64.whl", hash = "sha256:5f532a2ad4d174eb73494e7397988e22bf427f91acc8e6ebf5bb10597b49c493"},
|
{file = "Pillow-9.5.0-cp38-cp38-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:fbd359831c1657d69bb81f0db962905ee05e5e9451913b18b831febfe0519082"},
|
||||||
{file = "Pillow-9.4.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5dd5a9c3091a0f414a963d427f920368e2b6a4c2f7527fdd82cde8ef0bc7a327"},
|
{file = "Pillow-9.5.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f8fc330c3370a81bbf3f88557097d1ea26cd8b019d6433aa59f71195f5ddebbf"},
|
||||||
{file = "Pillow-9.4.0-cp37-cp37m-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ef21af928e807f10bf4141cad4746eee692a0dd3ff56cfb25fce076ec3cc8abe"},
|
{file = "Pillow-9.5.0-cp38-cp38-manylinux_2_28_aarch64.whl", hash = "sha256:7002d0797a3e4193c7cdee3198d7c14f92c0836d6b4a3f3046a64bd1ce8df2bf"},
|
||||||
{file = "Pillow-9.4.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:847b114580c5cc9ebaf216dd8c8dbc6b00a3b7ab0131e173d7120e6deade1f57"},
|
{file = "Pillow-9.5.0-cp38-cp38-manylinux_2_28_x86_64.whl", hash = "sha256:229e2c79c00e85989a34b5981a2b67aa079fd08c903f0aaead522a1d68d79e51"},
|
||||||
{file = "Pillow-9.4.0-cp37-cp37m-manylinux_2_28_aarch64.whl", hash = "sha256:653d7fb2df65efefbcbf81ef5fe5e5be931f1ee4332c2893ca638c9b11a409c4"},
|
{file = "Pillow-9.5.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:9adf58f5d64e474bed00d69bcd86ec4bcaa4123bfa70a65ce72e424bfb88ed96"},
|
||||||
{file = "Pillow-9.4.0-cp37-cp37m-manylinux_2_28_x86_64.whl", hash = "sha256:46f39cab8bbf4a384ba7cb0bc8bae7b7062b6a11cfac1ca4bc144dea90d4a9f5"},
|
{file = "Pillow-9.5.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:662da1f3f89a302cc22faa9f14a262c2e3951f9dbc9617609a47521c69dd9f8f"},
|
||||||
{file = "Pillow-9.4.0-cp37-cp37m-win32.whl", hash = "sha256:7ac7594397698f77bce84382929747130765f66406dc2cd8b4ab4da68ade4c6e"},
|
{file = "Pillow-9.5.0-cp38-cp38-win32.whl", hash = "sha256:6608ff3bf781eee0cd14d0901a2b9cc3d3834516532e3bd673a0a204dc8615fc"},
|
||||||
{file = "Pillow-9.4.0-cp37-cp37m-win_amd64.whl", hash = "sha256:46c259e87199041583658457372a183636ae8cd56dbf3f0755e0f376a7f9d0e6"},
|
{file = "Pillow-9.5.0-cp38-cp38-win_amd64.whl", hash = "sha256:e49eb4e95ff6fd7c0c402508894b1ef0e01b99a44320ba7d8ecbabefddcc5569"},
|
||||||
{file = "Pillow-9.4.0-cp38-cp38-macosx_10_10_x86_64.whl", hash = "sha256:0e51f608da093e5d9038c592b5b575cadc12fd748af1479b5e858045fff955a9"},
|
{file = "Pillow-9.5.0-cp39-cp39-macosx_10_10_x86_64.whl", hash = "sha256:482877592e927fd263028c105b36272398e3e1be3269efda09f6ba21fd83ec66"},
|
||||||
{file = "Pillow-9.4.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:765cb54c0b8724a7c12c55146ae4647e0274a839fb6de7bcba841e04298e1011"},
|
{file = "Pillow-9.5.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:3ded42b9ad70e5f1754fb7c2e2d6465a9c842e41d178f262e08b8c85ed8a1d8e"},
|
||||||
{file = "Pillow-9.4.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:519e14e2c49fcf7616d6d2cfc5c70adae95682ae20f0395e9280db85e8d6c4df"},
|
{file = "Pillow-9.5.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c446d2245ba29820d405315083d55299a796695d747efceb5717a8b450324115"},
|
||||||
{file = "Pillow-9.4.0-cp38-cp38-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d197df5489004db87d90b918033edbeee0bd6df3848a204bca3ff0a903bef837"},
|
{file = "Pillow-9.5.0-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8aca1152d93dcc27dc55395604dcfc55bed5f25ef4c98716a928bacba90d33a3"},
|
||||||
{file = "Pillow-9.4.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0845adc64fe9886db00f5ab68c4a8cd933ab749a87747555cec1c95acea64b0b"},
|
{file = "Pillow-9.5.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:608488bdcbdb4ba7837461442b90ea6f3079397ddc968c31265c1e056964f1ef"},
|
||||||
{file = "Pillow-9.4.0-cp38-cp38-manylinux_2_28_aarch64.whl", hash = "sha256:e1339790c083c5a4de48f688b4841f18df839eb3c9584a770cbd818b33e26d5d"},
|
{file = "Pillow-9.5.0-cp39-cp39-manylinux_2_28_aarch64.whl", hash = "sha256:60037a8db8750e474af7ffc9faa9b5859e6c6d0a50e55c45576bf28be7419705"},
|
||||||
{file = "Pillow-9.4.0-cp38-cp38-manylinux_2_28_x86_64.whl", hash = "sha256:a96e6e23f2b79433390273eaf8cc94fec9c6370842e577ab10dabdcc7ea0a66b"},
|
{file = "Pillow-9.5.0-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:07999f5834bdc404c442146942a2ecadd1cb6292f5229f4ed3b31e0a108746b1"},
|
||||||
{file = "Pillow-9.4.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:7cfc287da09f9d2a7ec146ee4d72d6ea1342e770d975e49a8621bf54eaa8f30f"},
|
{file = "Pillow-9.5.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:a127ae76092974abfbfa38ca2d12cbeddcdeac0fb71f9627cc1135bedaf9d51a"},
|
||||||
{file = "Pillow-9.4.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:d7081c084ceb58278dd3cf81f836bc818978c0ccc770cbbb202125ddabec6628"},
|
{file = "Pillow-9.5.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:489f8389261e5ed43ac8ff7b453162af39c3e8abd730af8363587ba64bb2e865"},
|
||||||
{file = "Pillow-9.4.0-cp38-cp38-win32.whl", hash = "sha256:df41112ccce5d47770a0c13651479fbcd8793f34232a2dd9faeccb75eb5d0d0d"},
|
{file = "Pillow-9.5.0-cp39-cp39-win32.whl", hash = "sha256:9b1af95c3a967bf1da94f253e56b6286b50af23392a886720f563c547e48e964"},
|
||||||
{file = "Pillow-9.4.0-cp38-cp38-win_amd64.whl", hash = "sha256:7a21222644ab69ddd9967cfe6f2bb420b460dae4289c9d40ff9a4896e7c35c9a"},
|
{file = "Pillow-9.5.0-cp39-cp39-win_amd64.whl", hash = "sha256:77165c4a5e7d5a284f10a6efaa39a0ae8ba839da344f20b111d62cc932fa4e5d"},
|
||||||
{file = "Pillow-9.4.0-cp39-cp39-macosx_10_10_x86_64.whl", hash = "sha256:0f3269304c1a7ce82f1759c12ce731ef9b6e95b6df829dccd9fe42912cc48569"},
|
{file = "Pillow-9.5.0-pp38-pypy38_pp73-macosx_10_10_x86_64.whl", hash = "sha256:833b86a98e0ede388fa29363159c9b1a294b0905b5128baf01db683672f230f5"},
|
||||||
{file = "Pillow-9.4.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:cb362e3b0976dc994857391b776ddaa8c13c28a16f80ac6522c23d5257156bed"},
|
{file = "Pillow-9.5.0-pp38-pypy38_pp73-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:aaf305d6d40bd9632198c766fb64f0c1a83ca5b667f16c1e79e1661ab5060140"},
|
||||||
{file = "Pillow-9.4.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a2e0f87144fcbbe54297cae708c5e7f9da21a4646523456b00cc956bd4c65815"},
|
{file = "Pillow-9.5.0-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0852ddb76d85f127c135b6dd1f0bb88dbb9ee990d2cd9aa9e28526c93e794fba"},
|
||||||
{file = "Pillow-9.4.0-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:28676836c7796805914b76b1837a40f76827ee0d5398f72f7dcc634bae7c6264"},
|
{file = "Pillow-9.5.0-pp38-pypy38_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:91ec6fe47b5eb5a9968c79ad9ed78c342b1f97a091677ba0e012701add857829"},
|
||||||
{file = "Pillow-9.4.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0884ba7b515163a1a05440a138adeb722b8a6ae2c2b33aea93ea3118dd3a899e"},
|
{file = "Pillow-9.5.0-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:cb841572862f629b99725ebaec3287fc6d275be9b14443ea746c1dd325053cbd"},
|
||||||
{file = "Pillow-9.4.0-cp39-cp39-manylinux_2_28_aarch64.whl", hash = "sha256:53dcb50fbdc3fb2c55431a9b30caeb2f7027fcd2aeb501459464f0214200a503"},
|
{file = "Pillow-9.5.0-pp39-pypy39_pp73-macosx_10_10_x86_64.whl", hash = "sha256:c380b27d041209b849ed246b111b7c166ba36d7933ec6e41175fd15ab9eb1572"},
|
||||||
{file = "Pillow-9.4.0-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:e8c5cf126889a4de385c02a2c3d3aba4b00f70234bfddae82a5eaa3ee6d5e3e6"},
|
{file = "Pillow-9.5.0-pp39-pypy39_pp73-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7c9af5a3b406a50e313467e3565fc99929717f780164fe6fbb7704edba0cebbe"},
|
||||||
{file = "Pillow-9.4.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:6c6b1389ed66cdd174d040105123a5a1bc91d0aa7059c7261d20e583b6d8cbd2"},
|
{file = "Pillow-9.5.0-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5671583eab84af046a397d6d0ba25343c00cd50bce03787948e0fff01d4fd9b1"},
|
||||||
{file = "Pillow-9.4.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:0dd4c681b82214b36273c18ca7ee87065a50e013112eea7d78c7a1b89a739153"},
|
{file = "Pillow-9.5.0-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:84a6f19ce086c1bf894644b43cd129702f781ba5751ca8572f08aa40ef0ab7b7"},
|
||||||
{file = "Pillow-9.4.0-cp39-cp39-win32.whl", hash = "sha256:6d9dfb9959a3b0039ee06c1a1a90dc23bac3b430842dcb97908ddde05870601c"},
|
{file = "Pillow-9.5.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:1e7723bd90ef94eda669a3c2c19d549874dd5badaeefabefd26053304abe5799"},
|
||||||
{file = "Pillow-9.4.0-cp39-cp39-win_amd64.whl", hash = "sha256:54614444887e0d3043557d9dbc697dbb16cfb5a35d672b7a0fcc1ed0cf1c600b"},
|
{file = "Pillow-9.5.0.tar.gz", hash = "sha256:bf548479d336726d7a0eceb6e767e179fbde37833ae42794602631a070d630f1"},
|
||||||
{file = "Pillow-9.4.0-pp38-pypy38_pp73-macosx_10_10_x86_64.whl", hash = "sha256:b9b752ab91e78234941e44abdecc07f1f0d8f51fb62941d32995b8161f68cfe5"},
|
|
||||||
{file = "Pillow-9.4.0-pp38-pypy38_pp73-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d3b56206244dc8711f7e8b7d6cad4663917cd5b2d950799425076681e8766286"},
|
|
||||||
{file = "Pillow-9.4.0-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:aabdab8ec1e7ca7f1434d042bf8b1e92056245fb179790dc97ed040361f16bfd"},
|
|
||||||
{file = "Pillow-9.4.0-pp38-pypy38_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:db74f5562c09953b2c5f8ec4b7dfd3f5421f31811e97d1dbc0a7c93d6e3a24df"},
|
|
||||||
{file = "Pillow-9.4.0-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:e9d7747847c53a16a729b6ee5e737cf170f7a16611c143d95aa60a109a59c336"},
|
|
||||||
{file = "Pillow-9.4.0-pp39-pypy39_pp73-macosx_10_10_x86_64.whl", hash = "sha256:b52ff4f4e002f828ea6483faf4c4e8deea8d743cf801b74910243c58acc6eda3"},
|
|
||||||
{file = "Pillow-9.4.0-pp39-pypy39_pp73-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:575d8912dca808edd9acd6f7795199332696d3469665ef26163cd090fa1f8bfa"},
|
|
||||||
{file = "Pillow-9.4.0-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c3c4ed2ff6760e98d262e0cc9c9a7f7b8a9f61aa4d47c58835cdaf7b0b8811bb"},
|
|
||||||
{file = "Pillow-9.4.0-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:e621b0246192d3b9cb1dc62c78cfa4c6f6d2ddc0ec207d43c0dedecb914f152a"},
|
|
||||||
{file = "Pillow-9.4.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:8f127e7b028900421cad64f51f75c051b628db17fb00e099eb148761eed598c9"},
|
|
||||||
{file = "Pillow-9.4.0.tar.gz", hash = "sha256:a1c2d7780448eb93fbcc3789bf3916aa5720d942e37945f4056680317f1cd23e"},
|
|
||||||
]
|
]
|
||||||
|
|
||||||
[package.extras]
|
[package.extras]
|
||||||
docs = ["furo", "olefile", "sphinx (>=2.4)", "sphinx-copybutton", "sphinx-inline-tabs", "sphinx-issues (>=3.0.1)", "sphinx-removed-in", "sphinxext-opengraph"]
|
docs = ["furo", "olefile", "sphinx (>=2.4)", "sphinx-copybutton", "sphinx-inline-tabs", "sphinx-removed-in", "sphinxext-opengraph"]
|
||||||
tests = ["check-manifest", "coverage", "defusedxml", "markdown2", "olefile", "packaging", "pyroma", "pytest", "pytest-cov", "pytest-timeout"]
|
tests = ["check-manifest", "coverage", "defusedxml", "markdown2", "olefile", "packaging", "pyroma", "pytest", "pytest-cov", "pytest-timeout"]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
|
@ -1796,25 +1785,25 @@ twisted = ["twisted"]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "psycopg2"
|
name = "psycopg2"
|
||||||
version = "2.9.5"
|
version = "2.9.6"
|
||||||
description = "psycopg2 - Python-PostgreSQL Database Adapter"
|
description = "psycopg2 - Python-PostgreSQL Database Adapter"
|
||||||
category = "main"
|
category = "main"
|
||||||
optional = true
|
optional = true
|
||||||
python-versions = ">=3.6"
|
python-versions = ">=3.6"
|
||||||
files = [
|
files = [
|
||||||
{file = "psycopg2-2.9.5-cp310-cp310-win32.whl", hash = "sha256:d3ef67e630b0de0779c42912fe2cbae3805ebaba30cda27fea2a3de650a9414f"},
|
{file = "psycopg2-2.9.6-cp310-cp310-win32.whl", hash = "sha256:f7a7a5ee78ba7dc74265ba69e010ae89dae635eea0e97b055fb641a01a31d2b1"},
|
||||||
{file = "psycopg2-2.9.5-cp310-cp310-win_amd64.whl", hash = "sha256:4cb9936316d88bfab614666eb9e32995e794ed0f8f6b3b718666c22819c1d7ee"},
|
{file = "psycopg2-2.9.6-cp310-cp310-win_amd64.whl", hash = "sha256:f75001a1cbbe523e00b0ef896a5a1ada2da93ccd752b7636db5a99bc57c44494"},
|
||||||
{file = "psycopg2-2.9.5-cp311-cp311-win32.whl", hash = "sha256:093e3894d2d3c592ab0945d9eba9d139c139664dcf83a1c440b8a7aa9bb21955"},
|
{file = "psycopg2-2.9.6-cp311-cp311-win32.whl", hash = "sha256:53f4ad0a3988f983e9b49a5d9765d663bbe84f508ed655affdb810af9d0972ad"},
|
||||||
{file = "psycopg2-2.9.5-cp311-cp311-win_amd64.whl", hash = "sha256:920bf418000dd17669d2904472efeab2b20546efd0548139618f8fa305d1d7ad"},
|
{file = "psycopg2-2.9.6-cp311-cp311-win_amd64.whl", hash = "sha256:b81fcb9ecfc584f661b71c889edeae70bae30d3ef74fa0ca388ecda50b1222b7"},
|
||||||
{file = "psycopg2-2.9.5-cp36-cp36m-win32.whl", hash = "sha256:b9ac1b0d8ecc49e05e4e182694f418d27f3aedcfca854ebd6c05bb1cffa10d6d"},
|
{file = "psycopg2-2.9.6-cp36-cp36m-win32.whl", hash = "sha256:11aca705ec888e4f4cea97289a0bf0f22a067a32614f6ef64fcf7b8bfbc53744"},
|
||||||
{file = "psycopg2-2.9.5-cp36-cp36m-win_amd64.whl", hash = "sha256:fc04dd5189b90d825509caa510f20d1d504761e78b8dfb95a0ede180f71d50e5"},
|
{file = "psycopg2-2.9.6-cp36-cp36m-win_amd64.whl", hash = "sha256:36c941a767341d11549c0fbdbb2bf5be2eda4caf87f65dfcd7d146828bd27f39"},
|
||||||
{file = "psycopg2-2.9.5-cp37-cp37m-win32.whl", hash = "sha256:922cc5f0b98a5f2b1ff481f5551b95cd04580fd6f0c72d9b22e6c0145a4840e0"},
|
{file = "psycopg2-2.9.6-cp37-cp37m-win32.whl", hash = "sha256:869776630c04f335d4124f120b7fb377fe44b0a7645ab3c34b4ba42516951889"},
|
||||||
{file = "psycopg2-2.9.5-cp37-cp37m-win_amd64.whl", hash = "sha256:1e5a38aa85bd660c53947bd28aeaafb6a97d70423606f1ccb044a03a1203fe4a"},
|
{file = "psycopg2-2.9.6-cp37-cp37m-win_amd64.whl", hash = "sha256:a8ad4a47f42aa6aec8d061fdae21eaed8d864d4bb0f0cade5ad32ca16fcd6258"},
|
||||||
{file = "psycopg2-2.9.5-cp38-cp38-win32.whl", hash = "sha256:f5b6320dbc3cf6cfb9f25308286f9f7ab464e65cfb105b64cc9c52831748ced2"},
|
{file = "psycopg2-2.9.6-cp38-cp38-win32.whl", hash = "sha256:2362ee4d07ac85ff0ad93e22c693d0f37ff63e28f0615a16b6635a645f4b9214"},
|
||||||
{file = "psycopg2-2.9.5-cp38-cp38-win_amd64.whl", hash = "sha256:1a5c7d7d577e0eabfcf15eb87d1e19314c8c4f0e722a301f98e0e3a65e238b4e"},
|
{file = "psycopg2-2.9.6-cp38-cp38-win_amd64.whl", hash = "sha256:d24ead3716a7d093b90b27b3d73459fe8cd90fd7065cf43b3c40966221d8c394"},
|
||||||
{file = "psycopg2-2.9.5-cp39-cp39-win32.whl", hash = "sha256:322fd5fca0b1113677089d4ebd5222c964b1760e361f151cbb2706c4912112c5"},
|
{file = "psycopg2-2.9.6-cp39-cp39-win32.whl", hash = "sha256:1861a53a6a0fd248e42ea37c957d36950da00266378746588eab4f4b5649e95f"},
|
||||||
{file = "psycopg2-2.9.5-cp39-cp39-win_amd64.whl", hash = "sha256:190d51e8c1b25a47484e52a79638a8182451d6f6dff99f26ad9bd81e5359a0fa"},
|
{file = "psycopg2-2.9.6-cp39-cp39-win_amd64.whl", hash = "sha256:ded2faa2e6dfb430af7713d87ab4abbfc764d8d7fb73eafe96a24155f906ebf5"},
|
||||||
{file = "psycopg2-2.9.5.tar.gz", hash = "sha256:a5246d2e683a972e2187a8714b5c2cf8156c064629f9a9b1a873c1730d9e245a"},
|
{file = "psycopg2-2.9.6.tar.gz", hash = "sha256:f15158418fd826831b28585e2ab48ed8df2d0d98f502a2b4fe619e7d5ca29011"},
|
||||||
]
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
|
@ -3082,14 +3071,14 @@ files = [
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "types-pyopenssl"
|
name = "types-pyopenssl"
|
||||||
version = "23.1.0.0"
|
version = "23.1.0.2"
|
||||||
description = "Typing stubs for pyOpenSSL"
|
description = "Typing stubs for pyOpenSSL"
|
||||||
category = "dev"
|
category = "dev"
|
||||||
optional = false
|
optional = false
|
||||||
python-versions = "*"
|
python-versions = "*"
|
||||||
files = [
|
files = [
|
||||||
{file = "types-pyOpenSSL-23.1.0.0.tar.gz", hash = "sha256:acc153718bff497e8f6ca3beecb5ea7a3087c796e40d569fded8bafbfca73605"},
|
{file = "types-pyOpenSSL-23.1.0.2.tar.gz", hash = "sha256:20b80971b86240e8432a1832bd8124cea49c3088c7bfc77dfd23be27ffe4a517"},
|
||||||
{file = "types_pyOpenSSL-23.1.0.0-py3-none-any.whl", hash = "sha256:9dacec020a3484ef5e4ea4bd9d403a981765b80821d5a40b790b2ba2f09d58db"},
|
{file = "types_pyOpenSSL-23.1.0.2-py3-none-any.whl", hash = "sha256:b050641aeff6dfebf231ad719bdac12d53b8ee818d4afb67b886333484629957"},
|
||||||
]
|
]
|
||||||
|
|
||||||
[package.dependencies]
|
[package.dependencies]
|
||||||
|
|
|
@ -89,7 +89,7 @@ manifest-path = "rust/Cargo.toml"
|
||||||
|
|
||||||
[tool.poetry]
|
[tool.poetry]
|
||||||
name = "matrix-synapse"
|
name = "matrix-synapse"
|
||||||
version = "1.81.0"
|
version = "1.82.0rc1"
|
||||||
description = "Homeserver for the Matrix decentralised comms protocol"
|
description = "Homeserver for the Matrix decentralised comms protocol"
|
||||||
authors = ["Matrix.org Team and Contributors <packages@matrix.org>"]
|
authors = ["Matrix.org Team and Contributors <packages@matrix.org>"]
|
||||||
license = "Apache-2.0"
|
license = "Apache-2.0"
|
||||||
|
|
|
@ -64,7 +64,6 @@ from synapse.config.homeserver import HomeServerConfig
|
||||||
from synapse.config.server import ListenerConfig, ManholeConfig, TCPListenerConfig
|
from synapse.config.server import ListenerConfig, ManholeConfig, TCPListenerConfig
|
||||||
from synapse.crypto import context_factory
|
from synapse.crypto import context_factory
|
||||||
from synapse.events.presence_router import load_legacy_presence_router
|
from synapse.events.presence_router import load_legacy_presence_router
|
||||||
from synapse.events.spamcheck import load_legacy_spam_checkers
|
|
||||||
from synapse.events.third_party_rules import load_legacy_third_party_event_rules
|
from synapse.events.third_party_rules import load_legacy_third_party_event_rules
|
||||||
from synapse.handlers.auth import load_legacy_password_auth_providers
|
from synapse.handlers.auth import load_legacy_password_auth_providers
|
||||||
from synapse.http.site import SynapseSite
|
from synapse.http.site import SynapseSite
|
||||||
|
@ -73,6 +72,7 @@ from synapse.logging.opentracing import init_tracer
|
||||||
from synapse.metrics import install_gc_manager, register_threadpool
|
from synapse.metrics import install_gc_manager, register_threadpool
|
||||||
from synapse.metrics.background_process_metrics import wrap_as_background_process
|
from synapse.metrics.background_process_metrics import wrap_as_background_process
|
||||||
from synapse.metrics.jemalloc import setup_jemalloc_stats
|
from synapse.metrics.jemalloc import setup_jemalloc_stats
|
||||||
|
from synapse.module_api.callbacks.spamchecker_callbacks import load_legacy_spam_checkers
|
||||||
from synapse.types import ISynapseReactor
|
from synapse.types import ISynapseReactor
|
||||||
from synapse.util import SYNAPSE_VERSION
|
from synapse.util import SYNAPSE_VERSION
|
||||||
from synapse.util.caches.lrucache import setup_expire_lru_cache_entries
|
from synapse.util.caches.lrucache import setup_expire_lru_cache_entries
|
||||||
|
|
|
@ -11,9 +11,10 @@
|
||||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
# See the License for the specific language governing permissions and
|
# See the License for the specific language governing permissions and
|
||||||
# limitations under the License.
|
# limitations under the License.
|
||||||
from typing import Any, Iterable
|
from typing import Any, Dict, Iterable, Type, TypeVar
|
||||||
|
|
||||||
import jsonschema
|
import jsonschema
|
||||||
|
from pydantic import BaseModel, ValidationError, parse_obj_as
|
||||||
|
|
||||||
from synapse.config._base import ConfigError
|
from synapse.config._base import ConfigError
|
||||||
from synapse.types import JsonDict
|
from synapse.types import JsonDict
|
||||||
|
@ -64,3 +65,28 @@ def json_error_to_config_error(
|
||||||
else:
|
else:
|
||||||
path.append(str(p))
|
path.append(str(p))
|
||||||
return ConfigError(e.message, path)
|
return ConfigError(e.message, path)
|
||||||
|
|
||||||
|
|
||||||
|
Model = TypeVar("Model", bound=BaseModel)
|
||||||
|
|
||||||
|
|
||||||
|
def parse_and_validate_mapping(
|
||||||
|
config: Any,
|
||||||
|
model_type: Type[Model],
|
||||||
|
) -> Dict[str, Model]:
|
||||||
|
"""Parse `config` as a mapping from strings to a given `Model` type.
|
||||||
|
Args:
|
||||||
|
config: The configuration data to check
|
||||||
|
model_type: The BaseModel to validate and parse against.
|
||||||
|
Returns:
|
||||||
|
Fully validated and parsed Dict[str, Model].
|
||||||
|
Raises:
|
||||||
|
ConfigError, if given improper input.
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
# type-ignore: mypy doesn't like constructing `Dict[str, model_type]` because
|
||||||
|
# `model_type` is a runtime variable. Pydantic is fine with this.
|
||||||
|
instances = parse_obj_as(Dict[str, model_type], config) # type: ignore[valid-type]
|
||||||
|
except ValidationError as e:
|
||||||
|
raise ConfigError(str(e)) from e
|
||||||
|
return instances
|
||||||
|
|
|
@ -18,6 +18,7 @@ import logging
|
||||||
from typing import Any, Dict, List, Union
|
from typing import Any, Dict, List, Union
|
||||||
|
|
||||||
import attr
|
import attr
|
||||||
|
from pydantic import BaseModel, Extra, StrictBool, StrictInt, StrictStr
|
||||||
|
|
||||||
from synapse.config._base import (
|
from synapse.config._base import (
|
||||||
Config,
|
Config,
|
||||||
|
@ -25,6 +26,7 @@ from synapse.config._base import (
|
||||||
RoutableShardedWorkerHandlingConfig,
|
RoutableShardedWorkerHandlingConfig,
|
||||||
ShardedWorkerHandlingConfig,
|
ShardedWorkerHandlingConfig,
|
||||||
)
|
)
|
||||||
|
from synapse.config._util import parse_and_validate_mapping
|
||||||
from synapse.config.server import (
|
from synapse.config.server import (
|
||||||
DIRECT_TCP_ERROR,
|
DIRECT_TCP_ERROR,
|
||||||
TCPListenerConfig,
|
TCPListenerConfig,
|
||||||
|
@ -50,13 +52,43 @@ def _instance_to_list_converter(obj: Union[str, List[str]]) -> List[str]:
|
||||||
return obj
|
return obj
|
||||||
|
|
||||||
|
|
||||||
@attr.s(auto_attribs=True)
|
class ConfigModel(BaseModel):
|
||||||
class InstanceLocationConfig:
|
"""A custom version of Pydantic's BaseModel which
|
||||||
|
|
||||||
|
- ignores unknown fields and
|
||||||
|
- does not allow fields to be overwritten after construction,
|
||||||
|
|
||||||
|
but otherwise uses Pydantic's default behaviour.
|
||||||
|
|
||||||
|
For now, ignore unknown fields. In the future, we could change this so that unknown
|
||||||
|
config values cause a ValidationError, provided the error messages are meaningful to
|
||||||
|
server operators.
|
||||||
|
|
||||||
|
Subclassing in this way is recommended by
|
||||||
|
https://pydantic-docs.helpmanual.io/usage/model_config/#change-behaviour-globally
|
||||||
|
"""
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
# By default, ignore fields that we don't recognise.
|
||||||
|
extra = Extra.ignore
|
||||||
|
# By default, don't allow fields to be reassigned after parsing.
|
||||||
|
allow_mutation = False
|
||||||
|
|
||||||
|
|
||||||
|
class InstanceLocationConfig(ConfigModel):
|
||||||
"""The host and port to talk to an instance via HTTP replication."""
|
"""The host and port to talk to an instance via HTTP replication."""
|
||||||
|
|
||||||
host: str
|
host: StrictStr
|
||||||
port: int
|
port: StrictInt
|
||||||
tls: bool = False
|
tls: StrictBool = False
|
||||||
|
|
||||||
|
def scheme(self) -> str:
|
||||||
|
"""Hardcode a retrievable scheme based on self.tls"""
|
||||||
|
return "https" if self.tls else "http"
|
||||||
|
|
||||||
|
def netloc(self) -> str:
|
||||||
|
"""Nicely format the network location data"""
|
||||||
|
return f"{self.host}:{self.port}"
|
||||||
|
|
||||||
|
|
||||||
@attr.s
|
@attr.s
|
||||||
|
@ -183,10 +215,12 @@ class WorkerConfig(Config):
|
||||||
)
|
)
|
||||||
|
|
||||||
# A map from instance name to host/port of their HTTP replication endpoint.
|
# A map from instance name to host/port of their HTTP replication endpoint.
|
||||||
instance_map = config.get("instance_map") or {}
|
self.instance_map: Dict[
|
||||||
self.instance_map = {
|
str, InstanceLocationConfig
|
||||||
name: InstanceLocationConfig(**c) for name, c in instance_map.items()
|
] = parse_and_validate_mapping(
|
||||||
}
|
config.get("instance_map", {}),
|
||||||
|
InstanceLocationConfig,
|
||||||
|
)
|
||||||
|
|
||||||
# Map from type of streams to source, c.f. WriterLocations.
|
# Map from type of streams to source, c.f. WriterLocations.
|
||||||
writers = config.get("stream_writers") or {}
|
writers = config.get("stream_writers") or {}
|
||||||
|
|
|
@ -721,7 +721,7 @@ class PerspectivesKeyFetcher(BaseV2KeyFetcher):
|
||||||
)
|
)
|
||||||
|
|
||||||
keys: Dict[str, Dict[str, FetchKeyResult]] = {}
|
keys: Dict[str, Dict[str, FetchKeyResult]] = {}
|
||||||
added_keys: List[Tuple[str, str, FetchKeyResult]] = []
|
added_keys: Dict[Tuple[str, str], FetchKeyResult] = {}
|
||||||
|
|
||||||
time_now_ms = self.clock.time_msec()
|
time_now_ms = self.clock.time_msec()
|
||||||
|
|
||||||
|
@ -752,9 +752,27 @@ class PerspectivesKeyFetcher(BaseV2KeyFetcher):
|
||||||
# we continue to process the rest of the response
|
# we continue to process the rest of the response
|
||||||
continue
|
continue
|
||||||
|
|
||||||
added_keys.extend(
|
for key_id, key in processed_response.items():
|
||||||
(server_name, key_id, key) for key_id, key in processed_response.items()
|
dict_key = (server_name, key_id)
|
||||||
|
if dict_key in added_keys:
|
||||||
|
already_present_key = added_keys[dict_key]
|
||||||
|
logger.warning(
|
||||||
|
"Duplicate server keys for %s (%s) from perspective %s (%r, %r)",
|
||||||
|
server_name,
|
||||||
|
key_id,
|
||||||
|
perspective_name,
|
||||||
|
already_present_key,
|
||||||
|
key,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
if already_present_key.valid_until_ts > key.valid_until_ts:
|
||||||
|
# Favour the entry with the largest valid_until_ts,
|
||||||
|
# as `old_verify_keys` are also collected from this
|
||||||
|
# response.
|
||||||
|
continue
|
||||||
|
|
||||||
|
added_keys[dict_key] = key
|
||||||
|
|
||||||
keys.setdefault(server_name, {}).update(processed_response)
|
keys.setdefault(server_name, {}).update(processed_response)
|
||||||
|
|
||||||
await self.store.store_server_verify_keys(
|
await self.store.store_server_verify_keys(
|
||||||
|
|
|
@ -51,7 +51,7 @@ class FederationBase:
|
||||||
|
|
||||||
self.server_name = hs.hostname
|
self.server_name = hs.hostname
|
||||||
self.keyring = hs.get_keyring()
|
self.keyring = hs.get_keyring()
|
||||||
self.spam_checker = hs.get_spam_checker()
|
self._spam_checker_module_callbacks = hs.get_module_api_callbacks().spam_checker
|
||||||
self.store = hs.get_datastores().main
|
self.store = hs.get_datastores().main
|
||||||
self._clock = hs.get_clock()
|
self._clock = hs.get_clock()
|
||||||
self._storage_controllers = hs.get_storage_controllers()
|
self._storage_controllers = hs.get_storage_controllers()
|
||||||
|
@ -137,9 +137,9 @@ class FederationBase:
|
||||||
)
|
)
|
||||||
return redacted_event
|
return redacted_event
|
||||||
|
|
||||||
spam_check = await self.spam_checker.check_event_for_spam(pdu)
|
spam_check = await self._spam_checker_module_callbacks.check_event_for_spam(pdu)
|
||||||
|
|
||||||
if spam_check != self.spam_checker.NOT_SPAM:
|
if spam_check != self._spam_checker_module_callbacks.NOT_SPAM:
|
||||||
logger.warning("Event contains spam, soft-failing %s", pdu.event_id)
|
logger.warning("Event contains spam, soft-failing %s", pdu.event_id)
|
||||||
log_kv(
|
log_kv(
|
||||||
{
|
{
|
||||||
|
|
|
@ -130,7 +130,7 @@ class FederationServer(FederationBase):
|
||||||
super().__init__(hs)
|
super().__init__(hs)
|
||||||
|
|
||||||
self.handler = hs.get_federation_handler()
|
self.handler = hs.get_federation_handler()
|
||||||
self._spam_checker = hs.get_spam_checker()
|
self._spam_checker_module_callbacks = hs.get_module_api_callbacks().spam_checker
|
||||||
self._federation_event_handler = hs.get_federation_event_handler()
|
self._federation_event_handler = hs.get_federation_event_handler()
|
||||||
self.state = hs.get_state_handler()
|
self.state = hs.get_state_handler()
|
||||||
self._event_auth_handler = hs.get_event_auth_handler()
|
self._event_auth_handler = hs.get_event_auth_handler()
|
||||||
|
@ -1129,7 +1129,7 @@ class FederationServer(FederationBase):
|
||||||
logger.warning("event id %s: %s", pdu.event_id, e)
|
logger.warning("event id %s: %s", pdu.event_id, e)
|
||||||
raise FederationError("ERROR", 403, str(e), affected=pdu.event_id)
|
raise FederationError("ERROR", 403, str(e), affected=pdu.event_id)
|
||||||
|
|
||||||
if await self._spam_checker.should_drop_federated_event(pdu):
|
if await self._spam_checker_module_callbacks.should_drop_federated_event(pdu):
|
||||||
logger.warning(
|
logger.warning(
|
||||||
"Unstaged federated event contains spam, dropping %s", pdu.event_id
|
"Unstaged federated event contains spam, dropping %s", pdu.event_id
|
||||||
)
|
)
|
||||||
|
@ -1174,7 +1174,9 @@ class FederationServer(FederationBase):
|
||||||
|
|
||||||
origin, event = next
|
origin, event = next
|
||||||
|
|
||||||
if await self._spam_checker.should_drop_federated_event(event):
|
if await self._spam_checker_module_callbacks.should_drop_federated_event(
|
||||||
|
event
|
||||||
|
):
|
||||||
logger.warning(
|
logger.warning(
|
||||||
"Staged federated event contains spam, dropping %s",
|
"Staged federated event contains spam, dropping %s",
|
||||||
event.event_id,
|
event.event_id,
|
||||||
|
|
|
@ -513,8 +513,6 @@ class DeviceHandler(DeviceWorkerHandler):
|
||||||
else:
|
else:
|
||||||
raise
|
raise
|
||||||
|
|
||||||
await self.hs.get_pusherpool().remove_pushers_by_devices(user_id, device_ids)
|
|
||||||
|
|
||||||
# Delete data specific to each device. Not optimised as it is not
|
# Delete data specific to each device. Not optimised as it is not
|
||||||
# considered as part of a critical path.
|
# considered as part of a critical path.
|
||||||
for device_id in device_ids:
|
for device_id in device_ids:
|
||||||
|
@ -533,6 +531,10 @@ class DeviceHandler(DeviceWorkerHandler):
|
||||||
f"org.matrix.msc3890.local_notification_settings.{device_id}",
|
f"org.matrix.msc3890.local_notification_settings.{device_id}",
|
||||||
)
|
)
|
||||||
|
|
||||||
|
# Pushers are deleted after `delete_access_tokens_for_user` is called so that
|
||||||
|
# modules using `on_logged_out` hook can use them if needed.
|
||||||
|
await self.hs.get_pusherpool().remove_pushers_by_devices(user_id, device_ids)
|
||||||
|
|
||||||
await self.notify_device_update(user_id, device_ids)
|
await self.notify_device_update(user_id, device_ids)
|
||||||
|
|
||||||
async def update_device(self, user_id: str, device_id: str, content: dict) -> None:
|
async def update_device(self, user_id: str, device_id: str, content: dict) -> None:
|
||||||
|
|
|
@ -60,7 +60,7 @@ class DirectoryHandler:
|
||||||
"directory", self.on_directory_query
|
"directory", self.on_directory_query
|
||||||
)
|
)
|
||||||
|
|
||||||
self.spam_checker = hs.get_spam_checker()
|
self._spam_checker_module_callbacks = hs.get_module_api_callbacks().spam_checker
|
||||||
|
|
||||||
async def _create_association(
|
async def _create_association(
|
||||||
self,
|
self,
|
||||||
|
@ -145,10 +145,12 @@ class DirectoryHandler:
|
||||||
403, "You must be in the room to create an alias for it"
|
403, "You must be in the room to create an alias for it"
|
||||||
)
|
)
|
||||||
|
|
||||||
spam_check = await self.spam_checker.user_may_create_room_alias(
|
spam_check = (
|
||||||
|
await self._spam_checker_module_callbacks.user_may_create_room_alias(
|
||||||
user_id, room_alias
|
user_id, room_alias
|
||||||
)
|
)
|
||||||
if spam_check != self.spam_checker.NOT_SPAM:
|
)
|
||||||
|
if spam_check != self._spam_checker_module_callbacks.NOT_SPAM:
|
||||||
raise AuthError(
|
raise AuthError(
|
||||||
403,
|
403,
|
||||||
"This user is not permitted to create this alias",
|
"This user is not permitted to create this alias",
|
||||||
|
@ -444,7 +446,9 @@ class DirectoryHandler:
|
||||||
"""
|
"""
|
||||||
user_id = requester.user.to_string()
|
user_id = requester.user.to_string()
|
||||||
|
|
||||||
spam_check = await self.spam_checker.user_may_publish_room(user_id, room_id)
|
spam_check = await self._spam_checker_module_callbacks.user_may_publish_room(
|
||||||
|
user_id, room_id
|
||||||
|
)
|
||||||
if spam_check != NOT_SPAM:
|
if spam_check != NOT_SPAM:
|
||||||
raise AuthError(
|
raise AuthError(
|
||||||
403,
|
403,
|
||||||
|
|
|
@ -141,7 +141,7 @@ class FederationHandler:
|
||||||
self.server_name = hs.hostname
|
self.server_name = hs.hostname
|
||||||
self.keyring = hs.get_keyring()
|
self.keyring = hs.get_keyring()
|
||||||
self.is_mine_id = hs.is_mine_id
|
self.is_mine_id = hs.is_mine_id
|
||||||
self.spam_checker = hs.get_spam_checker()
|
self._spam_checker_module_callbacks = hs.get_module_api_callbacks().spam_checker
|
||||||
self.event_creation_handler = hs.get_event_creation_handler()
|
self.event_creation_handler = hs.get_event_creation_handler()
|
||||||
self.event_builder_factory = hs.get_event_builder_factory()
|
self.event_builder_factory = hs.get_event_builder_factory()
|
||||||
self._event_auth_handler = hs.get_event_auth_handler()
|
self._event_auth_handler = hs.get_event_auth_handler()
|
||||||
|
@ -1042,7 +1042,7 @@ class FederationHandler:
|
||||||
if self.hs.config.server.block_non_admin_invites:
|
if self.hs.config.server.block_non_admin_invites:
|
||||||
raise SynapseError(403, "This server does not accept room invites")
|
raise SynapseError(403, "This server does not accept room invites")
|
||||||
|
|
||||||
spam_check = await self.spam_checker.user_may_invite(
|
spam_check = await self._spam_checker_module_callbacks.user_may_invite(
|
||||||
event.sender, event.state_key, event.room_id
|
event.sender, event.state_key, event.room_id
|
||||||
)
|
)
|
||||||
if spam_check != NOT_SPAM:
|
if spam_check != NOT_SPAM:
|
||||||
|
|
|
@ -508,7 +508,7 @@ class EventCreationHandler:
|
||||||
|
|
||||||
self._bulk_push_rule_evaluator = hs.get_bulk_push_rule_evaluator()
|
self._bulk_push_rule_evaluator = hs.get_bulk_push_rule_evaluator()
|
||||||
|
|
||||||
self.spam_checker = hs.get_spam_checker()
|
self._spam_checker_module_callbacks = hs.get_module_api_callbacks().spam_checker
|
||||||
self.third_party_event_rules: "ThirdPartyEventRules" = (
|
self.third_party_event_rules: "ThirdPartyEventRules" = (
|
||||||
self.hs.get_third_party_event_rules()
|
self.hs.get_third_party_event_rules()
|
||||||
)
|
)
|
||||||
|
@ -1035,8 +1035,12 @@ class EventCreationHandler:
|
||||||
event.sender,
|
event.sender,
|
||||||
)
|
)
|
||||||
|
|
||||||
spam_check_result = await self.spam_checker.check_event_for_spam(event)
|
spam_check_result = (
|
||||||
if spam_check_result != self.spam_checker.NOT_SPAM:
|
await self._spam_checker_module_callbacks.check_event_for_spam(
|
||||||
|
event
|
||||||
|
)
|
||||||
|
)
|
||||||
|
if spam_check_result != self._spam_checker_module_callbacks.NOT_SPAM:
|
||||||
if isinstance(spam_check_result, tuple):
|
if isinstance(spam_check_result, tuple):
|
||||||
try:
|
try:
|
||||||
[code, dict] = spam_check_result
|
[code, dict] = spam_check_result
|
||||||
|
|
|
@ -110,7 +110,7 @@ class RegistrationHandler:
|
||||||
self._server_notices_mxid = hs.config.servernotices.server_notices_mxid
|
self._server_notices_mxid = hs.config.servernotices.server_notices_mxid
|
||||||
self._server_name = hs.hostname
|
self._server_name = hs.hostname
|
||||||
|
|
||||||
self.spam_checker = hs.get_spam_checker()
|
self._spam_checker_module_callbacks = hs.get_module_api_callbacks().spam_checker
|
||||||
|
|
||||||
if hs.config.worker.worker_app:
|
if hs.config.worker.worker_app:
|
||||||
self._register_client = ReplicationRegisterServlet.make_client(hs)
|
self._register_client = ReplicationRegisterServlet.make_client(hs)
|
||||||
|
@ -259,7 +259,7 @@ class RegistrationHandler:
|
||||||
|
|
||||||
await self.check_registration_ratelimit(address)
|
await self.check_registration_ratelimit(address)
|
||||||
|
|
||||||
result = await self.spam_checker.check_registration_for_spam(
|
result = await self._spam_checker_module_callbacks.check_registration_for_spam(
|
||||||
threepid,
|
threepid,
|
||||||
localpart,
|
localpart,
|
||||||
user_agent_ips or [],
|
user_agent_ips or [],
|
||||||
|
|
|
@ -106,7 +106,7 @@ class RoomCreationHandler:
|
||||||
self.auth_blocking = hs.get_auth_blocking()
|
self.auth_blocking = hs.get_auth_blocking()
|
||||||
self.clock = hs.get_clock()
|
self.clock = hs.get_clock()
|
||||||
self.hs = hs
|
self.hs = hs
|
||||||
self.spam_checker = hs.get_spam_checker()
|
self._spam_checker_module_callbacks = hs.get_module_api_callbacks().spam_checker
|
||||||
self.event_creation_handler = hs.get_event_creation_handler()
|
self.event_creation_handler = hs.get_event_creation_handler()
|
||||||
self.room_member_handler = hs.get_room_member_handler()
|
self.room_member_handler = hs.get_room_member_handler()
|
||||||
self._event_auth_handler = hs.get_event_auth_handler()
|
self._event_auth_handler = hs.get_event_auth_handler()
|
||||||
|
@ -449,7 +449,9 @@ class RoomCreationHandler:
|
||||||
"""
|
"""
|
||||||
user_id = requester.user.to_string()
|
user_id = requester.user.to_string()
|
||||||
|
|
||||||
spam_check = await self.spam_checker.user_may_create_room(user_id)
|
spam_check = await self._spam_checker_module_callbacks.user_may_create_room(
|
||||||
|
user_id
|
||||||
|
)
|
||||||
if spam_check != NOT_SPAM:
|
if spam_check != NOT_SPAM:
|
||||||
raise SynapseError(
|
raise SynapseError(
|
||||||
403,
|
403,
|
||||||
|
@ -761,7 +763,9 @@ class RoomCreationHandler:
|
||||||
)
|
)
|
||||||
|
|
||||||
if not is_requester_admin:
|
if not is_requester_admin:
|
||||||
spam_check = await self.spam_checker.user_may_create_room(user_id)
|
spam_check = await self._spam_checker_module_callbacks.user_may_create_room(
|
||||||
|
user_id
|
||||||
|
)
|
||||||
if spam_check != NOT_SPAM:
|
if spam_check != NOT_SPAM:
|
||||||
raise SynapseError(
|
raise SynapseError(
|
||||||
403,
|
403,
|
||||||
|
|
|
@ -96,7 +96,7 @@ class RoomMemberHandler(metaclass=abc.ABCMeta):
|
||||||
self.member_as_limiter = Linearizer(max_count=10, name="member_as_limiter")
|
self.member_as_limiter = Linearizer(max_count=10, name="member_as_limiter")
|
||||||
|
|
||||||
self.clock = hs.get_clock()
|
self.clock = hs.get_clock()
|
||||||
self.spam_checker = hs.get_spam_checker()
|
self._spam_checker_module_callbacks = hs.get_module_api_callbacks().spam_checker
|
||||||
self.third_party_event_rules = hs.get_third_party_event_rules()
|
self.third_party_event_rules = hs.get_third_party_event_rules()
|
||||||
self._server_notices_mxid = self.config.servernotices.server_notices_mxid
|
self._server_notices_mxid = self.config.servernotices.server_notices_mxid
|
||||||
self._enable_lookup = hs.config.registration.enable_3pid_lookup
|
self._enable_lookup = hs.config.registration.enable_3pid_lookup
|
||||||
|
@ -820,7 +820,7 @@ class RoomMemberHandler(metaclass=abc.ABCMeta):
|
||||||
)
|
)
|
||||||
block_invite_result = (Codes.FORBIDDEN, {})
|
block_invite_result = (Codes.FORBIDDEN, {})
|
||||||
|
|
||||||
spam_check = await self.spam_checker.user_may_invite(
|
spam_check = await self._spam_checker_module_callbacks.user_may_invite(
|
||||||
requester.user.to_string(), target_id, room_id
|
requester.user.to_string(), target_id, room_id
|
||||||
)
|
)
|
||||||
if spam_check != NOT_SPAM:
|
if spam_check != NOT_SPAM:
|
||||||
|
@ -954,9 +954,11 @@ class RoomMemberHandler(metaclass=abc.ABCMeta):
|
||||||
# a room then they're allowed to join it.
|
# a room then they're allowed to join it.
|
||||||
and not new_room
|
and not new_room
|
||||||
):
|
):
|
||||||
spam_check = await self.spam_checker.user_may_join_room(
|
spam_check = (
|
||||||
|
await self._spam_checker_module_callbacks.user_may_join_room(
|
||||||
target.to_string(), room_id, is_invited=inviter is not None
|
target.to_string(), room_id, is_invited=inviter is not None
|
||||||
)
|
)
|
||||||
|
)
|
||||||
if spam_check != NOT_SPAM:
|
if spam_check != NOT_SPAM:
|
||||||
raise SynapseError(
|
raise SynapseError(
|
||||||
403,
|
403,
|
||||||
|
@ -1564,12 +1566,14 @@ class RoomMemberHandler(metaclass=abc.ABCMeta):
|
||||||
)
|
)
|
||||||
else:
|
else:
|
||||||
# Check if the spamchecker(s) allow this invite to go through.
|
# Check if the spamchecker(s) allow this invite to go through.
|
||||||
spam_check = await self.spam_checker.user_may_send_3pid_invite(
|
spam_check = (
|
||||||
|
await self._spam_checker_module_callbacks.user_may_send_3pid_invite(
|
||||||
inviter_userid=requester.user.to_string(),
|
inviter_userid=requester.user.to_string(),
|
||||||
medium=medium,
|
medium=medium,
|
||||||
address=address,
|
address=address,
|
||||||
room_id=room_id,
|
room_id=room_id,
|
||||||
)
|
)
|
||||||
|
)
|
||||||
if spam_check != NOT_SPAM:
|
if spam_check != NOT_SPAM:
|
||||||
raise SynapseError(
|
raise SynapseError(
|
||||||
403,
|
403,
|
||||||
|
|
|
@ -94,7 +94,7 @@ class UserDirectoryHandler(StateDeltasHandler):
|
||||||
self.is_mine_id = hs.is_mine_id
|
self.is_mine_id = hs.is_mine_id
|
||||||
self.update_user_directory = hs.config.worker.should_update_user_directory
|
self.update_user_directory = hs.config.worker.should_update_user_directory
|
||||||
self.search_all_users = hs.config.userdirectory.user_directory_search_all_users
|
self.search_all_users = hs.config.userdirectory.user_directory_search_all_users
|
||||||
self.spam_checker = hs.get_spam_checker()
|
self._spam_checker_module_callbacks = hs.get_module_api_callbacks().spam_checker
|
||||||
self._hs = hs
|
self._hs = hs
|
||||||
|
|
||||||
# The current position in the current_state_delta stream
|
# The current position in the current_state_delta stream
|
||||||
|
@ -149,7 +149,9 @@ class UserDirectoryHandler(StateDeltasHandler):
|
||||||
# Remove any spammy users from the results.
|
# Remove any spammy users from the results.
|
||||||
non_spammy_users = []
|
non_spammy_users = []
|
||||||
for user in results["results"]:
|
for user in results["results"]:
|
||||||
if not await self.spam_checker.check_username_for_spam(user):
|
if not await self._spam_checker_module_callbacks.check_username_for_spam(
|
||||||
|
user
|
||||||
|
):
|
||||||
non_spammy_users.append(user)
|
non_spammy_users.append(user)
|
||||||
results["results"] = non_spammy_users
|
results["results"] = non_spammy_users
|
||||||
|
|
||||||
|
|
|
@ -312,35 +312,27 @@ class BlacklistingAgentWrapper(Agent):
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
class SimpleHttpClient:
|
class BaseHttpClient:
|
||||||
"""
|
"""
|
||||||
A simple, no-frills HTTP client with methods that wrap up common ways of
|
A simple, no-frills HTTP client with methods that wrap up common ways of
|
||||||
using HTTP in Matrix
|
using HTTP in Matrix. Does not come with a default Agent, subclasses will need to
|
||||||
|
define their own.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
hs: The HomeServer instance to pass in
|
||||||
|
treq_args: Extra keyword arguments to be given to treq.request.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
agent: IAgent
|
||||||
|
|
||||||
def __init__(
|
def __init__(
|
||||||
self,
|
self,
|
||||||
hs: "HomeServer",
|
hs: "HomeServer",
|
||||||
treq_args: Optional[Dict[str, Any]] = None,
|
treq_args: Optional[Dict[str, Any]] = None,
|
||||||
ip_whitelist: Optional[IPSet] = None,
|
|
||||||
ip_blacklist: Optional[IPSet] = None,
|
|
||||||
use_proxy: bool = False,
|
|
||||||
):
|
):
|
||||||
"""
|
|
||||||
Args:
|
|
||||||
hs
|
|
||||||
treq_args: Extra keyword arguments to be given to treq.request.
|
|
||||||
ip_blacklist: The IP addresses that are blacklisted that
|
|
||||||
we may not request.
|
|
||||||
ip_whitelist: The whitelisted IP addresses, that we can
|
|
||||||
request if it were otherwise caught in a blacklist.
|
|
||||||
use_proxy: Whether proxy settings should be discovered and used
|
|
||||||
from conventional environment variables.
|
|
||||||
"""
|
|
||||||
self.hs = hs
|
self.hs = hs
|
||||||
|
self.reactor = hs.get_reactor()
|
||||||
|
|
||||||
self._ip_whitelist = ip_whitelist
|
|
||||||
self._ip_blacklist = ip_blacklist
|
|
||||||
self._extra_treq_args = treq_args or {}
|
self._extra_treq_args = treq_args or {}
|
||||||
self.clock = hs.get_clock()
|
self.clock = hs.get_clock()
|
||||||
|
|
||||||
|
@ -356,44 +348,6 @@ class SimpleHttpClient:
|
||||||
# reactor.
|
# reactor.
|
||||||
self._cooperator = Cooperator(scheduler=_make_scheduler(hs.get_reactor()))
|
self._cooperator = Cooperator(scheduler=_make_scheduler(hs.get_reactor()))
|
||||||
|
|
||||||
if self._ip_blacklist:
|
|
||||||
# If we have an IP blacklist, we need to use a DNS resolver which
|
|
||||||
# filters out blacklisted IP addresses, to prevent DNS rebinding.
|
|
||||||
self.reactor: ISynapseReactor = BlacklistingReactorWrapper(
|
|
||||||
hs.get_reactor(), self._ip_whitelist, self._ip_blacklist
|
|
||||||
)
|
|
||||||
else:
|
|
||||||
self.reactor = hs.get_reactor()
|
|
||||||
|
|
||||||
# the pusher makes lots of concurrent SSL connections to sygnal, and
|
|
||||||
# tends to do so in batches, so we need to allow the pool to keep
|
|
||||||
# lots of idle connections around.
|
|
||||||
pool = HTTPConnectionPool(self.reactor)
|
|
||||||
# XXX: The justification for using the cache factor here is that larger instances
|
|
||||||
# will need both more cache and more connections.
|
|
||||||
# Still, this should probably be a separate dial
|
|
||||||
pool.maxPersistentPerHost = max(int(100 * hs.config.caches.global_factor), 5)
|
|
||||||
pool.cachedConnectionTimeout = 2 * 60
|
|
||||||
|
|
||||||
self.agent: IAgent = ProxyAgent(
|
|
||||||
self.reactor,
|
|
||||||
hs.get_reactor(),
|
|
||||||
connectTimeout=15,
|
|
||||||
contextFactory=self.hs.get_http_client_context_factory(),
|
|
||||||
pool=pool,
|
|
||||||
use_proxy=use_proxy,
|
|
||||||
)
|
|
||||||
|
|
||||||
if self._ip_blacklist:
|
|
||||||
# If we have an IP blacklist, we then install the blacklisting Agent
|
|
||||||
# which prevents direct access to IP addresses, that are not caught
|
|
||||||
# by the DNS resolution.
|
|
||||||
self.agent = BlacklistingAgentWrapper(
|
|
||||||
self.agent,
|
|
||||||
ip_blacklist=self._ip_blacklist,
|
|
||||||
ip_whitelist=self._ip_whitelist,
|
|
||||||
)
|
|
||||||
|
|
||||||
async def request(
|
async def request(
|
||||||
self,
|
self,
|
||||||
method: str,
|
method: str,
|
||||||
|
@ -799,6 +753,72 @@ class SimpleHttpClient:
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class SimpleHttpClient(BaseHttpClient):
|
||||||
|
"""
|
||||||
|
An HTTP client capable of crossing a proxy and respecting a block/allow list.
|
||||||
|
|
||||||
|
This also configures a larger / longer lasting HTTP connection pool.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
hs: The HomeServer instance to pass in
|
||||||
|
treq_args: Extra keyword arguments to be given to treq.request.
|
||||||
|
ip_blacklist: The IP addresses that are blacklisted that
|
||||||
|
we may not request.
|
||||||
|
ip_whitelist: The whitelisted IP addresses, that we can
|
||||||
|
request if it were otherwise caught in a blacklist.
|
||||||
|
use_proxy: Whether proxy settings should be discovered and used
|
||||||
|
from conventional environment variables.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
hs: "HomeServer",
|
||||||
|
treq_args: Optional[Dict[str, Any]] = None,
|
||||||
|
ip_whitelist: Optional[IPSet] = None,
|
||||||
|
ip_blacklist: Optional[IPSet] = None,
|
||||||
|
use_proxy: bool = False,
|
||||||
|
):
|
||||||
|
super().__init__(hs, treq_args=treq_args)
|
||||||
|
self._ip_whitelist = ip_whitelist
|
||||||
|
self._ip_blacklist = ip_blacklist
|
||||||
|
|
||||||
|
if self._ip_blacklist:
|
||||||
|
# If we have an IP blacklist, we need to use a DNS resolver which
|
||||||
|
# filters out blacklisted IP addresses, to prevent DNS rebinding.
|
||||||
|
self.reactor: ISynapseReactor = BlacklistingReactorWrapper(
|
||||||
|
self.reactor, self._ip_whitelist, self._ip_blacklist
|
||||||
|
)
|
||||||
|
|
||||||
|
# the pusher makes lots of concurrent SSL connections to Sygnal, and tends to
|
||||||
|
# do so in batches, so we need to allow the pool to keep lots of idle
|
||||||
|
# connections around.
|
||||||
|
pool = HTTPConnectionPool(self.reactor)
|
||||||
|
# XXX: The justification for using the cache factor here is that larger
|
||||||
|
# instances will need both more cache and more connections.
|
||||||
|
# Still, this should probably be a separate dial
|
||||||
|
pool.maxPersistentPerHost = max(int(100 * hs.config.caches.global_factor), 5)
|
||||||
|
pool.cachedConnectionTimeout = 2 * 60
|
||||||
|
|
||||||
|
self.agent: IAgent = ProxyAgent(
|
||||||
|
self.reactor,
|
||||||
|
hs.get_reactor(),
|
||||||
|
connectTimeout=15,
|
||||||
|
contextFactory=self.hs.get_http_client_context_factory(),
|
||||||
|
pool=pool,
|
||||||
|
use_proxy=use_proxy,
|
||||||
|
)
|
||||||
|
|
||||||
|
if self._ip_blacklist:
|
||||||
|
# If we have an IP blacklist, we then install the blacklisting Agent
|
||||||
|
# which prevents direct access to IP addresses, that are not caught
|
||||||
|
# by the DNS resolution.
|
||||||
|
self.agent = BlacklistingAgentWrapper(
|
||||||
|
self.agent,
|
||||||
|
ip_blacklist=self._ip_blacklist,
|
||||||
|
ip_whitelist=self._ip_whitelist,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
def _timeout_to_request_timed_out_error(f: Failure) -> Failure:
|
def _timeout_to_request_timed_out_error(f: Failure) -> Failure:
|
||||||
if f.check(twisted_error.TimeoutError, twisted_error.ConnectingCancelledError):
|
if f.check(twisted_error.TimeoutError, twisted_error.ConnectingCancelledError):
|
||||||
# The TCP connection has its own timeout (set by the 'connectTimeout' param
|
# The TCP connection has its own timeout (set by the 'connectTimeout' param
|
||||||
|
|
|
@ -46,6 +46,13 @@ from twisted.internet import defer, interfaces
|
||||||
from twisted.internet.defer import CancelledError
|
from twisted.internet.defer import CancelledError
|
||||||
from twisted.python import failure
|
from twisted.python import failure
|
||||||
from twisted.web import resource
|
from twisted.web import resource
|
||||||
|
|
||||||
|
try:
|
||||||
|
from twisted.web.pages import notFound
|
||||||
|
except ImportError:
|
||||||
|
from twisted.web.resource import NoResource as notFound # type: ignore[assignment]
|
||||||
|
|
||||||
|
from twisted.web.resource import IResource
|
||||||
from twisted.web.server import NOT_DONE_YET, Request
|
from twisted.web.server import NOT_DONE_YET, Request
|
||||||
from twisted.web.static import File
|
from twisted.web.static import File
|
||||||
from twisted.web.util import redirectTo
|
from twisted.web.util import redirectTo
|
||||||
|
@ -569,6 +576,9 @@ class StaticResource(File):
|
||||||
set_clickjacking_protection_headers(request)
|
set_clickjacking_protection_headers(request)
|
||||||
return super().render_GET(request)
|
return super().render_GET(request)
|
||||||
|
|
||||||
|
def directoryListing(self) -> IResource:
|
||||||
|
return notFound()
|
||||||
|
|
||||||
|
|
||||||
class UnrecognizedRequestResource(resource.Resource):
|
class UnrecognizedRequestResource(resource.Resource):
|
||||||
"""
|
"""
|
||||||
|
|
|
@ -36,7 +36,6 @@ from twisted.internet.defer import Deferred
|
||||||
from twisted.internet.interfaces import IConsumer
|
from twisted.internet.interfaces import IConsumer
|
||||||
from twisted.protocols.basic import FileSender
|
from twisted.protocols.basic import FileSender
|
||||||
|
|
||||||
import synapse
|
|
||||||
from synapse.api.errors import NotFoundError
|
from synapse.api.errors import NotFoundError
|
||||||
from synapse.logging.context import defer_to_thread, make_deferred_yieldable
|
from synapse.logging.context import defer_to_thread, make_deferred_yieldable
|
||||||
from synapse.util import Clock
|
from synapse.util import Clock
|
||||||
|
@ -74,7 +73,7 @@ class MediaStorage:
|
||||||
self.local_media_directory = local_media_directory
|
self.local_media_directory = local_media_directory
|
||||||
self.filepaths = filepaths
|
self.filepaths = filepaths
|
||||||
self.storage_providers = storage_providers
|
self.storage_providers = storage_providers
|
||||||
self.spam_checker = hs.get_spam_checker()
|
self._spam_checker_module_callbacks = hs.get_module_api_callbacks().spam_checker
|
||||||
self.clock = hs.get_clock()
|
self.clock = hs.get_clock()
|
||||||
|
|
||||||
async def store_file(self, source: IO, file_info: FileInfo) -> str:
|
async def store_file(self, source: IO, file_info: FileInfo) -> str:
|
||||||
|
@ -145,10 +144,10 @@ class MediaStorage:
|
||||||
f.flush()
|
f.flush()
|
||||||
f.close()
|
f.close()
|
||||||
|
|
||||||
spam_check = await self.spam_checker.check_media_file_for_spam(
|
spam_check = await self._spam_checker_module_callbacks.check_media_file_for_spam(
|
||||||
ReadableFileWrapper(self.clock, fname), file_info
|
ReadableFileWrapper(self.clock, fname), file_info
|
||||||
)
|
)
|
||||||
if spam_check != synapse.module_api.NOT_SPAM:
|
if spam_check != self._spam_checker_module_callbacks.NOT_SPAM:
|
||||||
logger.info("Blocking media due to spam checker")
|
logger.info("Blocking media due to spam checker")
|
||||||
# Note that we'll delete the stored media, due to the
|
# Note that we'll delete the stored media, due to the
|
||||||
# try/except below. The media also won't be stored in
|
# try/except below. The media also won't be stored in
|
||||||
|
|
|
@ -44,20 +44,6 @@ from synapse.events.presence_router import (
|
||||||
GET_USERS_FOR_STATES_CALLBACK,
|
GET_USERS_FOR_STATES_CALLBACK,
|
||||||
PresenceRouter,
|
PresenceRouter,
|
||||||
)
|
)
|
||||||
from synapse.events.spamcheck import (
|
|
||||||
CHECK_EVENT_FOR_SPAM_CALLBACK,
|
|
||||||
CHECK_MEDIA_FILE_FOR_SPAM_CALLBACK,
|
|
||||||
CHECK_REGISTRATION_FOR_SPAM_CALLBACK,
|
|
||||||
CHECK_USERNAME_FOR_SPAM_CALLBACK,
|
|
||||||
SHOULD_DROP_FEDERATED_EVENT_CALLBACK,
|
|
||||||
USER_MAY_CREATE_ROOM_ALIAS_CALLBACK,
|
|
||||||
USER_MAY_CREATE_ROOM_CALLBACK,
|
|
||||||
USER_MAY_INVITE_CALLBACK,
|
|
||||||
USER_MAY_JOIN_ROOM_CALLBACK,
|
|
||||||
USER_MAY_PUBLISH_ROOM_CALLBACK,
|
|
||||||
USER_MAY_SEND_3PID_INVITE_CALLBACK,
|
|
||||||
SpamChecker,
|
|
||||||
)
|
|
||||||
from synapse.events.third_party_rules import (
|
from synapse.events.third_party_rules import (
|
||||||
CHECK_CAN_DEACTIVATE_USER_CALLBACK,
|
CHECK_CAN_DEACTIVATE_USER_CALLBACK,
|
||||||
CHECK_CAN_SHUTDOWN_ROOM_CALLBACK,
|
CHECK_CAN_SHUTDOWN_ROOM_CALLBACK,
|
||||||
|
@ -105,6 +91,20 @@ from synapse.module_api.callbacks.account_validity_callbacks import (
|
||||||
ON_LEGACY_SEND_MAIL_CALLBACK,
|
ON_LEGACY_SEND_MAIL_CALLBACK,
|
||||||
ON_USER_REGISTRATION_CALLBACK,
|
ON_USER_REGISTRATION_CALLBACK,
|
||||||
)
|
)
|
||||||
|
from synapse.module_api.callbacks.spamchecker_callbacks import (
|
||||||
|
CHECK_EVENT_FOR_SPAM_CALLBACK,
|
||||||
|
CHECK_MEDIA_FILE_FOR_SPAM_CALLBACK,
|
||||||
|
CHECK_REGISTRATION_FOR_SPAM_CALLBACK,
|
||||||
|
CHECK_USERNAME_FOR_SPAM_CALLBACK,
|
||||||
|
SHOULD_DROP_FEDERATED_EVENT_CALLBACK,
|
||||||
|
USER_MAY_CREATE_ROOM_ALIAS_CALLBACK,
|
||||||
|
USER_MAY_CREATE_ROOM_CALLBACK,
|
||||||
|
USER_MAY_INVITE_CALLBACK,
|
||||||
|
USER_MAY_JOIN_ROOM_CALLBACK,
|
||||||
|
USER_MAY_PUBLISH_ROOM_CALLBACK,
|
||||||
|
USER_MAY_SEND_3PID_INVITE_CALLBACK,
|
||||||
|
SpamCheckerModuleApiCallbacks,
|
||||||
|
)
|
||||||
from synapse.rest.client.login import LoginResponse
|
from synapse.rest.client.login import LoginResponse
|
||||||
from synapse.storage import DataStore
|
from synapse.storage import DataStore
|
||||||
from synapse.storage.background_updates import (
|
from synapse.storage.background_updates import (
|
||||||
|
@ -147,7 +147,7 @@ are loaded into Synapse.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
PRESENCE_ALL_USERS = PresenceRouter.ALL_USERS
|
PRESENCE_ALL_USERS = PresenceRouter.ALL_USERS
|
||||||
NOT_SPAM = SpamChecker.NOT_SPAM
|
NOT_SPAM = SpamCheckerModuleApiCallbacks.NOT_SPAM
|
||||||
|
|
||||||
__all__ = [
|
__all__ = [
|
||||||
"errors",
|
"errors",
|
||||||
|
@ -271,7 +271,6 @@ class ModuleApi:
|
||||||
self._public_room_list_manager = PublicRoomListManager(hs)
|
self._public_room_list_manager = PublicRoomListManager(hs)
|
||||||
self._account_data_manager = AccountDataManager(hs)
|
self._account_data_manager = AccountDataManager(hs)
|
||||||
|
|
||||||
self._spam_checker = hs.get_spam_checker()
|
|
||||||
self._third_party_event_rules = hs.get_third_party_event_rules()
|
self._third_party_event_rules = hs.get_third_party_event_rules()
|
||||||
self._password_auth_provider = hs.get_password_auth_provider()
|
self._password_auth_provider = hs.get_password_auth_provider()
|
||||||
self._presence_router = hs.get_presence_router()
|
self._presence_router = hs.get_presence_router()
|
||||||
|
@ -305,7 +304,7 @@ class ModuleApi:
|
||||||
|
|
||||||
Added in Synapse v1.37.0.
|
Added in Synapse v1.37.0.
|
||||||
"""
|
"""
|
||||||
return self._spam_checker.register_callbacks(
|
return self._callbacks.spam_checker.register_callbacks(
|
||||||
check_event_for_spam=check_event_for_spam,
|
check_event_for_spam=check_event_for_spam,
|
||||||
should_drop_federated_event=should_drop_federated_event,
|
should_drop_federated_event=should_drop_federated_event,
|
||||||
user_may_join_room=user_may_join_room,
|
user_may_join_room=user_may_join_room,
|
||||||
|
|
|
@ -12,11 +12,20 @@
|
||||||
# See the License for the specific language governing permissions and
|
# See the License for the specific language governing permissions and
|
||||||
# limitations under the License.
|
# limitations under the License.
|
||||||
|
|
||||||
|
from typing import TYPE_CHECKING
|
||||||
|
|
||||||
|
if TYPE_CHECKING:
|
||||||
|
from synapse.server import HomeServer
|
||||||
|
|
||||||
from synapse.module_api.callbacks.account_validity_callbacks import (
|
from synapse.module_api.callbacks.account_validity_callbacks import (
|
||||||
AccountValidityModuleApiCallbacks,
|
AccountValidityModuleApiCallbacks,
|
||||||
)
|
)
|
||||||
|
from synapse.module_api.callbacks.spamchecker_callbacks import (
|
||||||
|
SpamCheckerModuleApiCallbacks,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
class ModuleApiCallbacks:
|
class ModuleApiCallbacks:
|
||||||
def __init__(self) -> None:
|
def __init__(self, hs: "HomeServer") -> None:
|
||||||
self.account_validity = AccountValidityModuleApiCallbacks()
|
self.account_validity = AccountValidityModuleApiCallbacks()
|
||||||
|
self.spam_checker = SpamCheckerModuleApiCallbacks(hs)
|
||||||
|
|
|
@ -286,11 +286,10 @@ def load_legacy_spam_checkers(hs: "synapse.server.HomeServer") -> None:
|
||||||
api.register_spam_checker_callbacks(**hooks)
|
api.register_spam_checker_callbacks(**hooks)
|
||||||
|
|
||||||
|
|
||||||
class SpamChecker:
|
class SpamCheckerModuleApiCallbacks:
|
||||||
NOT_SPAM: Literal["NOT_SPAM"] = "NOT_SPAM"
|
NOT_SPAM: Literal["NOT_SPAM"] = "NOT_SPAM"
|
||||||
|
|
||||||
def __init__(self, hs: "synapse.server.HomeServer") -> None:
|
def __init__(self, hs: "synapse.server.HomeServer") -> None:
|
||||||
self.hs = hs
|
|
||||||
self.clock = hs.get_clock()
|
self.clock = hs.get_clock()
|
||||||
|
|
||||||
self._check_event_for_spam_callbacks: List[CHECK_EVENT_FOR_SPAM_CALLBACK] = []
|
self._check_event_for_spam_callbacks: List[CHECK_EVENT_FOR_SPAM_CALLBACK] = []
|
|
@ -3,7 +3,11 @@
|
||||||
|
|
||||||
{% block header %}
|
{% block header %}
|
||||||
<script src="https://www.recaptcha.net/recaptcha/api.js" async defer></script>
|
<script src="https://www.recaptcha.net/recaptcha/api.js" async defer></script>
|
||||||
<link rel="stylesheet" href="/_matrix/static/client/register/style.css">
|
<style type="text/css">
|
||||||
|
.g-recaptcha div {
|
||||||
|
margin: auto;
|
||||||
|
}
|
||||||
|
</style>
|
||||||
<script>
|
<script>
|
||||||
function captchaDone() {
|
function captchaDone() {
|
||||||
document.getElementById('registrationForm').submit();
|
document.getElementById('registrationForm').submit();
|
||||||
|
|
|
@ -1,12 +1,8 @@
|
||||||
{% extends "_base.html" %}
|
{% extends "_base.html" %}
|
||||||
{% block title %}Authentication{% endblock %}
|
{% block title %}Authentication{% endblock %}
|
||||||
|
|
||||||
{% block header %}
|
|
||||||
<link rel="stylesheet" href="/_matrix/static/client/register/style.css">
|
|
||||||
{% endblock %}
|
|
||||||
|
|
||||||
{% block body %}
|
{% block body %}
|
||||||
<form id="registrationForm" method="post" action="{{ myurl }}">
|
<form method="post" action="{{ myurl }}">
|
||||||
<div>
|
<div>
|
||||||
{% if error is defined %}
|
{% if error is defined %}
|
||||||
<p class="error"><strong>Error: {{ error }}</strong></p>
|
<p class="error"><strong>Error: {{ error }}</strong></p>
|
||||||
|
|
|
@ -27,3 +27,7 @@ body {
|
||||||
h3 { font-size: .85rem; }
|
h3 { font-size: .85rem; }
|
||||||
h4 { font-size: .8rem; }
|
h4 { font-size: .8rem; }
|
||||||
}
|
}
|
||||||
|
|
||||||
|
.error {
|
||||||
|
color: red;
|
||||||
|
}
|
||||||
|
|
|
@ -2,7 +2,12 @@
|
||||||
{% block title %}Authentication{% endblock %}
|
{% block title %}Authentication{% endblock %}
|
||||||
|
|
||||||
{% block header %}
|
{% block header %}
|
||||||
<link rel="stylesheet" href="/_matrix/static/client/register/style.css">
|
<style type="text/css">
|
||||||
|
#registrationForm input {
|
||||||
|
display: block;
|
||||||
|
margin: auto;
|
||||||
|
}
|
||||||
|
</style>
|
||||||
{% endblock %}
|
{% endblock %}
|
||||||
|
|
||||||
{% block body %}
|
{% block body %}
|
||||||
|
|
|
@ -100,7 +100,6 @@ class ClientRestResource(JsonResource):
|
||||||
login.register_servlets(hs, client_resource)
|
login.register_servlets(hs, client_resource)
|
||||||
profile.register_servlets(hs, client_resource)
|
profile.register_servlets(hs, client_resource)
|
||||||
presence.register_servlets(hs, client_resource)
|
presence.register_servlets(hs, client_resource)
|
||||||
if is_main_process:
|
|
||||||
directory.register_servlets(hs, client_resource)
|
directory.register_servlets(hs, client_resource)
|
||||||
voip.register_servlets(hs, client_resource)
|
voip.register_servlets(hs, client_resource)
|
||||||
if is_main_process:
|
if is_main_process:
|
||||||
|
@ -134,8 +133,8 @@ class ClientRestResource(JsonResource):
|
||||||
if is_main_process:
|
if is_main_process:
|
||||||
room_upgrade_rest_servlet.register_servlets(hs, client_resource)
|
room_upgrade_rest_servlet.register_servlets(hs, client_resource)
|
||||||
room_batch.register_servlets(hs, client_resource)
|
room_batch.register_servlets(hs, client_resource)
|
||||||
if is_main_process:
|
|
||||||
capabilities.register_servlets(hs, client_resource)
|
capabilities.register_servlets(hs, client_resource)
|
||||||
|
if is_main_process:
|
||||||
account_validity.register_servlets(hs, client_resource)
|
account_validity.register_servlets(hs, client_resource)
|
||||||
relations.register_servlets(hs, client_resource)
|
relations.register_servlets(hs, client_resource)
|
||||||
password_policy.register_servlets(hs, client_resource)
|
password_policy.register_servlets(hs, client_resource)
|
||||||
|
|
|
@ -33,6 +33,7 @@ class CapabilitiesRestServlet(RestServlet):
|
||||||
"""End point to expose the capabilities of the server."""
|
"""End point to expose the capabilities of the server."""
|
||||||
|
|
||||||
PATTERNS = client_patterns("/capabilities$")
|
PATTERNS = client_patterns("/capabilities$")
|
||||||
|
CATEGORY = "Client API requests"
|
||||||
|
|
||||||
def __init__(self, hs: "HomeServer"):
|
def __init__(self, hs: "HomeServer"):
|
||||||
super().__init__()
|
super().__init__()
|
||||||
|
|
|
@ -39,12 +39,14 @@ logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
def register_servlets(hs: "HomeServer", http_server: HttpServer) -> None:
|
def register_servlets(hs: "HomeServer", http_server: HttpServer) -> None:
|
||||||
ClientDirectoryServer(hs).register(http_server)
|
ClientDirectoryServer(hs).register(http_server)
|
||||||
|
if hs.config.worker.worker_app is None:
|
||||||
ClientDirectoryListServer(hs).register(http_server)
|
ClientDirectoryListServer(hs).register(http_server)
|
||||||
ClientAppserviceDirectoryListServer(hs).register(http_server)
|
ClientAppserviceDirectoryListServer(hs).register(http_server)
|
||||||
|
|
||||||
|
|
||||||
class ClientDirectoryServer(RestServlet):
|
class ClientDirectoryServer(RestServlet):
|
||||||
PATTERNS = client_patterns("/directory/room/(?P<room_alias>[^/]*)$", v1=True)
|
PATTERNS = client_patterns("/directory/room/(?P<room_alias>[^/]*)$", v1=True)
|
||||||
|
CATEGORY = "Client API requests"
|
||||||
|
|
||||||
def __init__(self, hs: "HomeServer"):
|
def __init__(self, hs: "HomeServer"):
|
||||||
super().__init__()
|
super().__init__()
|
||||||
|
|
|
@ -42,7 +42,6 @@ from synapse.crypto.context_factory import RegularPolicyForHTTPS
|
||||||
from synapse.crypto.keyring import Keyring
|
from synapse.crypto.keyring import Keyring
|
||||||
from synapse.events.builder import EventBuilderFactory
|
from synapse.events.builder import EventBuilderFactory
|
||||||
from synapse.events.presence_router import PresenceRouter
|
from synapse.events.presence_router import PresenceRouter
|
||||||
from synapse.events.spamcheck import SpamChecker
|
|
||||||
from synapse.events.third_party_rules import ThirdPartyEventRules
|
from synapse.events.third_party_rules import ThirdPartyEventRules
|
||||||
from synapse.events.utils import EventClientSerializer
|
from synapse.events.utils import EventClientSerializer
|
||||||
from synapse.federation.federation_client import FederationClient
|
from synapse.federation.federation_client import FederationClient
|
||||||
|
@ -687,10 +686,6 @@ class HomeServer(metaclass=abc.ABCMeta):
|
||||||
def get_stats_handler(self) -> StatsHandler:
|
def get_stats_handler(self) -> StatsHandler:
|
||||||
return StatsHandler(self)
|
return StatsHandler(self)
|
||||||
|
|
||||||
@cache_in_self
|
|
||||||
def get_spam_checker(self) -> SpamChecker:
|
|
||||||
return SpamChecker(self)
|
|
||||||
|
|
||||||
@cache_in_self
|
@cache_in_self
|
||||||
def get_third_party_event_rules(self) -> ThirdPartyEventRules:
|
def get_third_party_event_rules(self) -> ThirdPartyEventRules:
|
||||||
return ThirdPartyEventRules(self)
|
return ThirdPartyEventRules(self)
|
||||||
|
@ -803,7 +798,7 @@ class HomeServer(metaclass=abc.ABCMeta):
|
||||||
|
|
||||||
@cache_in_self
|
@cache_in_self
|
||||||
def get_module_api_callbacks(self) -> ModuleApiCallbacks:
|
def get_module_api_callbacks(self) -> ModuleApiCallbacks:
|
||||||
return ModuleApiCallbacks()
|
return ModuleApiCallbacks(self)
|
||||||
|
|
||||||
@cache_in_self
|
@cache_in_self
|
||||||
def get_account_data_handler(self) -> AccountDataHandler:
|
def get_account_data_handler(self) -> AccountDataHandler:
|
||||||
|
|
|
@ -1,34 +0,0 @@
|
||||||
<!doctype html>
|
|
||||||
<html>
|
|
||||||
<head>
|
|
||||||
<title> Registration </title>
|
|
||||||
<meta http-equiv="X-UA-Compatible" content="IE=edge">
|
|
||||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
|
||||||
<link rel="stylesheet" href="style.css">
|
|
||||||
<script src="js/jquery-3.4.1.min.js"></script>
|
|
||||||
<script src="https://www.recaptcha.net/recaptcha/api/js/recaptcha_ajax.js"></script>
|
|
||||||
<script src="register_config.js"></script>
|
|
||||||
<script src="js/register.js"></script>
|
|
||||||
</head>
|
|
||||||
<body onload="matrixRegistration.onLoad()">
|
|
||||||
<form id="registrationForm" onsubmit="matrixRegistration.signUp(); return false;">
|
|
||||||
<div>
|
|
||||||
Create account:<br/>
|
|
||||||
|
|
||||||
<div style="text-align: center">
|
|
||||||
<input id="desired_user_id" size="32" type="text" placeholder="Matrix ID (e.g. bob)" autocapitalize="off" autocorrect="off" />
|
|
||||||
<br/>
|
|
||||||
<input id="pwd1" size="32" type="password" placeholder="Type a password"/>
|
|
||||||
<br/>
|
|
||||||
<input id="pwd2" size="32" type="password" placeholder="Confirm your password"/>
|
|
||||||
<br/>
|
|
||||||
<span id="feedback" style="color: #f00"></span>
|
|
||||||
<br/>
|
|
||||||
<div id="regcaptcha"></div>
|
|
||||||
|
|
||||||
<button type="submit" style="margin: 10px">Sign up</button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</form>
|
|
||||||
</body>
|
|
||||||
</html>
|
|
File diff suppressed because one or more lines are too long
|
@ -1,117 +0,0 @@
|
||||||
window.matrixRegistration = {
|
|
||||||
endpoint: location.origin + "/_matrix/client/api/v1/register"
|
|
||||||
};
|
|
||||||
|
|
||||||
var setupCaptcha = function() {
|
|
||||||
if (!window.matrixRegistrationConfig) {
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
$.get(matrixRegistration.endpoint, function(response) {
|
|
||||||
var serverExpectsCaptcha = false;
|
|
||||||
for (var i=0; i<response.flows.length; i++) {
|
|
||||||
var flow = response.flows[i];
|
|
||||||
if ("m.login.recaptcha" === flow.type) {
|
|
||||||
serverExpectsCaptcha = true;
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if (!serverExpectsCaptcha) {
|
|
||||||
console.log("This server does not require a captcha.");
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
console.log("Setting up ReCaptcha for "+matrixRegistration.endpoint);
|
|
||||||
var public_key = window.matrixRegistrationConfig.recaptcha_public_key;
|
|
||||||
if (public_key === undefined) {
|
|
||||||
console.error("No public key defined for captcha!");
|
|
||||||
setFeedbackString("Misconfigured captcha for server. Contact server admin.");
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
Recaptcha.create(public_key,
|
|
||||||
"regcaptcha",
|
|
||||||
{
|
|
||||||
theme: "red",
|
|
||||||
callback: Recaptcha.focus_response_field
|
|
||||||
});
|
|
||||||
window.matrixRegistration.isUsingRecaptcha = true;
|
|
||||||
}).fail(errorFunc);
|
|
||||||
|
|
||||||
};
|
|
||||||
|
|
||||||
var submitCaptcha = function(user, pwd) {
|
|
||||||
var challengeToken = Recaptcha.get_challenge();
|
|
||||||
var captchaEntry = Recaptcha.get_response();
|
|
||||||
var data = {
|
|
||||||
type: "m.login.recaptcha",
|
|
||||||
challenge: challengeToken,
|
|
||||||
response: captchaEntry
|
|
||||||
};
|
|
||||||
console.log("Submitting captcha");
|
|
||||||
$.post(matrixRegistration.endpoint, JSON.stringify(data), function(response) {
|
|
||||||
console.log("Success -> "+JSON.stringify(response));
|
|
||||||
submitPassword(user, pwd, response.session);
|
|
||||||
}).fail(function(err) {
|
|
||||||
Recaptcha.reload();
|
|
||||||
errorFunc(err);
|
|
||||||
});
|
|
||||||
};
|
|
||||||
|
|
||||||
var submitPassword = function(user, pwd, session) {
|
|
||||||
console.log("Registering...");
|
|
||||||
var data = {
|
|
||||||
type: "m.login.password",
|
|
||||||
user: user,
|
|
||||||
password: pwd,
|
|
||||||
session: session
|
|
||||||
};
|
|
||||||
$.post(matrixRegistration.endpoint, JSON.stringify(data), function(response) {
|
|
||||||
matrixRegistration.onRegistered(
|
|
||||||
response.home_server, response.user_id, response.access_token
|
|
||||||
);
|
|
||||||
}).fail(errorFunc);
|
|
||||||
};
|
|
||||||
|
|
||||||
var errorFunc = function(err) {
|
|
||||||
if (err.responseJSON && err.responseJSON.error) {
|
|
||||||
setFeedbackString(err.responseJSON.error + " (" + err.responseJSON.errcode + ")");
|
|
||||||
}
|
|
||||||
else {
|
|
||||||
setFeedbackString("Request failed: " + err.status);
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
var setFeedbackString = function(text) {
|
|
||||||
$("#feedback").text(text);
|
|
||||||
};
|
|
||||||
|
|
||||||
matrixRegistration.onLoad = function() {
|
|
||||||
setupCaptcha();
|
|
||||||
};
|
|
||||||
|
|
||||||
matrixRegistration.signUp = function() {
|
|
||||||
var user = $("#desired_user_id").val();
|
|
||||||
if (user.length == 0) {
|
|
||||||
setFeedbackString("Must specify a username.");
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
var pwd1 = $("#pwd1").val();
|
|
||||||
var pwd2 = $("#pwd2").val();
|
|
||||||
if (pwd1.length < 6) {
|
|
||||||
setFeedbackString("Password: min. 6 characters.");
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
if (pwd1 != pwd2) {
|
|
||||||
setFeedbackString("Passwords do not match.");
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
if (window.matrixRegistration.isUsingRecaptcha) {
|
|
||||||
submitCaptcha(user, pwd1);
|
|
||||||
}
|
|
||||||
else {
|
|
||||||
submitPassword(user, pwd1);
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
matrixRegistration.onRegistered = function(hs_url, user_id, access_token) {
|
|
||||||
// clobber this function
|
|
||||||
console.warn("onRegistered - This function should be replaced to proceed.");
|
|
||||||
};
|
|
|
@ -1,3 +0,0 @@
|
||||||
window.matrixRegistrationConfig = {
|
|
||||||
recaptcha_public_key: "YOUR_PUBLIC_KEY"
|
|
||||||
};
|
|
|
@ -1,64 +0,0 @@
|
||||||
html {
|
|
||||||
height: 100%;
|
|
||||||
}
|
|
||||||
|
|
||||||
body {
|
|
||||||
height: 100%;
|
|
||||||
font-family: "Myriad Pro", "Myriad", Helvetica, Arial, sans-serif;
|
|
||||||
font-size: 12pt;
|
|
||||||
margin: 0px;
|
|
||||||
}
|
|
||||||
|
|
||||||
h1 {
|
|
||||||
font-size: 20pt;
|
|
||||||
}
|
|
||||||
|
|
||||||
a:link { color: #666; }
|
|
||||||
a:visited { color: #666; }
|
|
||||||
a:hover { color: #000; }
|
|
||||||
a:active { color: #000; }
|
|
||||||
|
|
||||||
input {
|
|
||||||
width: 100%
|
|
||||||
}
|
|
||||||
|
|
||||||
textarea, input {
|
|
||||||
font-family: inherit;
|
|
||||||
font-size: inherit;
|
|
||||||
}
|
|
||||||
|
|
||||||
.smallPrint {
|
|
||||||
color: #888;
|
|
||||||
font-size: 9pt ! important;
|
|
||||||
font-style: italic ! important;
|
|
||||||
}
|
|
||||||
|
|
||||||
#recaptcha_area {
|
|
||||||
margin: auto
|
|
||||||
}
|
|
||||||
|
|
||||||
.g-recaptcha div {
|
|
||||||
margin: auto;
|
|
||||||
}
|
|
||||||
|
|
||||||
#registrationForm {
|
|
||||||
text-align: left;
|
|
||||||
padding: 5px;
|
|
||||||
margin-bottom: 40px;
|
|
||||||
display: inline-block;
|
|
||||||
|
|
||||||
-webkit-border-radius: 10px;
|
|
||||||
-moz-border-radius: 10px;
|
|
||||||
border-radius: 10px;
|
|
||||||
|
|
||||||
-webkit-box-shadow: 0px 0px 20px 0px rgba(0,0,0,0.15);
|
|
||||||
-moz-box-shadow: 0px 0px 20px 0px rgba(0,0,0,0.15);
|
|
||||||
box-shadow: 0px 0px 20px 0px rgba(0,0,0,0.15);
|
|
||||||
|
|
||||||
background-color: #f8f8f8;
|
|
||||||
border: 1px #ccc solid;
|
|
||||||
}
|
|
||||||
|
|
||||||
.error {
|
|
||||||
color: red;
|
|
||||||
}
|
|
|
@ -58,7 +58,7 @@ from synapse.metrics import register_threadpool
|
||||||
from synapse.metrics.background_process_metrics import run_as_background_process
|
from synapse.metrics.background_process_metrics import run_as_background_process
|
||||||
from synapse.storage.background_updates import BackgroundUpdater
|
from synapse.storage.background_updates import BackgroundUpdater
|
||||||
from synapse.storage.engines import BaseDatabaseEngine, PostgresEngine, Sqlite3Engine
|
from synapse.storage.engines import BaseDatabaseEngine, PostgresEngine, Sqlite3Engine
|
||||||
from synapse.storage.types import Connection, Cursor
|
from synapse.storage.types import Connection, Cursor, SQLQueryParameters
|
||||||
from synapse.util.async_helpers import delay_cancellation
|
from synapse.util.async_helpers import delay_cancellation
|
||||||
from synapse.util.iterutils import batch_iter
|
from synapse.util.iterutils import batch_iter
|
||||||
|
|
||||||
|
@ -371,10 +371,18 @@ class LoggingTransaction:
|
||||||
if isinstance(self.database_engine, PostgresEngine):
|
if isinstance(self.database_engine, PostgresEngine):
|
||||||
from psycopg2.extras import execute_batch
|
from psycopg2.extras import execute_batch
|
||||||
|
|
||||||
|
# TODO: is it safe for values to be Iterable[Iterable[Any]] here?
|
||||||
|
# https://www.psycopg.org/docs/extras.html?highlight=execute_batch#psycopg2.extras.execute_batch
|
||||||
|
# suggests each arg in args should be a sequence or mapping
|
||||||
self._do_execute(
|
self._do_execute(
|
||||||
lambda the_sql: execute_batch(self.txn, the_sql, args), sql
|
lambda the_sql: execute_batch(self.txn, the_sql, args), sql
|
||||||
)
|
)
|
||||||
else:
|
else:
|
||||||
|
# TODO: is it safe for values to be Iterable[Iterable[Any]] here?
|
||||||
|
# https://docs.python.org/3/library/sqlite3.html?highlight=sqlite3#sqlite3.Cursor.executemany
|
||||||
|
# suggests that the outer collection may be iterable, but
|
||||||
|
# https://docs.python.org/3/library/sqlite3.html?highlight=sqlite3#how-to-use-placeholders-to-bind-values-in-sql-queries
|
||||||
|
# suggests that the inner collection should be a sequence or dict.
|
||||||
self.executemany(sql, args)
|
self.executemany(sql, args)
|
||||||
|
|
||||||
def execute_values(
|
def execute_values(
|
||||||
|
@ -390,14 +398,20 @@ class LoggingTransaction:
|
||||||
from psycopg2.extras import execute_values
|
from psycopg2.extras import execute_values
|
||||||
|
|
||||||
return self._do_execute(
|
return self._do_execute(
|
||||||
|
# TODO: is it safe for values to be Iterable[Iterable[Any]] here?
|
||||||
|
# https://www.psycopg.org/docs/extras.html?highlight=execute_batch#psycopg2.extras.execute_values says values should be Sequence[Sequence]
|
||||||
lambda the_sql: execute_values(self.txn, the_sql, values, fetch=fetch),
|
lambda the_sql: execute_values(self.txn, the_sql, values, fetch=fetch),
|
||||||
sql,
|
sql,
|
||||||
)
|
)
|
||||||
|
|
||||||
def execute(self, sql: str, *args: Any) -> None:
|
def execute(self, sql: str, parameters: SQLQueryParameters = ()) -> None:
|
||||||
self._do_execute(self.txn.execute, sql, *args)
|
self._do_execute(self.txn.execute, sql, parameters)
|
||||||
|
|
||||||
def executemany(self, sql: str, *args: Any) -> None:
|
def executemany(self, sql: str, *args: Any) -> None:
|
||||||
|
# TODO: we should add a type for *args here. Looking at Cursor.executemany
|
||||||
|
# and DBAPI2 it ought to be Sequence[_Parameter], but we pass in
|
||||||
|
# Iterable[Iterable[Any]] in execute_batch and execute_values above, which mypy
|
||||||
|
# complains about.
|
||||||
self._do_execute(self.txn.executemany, sql, *args)
|
self._do_execute(self.txn.executemany, sql, *args)
|
||||||
|
|
||||||
def executescript(self, sql: str) -> None:
|
def executescript(self, sql: str) -> None:
|
||||||
|
|
|
@ -129,8 +129,6 @@ class DirectoryWorkerStore(CacheInvalidationWorkerStore):
|
||||||
409, "Room alias %s already exists" % room_alias.to_string()
|
409, "Room alias %s already exists" % room_alias.to_string()
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
class DirectoryStore(DirectoryWorkerStore):
|
|
||||||
async def delete_room_alias(self, room_alias: RoomAlias) -> Optional[str]:
|
async def delete_room_alias(self, room_alias: RoomAlias) -> Optional[str]:
|
||||||
room_id = await self.db_pool.runInteraction(
|
room_id = await self.db_pool.runInteraction(
|
||||||
"delete_room_alias", self._delete_room_alias_txn, room_alias
|
"delete_room_alias", self._delete_room_alias_txn, room_alias
|
||||||
|
@ -201,3 +199,7 @@ class DirectoryStore(DirectoryWorkerStore):
|
||||||
await self.db_pool.runInteraction(
|
await self.db_pool.runInteraction(
|
||||||
"_update_aliases_for_room_txn", _update_aliases_for_room_txn
|
"_update_aliases_for_room_txn", _update_aliases_for_room_txn
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class DirectoryStore(DirectoryWorkerStore):
|
||||||
|
pass
|
||||||
|
|
|
@ -114,6 +114,10 @@ class _NoChainCoverIndex(Exception):
|
||||||
|
|
||||||
|
|
||||||
class EventFederationWorkerStore(SignatureWorkerStore, EventsWorkerStore, SQLBaseStore):
|
class EventFederationWorkerStore(SignatureWorkerStore, EventsWorkerStore, SQLBaseStore):
|
||||||
|
# TODO: this attribute comes from EventPushActionWorkerStore. Should we inherit from
|
||||||
|
# that store so that mypy can deduce this for itself?
|
||||||
|
stream_ordering_month_ago: Optional[int]
|
||||||
|
|
||||||
def __init__(
|
def __init__(
|
||||||
self,
|
self,
|
||||||
database: DatabasePool,
|
database: DatabasePool,
|
||||||
|
@ -1182,8 +1186,8 @@ class EventFederationWorkerStore(SignatureWorkerStore, EventsWorkerStore, SQLBas
|
||||||
Throws a StoreError if we have since purged the index for
|
Throws a StoreError if we have since purged the index for
|
||||||
stream_orderings from that point.
|
stream_orderings from that point.
|
||||||
"""
|
"""
|
||||||
|
assert self.stream_ordering_month_ago is not None
|
||||||
if stream_ordering <= self.stream_ordering_month_ago: # type: ignore[attr-defined]
|
if stream_ordering <= self.stream_ordering_month_ago:
|
||||||
raise StoreError(400, f"stream_ordering too old {stream_ordering}")
|
raise StoreError(400, f"stream_ordering too old {stream_ordering}")
|
||||||
|
|
||||||
sql = """
|
sql = """
|
||||||
|
@ -1231,7 +1235,8 @@ class EventFederationWorkerStore(SignatureWorkerStore, EventsWorkerStore, SQLBas
|
||||||
|
|
||||||
# provided the last_change is recent enough, we now clamp the requested
|
# provided the last_change is recent enough, we now clamp the requested
|
||||||
# stream_ordering to it.
|
# stream_ordering to it.
|
||||||
if last_change > self.stream_ordering_month_ago: # type: ignore[attr-defined]
|
assert self.stream_ordering_month_ago is not None
|
||||||
|
if last_change > self.stream_ordering_month_ago:
|
||||||
stream_ordering = min(last_change, stream_ordering)
|
stream_ordering = min(last_change, stream_ordering)
|
||||||
|
|
||||||
return await self._get_forward_extremeties_for_room(room_id, stream_ordering)
|
return await self._get_forward_extremeties_for_room(room_id, stream_ordering)
|
||||||
|
@ -1246,8 +1251,8 @@ class EventFederationWorkerStore(SignatureWorkerStore, EventsWorkerStore, SQLBas
|
||||||
Throws a StoreError if we have since purged the index for
|
Throws a StoreError if we have since purged the index for
|
||||||
stream_orderings from that point.
|
stream_orderings from that point.
|
||||||
"""
|
"""
|
||||||
|
assert self.stream_ordering_month_ago is not None
|
||||||
if stream_ordering <= self.stream_ordering_month_ago: # type: ignore[attr-defined]
|
if stream_ordering <= self.stream_ordering_month_ago:
|
||||||
raise StoreError(400, "stream_ordering too old %s" % (stream_ordering,))
|
raise StoreError(400, "stream_ordering too old %s" % (stream_ordering,))
|
||||||
|
|
||||||
sql = """
|
sql = """
|
||||||
|
@ -1707,9 +1712,7 @@ class EventFederationWorkerStore(SignatureWorkerStore, EventsWorkerStore, SQLBas
|
||||||
DELETE FROM stream_ordering_to_exterm
|
DELETE FROM stream_ordering_to_exterm
|
||||||
WHERE stream_ordering < ?
|
WHERE stream_ordering < ?
|
||||||
"""
|
"""
|
||||||
txn.execute(
|
txn.execute(sql, (self.stream_ordering_month_ago,))
|
||||||
sql, (self.stream_ordering_month_ago,) # type: ignore[attr-defined]
|
|
||||||
)
|
|
||||||
|
|
||||||
await self.db_pool.runInteraction(
|
await self.db_pool.runInteraction(
|
||||||
"_delete_old_forward_extrem_cache",
|
"_delete_old_forward_extrem_cache",
|
||||||
|
|
|
@ -15,7 +15,7 @@
|
||||||
|
|
||||||
import itertools
|
import itertools
|
||||||
import logging
|
import logging
|
||||||
from typing import Any, Dict, Iterable, List, Optional, Tuple
|
from typing import Any, Dict, Iterable, List, Mapping, Optional, Tuple
|
||||||
|
|
||||||
from signedjson.key import decode_verify_key_bytes
|
from signedjson.key import decode_verify_key_bytes
|
||||||
|
|
||||||
|
@ -95,7 +95,7 @@ class KeyStore(SQLBaseStore):
|
||||||
self,
|
self,
|
||||||
from_server: str,
|
from_server: str,
|
||||||
ts_added_ms: int,
|
ts_added_ms: int,
|
||||||
verify_keys: Iterable[Tuple[str, str, FetchKeyResult]],
|
verify_keys: Mapping[Tuple[str, str], FetchKeyResult],
|
||||||
) -> None:
|
) -> None:
|
||||||
"""Stores NACL verification keys for remote servers.
|
"""Stores NACL verification keys for remote servers.
|
||||||
Args:
|
Args:
|
||||||
|
@ -108,7 +108,7 @@ class KeyStore(SQLBaseStore):
|
||||||
key_values = []
|
key_values = []
|
||||||
value_values = []
|
value_values = []
|
||||||
invalidations = []
|
invalidations = []
|
||||||
for server_name, key_id, fetch_result in verify_keys:
|
for (server_name, key_id), fetch_result in verify_keys.items():
|
||||||
key_values.append((server_name, key_id))
|
key_values.append((server_name, key_id))
|
||||||
value_values.append(
|
value_values.append(
|
||||||
(
|
(
|
||||||
|
|
|
@ -102,44 +102,34 @@ class UserDirectoryBackgroundUpdateStore(StateDeltasStore):
|
||||||
) -> int:
|
) -> int:
|
||||||
# Get all the rooms that we want to process.
|
# Get all the rooms that we want to process.
|
||||||
def _make_staging_area(txn: LoggingTransaction) -> None:
|
def _make_staging_area(txn: LoggingTransaction) -> None:
|
||||||
sql = (
|
sql = f"""
|
||||||
"CREATE TABLE IF NOT EXISTS "
|
CREATE TABLE IF NOT EXISTS {TEMP_TABLE}_rooms AS
|
||||||
+ TEMP_TABLE
|
SELECT room_id, count(*) AS events
|
||||||
+ "_rooms(room_id TEXT NOT NULL, events BIGINT NOT NULL)"
|
FROM current_state_events
|
||||||
)
|
|
||||||
txn.execute(sql)
|
|
||||||
|
|
||||||
sql = (
|
|
||||||
"CREATE TABLE IF NOT EXISTS "
|
|
||||||
+ TEMP_TABLE
|
|
||||||
+ "_position(position TEXT NOT NULL)"
|
|
||||||
)
|
|
||||||
txn.execute(sql)
|
|
||||||
|
|
||||||
# Get rooms we want to process from the database
|
|
||||||
sql = """
|
|
||||||
SELECT room_id, count(*) FROM current_state_events
|
|
||||||
GROUP BY room_id
|
GROUP BY room_id
|
||||||
"""
|
"""
|
||||||
txn.execute(sql)
|
txn.execute(sql)
|
||||||
rooms = list(txn.fetchall())
|
txn.execute(
|
||||||
self.db_pool.simple_insert_many_txn(
|
f"CREATE INDEX IF NOT EXISTS {TEMP_TABLE}_rooms_rm ON {TEMP_TABLE}_rooms (room_id)"
|
||||||
txn, TEMP_TABLE + "_rooms", keys=("room_id", "events"), values=rooms
|
)
|
||||||
|
txn.execute(
|
||||||
|
f"CREATE INDEX IF NOT EXISTS {TEMP_TABLE}_rooms_evs ON {TEMP_TABLE}_rooms (events)"
|
||||||
)
|
)
|
||||||
del rooms
|
|
||||||
|
|
||||||
sql = (
|
sql = f"""
|
||||||
"CREATE TABLE IF NOT EXISTS "
|
CREATE TABLE IF NOT EXISTS {TEMP_TABLE}_position (
|
||||||
+ TEMP_TABLE
|
position TEXT NOT NULL
|
||||||
+ "_users(user_id TEXT NOT NULL)"
|
|
||||||
)
|
)
|
||||||
|
"""
|
||||||
txn.execute(sql)
|
txn.execute(sql)
|
||||||
|
|
||||||
txn.execute("SELECT name FROM users")
|
sql = f"""
|
||||||
users = list(txn.fetchall())
|
CREATE TABLE IF NOT EXISTS {TEMP_TABLE}_users AS
|
||||||
|
SELECT name AS user_id FROM users
|
||||||
self.db_pool.simple_insert_many_txn(
|
"""
|
||||||
txn, TEMP_TABLE + "_users", keys=("user_id",), values=users
|
txn.execute(sql)
|
||||||
|
txn.execute(
|
||||||
|
f"CREATE INDEX IF NOT EXISTS {TEMP_TABLE}_users_idx ON {TEMP_TABLE}_users (user_id)"
|
||||||
)
|
)
|
||||||
|
|
||||||
new_pos = await self.get_max_stream_id_in_current_state_deltas()
|
new_pos = await self.get_max_stream_id_in_current_state_deltas()
|
||||||
|
@ -222,6 +212,7 @@ class UserDirectoryBackgroundUpdateStore(StateDeltasStore):
|
||||||
if not rooms_to_work_on:
|
if not rooms_to_work_on:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
|
if "remaining" not in progress:
|
||||||
# Get how many are left to process, so we can give status on how
|
# Get how many are left to process, so we can give status on how
|
||||||
# far we are in processing
|
# far we are in processing
|
||||||
txn.execute("SELECT COUNT(*) FROM " + TEMP_TABLE + "_rooms")
|
txn.execute("SELECT COUNT(*) FROM " + TEMP_TABLE + "_rooms")
|
||||||
|
@ -332,7 +323,14 @@ class UserDirectoryBackgroundUpdateStore(StateDeltasStore):
|
||||||
|
|
||||||
if processed_event_count > batch_size:
|
if processed_event_count > batch_size:
|
||||||
# Don't process any more rooms, we've hit our batch size.
|
# Don't process any more rooms, we've hit our batch size.
|
||||||
return processed_event_count
|
break
|
||||||
|
|
||||||
|
await self.db_pool.runInteraction(
|
||||||
|
"populate_user_directory",
|
||||||
|
self.db_pool.updates._background_update_progress_txn,
|
||||||
|
"populate_user_directory_process_rooms",
|
||||||
|
progress,
|
||||||
|
)
|
||||||
|
|
||||||
return processed_event_count
|
return processed_event_count
|
||||||
|
|
||||||
|
@ -356,6 +354,7 @@ class UserDirectoryBackgroundUpdateStore(StateDeltasStore):
|
||||||
|
|
||||||
users_to_work_on = [x[0] for x in user_result]
|
users_to_work_on = [x[0] for x in user_result]
|
||||||
|
|
||||||
|
if "remaining" not in progress:
|
||||||
# Get how many are left to process, so we can give status on how
|
# Get how many are left to process, so we can give status on how
|
||||||
# far we are in processing
|
# far we are in processing
|
||||||
sql = "SELECT COUNT(*) FROM " + TEMP_TABLE + "_users"
|
sql = "SELECT COUNT(*) FROM " + TEMP_TABLE + "_users"
|
||||||
|
|
|
@ -31,14 +31,14 @@ from typing_extensions import Protocol
|
||||||
Some very basic protocol definitions for the DB-API2 classes specified in PEP-249
|
Some very basic protocol definitions for the DB-API2 classes specified in PEP-249
|
||||||
"""
|
"""
|
||||||
|
|
||||||
_Parameters = Union[Sequence[Any], Mapping[str, Any]]
|
SQLQueryParameters = Union[Sequence[Any], Mapping[str, Any]]
|
||||||
|
|
||||||
|
|
||||||
class Cursor(Protocol):
|
class Cursor(Protocol):
|
||||||
def execute(self, sql: str, parameters: _Parameters = ...) -> Any:
|
def execute(self, sql: str, parameters: SQLQueryParameters = ...) -> Any:
|
||||||
...
|
...
|
||||||
|
|
||||||
def executemany(self, sql: str, parameters: Sequence[_Parameters]) -> Any:
|
def executemany(self, sql: str, parameters: Sequence[SQLQueryParameters]) -> Any:
|
||||||
...
|
...
|
||||||
|
|
||||||
def fetchone(self) -> Optional[Tuple]:
|
def fetchone(self) -> Optional[Tuple]:
|
||||||
|
|
|
@ -193,7 +193,7 @@ class KeyringTestCase(unittest.HomeserverTestCase):
|
||||||
r = self.hs.get_datastores().main.store_server_verify_keys(
|
r = self.hs.get_datastores().main.store_server_verify_keys(
|
||||||
"server9",
|
"server9",
|
||||||
int(time.time() * 1000),
|
int(time.time() * 1000),
|
||||||
[("server9", get_key_id(key1), FetchKeyResult(get_verify_key(key1), 1000))],
|
{("server9", get_key_id(key1)): FetchKeyResult(get_verify_key(key1), 1000)},
|
||||||
)
|
)
|
||||||
self.get_success(r)
|
self.get_success(r)
|
||||||
|
|
||||||
|
@ -291,7 +291,7 @@ class KeyringTestCase(unittest.HomeserverTestCase):
|
||||||
# None is not a valid value in FetchKeyResult, but we're abusing this
|
# None is not a valid value in FetchKeyResult, but we're abusing this
|
||||||
# API to insert null values into the database. The nulls get converted
|
# API to insert null values into the database. The nulls get converted
|
||||||
# to 0 when fetched in KeyStore.get_server_verify_keys.
|
# to 0 when fetched in KeyStore.get_server_verify_keys.
|
||||||
[("server9", get_key_id(key1), FetchKeyResult(get_verify_key(key1), None))], # type: ignore[arg-type]
|
{("server9", get_key_id(key1)): FetchKeyResult(get_verify_key(key1), None)}, # type: ignore[arg-type]
|
||||||
)
|
)
|
||||||
self.get_success(r)
|
self.get_success(r)
|
||||||
|
|
||||||
|
|
|
@ -35,7 +35,7 @@ class TestSSOHandler(unittest.HomeserverTestCase):
|
||||||
)
|
)
|
||||||
return hs
|
return hs
|
||||||
|
|
||||||
async def test_set_avatar(self) -> None:
|
def test_set_avatar(self) -> None:
|
||||||
"""Tests successfully setting the avatar of a newly created user"""
|
"""Tests successfully setting the avatar of a newly created user"""
|
||||||
handler = self.hs.get_sso_handler()
|
handler = self.hs.get_sso_handler()
|
||||||
|
|
||||||
|
@ -54,7 +54,7 @@ class TestSSOHandler(unittest.HomeserverTestCase):
|
||||||
self.assertIsNot(profile["avatar_url"], None)
|
self.assertIsNot(profile["avatar_url"], None)
|
||||||
|
|
||||||
@unittest.override_config({"max_avatar_size": 1})
|
@unittest.override_config({"max_avatar_size": 1})
|
||||||
async def test_set_avatar_too_big_image(self) -> None:
|
def test_set_avatar_too_big_image(self) -> None:
|
||||||
"""Tests that saving an avatar fails when it is too big"""
|
"""Tests that saving an avatar fails when it is too big"""
|
||||||
handler = self.hs.get_sso_handler()
|
handler = self.hs.get_sso_handler()
|
||||||
|
|
||||||
|
@ -66,7 +66,7 @@ class TestSSOHandler(unittest.HomeserverTestCase):
|
||||||
)
|
)
|
||||||
|
|
||||||
@unittest.override_config({"allowed_avatar_mimetypes": ["image/jpeg"]})
|
@unittest.override_config({"allowed_avatar_mimetypes": ["image/jpeg"]})
|
||||||
async def test_set_avatar_incorrect_mime_type(self) -> None:
|
def test_set_avatar_incorrect_mime_type(self) -> None:
|
||||||
"""Tests that saving an avatar fails when its mime type is not allowed"""
|
"""Tests that saving an avatar fails when its mime type is not allowed"""
|
||||||
handler = self.hs.get_sso_handler()
|
handler = self.hs.get_sso_handler()
|
||||||
|
|
||||||
|
@ -77,7 +77,7 @@ class TestSSOHandler(unittest.HomeserverTestCase):
|
||||||
self.get_success(handler.set_avatar(user_id, "http://my.server/me.png"))
|
self.get_success(handler.set_avatar(user_id, "http://my.server/me.png"))
|
||||||
)
|
)
|
||||||
|
|
||||||
async def test_skip_saving_avatar_when_not_changed(self) -> None:
|
def test_skip_saving_avatar_when_not_changed(self) -> None:
|
||||||
"""Tests whether saving of avatar correctly skips if the avatar hasn't
|
"""Tests whether saving of avatar correctly skips if the avatar hasn't
|
||||||
changed"""
|
changed"""
|
||||||
handler = self.hs.get_sso_handler()
|
handler = self.hs.get_sso_handler()
|
||||||
|
|
|
@ -792,7 +792,7 @@ class UserDirectoryTestCase(unittest.HomeserverTestCase):
|
||||||
return False
|
return False
|
||||||
|
|
||||||
# Configure a spam checker that does not filter any users.
|
# Configure a spam checker that does not filter any users.
|
||||||
spam_checker = self.hs.get_spam_checker()
|
spam_checker = self.hs.get_module_api_callbacks().spam_checker
|
||||||
spam_checker._check_username_for_spam_callbacks = [allow_all]
|
spam_checker._check_username_for_spam_callbacks = [allow_all]
|
||||||
|
|
||||||
# The results do not change:
|
# The results do not change:
|
||||||
|
|
|
@ -31,7 +31,6 @@ from twisted.test.proto_helpers import MemoryReactor
|
||||||
|
|
||||||
from synapse.api.errors import Codes
|
from synapse.api.errors import Codes
|
||||||
from synapse.events import EventBase
|
from synapse.events import EventBase
|
||||||
from synapse.events.spamcheck import load_legacy_spam_checkers
|
|
||||||
from synapse.http.types import QueryParams
|
from synapse.http.types import QueryParams
|
||||||
from synapse.logging.context import make_deferred_yieldable
|
from synapse.logging.context import make_deferred_yieldable
|
||||||
from synapse.media._base import FileInfo
|
from synapse.media._base import FileInfo
|
||||||
|
@ -39,6 +38,7 @@ from synapse.media.filepath import MediaFilePaths
|
||||||
from synapse.media.media_storage import MediaStorage, ReadableFileWrapper
|
from synapse.media.media_storage import MediaStorage, ReadableFileWrapper
|
||||||
from synapse.media.storage_provider import FileStorageProviderBackend
|
from synapse.media.storage_provider import FileStorageProviderBackend
|
||||||
from synapse.module_api import ModuleApi
|
from synapse.module_api import ModuleApi
|
||||||
|
from synapse.module_api.callbacks.spamchecker_callbacks import load_legacy_spam_checkers
|
||||||
from synapse.rest import admin
|
from synapse.rest import admin
|
||||||
from synapse.rest.client import login
|
from synapse.rest.client import login
|
||||||
from synapse.server import HomeServer
|
from synapse.server import HomeServer
|
||||||
|
|
|
@ -11,7 +11,7 @@
|
||||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
# See the License for the specific language governing permissions and
|
# See the License for the specific language governing permissions and
|
||||||
# limitations under the License.
|
# limitations under the License.
|
||||||
from typing import Any, Dict
|
from typing import Any, Dict, Optional
|
||||||
from unittest.mock import Mock
|
from unittest.mock import Mock
|
||||||
|
|
||||||
from twisted.internet import defer
|
from twisted.internet import defer
|
||||||
|
@ -21,6 +21,7 @@ from synapse.api.constants import EduTypes, EventTypes
|
||||||
from synapse.api.errors import NotFoundError
|
from synapse.api.errors import NotFoundError
|
||||||
from synapse.events import EventBase
|
from synapse.events import EventBase
|
||||||
from synapse.federation.units import Transaction
|
from synapse.federation.units import Transaction
|
||||||
|
from synapse.handlers.device import DeviceHandler
|
||||||
from synapse.handlers.presence import UserPresenceState
|
from synapse.handlers.presence import UserPresenceState
|
||||||
from synapse.handlers.push_rules import InvalidRuleException
|
from synapse.handlers.push_rules import InvalidRuleException
|
||||||
from synapse.module_api import ModuleApi
|
from synapse.module_api import ModuleApi
|
||||||
|
@ -773,6 +774,54 @@ class ModuleApiTestCase(BaseModuleApiTestCase):
|
||||||
# Check room alias.
|
# Check room alias.
|
||||||
self.assertIsNone(room_alias)
|
self.assertIsNone(room_alias)
|
||||||
|
|
||||||
|
def test_on_logged_out(self) -> None:
|
||||||
|
"""Test that on_logged_out module hook is properly called when logging out
|
||||||
|
a device, and that related pushers are still available at this time.
|
||||||
|
"""
|
||||||
|
device_id = "AAAAAAA"
|
||||||
|
user_id = self.register_user("test_on_logged_out", "secret")
|
||||||
|
self.login("test_on_logged_out", "secret", device_id)
|
||||||
|
|
||||||
|
self.get_success(
|
||||||
|
self.hs.get_pusherpool().add_or_update_pusher(
|
||||||
|
user_id=user_id,
|
||||||
|
device_id=device_id,
|
||||||
|
kind="http",
|
||||||
|
app_id="m.http",
|
||||||
|
app_display_name="HTTP Push Notifications",
|
||||||
|
device_display_name="pushy push",
|
||||||
|
pushkey="a@example.com",
|
||||||
|
lang=None,
|
||||||
|
data={"url": "http://example.com/_matrix/push/v1/notify"},
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
# Setup a callback counting the number of pushers.
|
||||||
|
number_of_pushers_in_callback: Optional[int] = None
|
||||||
|
|
||||||
|
async def _on_logged_out_mock(
|
||||||
|
user_id: str, device_id: Optional[str], access_token: str
|
||||||
|
) -> None:
|
||||||
|
nonlocal number_of_pushers_in_callback
|
||||||
|
number_of_pushers_in_callback = len(
|
||||||
|
self.hs.get_pusherpool().pushers[user_id].values()
|
||||||
|
)
|
||||||
|
|
||||||
|
self.module_api.register_password_auth_provider_callbacks(
|
||||||
|
on_logged_out=_on_logged_out_mock
|
||||||
|
)
|
||||||
|
|
||||||
|
# Delete the device.
|
||||||
|
device_handler = self.hs.get_device_handler()
|
||||||
|
assert isinstance(device_handler, DeviceHandler)
|
||||||
|
self.get_success(device_handler.delete_devices(user_id, [device_id]))
|
||||||
|
|
||||||
|
# Check that the callback was called and the pushers still existed.
|
||||||
|
self.assertEqual(number_of_pushers_in_callback, 1)
|
||||||
|
|
||||||
|
# Ensure the pushers were deleted after the callback.
|
||||||
|
self.assertEqual(len(self.hs.get_pusherpool().pushers[user_id].values()), 0)
|
||||||
|
|
||||||
|
|
||||||
class ModuleApiWorkerTestCase(BaseModuleApiTestCase, BaseMultiWorkerStreamTestCase):
|
class ModuleApiWorkerTestCase(BaseModuleApiTestCase, BaseMultiWorkerStreamTestCase):
|
||||||
"""For testing ModuleApi functionality in a multi-worker setup"""
|
"""For testing ModuleApi functionality in a multi-worker setup"""
|
||||||
|
|
|
@ -814,7 +814,9 @@ class RoomsCreateTestCase(RoomBase):
|
||||||
return False
|
return False
|
||||||
|
|
||||||
join_mock = Mock(side_effect=user_may_join_room)
|
join_mock = Mock(side_effect=user_may_join_room)
|
||||||
self.hs.get_spam_checker()._user_may_join_room_callbacks.append(join_mock)
|
self.hs.get_module_api_callbacks().spam_checker._user_may_join_room_callbacks.append(
|
||||||
|
join_mock
|
||||||
|
)
|
||||||
|
|
||||||
channel = self.make_request(
|
channel = self.make_request(
|
||||||
"POST",
|
"POST",
|
||||||
|
@ -840,7 +842,9 @@ class RoomsCreateTestCase(RoomBase):
|
||||||
return Codes.CONSENT_NOT_GIVEN
|
return Codes.CONSENT_NOT_GIVEN
|
||||||
|
|
||||||
join_mock = Mock(side_effect=user_may_join_room_codes)
|
join_mock = Mock(side_effect=user_may_join_room_codes)
|
||||||
self.hs.get_spam_checker()._user_may_join_room_callbacks.append(join_mock)
|
self.hs.get_module_api_callbacks().spam_checker._user_may_join_room_callbacks.append(
|
||||||
|
join_mock
|
||||||
|
)
|
||||||
|
|
||||||
channel = self.make_request(
|
channel = self.make_request(
|
||||||
"POST",
|
"POST",
|
||||||
|
@ -1162,7 +1166,9 @@ class RoomJoinTestCase(RoomBase):
|
||||||
# `spec` argument is needed for this function mock to have `__qualname__`, which
|
# `spec` argument is needed for this function mock to have `__qualname__`, which
|
||||||
# is needed for `Measure` metrics buried in SpamChecker.
|
# is needed for `Measure` metrics buried in SpamChecker.
|
||||||
callback_mock = Mock(side_effect=user_may_join_room, spec=lambda *x: None)
|
callback_mock = Mock(side_effect=user_may_join_room, spec=lambda *x: None)
|
||||||
self.hs.get_spam_checker()._user_may_join_room_callbacks.append(callback_mock)
|
self.hs.get_module_api_callbacks().spam_checker._user_may_join_room_callbacks.append(
|
||||||
|
callback_mock
|
||||||
|
)
|
||||||
|
|
||||||
# Join a first room, without being invited to it.
|
# Join a first room, without being invited to it.
|
||||||
self.helper.join(self.room1, self.user2, tok=self.tok2)
|
self.helper.join(self.room1, self.user2, tok=self.tok2)
|
||||||
|
@ -1227,7 +1233,9 @@ class RoomJoinTestCase(RoomBase):
|
||||||
# `spec` argument is needed for this function mock to have `__qualname__`, which
|
# `spec` argument is needed for this function mock to have `__qualname__`, which
|
||||||
# is needed for `Measure` metrics buried in SpamChecker.
|
# is needed for `Measure` metrics buried in SpamChecker.
|
||||||
callback_mock = Mock(side_effect=user_may_join_room, spec=lambda *x: None)
|
callback_mock = Mock(side_effect=user_may_join_room, spec=lambda *x: None)
|
||||||
self.hs.get_spam_checker()._user_may_join_room_callbacks.append(callback_mock)
|
self.hs.get_module_api_callbacks().spam_checker._user_may_join_room_callbacks.append(
|
||||||
|
callback_mock
|
||||||
|
)
|
||||||
|
|
||||||
# Join a first room, without being invited to it.
|
# Join a first room, without being invited to it.
|
||||||
self.helper.join(self.room1, self.user2, tok=self.tok2)
|
self.helper.join(self.room1, self.user2, tok=self.tok2)
|
||||||
|
@ -1643,7 +1651,7 @@ class RoomMessagesTestCase(RoomBase):
|
||||||
|
|
||||||
spam_checker = SpamCheck()
|
spam_checker = SpamCheck()
|
||||||
|
|
||||||
self.hs.get_spam_checker()._check_event_for_spam_callbacks.append(
|
self.hs.get_module_api_callbacks().spam_checker._check_event_for_spam_callbacks.append(
|
||||||
spam_checker.check_event_for_spam
|
spam_checker.check_event_for_spam
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -3381,7 +3389,9 @@ class ThreepidInviteTestCase(unittest.HomeserverTestCase):
|
||||||
# `spec` argument is needed for this function mock to have `__qualname__`, which
|
# `spec` argument is needed for this function mock to have `__qualname__`, which
|
||||||
# is needed for `Measure` metrics buried in SpamChecker.
|
# is needed for `Measure` metrics buried in SpamChecker.
|
||||||
mock = Mock(return_value=make_awaitable(True), spec=lambda *x: None)
|
mock = Mock(return_value=make_awaitable(True), spec=lambda *x: None)
|
||||||
self.hs.get_spam_checker()._user_may_send_3pid_invite_callbacks.append(mock)
|
self.hs.get_module_api_callbacks().spam_checker._user_may_send_3pid_invite_callbacks.append(
|
||||||
|
mock
|
||||||
|
)
|
||||||
|
|
||||||
# Send a 3PID invite into the room and check that it succeeded.
|
# Send a 3PID invite into the room and check that it succeeded.
|
||||||
email_to_invite = "teresa@example.com"
|
email_to_invite = "teresa@example.com"
|
||||||
|
@ -3446,7 +3456,9 @@ class ThreepidInviteTestCase(unittest.HomeserverTestCase):
|
||||||
return_value=make_awaitable(synapse.module_api.NOT_SPAM),
|
return_value=make_awaitable(synapse.module_api.NOT_SPAM),
|
||||||
spec=lambda *x: None,
|
spec=lambda *x: None,
|
||||||
)
|
)
|
||||||
self.hs.get_spam_checker()._user_may_send_3pid_invite_callbacks.append(mock)
|
self.hs.get_module_api_callbacks().spam_checker._user_may_send_3pid_invite_callbacks.append(
|
||||||
|
mock
|
||||||
|
)
|
||||||
|
|
||||||
# Send a 3PID invite into the room and check that it succeeded.
|
# Send a 3PID invite into the room and check that it succeeded.
|
||||||
email_to_invite = "teresa@example.com"
|
email_to_invite = "teresa@example.com"
|
||||||
|
|
|
@ -73,11 +73,11 @@ from twisted.web.server import Request, Site
|
||||||
from synapse.config.database import DatabaseConnectionConfig
|
from synapse.config.database import DatabaseConnectionConfig
|
||||||
from synapse.config.homeserver import HomeServerConfig
|
from synapse.config.homeserver import HomeServerConfig
|
||||||
from synapse.events.presence_router import load_legacy_presence_router
|
from synapse.events.presence_router import load_legacy_presence_router
|
||||||
from synapse.events.spamcheck import load_legacy_spam_checkers
|
|
||||||
from synapse.events.third_party_rules import load_legacy_third_party_event_rules
|
from synapse.events.third_party_rules import load_legacy_third_party_event_rules
|
||||||
from synapse.handlers.auth import load_legacy_password_auth_providers
|
from synapse.handlers.auth import load_legacy_password_auth_providers
|
||||||
from synapse.http.site import SynapseRequest
|
from synapse.http.site import SynapseRequest
|
||||||
from synapse.logging.context import ContextResourceUsage
|
from synapse.logging.context import ContextResourceUsage
|
||||||
|
from synapse.module_api.callbacks.spamchecker_callbacks import load_legacy_spam_checkers
|
||||||
from synapse.server import HomeServer
|
from synapse.server import HomeServer
|
||||||
from synapse.storage import DataStore
|
from synapse.storage import DataStore
|
||||||
from synapse.storage.database import LoggingDatabaseConnection
|
from synapse.storage.database import LoggingDatabaseConnection
|
||||||
|
|
|
@ -46,10 +46,10 @@ class KeyStoreTestCase(tests.unittest.HomeserverTestCase):
|
||||||
store.store_server_verify_keys(
|
store.store_server_verify_keys(
|
||||||
"from_server",
|
"from_server",
|
||||||
10,
|
10,
|
||||||
[
|
{
|
||||||
("server1", key_id_1, FetchKeyResult(KEY_1, 100)),
|
("server1", key_id_1): FetchKeyResult(KEY_1, 100),
|
||||||
("server1", key_id_2, FetchKeyResult(KEY_2, 200)),
|
("server1", key_id_2): FetchKeyResult(KEY_2, 200),
|
||||||
],
|
},
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -90,10 +90,10 @@ class KeyStoreTestCase(tests.unittest.HomeserverTestCase):
|
||||||
store.store_server_verify_keys(
|
store.store_server_verify_keys(
|
||||||
"from_server",
|
"from_server",
|
||||||
0,
|
0,
|
||||||
[
|
{
|
||||||
("srv1", key_id_1, FetchKeyResult(KEY_1, 100)),
|
("srv1", key_id_1): FetchKeyResult(KEY_1, 100),
|
||||||
("srv1", key_id_2, FetchKeyResult(KEY_2, 200)),
|
("srv1", key_id_2): FetchKeyResult(KEY_2, 200),
|
||||||
],
|
},
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
|
|
||||||
|
@ -119,7 +119,7 @@ class KeyStoreTestCase(tests.unittest.HomeserverTestCase):
|
||||||
signedjson.key.generate_signing_key("key2")
|
signedjson.key.generate_signing_key("key2")
|
||||||
)
|
)
|
||||||
d = store.store_server_verify_keys(
|
d = store.store_server_verify_keys(
|
||||||
"from_server", 10, [("srv1", key_id_2, FetchKeyResult(new_key_2, 300))]
|
"from_server", 10, {("srv1", key_id_2): FetchKeyResult(new_key_2, 300)}
|
||||||
)
|
)
|
||||||
self.get_success(d)
|
self.get_success(d)
|
||||||
|
|
||||||
|
|
|
@ -793,16 +793,12 @@ class FederatingHomeserverTestCase(HomeserverTestCase):
|
||||||
hs.get_datastores().main.store_server_verify_keys(
|
hs.get_datastores().main.store_server_verify_keys(
|
||||||
from_server=self.OTHER_SERVER_NAME,
|
from_server=self.OTHER_SERVER_NAME,
|
||||||
ts_added_ms=clock.time_msec(),
|
ts_added_ms=clock.time_msec(),
|
||||||
verify_keys=[
|
verify_keys={
|
||||||
(
|
(self.OTHER_SERVER_NAME, verify_key_id): FetchKeyResult(
|
||||||
self.OTHER_SERVER_NAME,
|
|
||||||
verify_key_id,
|
|
||||||
FetchKeyResult(
|
|
||||||
verify_key=verify_key,
|
verify_key=verify_key,
|
||||||
valid_until_ts=clock.time_msec() + 10000,
|
valid_until_ts=clock.time_msec() + 10000,
|
||||||
),
|
),
|
||||||
)
|
},
|
||||||
],
|
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
Loading…
Reference in New Issue