Merge branch 'develop' into admin_api_doc_fix

This commit is contained in:
Brian 2023-09-20 14:34:35 -04:00 committed by GitHub
commit 4889222cfa
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
114 changed files with 809 additions and 584 deletions

View File

@ -64,7 +64,7 @@ if not IS_PR:
{ {
"python-version": "3.11", "python-version": "3.11",
"database": "postgres", "database": "postgres",
"postgres-version": "15", "postgres-version": "16",
"extras": "all", "extras": "all",
} }
) )

View File

@ -1,3 +1,100 @@
# Synapse 1.93.0rc1 (2023-09-19)
### Features
- Add automatic purge after all users have forgotten a room. ([\#15488](https://github.com/matrix-org/synapse/issues/15488))
- Restore room purge/shutdown after a Synapse restart. ([\#15488](https://github.com/matrix-org/synapse/issues/15488))
- Support resolving homeservers using `matrix-fed` DNS SRV records from [MSC4040](https://github.com/matrix-org/matrix-spec-proposals/pull/4040). ([\#16137](https://github.com/matrix-org/synapse/issues/16137))
- Add the ability to use `G` (GiB) and `T` (TiB) suffixes in configuration options that refer to numbers of bytes. ([\#16219](https://github.com/matrix-org/synapse/issues/16219))
- Add span information to requests sent to appservices. Contributed by MTRNord. ([\#16227](https://github.com/matrix-org/synapse/issues/16227))
- Add the ability to enable/disable registrations when using CAS. Contributed by Aurélien Grimpard. ([\#16262](https://github.com/matrix-org/synapse/issues/16262))
- Allow the `/notifications` endpoint to be routed to workers. ([\#16265](https://github.com/matrix-org/synapse/issues/16265))
- Enable users to easily unsubscribe to notifications emails via the `List-Unsubscribe` header. ([\#16274](https://github.com/matrix-org/synapse/issues/16274))
- Report whether a user is `locked` in the [List Accounts admin API](https://matrix-org.github.io/synapse/latest/admin_api/user_admin_api.html#list-accounts), and exclude locked users by default. ([\#16328](https://github.com/matrix-org/synapse/issues/16328))
### Bugfixes
- Fix a long-standing bug where multi-device accounts could cause high load due to presence. ([\#16066](https://github.com/matrix-org/synapse/issues/16066), [\#16170](https://github.com/matrix-org/synapse/issues/16170), [\#16171](https://github.com/matrix-org/synapse/issues/16171), [\#16172](https://github.com/matrix-org/synapse/issues/16172), [\#16174](https://github.com/matrix-org/synapse/issues/16174))
- Fix a long-standing bug where appservices using [MSC2409](https://github.com/matrix-org/matrix-spec-proposals/pull/2409) to receive `to_device` messages would only get messages for one user. ([\#16251](https://github.com/matrix-org/synapse/issues/16251))
- Fix bug when using workers where Synapse could end up re-requesting the same remote device repeatedly. ([\#16252](https://github.com/matrix-org/synapse/issues/16252))
- Fix long-standing bug where we kept re-requesting a remote server's key repeatedly, potentially causing delays in receiving events over federation. ([\#16257](https://github.com/matrix-org/synapse/issues/16257))
- Avoid temporary storage of sensitive information. ([\#16272](https://github.com/matrix-org/synapse/issues/16272))
- Fix bug introduced in Synapse 1.49.0 when using dehydrated devices ([MSC2697](https://github.com/matrix-org/matrix-spec-proposals/pull/2697)) and refresh tokens. Contributed by Hanadi. ([\#16288](https://github.com/matrix-org/synapse/issues/16288))
- Fix a long-standing bug where invalid receipts would be accepted. ([\#16327](https://github.com/matrix-org/synapse/issues/16327))
- Use standard name for UTF-8 charset in emails. ([\#16329](https://github.com/matrix-org/synapse/issues/16329))
- Don't try refetching device lists for users on remote hosts that are marked as "down". ([\#16298](https://github.com/matrix-org/synapse/issues/16298))
### Improved Documentation
- Fix typos in the documentation. ([\#16282](https://github.com/matrix-org/synapse/issues/16282))
- Link to the Alpine Linux community package for Synapse. ([\#16304](https://github.com/matrix-org/synapse/issues/16304))
- Use string for `federation_client_minimum_tls_version` documentation examples. Contributed by @jcgruenhage. ([\#16353](https://github.com/matrix-org/synapse/issues/16353))
### Internal Changes
- Allow modules to delete rooms. ([\#15997](https://github.com/matrix-org/synapse/issues/15997))
- Add GCC and GNU Make to the Nix flake development environment so that `ruff` can be compiled. ([\#16090](https://github.com/matrix-org/synapse/issues/16090), [\#16263](https://github.com/matrix-org/synapse/issues/16263))
- Fix type checking when using the new version of Twisted. ([\#16235](https://github.com/matrix-org/synapse/issues/16235))
- Delete device messages asynchronously and in staged batches using the task scheduler. ([\#16240](https://github.com/matrix-org/synapse/issues/16240), [\#16311](https://github.com/matrix-org/synapse/issues/16311), [\#16312](https://github.com/matrix-org/synapse/issues/16312), [\#16313](https://github.com/matrix-org/synapse/issues/16313))
- Bump minimum supported Rust version to 1.61.0. ([\#16248](https://github.com/matrix-org/synapse/issues/16248))
- Update rust to version 1.71.1 in the nix development environment. ([\#16260](https://github.com/matrix-org/synapse/issues/16260))
- Simplify server key storage. ([\#16261](https://github.com/matrix-org/synapse/issues/16261))
- Reduce CPU overhead of change password endpoint. ([\#16264](https://github.com/matrix-org/synapse/issues/16264))
- Stop purging from tables slated for removal. ([\#16273](https://github.com/matrix-org/synapse/issues/16273))
- Improve type hints. ([\#16276](https://github.com/matrix-org/synapse/issues/16276), [\#16301](https://github.com/matrix-org/synapse/issues/16301), [\#16325](https://github.com/matrix-org/synapse/issues/16325), [\#16326](https://github.com/matrix-org/synapse/issues/16326))
- Raise `setuptools_rust` version cap to 1.7.0. ([\#16277](https://github.com/matrix-org/synapse/issues/16277))
- Fix using the new task scheduler causing lots of CPU to be used. ([\#16278](https://github.com/matrix-org/synapse/issues/16278))
- Upgrade CI run of Python 3.12 from rc1 to rc2. ([\#16280](https://github.com/matrix-org/synapse/issues/16280))
- Include values in SQL debug when using `execute_values` with Postgres. ([\#16281](https://github.com/matrix-org/synapse/issues/16281))
- Enable additional linting checks. ([\#16283](https://github.com/matrix-org/synapse/issues/16283))
- Refactor `receipts_graph` Postgres transactions to stop error messages. ([\#16299](https://github.com/matrix-org/synapse/issues/16299))
- Small improvements to logging in replication code. ([\#16309](https://github.com/matrix-org/synapse/issues/16309))
- Remove a reference cycle in background processes. ([\#16314](https://github.com/matrix-org/synapse/issues/16314))
- Only use literal strings for background process names. ([\#16315](https://github.com/matrix-org/synapse/issues/16315))
- Refactor `get_user_by_id`. ([\#16316](https://github.com/matrix-org/synapse/issues/16316))
- Speed up task to delete to-device messages. ([\#16318](https://github.com/matrix-org/synapse/issues/16318))
- Avoid patching code in tests. ([\#16349](https://github.com/matrix-org/synapse/issues/16349))
- Test against PostgreSQL 16. ([\#16351](https://github.com/matrix-org/synapse/issues/16351))
### Updates to locked dependencies
* Bump mypy from 1.4.1 to 1.5.1. ([\#16300](https://github.com/matrix-org/synapse/issues/16300))
* Bump black from 23.7.0 to 23.9.1. ([\#16295](https://github.com/matrix-org/synapse/issues/16295))
* Bump docker/build-push-action from 4 to 5. ([\#16336](https://github.com/matrix-org/synapse/issues/16336))
* Bump docker/login-action from 2 to 3. ([\#16339](https://github.com/matrix-org/synapse/issues/16339))
* Bump docker/metadata-action from 4 to 5. ([\#16337](https://github.com/matrix-org/synapse/issues/16337))
* Bump docker/setup-qemu-action from 2 to 3. ([\#16338](https://github.com/matrix-org/synapse/issues/16338))
* Bump furo from 2023.8.19 to 2023.9.10. ([\#16340](https://github.com/matrix-org/synapse/issues/16340))
* Bump gitpython from 3.1.32 to 3.1.35. ([\#16267](https://github.com/matrix-org/synapse/issues/16267), [\#16279](https://github.com/matrix-org/synapse/issues/16279))
* Bump mypy-zope from 1.0.0 to 1.0.1. ([\#16291](https://github.com/matrix-org/synapse/issues/16291))
* Bump pillow from 10.0.0 to 10.0.1. ([\#16344](https://github.com/matrix-org/synapse/issues/16344))
* Bump regex from 1.9.4 to 1.9.5. ([\#16233](https://github.com/matrix-org/synapse/issues/16233))
* Bump ruff from 0.0.286 to 0.0.290. ([\#16342](https://github.com/matrix-org/synapse/issues/16342))
* Bump serde_json from 1.0.105 to 1.0.107. ([\#16296](https://github.com/matrix-org/synapse/issues/16296), [\#16345](https://github.com/matrix-org/synapse/issues/16345))
* Bump twisted from 22.10.0 to 23.8.0. ([\#16235](https://github.com/matrix-org/synapse/issues/16235))
* Bump types-pillow from 10.0.0.2 to 10.0.0.3. ([\#16293](https://github.com/matrix-org/synapse/issues/16293))
* Bump types-setuptools from 68.0.0.3 to 68.2.0.0. ([\#16292](https://github.com/matrix-org/synapse/issues/16292))
* Bump typing-extensions from 4.7.1 to 4.8.0. ([\#16341](https://github.com/matrix-org/synapse/issues/16341))
# Synapse 1.92.3 (2023-09-18)
This is again a security update targeted at mitigating [CVE-2023-4863](https://cve.org/CVERecord?id=CVE-2023-4863).
It turns out that libwebp is bundled statically in Pillow wheels so we need to update this dependency instead of
libwebp package at the OS level.
Unlike what was advertised in 1.92.2 changelog this release also impacts PyPI wheels and Debian packages from matrix.org.
We encourage admins to upgrade as soon as possible.
### Internal Changes
- Pillow 10.0.1 is now mandatory because of libwebp CVE-2023-4863, since Pillow provides libwebp in the wheels. ([\#16347](https://github.com/matrix-org/synapse/issues/16347))
### Updates to locked dependencies
* Bump pillow from 10.0.0 to 10.0.1. ([\#16344](https://github.com/matrix-org/synapse/issues/16344))
# Synapse 1.92.2 (2023-09-15) # Synapse 1.92.2 (2023-09-15)
This is a Docker-only update to mitigate [CVE-2023-4863](https://cve.org/CVERecord?id=CVE-2023-4863), a critical vulnerability in `libwebp`. Server admins not using Docker should ensure that their `libwebp` is up to date (if installed). We encourage admins to upgrade as soon as possible. This is a Docker-only update to mitigate [CVE-2023-4863](https://cve.org/CVERecord?id=CVE-2023-4863), a critical vulnerability in `libwebp`. Server admins not using Docker should ensure that their `libwebp` is up to date (if installed). We encourage admins to upgrade as soon as possible.

4
Cargo.lock generated
View File

@ -352,9 +352,9 @@ dependencies = [
[[package]] [[package]]
name = "serde_json" name = "serde_json"
version = "1.0.106" version = "1.0.107"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2cc66a619ed80bf7a0f6b17dd063a84b88f6dea1813737cf469aef1d081142c2" checksum = "6b420ce6e3d8bd882e9b243c6eed35dbc9a6110c9769e74b584e0d68d1f20c65"
dependencies = [ dependencies = [
"itoa", "itoa",
"ryu", "ryu",

View File

@ -1 +0,0 @@
Add automatic purge after all users forgotten a room. Also add restore of purge/shutdown rooms after a synapse restart.

View File

@ -1 +0,0 @@
Allow modules to delete rooms.

View File

@ -1 +0,0 @@
Fix a long-standing bug where multi-device accounts could cause high load due to presence.

View File

@ -1 +0,0 @@
Add GCC and GNU Make to the Nix flake development environment so that `ruff` can be compiled.

View File

@ -1 +0,0 @@
Support resolving homeservers using `matrix-fed` DNS SRV records from [MSC4040](https://github.com/matrix-org/matrix-spec-proposals/pull/4040).

View File

@ -1 +0,0 @@
Fix a long-standing bug where multi-device accounts could cause high load due to presence.

View File

@ -1 +0,0 @@
Fix a long-standing bug where multi-device accounts could cause high load due to presence.

View File

@ -1 +0,0 @@
Fix a long-standing bug where multi-device accounts could cause high load due to presence.

View File

@ -1 +0,0 @@
Fix a long-standing bug where multi-device accounts could cause high load due to presence.

View File

@ -1 +0,0 @@
Add the ability to use `G` (GiB) and `T` (TiB) suffixes in configuration options that refer to numbers of bytes.

View File

@ -1 +0,0 @@
Add span information to requests sent to appservices. Contributed by MTRNord.

View File

@ -1 +0,0 @@
Fix type checking when using the new version of Twisted.

View File

@ -1 +0,0 @@
Delete device messages asynchronously and in staged batches using the task scheduler.

View File

@ -1 +0,0 @@
Bump minimum supported Rust version to 1.61.0.

View File

@ -1 +0,0 @@
Fix a long-standing bug where appservices using MSC2409 to receive to_device messages, would only get messages for one user.

View File

@ -1 +0,0 @@
Fix bug when using workers where Synapse could end up re-requesting the same remote device repeatedly.

View File

@ -1 +0,0 @@
Fix long-standing bug where we kept re-requesting a remote server's key repeatedly, potentially causing delays in receiving events over federation.

View File

@ -1 +0,0 @@
Update rust to version 1.71.1 in the nix development environment.

View File

@ -1 +0,0 @@
Simplify server key storage.

View File

@ -1 +0,0 @@
Add the ability to enable/disable registrations when in the CAS flow. Contributed by Aurélien Grimpard.

View File

@ -1 +0,0 @@
Add GCC and GNU Make to the Nix flake development environment so that `ruff` can be compiled.

View File

@ -1 +0,0 @@
Reduce CPU overhead of change password endpoint.

View File

@ -1 +0,0 @@
Allow `/notifications` endpoint to be routed to workers.

View File

@ -1 +0,0 @@
Avoid temporary storage of sensitive information.

View File

@ -1 +0,0 @@
Stop purging from tables slated for removal.

View File

@ -1 +0,0 @@
Enable users to easily unsubscribe to notifications emails via the `List-Unsubscribe` header.

View File

@ -1 +0,0 @@
Raise setuptools_rust version cap to 1.7.0.

View File

@ -1 +0,0 @@
Fix using the new task scheduler causing lots of CPU to be used.

View File

@ -1 +0,0 @@
Upgrade CI run of Python 3.12 from rc1 to rc2.

View File

@ -1 +0,0 @@
Include values in SQL debug when using `execute_values` with Postgres.

View File

@ -1 +0,0 @@
Fix typos in the documentation.

View File

@ -1 +0,0 @@
Enable additional linting checks.

View File

@ -1 +0,0 @@
Fix bug introduced in Synapse 1.49.0 when using dehydrated devices ([MSC2697](https://github.com/matrix-org/matrix-spec-proposals/pull/2697)) and refresh tokens. Contributed by Hanadi.

View File

@ -1 +0,0 @@
Don't try refetching device lists for users on remote hosts that are marked as "down".

View File

@ -1 +0,0 @@
Refactor `receipts_graph` Postgres transactions to stop error messages.

View File

@ -1 +0,0 @@
Bump mypy from 1.4.1 to 1.5.1.

View File

@ -1 +0,0 @@
Link to the Alpine Linux community package for Synapse.

View File

@ -1 +0,0 @@
Small improvements to logging in replication code.

View File

@ -1 +0,0 @@
Delete device messages asynchronously and in staged batches using the task scheduler.

View File

@ -1 +0,0 @@
Delete device messages asynchronously and in staged batches using the task scheduler.

View File

@ -1 +0,0 @@
Delete device messages asynchronously and in staged batches using the task scheduler.

View File

@ -1 +0,0 @@
Remove a reference cycle for in background processes.

View File

@ -1 +0,0 @@
Only use literal strings for background process names.

View File

@ -1 +0,0 @@
Refactor `get_user_by_id`.

View File

@ -1 +0,0 @@
Speed up task to delete to-device messages.

1
changelog.d/16355.doc Normal file
View File

@ -0,0 +1 @@
Fix rendering of user admin API documentation around deactivation. This was broken in Synapse 1.91.0.

12
debian/changelog vendored
View File

@ -1,3 +1,15 @@
matrix-synapse-py3 (1.93.0~rc1) stable; urgency=medium
* New synapse release 1.93.0rc1.
-- Synapse Packaging team <packages@matrix.org> Tue, 19 Sep 2023 11:55:00 +0000
matrix-synapse-py3 (1.92.3) stable; urgency=medium
* New Synapse release 1.92.3.
-- Synapse Packaging team <packages@matrix.org> Mon, 18 Sep 2023 15:05:04 +0200
matrix-synapse-py3 (1.92.2) stable; urgency=medium matrix-synapse-py3 (1.92.2) stable; urgency=medium
* New Synapse release 1.92.2. * New Synapse release 1.92.2.

View File

@ -54,7 +54,8 @@ It returns a JSON body like the following:
"external_id": "<user_id_provider_2>" "external_id": "<user_id_provider_2>"
} }
], ],
"user_type": null "user_type": null,
"locked": false
} }
``` ```
@ -103,7 +104,8 @@ with a body of:
], ],
"admin": false, "admin": false,
"deactivated": false, "deactivated": false,
"user_type": null "user_type": null,
"locked": false
} }
``` ```
@ -146,7 +148,6 @@ Body parameters:
- `admin` - **bool**, optional, defaults to `false`. Whether the user is a homeserver administrator, - `admin` - **bool**, optional, defaults to `false`. Whether the user is a homeserver administrator,
granting them access to the Admin API, among other things. granting them access to the Admin API, among other things.
- `deactivated` - **bool**, optional. If unspecified, deactivation state will be left unchanged. - `deactivated` - **bool**, optional. If unspecified, deactivation state will be left unchanged.
- `locked` - **bool**, optional. If unspecified, locked state will be left unchanged.
Note: the `password` field must also be set if both of the following are true: Note: the `password` field must also be set if both of the following are true:
- `deactivated` is set to `false` and the user was previously deactivated (you are reactivating this user) - `deactivated` is set to `false` and the user was previously deactivated (you are reactivating this user)
@ -156,6 +157,7 @@ Body parameters:
Note: a user cannot be erased with this API. For more details on Note: a user cannot be erased with this API. For more details on
deactivating and erasing users see [Deactivate Account](#deactivate-account). deactivating and erasing users see [Deactivate Account](#deactivate-account).
- `locked` - **bool**, optional. If unspecified, locked state will be left unchanged.
- `user_type` - **string** or null, optional. If not provided, the user type will be - `user_type` - **string** or null, optional. If not provided, the user type will be
not be changed. If `null` is given, the user type will be cleared. not be changed. If `null` is given, the user type will be cleared.
Other allowed options are: `bot` and `support`. Other allowed options are: `bot` and `support`.
@ -184,7 +186,8 @@ A response body like the following is returned:
"shadow_banned": 0, "shadow_banned": 0,
"displayname": "<User One>", "displayname": "<User One>",
"avatar_url": null, "avatar_url": null,
"creation_ts": 1560432668000 "creation_ts": 1560432668000,
"locked": false
}, { }, {
"name": "<user_id2>", "name": "<user_id2>",
"is_guest": 0, "is_guest": 0,
@ -195,7 +198,8 @@ A response body like the following is returned:
"shadow_banned": 0, "shadow_banned": 0,
"displayname": "<User Two>", "displayname": "<User Two>",
"avatar_url": "<avatar_url>", "avatar_url": "<avatar_url>",
"creation_ts": 1561550621000 "creation_ts": 1561550621000,
"locked": false
} }
], ],
"next_token": "100", "next_token": "100",
@ -249,6 +253,8 @@ The following parameters should be set in the URL:
- `not_user_type` - Exclude certain user types, such as bot users, from the request. - `not_user_type` - Exclude certain user types, such as bot users, from the request.
Can be provided multiple times. Possible values are `bot`, `support` or "empty string". Can be provided multiple times. Possible values are `bot`, `support` or "empty string".
"empty string" here means to exclude users without a type. "empty string" here means to exclude users without a type.
- `locked` - string representing a bool - Is optional and if `true` will **include** locked users.
Defaults to `false` to exclude locked users. Note: Introduced in v1.93.
Caution. The database only has indexes on the columns `name` and `creation_ts`. Caution. The database only has indexes on the columns `name` and `creation_ts`.
This means that if a different sort order is used (`is_guest`, `admin`, This means that if a different sort order is used (`is_guest`, `admin`,
@ -274,10 +280,11 @@ The following fields are returned in the JSON response body:
- `avatar_url` - string - The user's avatar URL if they have set one. - `avatar_url` - string - The user's avatar URL if they have set one.
- `creation_ts` - integer - The user's creation timestamp in ms. - `creation_ts` - integer - The user's creation timestamp in ms.
- `last_seen_ts` - integer - The user's last activity timestamp in ms. - `last_seen_ts` - integer - The user's last activity timestamp in ms.
- `locked` - bool - Status if that user has been marked as locked. Note: Introduced in v1.93.
- `next_token`: string representing a positive integer - Indication for pagination. See above. - `next_token`: string representing a positive integer - Indication for pagination. See above.
- `total` - integer - Total number of media. - `total` - integer - Total number of media.
*Added in Synapse 1.93:* the `locked` query parameter and response field.
## Query current sessions for a user ## Query current sessions for a user

View File

@ -1133,14 +1133,14 @@ federation_verify_certificates: false
The minimum TLS version that will be used for outbound federation requests. The minimum TLS version that will be used for outbound federation requests.
Defaults to `1`. Configurable to `1`, `1.1`, `1.2`, or `1.3`. Note Defaults to `"1"`. Configurable to `"1"`, `"1.1"`, `"1.2"`, or `"1.3"`. Note
that setting this value higher than `1.2` will prevent federation to most that setting this value higher than `"1.2"` will prevent federation to most
of the public Matrix network: only configure it to `1.3` if you have an of the public Matrix network: only configure it to `"1.3"` if you have an
entirely private federation setup and you can ensure TLS 1.3 support. entirely private federation setup and you can ensure TLS 1.3 support.
Example configuration: Example configuration:
```yaml ```yaml
federation_client_minimum_tls_version: 1.2 federation_client_minimum_tls_version: "1.2"
``` ```
--- ---
### `federation_certificate_verification_whitelist` ### `federation_certificate_verification_whitelist`

156
poetry.lock generated
View File

@ -555,13 +555,13 @@ dev = ["Sphinx", "coverage", "flake8", "lxml", "lxml-stubs", "memory-profiler",
[[package]] [[package]]
name = "furo" name = "furo"
version = "2023.8.19" version = "2023.9.10"
description = "A clean customisable Sphinx documentation theme." description = "A clean customisable Sphinx documentation theme."
optional = false optional = false
python-versions = ">=3.8" python-versions = ">=3.8"
files = [ files = [
{file = "furo-2023.8.19-py3-none-any.whl", hash = "sha256:12f99f87a1873b6746228cfde18f77244e6c1ffb85d7fed95e638aae70d80590"}, {file = "furo-2023.9.10-py3-none-any.whl", hash = "sha256:513092538537dc5c596691da06e3c370714ec99bc438680edc1debffb73e5bfc"},
{file = "furo-2023.8.19.tar.gz", hash = "sha256:e671ee638ab3f1b472f4033b0167f502ab407830e0db0f843b1c1028119c9cd1"}, {file = "furo-2023.9.10.tar.gz", hash = "sha256:5707530a476d2a63b8cad83b4f961f3739a69f4b058bcf38a03a39fa537195b2"},
] ]
[package.dependencies] [package.dependencies]
@ -1618,67 +1618,65 @@ files = [
[[package]] [[package]]
name = "pillow" name = "pillow"
version = "10.0.0" version = "10.0.1"
description = "Python Imaging Library (Fork)" description = "Python Imaging Library (Fork)"
optional = false optional = false
python-versions = ">=3.8" python-versions = ">=3.8"
files = [ files = [
{file = "Pillow-10.0.0-cp310-cp310-macosx_10_10_x86_64.whl", hash = "sha256:1f62406a884ae75fb2f818694469519fb685cc7eaff05d3451a9ebe55c646891"}, {file = "Pillow-10.0.1-cp310-cp310-macosx_10_10_x86_64.whl", hash = "sha256:8f06be50669087250f319b706decf69ca71fdecd829091a37cc89398ca4dc17a"},
{file = "Pillow-10.0.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:d5db32e2a6ccbb3d34d87c87b432959e0db29755727afb37290e10f6e8e62614"}, {file = "Pillow-10.0.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:50bd5f1ebafe9362ad622072a1d2f5850ecfa44303531ff14353a4059113b12d"},
{file = "Pillow-10.0.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:edf4392b77bdc81f36e92d3a07a5cd072f90253197f4a52a55a8cec48a12483b"}, {file = "Pillow-10.0.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e6a90167bcca1216606223a05e2cf991bb25b14695c518bc65639463d7db722d"},
{file = "Pillow-10.0.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:520f2a520dc040512699f20fa1c363eed506e94248d71f85412b625026f6142c"}, {file = "Pillow-10.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f11c9102c56ffb9ca87134bd025a43d2aba3f1155f508eff88f694b33a9c6d19"},
{file = "Pillow-10.0.0-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:8c11160913e3dd06c8ffdb5f233a4f254cb449f4dfc0f8f4549eda9e542c93d1"}, {file = "Pillow-10.0.1-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:186f7e04248103482ea6354af6d5bcedb62941ee08f7f788a1c7707bc720c66f"},
{file = "Pillow-10.0.0-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:a74ba0c356aaa3bb8e3eb79606a87669e7ec6444be352870623025d75a14a2bf"}, {file = "Pillow-10.0.1-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:0462b1496505a3462d0f35dc1c4d7b54069747d65d00ef48e736acda2c8cbdff"},
{file = "Pillow-10.0.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:d5d0dae4cfd56969d23d94dc8e89fb6a217be461c69090768227beb8ed28c0a3"}, {file = "Pillow-10.0.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:d889b53ae2f030f756e61a7bff13684dcd77e9af8b10c6048fb2c559d6ed6eaf"},
{file = "Pillow-10.0.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:22c10cc517668d44b211717fd9775799ccec4124b9a7f7b3635fc5386e584992"}, {file = "Pillow-10.0.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:552912dbca585b74d75279a7570dd29fa43b6d93594abb494ebb31ac19ace6bd"},
{file = "Pillow-10.0.0-cp310-cp310-win_amd64.whl", hash = "sha256:dffe31a7f47b603318c609f378ebcd57f1554a3a6a8effbc59c3c69f804296de"}, {file = "Pillow-10.0.1-cp310-cp310-win_amd64.whl", hash = "sha256:787bb0169d2385a798888e1122c980c6eff26bf941a8ea79747d35d8f9210ca0"},
{file = "Pillow-10.0.0-cp311-cp311-macosx_10_10_x86_64.whl", hash = "sha256:9fb218c8a12e51d7ead2a7c9e101a04982237d4855716af2e9499306728fb485"}, {file = "Pillow-10.0.1-cp311-cp311-macosx_10_10_x86_64.whl", hash = "sha256:fd2a5403a75b54661182b75ec6132437a181209b901446ee5724b589af8edef1"},
{file = "Pillow-10.0.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:d35e3c8d9b1268cbf5d3670285feb3528f6680420eafe35cccc686b73c1e330f"}, {file = "Pillow-10.0.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:2d7e91b4379f7a76b31c2dda84ab9e20c6220488e50f7822e59dac36b0cd92b1"},
{file = "Pillow-10.0.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3ed64f9ca2f0a95411e88a4efbd7a29e5ce2cea36072c53dd9d26d9c76f753b3"}, {file = "Pillow-10.0.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:19e9adb3f22d4c416e7cd79b01375b17159d6990003633ff1d8377e21b7f1b21"},
{file = "Pillow-10.0.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0b6eb5502f45a60a3f411c63187db83a3d3107887ad0d036c13ce836f8a36f1d"}, {file = "Pillow-10.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:93139acd8109edcdeffd85e3af8ae7d88b258b3a1e13a038f542b79b6d255c54"},
{file = "Pillow-10.0.0-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:c1fbe7621c167ecaa38ad29643d77a9ce7311583761abf7836e1510c580bf3dd"}, {file = "Pillow-10.0.1-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:92a23b0431941a33242b1f0ce6c88a952e09feeea9af4e8be48236a68ffe2205"},
{file = "Pillow-10.0.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:cd25d2a9d2b36fcb318882481367956d2cf91329f6892fe5d385c346c0649629"}, {file = "Pillow-10.0.1-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:cbe68deb8580462ca0d9eb56a81912f59eb4542e1ef8f987405e35a0179f4ea2"},
{file = "Pillow-10.0.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:3b08d4cc24f471b2c8ca24ec060abf4bebc6b144cb89cba638c720546b1cf538"}, {file = "Pillow-10.0.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:522ff4ac3aaf839242c6f4e5b406634bfea002469656ae8358644fc6c4856a3b"},
{file = "Pillow-10.0.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:d737a602fbd82afd892ca746392401b634e278cb65d55c4b7a8f48e9ef8d008d"}, {file = "Pillow-10.0.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:84efb46e8d881bb06b35d1d541aa87f574b58e87f781cbba8d200daa835b42e1"},
{file = "Pillow-10.0.0-cp311-cp311-win_amd64.whl", hash = "sha256:3a82c40d706d9aa9734289740ce26460a11aeec2d9c79b7af87bb35f0073c12f"}, {file = "Pillow-10.0.1-cp311-cp311-win_amd64.whl", hash = "sha256:898f1d306298ff40dc1b9ca24824f0488f6f039bc0e25cfb549d3195ffa17088"},
{file = "Pillow-10.0.0-cp311-cp311-win_arm64.whl", hash = "sha256:bc2ec7c7b5d66b8ec9ce9f720dbb5fa4bace0f545acd34870eff4a369b44bf37"}, {file = "Pillow-10.0.1-cp312-cp312-macosx_10_10_x86_64.whl", hash = "sha256:bcf1207e2f2385a576832af02702de104be71301c2696d0012b1b93fe34aaa5b"},
{file = "Pillow-10.0.0-cp312-cp312-macosx_10_10_x86_64.whl", hash = "sha256:d80cf684b541685fccdd84c485b31ce73fc5c9b5d7523bf1394ce134a60c6883"}, {file = "Pillow-10.0.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:5d6c9049c6274c1bb565021367431ad04481ebb54872edecfcd6088d27edd6ed"},
{file = "Pillow-10.0.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:76de421f9c326da8f43d690110f0e79fe3ad1e54be811545d7d91898b4c8493e"}, {file = "Pillow-10.0.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:28444cb6ad49726127d6b340217f0627abc8732f1194fd5352dec5e6a0105635"},
{file = "Pillow-10.0.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:81ff539a12457809666fef6624684c008e00ff6bf455b4b89fd00a140eecd640"}, {file = "Pillow-10.0.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:de596695a75496deb3b499c8c4f8e60376e0516e1a774e7bc046f0f48cd620ad"},
{file = "Pillow-10.0.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ce543ed15570eedbb85df19b0a1a7314a9c8141a36ce089c0a894adbfccb4568"}, {file = "Pillow-10.0.1-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:2872f2d7846cf39b3dbff64bc1104cc48c76145854256451d33c5faa55c04d1a"},
{file = "Pillow-10.0.0-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:685ac03cc4ed5ebc15ad5c23bc555d68a87777586d970c2c3e216619a5476223"}, {file = "Pillow-10.0.1-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:4ce90f8a24e1c15465048959f1e94309dfef93af272633e8f37361b824532e91"},
{file = "Pillow-10.0.0-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:d72e2ecc68a942e8cf9739619b7f408cc7b272b279b56b2c83c6123fcfa5cdff"}, {file = "Pillow-10.0.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:ee7810cf7c83fa227ba9125de6084e5e8b08c59038a7b2c9045ef4dde61663b4"},
{file = "Pillow-10.0.0-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:d50b6aec14bc737742ca96e85d6d0a5f9bfbded018264b3b70ff9d8c33485551"}, {file = "Pillow-10.0.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:b1be1c872b9b5fcc229adeadbeb51422a9633abd847c0ff87dc4ef9bb184ae08"},
{file = "Pillow-10.0.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:00e65f5e822decd501e374b0650146063fbb30a7264b4d2744bdd7b913e0cab5"}, {file = "Pillow-10.0.1-cp312-cp312-win_amd64.whl", hash = "sha256:98533fd7fa764e5f85eebe56c8e4094db912ccbe6fbf3a58778d543cadd0db08"},
{file = "Pillow-10.0.0-cp312-cp312-win_amd64.whl", hash = "sha256:f31f9fdbfecb042d046f9d91270a0ba28368a723302786c0009ee9b9f1f60199"}, {file = "Pillow-10.0.1-cp38-cp38-macosx_10_10_x86_64.whl", hash = "sha256:764d2c0daf9c4d40ad12fbc0abd5da3af7f8aa11daf87e4fa1b834000f4b6b0a"},
{file = "Pillow-10.0.0-cp312-cp312-win_arm64.whl", hash = "sha256:1ce91b6ec08d866b14413d3f0bbdea7e24dfdc8e59f562bb77bc3fe60b6144ca"}, {file = "Pillow-10.0.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:fcb59711009b0168d6ee0bd8fb5eb259c4ab1717b2f538bbf36bacf207ef7a68"},
{file = "Pillow-10.0.0-cp38-cp38-macosx_10_10_x86_64.whl", hash = "sha256:349930d6e9c685c089284b013478d6f76e3a534e36ddfa912cde493f235372f3"}, {file = "Pillow-10.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:697a06bdcedd473b35e50a7e7506b1d8ceb832dc238a336bd6f4f5aa91a4b500"},
{file = "Pillow-10.0.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:3a684105f7c32488f7153905a4e3015a3b6c7182e106fe3c37fbb5ef3e6994c3"}, {file = "Pillow-10.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9f665d1e6474af9f9da5e86c2a3a2d2d6204e04d5af9c06b9d42afa6ebde3f21"},
{file = "Pillow-10.0.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b4f69b3700201b80bb82c3a97d5e9254084f6dd5fb5b16fc1a7b974260f89f43"}, {file = "Pillow-10.0.1-cp38-cp38-manylinux_2_28_aarch64.whl", hash = "sha256:2fa6dd2661838c66f1a5473f3b49ab610c98a128fc08afbe81b91a1f0bf8c51d"},
{file = "Pillow-10.0.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3f07ea8d2f827d7d2a49ecf1639ec02d75ffd1b88dcc5b3a61bbb37a8759ad8d"}, {file = "Pillow-10.0.1-cp38-cp38-manylinux_2_28_x86_64.whl", hash = "sha256:3a04359f308ebee571a3127fdb1bd01f88ba6f6fb6d087f8dd2e0d9bff43f2a7"},
{file = "Pillow-10.0.0-cp38-cp38-manylinux_2_28_aarch64.whl", hash = "sha256:040586f7d37b34547153fa383f7f9aed68b738992380ac911447bb78f2abe530"}, {file = "Pillow-10.0.1-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:723bd25051454cea9990203405fa6b74e043ea76d4968166dfd2569b0210886a"},
{file = "Pillow-10.0.0-cp38-cp38-manylinux_2_28_x86_64.whl", hash = "sha256:f88a0b92277de8e3ca715a0d79d68dc82807457dae3ab8699c758f07c20b3c51"}, {file = "Pillow-10.0.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:71671503e3015da1b50bd18951e2f9daf5b6ffe36d16f1eb2c45711a301521a7"},
{file = "Pillow-10.0.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:c7cf14a27b0d6adfaebb3ae4153f1e516df54e47e42dcc073d7b3d76111a8d86"}, {file = "Pillow-10.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:44e7e4587392953e5e251190a964675f61e4dae88d1e6edbe9f36d6243547ff3"},
{file = "Pillow-10.0.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:3400aae60685b06bb96f99a21e1ada7bc7a413d5f49bce739828ecd9391bb8f7"}, {file = "Pillow-10.0.1-cp39-cp39-macosx_10_10_x86_64.whl", hash = "sha256:3855447d98cced8670aaa63683808df905e956f00348732448b5a6df67ee5849"},
{file = "Pillow-10.0.0-cp38-cp38-win_amd64.whl", hash = "sha256:dbc02381779d412145331789b40cc7b11fdf449e5d94f6bc0b080db0a56ea3f0"}, {file = "Pillow-10.0.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:ed2d9c0704f2dc4fa980b99d565c0c9a543fe5101c25b3d60488b8ba80f0cce1"},
{file = "Pillow-10.0.0-cp39-cp39-macosx_10_10_x86_64.whl", hash = "sha256:9211e7ad69d7c9401cfc0e23d49b69ca65ddd898976d660a2fa5904e3d7a9baa"}, {file = "Pillow-10.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f5bb289bb835f9fe1a1e9300d011eef4d69661bb9b34d5e196e5e82c4cb09b37"},
{file = "Pillow-10.0.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:faaf07ea35355b01a35cb442dd950d8f1bb5b040a7787791a535de13db15ed90"}, {file = "Pillow-10.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3a0d3e54ab1df9df51b914b2233cf779a5a10dfd1ce339d0421748232cea9876"},
{file = "Pillow-10.0.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c9f72a021fbb792ce98306ffb0c348b3c9cb967dce0f12a49aa4c3d3fdefa967"}, {file = "Pillow-10.0.1-cp39-cp39-manylinux_2_28_aarch64.whl", hash = "sha256:2cc6b86ece42a11f16f55fe8903595eff2b25e0358dec635d0a701ac9586588f"},
{file = "Pillow-10.0.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9f7c16705f44e0504a3a2a14197c1f0b32a95731d251777dcb060aa83022cb2d"}, {file = "Pillow-10.0.1-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:ca26ba5767888c84bf5a0c1a32f069e8204ce8c21d00a49c90dabeba00ce0145"},
{file = "Pillow-10.0.0-cp39-cp39-manylinux_2_28_aarch64.whl", hash = "sha256:76edb0a1fa2b4745fb0c99fb9fb98f8b180a1bbceb8be49b087e0b21867e77d3"}, {file = "Pillow-10.0.1-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:f0b4b06da13275bc02adfeb82643c4a6385bd08d26f03068c2796f60d125f6f2"},
{file = "Pillow-10.0.0-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:368ab3dfb5f49e312231b6f27b8820c823652b7cd29cfbd34090565a015e99ba"}, {file = "Pillow-10.0.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:bc2e3069569ea9dbe88d6b8ea38f439a6aad8f6e7a6283a38edf61ddefb3a9bf"},
{file = "Pillow-10.0.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:608bfdee0d57cf297d32bcbb3c728dc1da0907519d1784962c5f0c68bb93e5a3"}, {file = "Pillow-10.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:8b451d6ead6e3500b6ce5c7916a43d8d8d25ad74b9102a629baccc0808c54971"},
{file = "Pillow-10.0.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:5c6e3df6bdd396749bafd45314871b3d0af81ff935b2d188385e970052091017"}, {file = "Pillow-10.0.1-pp310-pypy310_pp73-macosx_10_10_x86_64.whl", hash = "sha256:32bec7423cdf25c9038fef614a853c9d25c07590e1a870ed471f47fb80b244db"},
{file = "Pillow-10.0.0-cp39-cp39-win_amd64.whl", hash = "sha256:7be600823e4c8631b74e4a0d38384c73f680e6105a7d3c6824fcf226c178c7e6"}, {file = "Pillow-10.0.1-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b7cf63d2c6928b51d35dfdbda6f2c1fddbe51a6bc4a9d4ee6ea0e11670dd981e"},
{file = "Pillow-10.0.0-pp310-pypy310_pp73-macosx_10_10_x86_64.whl", hash = "sha256:92be919bbc9f7d09f7ae343c38f5bb21c973d2576c1d45600fce4b74bafa7ac0"}, {file = "Pillow-10.0.1-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:f6d3d4c905e26354e8f9d82548475c46d8e0889538cb0657aa9c6f0872a37aa4"},
{file = "Pillow-10.0.0-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8f8182b523b2289f7c415f589118228d30ac8c355baa2f3194ced084dac2dbba"}, {file = "Pillow-10.0.1-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:847e8d1017c741c735d3cd1883fa7b03ded4f825a6e5fcb9378fd813edee995f"},
{file = "Pillow-10.0.0-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:38250a349b6b390ee6047a62c086d3817ac69022c127f8a5dc058c31ccef17f3"}, {file = "Pillow-10.0.1-pp39-pypy39_pp73-macosx_10_10_x86_64.whl", hash = "sha256:7f771e7219ff04b79e231d099c0a28ed83aa82af91fd5fa9fdb28f5b8d5addaf"},
{file = "Pillow-10.0.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:88af2003543cc40c80f6fca01411892ec52b11021b3dc22ec3bc9d5afd1c5334"}, {file = "Pillow-10.0.1-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:459307cacdd4138edee3875bbe22a2492519e060660eaf378ba3b405d1c66317"},
{file = "Pillow-10.0.0-pp39-pypy39_pp73-macosx_10_10_x86_64.whl", hash = "sha256:c189af0545965fa8d3b9613cfdb0cd37f9d71349e0f7750e1fd704648d475ed2"}, {file = "Pillow-10.0.1-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:b059ac2c4c7a97daafa7dc850b43b2d3667def858a4f112d1aa082e5c3d6cf7d"},
{file = "Pillow-10.0.0-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ce7b031a6fc11365970e6a5686d7ba8c63e4c1cf1ea143811acbb524295eabed"}, {file = "Pillow-10.0.1-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:d6caf3cd38449ec3cd8a68b375e0c6fe4b6fd04edb6c9766b55ef84a6e8ddf2d"},
{file = "Pillow-10.0.0-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:db24668940f82321e746773a4bc617bfac06ec831e5c88b643f91f122a785684"}, {file = "Pillow-10.0.1.tar.gz", hash = "sha256:d72967b06be9300fed5cfbc8b5bafceec48bf7cdc7dab66b1d2549035287191d"},
{file = "Pillow-10.0.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:efe8c0681042536e0d06c11f48cebe759707c9e9abf880ee213541c5b46c5bf3"},
{file = "Pillow-10.0.0.tar.gz", hash = "sha256:9c82b5b3e043c7af0d95792d0d20ccf68f61a1fec6b3530e718b688422727396"},
] ]
[package.extras] [package.extras]
@ -2334,28 +2332,28 @@ files = [
[[package]] [[package]]
name = "ruff" name = "ruff"
version = "0.0.286" version = "0.0.290"
description = "An extremely fast Python linter, written in Rust." description = "An extremely fast Python linter, written in Rust."
optional = false optional = false
python-versions = ">=3.7" python-versions = ">=3.7"
files = [ files = [
{file = "ruff-0.0.286-py3-none-macosx_10_7_x86_64.whl", hash = "sha256:8e22cb557e7395893490e7f9cfea1073d19a5b1dd337f44fd81359b2767da4e9"}, {file = "ruff-0.0.290-py3-none-macosx_10_7_x86_64.whl", hash = "sha256:0e2b09ac4213b11a3520221083866a5816616f3ae9da123037b8ab275066fbac"},
{file = "ruff-0.0.286-py3-none-macosx_10_9_x86_64.macosx_11_0_arm64.macosx_10_9_universal2.whl", hash = "sha256:68ed8c99c883ae79a9133cb1a86d7130feee0397fdf5ba385abf2d53e178d3fa"}, {file = "ruff-0.0.290-py3-none-macosx_10_9_x86_64.macosx_11_0_arm64.macosx_10_9_universal2.whl", hash = "sha256:4ca6285aa77b3d966be32c9a3cd531655b3d4a0171e1f9bf26d66d0372186767"},
{file = "ruff-0.0.286-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8301f0bb4ec1a5b29cfaf15b83565136c47abefb771603241af9d6038f8981e8"}, {file = "ruff-0.0.290-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:35e3550d1d9f2157b0fcc77670f7bb59154f223bff281766e61bdd1dd854e0c5"},
{file = "ruff-0.0.286-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:acc4598f810bbc465ce0ed84417ac687e392c993a84c7eaf3abf97638701c1ec"}, {file = "ruff-0.0.290-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:d748c8bd97874f5751aed73e8dde379ce32d16338123d07c18b25c9a2796574a"},
{file = "ruff-0.0.286-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:88c8e358b445eb66d47164fa38541cfcc267847d1e7a92dd186dddb1a0a9a17f"}, {file = "ruff-0.0.290-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:982af5ec67cecd099e2ef5e238650407fb40d56304910102d054c109f390bf3c"},
{file = "ruff-0.0.286-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:0433683d0c5dbcf6162a4beb2356e820a593243f1fa714072fec15e2e4f4c939"}, {file = "ruff-0.0.290-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:bbd37352cea4ee007c48a44c9bc45a21f7ba70a57edfe46842e346651e2b995a"},
{file = "ruff-0.0.286-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ddb61a0c4454cbe4623f4a07fef03c5ae921fe04fede8d15c6e36703c0a73b07"}, {file = "ruff-0.0.290-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1d9be6351b7889462912e0b8185a260c0219c35dfd920fb490c7f256f1d8313e"},
{file = "ruff-0.0.286-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:47549c7c0be24c8ae9f2bce6f1c49fbafea83bca80142d118306f08ec7414041"}, {file = "ruff-0.0.290-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:75cdc7fe32dcf33b7cec306707552dda54632ac29402775b9e212a3c16aad5e6"},
{file = "ruff-0.0.286-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:559aa793149ac23dc4310f94f2c83209eedb16908a0343663be19bec42233d25"}, {file = "ruff-0.0.290-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:eb07f37f7aecdbbc91d759c0c09870ce0fb3eed4025eebedf9c4b98c69abd527"},
{file = "ruff-0.0.286-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:d73cfb1c3352e7aa0ce6fb2321f36fa1d4a2c48d2ceac694cb03611ddf0e4db6"}, {file = "ruff-0.0.290-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:2ab41bc0ba359d3f715fc7b705bdeef19c0461351306b70a4e247f836b9350ed"},
{file = "ruff-0.0.286-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:3dad93b1f973c6d1db4b6a5da8690c5625a3fa32bdf38e543a6936e634b83dc3"}, {file = "ruff-0.0.290-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:150bf8050214cea5b990945b66433bf9a5e0cef395c9bc0f50569e7de7540c86"},
{file = "ruff-0.0.286-py3-none-musllinux_1_2_i686.whl", hash = "sha256:26afc0851f4fc3738afcf30f5f8b8612a31ac3455cb76e611deea80f5c0bf3ce"}, {file = "ruff-0.0.290-py3-none-musllinux_1_2_i686.whl", hash = "sha256:75386ebc15fe5467248c039f5bf6a0cfe7bfc619ffbb8cd62406cd8811815fca"},
{file = "ruff-0.0.286-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:9b6b116d1c4000de1b9bf027131dbc3b8a70507788f794c6b09509d28952c512"}, {file = "ruff-0.0.290-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:ac93eadf07bc4ab4c48d8bb4e427bf0f58f3a9c578862eb85d99d704669f5da0"},
{file = "ruff-0.0.286-py3-none-win32.whl", hash = "sha256:556e965ac07c1e8c1c2d759ac512e526ecff62c00fde1a046acb088d3cbc1a6c"}, {file = "ruff-0.0.290-py3-none-win32.whl", hash = "sha256:461fbd1fb9ca806d4e3d5c745a30e185f7cf3ca77293cdc17abb2f2a990ad3f7"},
{file = "ruff-0.0.286-py3-none-win_amd64.whl", hash = "sha256:5d295c758961376c84aaa92d16e643d110be32add7465e197bfdaec5a431a107"}, {file = "ruff-0.0.290-py3-none-win_amd64.whl", hash = "sha256:f1f49f5ec967fd5778813780b12a5650ab0ebcb9ddcca28d642c689b36920796"},
{file = "ruff-0.0.286-py3-none-win_arm64.whl", hash = "sha256:1d6142d53ab7f164204b3133d053c4958d4d11ec3a39abf23a40b13b0784e3f0"}, {file = "ruff-0.0.290-py3-none-win_arm64.whl", hash = "sha256:ae5a92dfbdf1f0c689433c223f8dac0782c2b2584bd502dfdbc76475669f1ba1"},
{file = "ruff-0.0.286.tar.gz", hash = "sha256:f1e9d169cce81a384a26ee5bb8c919fe9ae88255f39a1a69fd1ebab233a85ed2"}, {file = "ruff-0.0.290.tar.gz", hash = "sha256:949fecbc5467bb11b8db810a7fa53c7e02633856ee6bd1302b2f43adcd71b88d"},
] ]
[[package]] [[package]]
@ -3349,4 +3347,4 @@ user-search = ["pyicu"]
[metadata] [metadata]
lock-version = "2.0" lock-version = "2.0"
python-versions = "^3.8.0" python-versions = "^3.8.0"
content-hash = "4a3a82becd89b91e76e2bc2f8ba72123f665c517d9b841d9a34cd01b83a1adc3" content-hash = "104f108b3c966be05e17cf9975b4061942b354fe9a57cbf7372371fd56b1bf24"

View File

@ -95,7 +95,7 @@ manifest-path = "rust/Cargo.toml"
[tool.poetry] [tool.poetry]
name = "matrix-synapse" name = "matrix-synapse"
version = "1.92.2" version = "1.93.0rc1"
description = "Homeserver for the Matrix decentralised comms protocol" description = "Homeserver for the Matrix decentralised comms protocol"
authors = ["Matrix.org Team and Contributors <packages@matrix.org>"] authors = ["Matrix.org Team and Contributors <packages@matrix.org>"]
license = "Apache-2.0" license = "Apache-2.0"
@ -180,7 +180,9 @@ PyYAML = ">=3.13"
pyasn1 = ">=0.1.9" pyasn1 = ">=0.1.9"
pyasn1-modules = ">=0.0.7" pyasn1-modules = ">=0.0.7"
bcrypt = ">=3.1.7" bcrypt = ">=3.1.7"
Pillow = ">=5.4.0" # 10.0.1 minimum is mandatory here because of libwebp CVE-2023-4863.
# Packagers that already took care of libwebp can lower that down to 5.4.0.
Pillow = ">=10.0.1"
# We use SortedDict.peekitem(), which was added in sortedcontainers 1.5.2. # We use SortedDict.peekitem(), which was added in sortedcontainers 1.5.2.
sortedcontainers = ">=1.5.2" sortedcontainers = ">=1.5.2"
pymacaroons = ">=0.13.0" pymacaroons = ">=0.13.0"
@ -318,7 +320,7 @@ all = [
# This helps prevents merge conflicts when running a batch of dependabot updates. # This helps prevents merge conflicts when running a batch of dependabot updates.
isort = ">=5.10.1" isort = ">=5.10.1"
black = ">=22.7.0" black = ">=22.7.0"
ruff = "0.0.286" ruff = "0.0.290"
# Typechecking # Typechecking
lxml-stubs = ">=0.4.0" lxml-stubs = ">=0.4.0"

View File

@ -37,7 +37,7 @@ from synapse.api.constants import EduTypes, EventContentFields
from synapse.api.errors import SynapseError from synapse.api.errors import SynapseError
from synapse.api.presence import UserPresenceState from synapse.api.presence import UserPresenceState
from synapse.events import EventBase, relation_from_event from synapse.events import EventBase, relation_from_event
from synapse.types import JsonDict, RoomID, UserID from synapse.types import JsonDict, JsonMapping, RoomID, UserID
if TYPE_CHECKING: if TYPE_CHECKING:
from synapse.server import HomeServer from synapse.server import HomeServer
@ -191,7 +191,7 @@ FilterEvent = TypeVar("FilterEvent", EventBase, UserPresenceState, JsonDict)
class FilterCollection: class FilterCollection:
def __init__(self, hs: "HomeServer", filter_json: JsonDict): def __init__(self, hs: "HomeServer", filter_json: JsonMapping):
self._filter_json = filter_json self._filter_json = filter_json
room_filter_json = self._filter_json.get("room", {}) room_filter_json = self._filter_json.get("room", {})
@ -219,7 +219,7 @@ class FilterCollection:
def __repr__(self) -> str: def __repr__(self) -> str:
return "<FilterCollection %s>" % (json.dumps(self._filter_json),) return "<FilterCollection %s>" % (json.dumps(self._filter_json),)
def get_filter_json(self) -> JsonDict: def get_filter_json(self) -> JsonMapping:
return self._filter_json return self._filter_json
def timeline_limit(self) -> int: def timeline_limit(self) -> int:
@ -313,7 +313,7 @@ class FilterCollection:
class Filter: class Filter:
def __init__(self, hs: "HomeServer", filter_json: JsonDict): def __init__(self, hs: "HomeServer", filter_json: JsonMapping):
self._hs = hs self._hs = hs
self._store = hs.get_datastores().main self._store = hs.get_datastores().main
self.filter_json = filter_json self.filter_json = filter_json

View File

@ -17,7 +17,7 @@ import logging
import os import os
import sys import sys
import tempfile import tempfile
from typing import List, Mapping, Optional from typing import List, Mapping, Optional, Sequence
from twisted.internet import defer, task from twisted.internet import defer, task
@ -57,7 +57,7 @@ from synapse.storage.databases.main.state import StateGroupWorkerStore
from synapse.storage.databases.main.stream import StreamWorkerStore from synapse.storage.databases.main.stream import StreamWorkerStore
from synapse.storage.databases.main.tags import TagsWorkerStore from synapse.storage.databases.main.tags import TagsWorkerStore
from synapse.storage.databases.main.user_erasure_store import UserErasureWorkerStore from synapse.storage.databases.main.user_erasure_store import UserErasureWorkerStore
from synapse.types import JsonDict, StateMap from synapse.types import JsonMapping, StateMap
from synapse.util import SYNAPSE_VERSION from synapse.util import SYNAPSE_VERSION
from synapse.util.logcontext import LoggingContext from synapse.util.logcontext import LoggingContext
@ -198,7 +198,7 @@ class FileExfiltrationWriter(ExfiltrationWriter):
for event in state.values(): for event in state.values():
json.dump(event, fp=f) json.dump(event, fp=f)
def write_profile(self, profile: JsonDict) -> None: def write_profile(self, profile: JsonMapping) -> None:
user_directory = os.path.join(self.base_directory, "user_data") user_directory = os.path.join(self.base_directory, "user_data")
os.makedirs(user_directory, exist_ok=True) os.makedirs(user_directory, exist_ok=True)
profile_file = os.path.join(user_directory, "profile") profile_file = os.path.join(user_directory, "profile")
@ -206,7 +206,7 @@ class FileExfiltrationWriter(ExfiltrationWriter):
with open(profile_file, "a") as f: with open(profile_file, "a") as f:
json.dump(profile, fp=f) json.dump(profile, fp=f)
def write_devices(self, devices: List[JsonDict]) -> None: def write_devices(self, devices: Sequence[JsonMapping]) -> None:
user_directory = os.path.join(self.base_directory, "user_data") user_directory = os.path.join(self.base_directory, "user_data")
os.makedirs(user_directory, exist_ok=True) os.makedirs(user_directory, exist_ok=True)
device_file = os.path.join(user_directory, "devices") device_file = os.path.join(user_directory, "devices")
@ -215,7 +215,7 @@ class FileExfiltrationWriter(ExfiltrationWriter):
with open(device_file, "a") as f: with open(device_file, "a") as f:
json.dump(device, fp=f) json.dump(device, fp=f)
def write_connections(self, connections: List[JsonDict]) -> None: def write_connections(self, connections: Sequence[JsonMapping]) -> None:
user_directory = os.path.join(self.base_directory, "user_data") user_directory = os.path.join(self.base_directory, "user_data")
os.makedirs(user_directory, exist_ok=True) os.makedirs(user_directory, exist_ok=True)
connection_file = os.path.join(user_directory, "connections") connection_file = os.path.join(user_directory, "connections")
@ -225,7 +225,7 @@ class FileExfiltrationWriter(ExfiltrationWriter):
json.dump(connection, fp=f) json.dump(connection, fp=f)
def write_account_data( def write_account_data(
self, file_name: str, account_data: Mapping[str, JsonDict] self, file_name: str, account_data: Mapping[str, JsonMapping]
) -> None: ) -> None:
account_data_directory = os.path.join( account_data_directory = os.path.join(
self.base_directory, "user_data", "account_data" self.base_directory, "user_data", "account_data"
@ -237,7 +237,7 @@ class FileExfiltrationWriter(ExfiltrationWriter):
with open(account_data_file, "a") as f: with open(account_data_file, "a") as f:
json.dump(account_data, fp=f) json.dump(account_data, fp=f)
def write_media_id(self, media_id: str, media_metadata: JsonDict) -> None: def write_media_id(self, media_id: str, media_metadata: JsonMapping) -> None:
file_directory = os.path.join(self.base_directory, "media_ids") file_directory = os.path.join(self.base_directory, "media_ids")
os.makedirs(file_directory, exist_ok=True) os.makedirs(file_directory, exist_ok=True)
media_id_file = os.path.join(file_directory, media_id) media_id_file = os.path.join(file_directory, media_id)

View File

@ -23,7 +23,7 @@ from netaddr import IPSet
from synapse.api.constants import EventTypes from synapse.api.constants import EventTypes
from synapse.events import EventBase from synapse.events import EventBase
from synapse.types import DeviceListUpdates, JsonDict, UserID from synapse.types import DeviceListUpdates, JsonDict, JsonMapping, UserID
from synapse.util.caches.descriptors import _CacheContext, cached from synapse.util.caches.descriptors import _CacheContext, cached
if TYPE_CHECKING: if TYPE_CHECKING:
@ -379,8 +379,8 @@ class AppServiceTransaction:
service: ApplicationService, service: ApplicationService,
id: int, id: int,
events: Sequence[EventBase], events: Sequence[EventBase],
ephemeral: List[JsonDict], ephemeral: List[JsonMapping],
to_device_messages: List[JsonDict], to_device_messages: List[JsonMapping],
one_time_keys_count: TransactionOneTimeKeysCount, one_time_keys_count: TransactionOneTimeKeysCount,
unused_fallback_keys: TransactionUnusedFallbackKeys, unused_fallback_keys: TransactionUnusedFallbackKeys,
device_list_summary: DeviceListUpdates, device_list_summary: DeviceListUpdates,

View File

@ -41,7 +41,7 @@ from synapse.events import EventBase
from synapse.events.utils import SerializeEventConfig, serialize_event from synapse.events.utils import SerializeEventConfig, serialize_event
from synapse.http.client import SimpleHttpClient, is_unknown_endpoint from synapse.http.client import SimpleHttpClient, is_unknown_endpoint
from synapse.logging import opentracing from synapse.logging import opentracing
from synapse.types import DeviceListUpdates, JsonDict, ThirdPartyInstanceID from synapse.types import DeviceListUpdates, JsonDict, JsonMapping, ThirdPartyInstanceID
from synapse.util.caches.response_cache import ResponseCache from synapse.util.caches.response_cache import ResponseCache
if TYPE_CHECKING: if TYPE_CHECKING:
@ -306,8 +306,8 @@ class ApplicationServiceApi(SimpleHttpClient):
self, self,
service: "ApplicationService", service: "ApplicationService",
events: Sequence[EventBase], events: Sequence[EventBase],
ephemeral: List[JsonDict], ephemeral: List[JsonMapping],
to_device_messages: List[JsonDict], to_device_messages: List[JsonMapping],
one_time_keys_count: TransactionOneTimeKeysCount, one_time_keys_count: TransactionOneTimeKeysCount,
unused_fallback_keys: TransactionUnusedFallbackKeys, unused_fallback_keys: TransactionUnusedFallbackKeys,
device_list_summary: DeviceListUpdates, device_list_summary: DeviceListUpdates,

View File

@ -73,7 +73,7 @@ from synapse.events import EventBase
from synapse.logging.context import run_in_background from synapse.logging.context import run_in_background
from synapse.metrics.background_process_metrics import run_as_background_process from synapse.metrics.background_process_metrics import run_as_background_process
from synapse.storage.databases.main import DataStore from synapse.storage.databases.main import DataStore
from synapse.types import DeviceListUpdates, JsonDict from synapse.types import DeviceListUpdates, JsonMapping
from synapse.util import Clock from synapse.util import Clock
if TYPE_CHECKING: if TYPE_CHECKING:
@ -121,8 +121,8 @@ class ApplicationServiceScheduler:
self, self,
appservice: ApplicationService, appservice: ApplicationService,
events: Optional[Collection[EventBase]] = None, events: Optional[Collection[EventBase]] = None,
ephemeral: Optional[Collection[JsonDict]] = None, ephemeral: Optional[Collection[JsonMapping]] = None,
to_device_messages: Optional[Collection[JsonDict]] = None, to_device_messages: Optional[Collection[JsonMapping]] = None,
device_list_summary: Optional[DeviceListUpdates] = None, device_list_summary: Optional[DeviceListUpdates] = None,
) -> None: ) -> None:
""" """
@ -180,9 +180,9 @@ class _ServiceQueuer:
# dict of {service_id: [events]} # dict of {service_id: [events]}
self.queued_events: Dict[str, List[EventBase]] = {} self.queued_events: Dict[str, List[EventBase]] = {}
# dict of {service_id: [events]} # dict of {service_id: [events]}
self.queued_ephemeral: Dict[str, List[JsonDict]] = {} self.queued_ephemeral: Dict[str, List[JsonMapping]] = {}
# dict of {service_id: [to_device_message_json]} # dict of {service_id: [to_device_message_json]}
self.queued_to_device_messages: Dict[str, List[JsonDict]] = {} self.queued_to_device_messages: Dict[str, List[JsonMapping]] = {}
# dict of {service_id: [device_list_summary]} # dict of {service_id: [device_list_summary]}
self.queued_device_list_summaries: Dict[str, List[DeviceListUpdates]] = {} self.queued_device_list_summaries: Dict[str, List[DeviceListUpdates]] = {}
@ -293,8 +293,8 @@ class _ServiceQueuer:
self, self,
service: ApplicationService, service: ApplicationService,
events: Iterable[EventBase], events: Iterable[EventBase],
ephemerals: Iterable[JsonDict], ephemerals: Iterable[JsonMapping],
to_device_messages: Iterable[JsonDict], to_device_messages: Iterable[JsonMapping],
) -> Tuple[TransactionOneTimeKeysCount, TransactionUnusedFallbackKeys]: ) -> Tuple[TransactionOneTimeKeysCount, TransactionUnusedFallbackKeys]:
""" """
Given a list of the events, ephemeral messages and to-device messages, Given a list of the events, ephemeral messages and to-device messages,
@ -364,8 +364,8 @@ class _TransactionController:
self, self,
service: ApplicationService, service: ApplicationService,
events: Sequence[EventBase], events: Sequence[EventBase],
ephemeral: Optional[List[JsonDict]] = None, ephemeral: Optional[List[JsonMapping]] = None,
to_device_messages: Optional[List[JsonDict]] = None, to_device_messages: Optional[List[JsonMapping]] = None,
one_time_keys_count: Optional[TransactionOneTimeKeysCount] = None, one_time_keys_count: Optional[TransactionOneTimeKeysCount] = None,
unused_fallback_keys: Optional[TransactionUnusedFallbackKeys] = None, unused_fallback_keys: Optional[TransactionUnusedFallbackKeys] = None,
device_list_summary: Optional[DeviceListUpdates] = None, device_list_summary: Optional[DeviceListUpdates] = None,

View File

@ -103,7 +103,7 @@ class EventBuilder:
async def build( async def build(
self, self,
prev_event_ids: StrCollection, prev_event_ids: List[str],
auth_event_ids: Optional[List[str]], auth_event_ids: Optional[List[str]],
depth: Optional[int] = None, depth: Optional[int] = None,
) -> EventBase: ) -> EventBase:

View File

@ -64,7 +64,7 @@ from synapse.federation.transport.client import SendJoinResponse
from synapse.http.client import is_unknown_endpoint from synapse.http.client import is_unknown_endpoint
from synapse.http.types import QueryParams from synapse.http.types import QueryParams
from synapse.logging.opentracing import SynapseTags, log_kv, set_tag, tag_args, trace from synapse.logging.opentracing import SynapseTags, log_kv, set_tag, tag_args, trace
from synapse.types import JsonDict, UserID, get_domain_from_id from synapse.types import JsonDict, StrCollection, UserID, get_domain_from_id
from synapse.util.async_helpers import concurrently_execute from synapse.util.async_helpers import concurrently_execute
from synapse.util.caches.expiringcache import ExpiringCache from synapse.util.caches.expiringcache import ExpiringCache
from synapse.util.retryutils import NotRetryingDestination from synapse.util.retryutils import NotRetryingDestination
@ -1704,7 +1704,7 @@ class FederationClient(FederationBase):
async def timestamp_to_event( async def timestamp_to_event(
self, self,
*, *,
destinations: List[str], destinations: StrCollection,
room_id: str, room_id: str,
timestamp: int, timestamp: int,
direction: Direction, direction: Direction,

View File

@ -14,11 +14,11 @@
import abc import abc
import logging import logging
from typing import TYPE_CHECKING, Any, Dict, List, Mapping, Optional, Set from typing import TYPE_CHECKING, Any, Dict, List, Mapping, Optional, Sequence, Set
from synapse.api.constants import Direction, Membership from synapse.api.constants import Direction, Membership
from synapse.events import EventBase from synapse.events import EventBase
from synapse.types import JsonDict, RoomStreamToken, StateMap, UserID, UserInfo from synapse.types import JsonMapping, RoomStreamToken, StateMap, UserID, UserInfo
from synapse.visibility import filter_events_for_client from synapse.visibility import filter_events_for_client
if TYPE_CHECKING: if TYPE_CHECKING:
@ -35,7 +35,7 @@ class AdminHandler:
self._state_storage_controller = self._storage_controllers.state self._state_storage_controller = self._storage_controllers.state
self._msc3866_enabled = hs.config.experimental.msc3866.enabled self._msc3866_enabled = hs.config.experimental.msc3866.enabled
async def get_whois(self, user: UserID) -> JsonDict: async def get_whois(self, user: UserID) -> JsonMapping:
connections = [] connections = []
sessions = await self._store.get_user_ip_and_agents(user) sessions = await self._store.get_user_ip_and_agents(user)
@ -55,7 +55,7 @@ class AdminHandler:
return ret return ret
async def get_user(self, user: UserID) -> Optional[JsonDict]: async def get_user(self, user: UserID) -> Optional[JsonMapping]:
"""Function to get user details""" """Function to get user details"""
user_info: Optional[UserInfo] = await self._store.get_user_by_id( user_info: Optional[UserInfo] = await self._store.get_user_by_id(
user.to_string() user.to_string()
@ -344,7 +344,7 @@ class ExfiltrationWriter(metaclass=abc.ABCMeta):
raise NotImplementedError() raise NotImplementedError()
@abc.abstractmethod @abc.abstractmethod
def write_profile(self, profile: JsonDict) -> None: def write_profile(self, profile: JsonMapping) -> None:
"""Write the profile of a user. """Write the profile of a user.
Args: Args:
@ -353,7 +353,7 @@ class ExfiltrationWriter(metaclass=abc.ABCMeta):
raise NotImplementedError() raise NotImplementedError()
@abc.abstractmethod @abc.abstractmethod
def write_devices(self, devices: List[JsonDict]) -> None: def write_devices(self, devices: Sequence[JsonMapping]) -> None:
"""Write the devices of a user. """Write the devices of a user.
Args: Args:
@ -362,7 +362,7 @@ class ExfiltrationWriter(metaclass=abc.ABCMeta):
raise NotImplementedError() raise NotImplementedError()
@abc.abstractmethod @abc.abstractmethod
def write_connections(self, connections: List[JsonDict]) -> None: def write_connections(self, connections: Sequence[JsonMapping]) -> None:
"""Write the connections of a user. """Write the connections of a user.
Args: Args:
@ -372,7 +372,7 @@ class ExfiltrationWriter(metaclass=abc.ABCMeta):
@abc.abstractmethod @abc.abstractmethod
def write_account_data( def write_account_data(
self, file_name: str, account_data: Mapping[str, JsonDict] self, file_name: str, account_data: Mapping[str, JsonMapping]
) -> None: ) -> None:
"""Write the account data of a user. """Write the account data of a user.
@ -383,7 +383,7 @@ class ExfiltrationWriter(metaclass=abc.ABCMeta):
raise NotImplementedError() raise NotImplementedError()
@abc.abstractmethod @abc.abstractmethod
def write_media_id(self, media_id: str, media_metadata: JsonDict) -> None: def write_media_id(self, media_id: str, media_metadata: JsonMapping) -> None:
"""Write the media's metadata of a user. """Write the media's metadata of a user.
Exports only the metadata, as this can be fetched from the database via Exports only the metadata, as this can be fetched from the database via
read only. In order to access the files, a connection to the correct read only. In order to access the files, a connection to the correct

View File

@ -46,6 +46,7 @@ from synapse.storage.databases.main.directory import RoomAliasMapping
from synapse.types import ( from synapse.types import (
DeviceListUpdates, DeviceListUpdates,
JsonDict, JsonDict,
JsonMapping,
RoomAlias, RoomAlias,
RoomStreamToken, RoomStreamToken,
StreamKeyType, StreamKeyType,
@ -397,7 +398,7 @@ class ApplicationServicesHandler:
async def _handle_typing( async def _handle_typing(
self, service: ApplicationService, new_token: int self, service: ApplicationService, new_token: int
) -> List[JsonDict]: ) -> List[JsonMapping]:
""" """
Return the typing events since the given stream token that the given application Return the typing events since the given stream token that the given application
service should receive. service should receive.
@ -432,7 +433,7 @@ class ApplicationServicesHandler:
async def _handle_receipts( async def _handle_receipts(
self, service: ApplicationService, new_token: int self, service: ApplicationService, new_token: int
) -> List[JsonDict]: ) -> List[JsonMapping]:
""" """
Return the latest read receipts that the given application service should receive. Return the latest read receipts that the given application service should receive.
@ -471,7 +472,7 @@ class ApplicationServicesHandler:
service: ApplicationService, service: ApplicationService,
users: Collection[Union[str, UserID]], users: Collection[Union[str, UserID]],
new_token: Optional[int], new_token: Optional[int],
) -> List[JsonDict]: ) -> List[JsonMapping]:
""" """
Return the latest presence updates that the given application service should receive. Return the latest presence updates that the given application service should receive.
@ -491,7 +492,7 @@ class ApplicationServicesHandler:
A list of json dictionaries containing data derived from the presence events A list of json dictionaries containing data derived from the presence events
that should be sent to the given application service. that should be sent to the given application service.
""" """
events: List[JsonDict] = [] events: List[JsonMapping] = []
presence_source = self.event_sources.sources.presence presence_source = self.event_sources.sources.presence
from_key = await self.store.get_type_stream_id_for_appservice( from_key = await self.store.get_type_stream_id_for_appservice(
service, "presence" service, "presence"

View File

@ -14,7 +14,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import logging import logging
from typing import TYPE_CHECKING, Any, Dict, Iterable, List, Mapping, Optional, Tuple from typing import TYPE_CHECKING, Dict, Iterable, List, Mapping, Optional, Tuple
import attr import attr
from canonicaljson import encode_canonical_json from canonicaljson import encode_canonical_json
@ -31,6 +31,7 @@ from synapse.logging.context import make_deferred_yieldable, run_in_background
from synapse.logging.opentracing import log_kv, set_tag, tag_args, trace from synapse.logging.opentracing import log_kv, set_tag, tag_args, trace
from synapse.types import ( from synapse.types import (
JsonDict, JsonDict,
JsonMapping,
UserID, UserID,
get_domain_from_id, get_domain_from_id,
get_verify_key_from_cross_signing_key, get_verify_key_from_cross_signing_key,
@ -272,11 +273,7 @@ class E2eKeysHandler:
delay_cancellation=True, delay_cancellation=True,
) )
ret = {"device_keys": results, "failures": failures} return {"device_keys": results, "failures": failures, **cross_signing_keys}
ret.update(cross_signing_keys)
return ret
@trace @trace
async def _query_devices_for_destination( async def _query_devices_for_destination(
@ -408,7 +405,7 @@ class E2eKeysHandler:
@cancellable @cancellable
async def get_cross_signing_keys_from_cache( async def get_cross_signing_keys_from_cache(
self, query: Iterable[str], from_user_id: Optional[str] self, query: Iterable[str], from_user_id: Optional[str]
) -> Dict[str, Dict[str, dict]]: ) -> Dict[str, Dict[str, JsonMapping]]:
"""Get cross-signing keys for users from the database """Get cross-signing keys for users from the database
Args: Args:
@ -551,16 +548,13 @@ class E2eKeysHandler:
self.config.federation.allow_device_name_lookup_over_federation self.config.federation.allow_device_name_lookup_over_federation
), ),
) )
ret = {"device_keys": res}
# add in the cross-signing keys # add in the cross-signing keys
cross_signing_keys = await self.get_cross_signing_keys_from_cache( cross_signing_keys = await self.get_cross_signing_keys_from_cache(
device_keys_query, None device_keys_query, None
) )
ret.update(cross_signing_keys) return {"device_keys": res, **cross_signing_keys}
return ret
async def claim_local_one_time_keys( async def claim_local_one_time_keys(
self, self,
@ -1127,7 +1121,7 @@ class E2eKeysHandler:
user_id: str, user_id: str,
master_key_id: str, master_key_id: str,
signed_master_key: JsonDict, signed_master_key: JsonDict,
stored_master_key: JsonDict, stored_master_key: JsonMapping,
devices: Dict[str, Dict[str, JsonDict]], devices: Dict[str, Dict[str, JsonDict]],
) -> List["SignatureListItem"]: ) -> List["SignatureListItem"]:
"""Check signatures of a user's master key made by their devices. """Check signatures of a user's master key made by their devices.
@ -1278,7 +1272,7 @@ class E2eKeysHandler:
async def _get_e2e_cross_signing_verify_key( async def _get_e2e_cross_signing_verify_key(
self, user_id: str, key_type: str, from_user_id: Optional[str] = None self, user_id: str, key_type: str, from_user_id: Optional[str] = None
) -> Tuple[JsonDict, str, VerifyKey]: ) -> Tuple[JsonMapping, str, VerifyKey]:
"""Fetch locally or remotely query for a cross-signing public key. """Fetch locally or remotely query for a cross-signing public key.
First, attempt to fetch the cross-signing public key from storage. First, attempt to fetch the cross-signing public key from storage.
@ -1333,7 +1327,7 @@ class E2eKeysHandler:
self, self,
user: UserID, user: UserID,
desired_key_type: str, desired_key_type: str,
) -> Optional[Tuple[Dict[str, Any], str, VerifyKey]]: ) -> Optional[Tuple[JsonMapping, str, VerifyKey]]:
"""Queries cross-signing keys for a remote user and saves them to the database """Queries cross-signing keys for a remote user and saves them to the database
Only the key specified by `key_type` will be returned, while all retrieved keys Only the key specified by `key_type` will be returned, while all retrieved keys
@ -1474,7 +1468,7 @@ def _check_device_signature(
user_id: str, user_id: str,
verify_key: VerifyKey, verify_key: VerifyKey,
signed_device: JsonDict, signed_device: JsonDict,
stored_device: JsonDict, stored_device: JsonMapping,
) -> None: ) -> None:
"""Check that a signature on a device or cross-signing key is correct and """Check that a signature on a device or cross-signing key is correct and
matches the copy of the device/key that we have stored. Throws an matches the copy of the device/key that we have stored. Throws an

View File

@ -723,12 +723,11 @@ class FederationEventHandler:
if not prevs - seen: if not prevs - seen:
return return
latest_list = await self._store.get_latest_event_ids_in_room(room_id) latest_frozen = await self._store.get_latest_event_ids_in_room(room_id)
# We add the prev events that we have seen to the latest # We add the prev events that we have seen to the latest
# list to ensure the remote server doesn't give them to us # list to ensure the remote server doesn't give them to us
latest = set(latest_list) latest = seen | latest_frozen
latest |= seen
logger.info( logger.info(
"Requesting missing events between %s and %s", "Requesting missing events between %s and %s",
@ -1539,7 +1538,7 @@ class FederationEventHandler:
logger.exception("Failed to resync device for %s", sender) logger.exception("Failed to resync device for %s", sender)
async def backfill_event_id( async def backfill_event_id(
self, destinations: List[str], room_id: str, event_id: str self, destinations: StrCollection, room_id: str, event_id: str
) -> PulledPduInfo: ) -> PulledPduInfo:
"""Backfill a single event and persist it as a non-outlier which means """Backfill a single event and persist it as a non-outlier which means
we also pull in all of the state and auth events necessary for it. we also pull in all of the state and auth events necessary for it.
@ -1976,8 +1975,7 @@ class FederationEventHandler:
# partial and full state and may not be accurate. # partial and full state and may not be accurate.
return return
extrem_ids_list = await self._store.get_latest_event_ids_in_room(event.room_id) extrem_ids = await self._store.get_latest_event_ids_in_room(event.room_id)
extrem_ids = set(extrem_ids_list)
prev_event_ids = set(event.prev_event_ids()) prev_event_ids = set(event.prev_event_ids())
if extrem_ids == prev_event_ids: if extrem_ids == prev_event_ids:

View File

@ -32,6 +32,7 @@ from synapse.storage.roommember import RoomsForUser
from synapse.streams.config import PaginationConfig from synapse.streams.config import PaginationConfig
from synapse.types import ( from synapse.types import (
JsonDict, JsonDict,
JsonMapping,
Requester, Requester,
RoomStreamToken, RoomStreamToken,
StreamKeyType, StreamKeyType,
@ -454,7 +455,7 @@ class InitialSyncHandler:
for s in states for s in states
] ]
async def get_receipts() -> List[JsonDict]: async def get_receipts() -> List[JsonMapping]:
receipts = await self.store.get_linearized_receipts_for_room( receipts = await self.store.get_linearized_receipts_for_room(
room_id, to_key=now_token.receipt_key room_id, to_key=now_token.receipt_key
) )

View File

@ -19,6 +19,7 @@ from synapse.appservice import ApplicationService
from synapse.streams import EventSource from synapse.streams import EventSource
from synapse.types import ( from synapse.types import (
JsonDict, JsonDict,
JsonMapping,
ReadReceipt, ReadReceipt,
StreamKeyType, StreamKeyType,
UserID, UserID,
@ -37,6 +38,8 @@ class ReceiptsHandler:
self.server_name = hs.config.server.server_name self.server_name = hs.config.server.server_name
self.store = hs.get_datastores().main self.store = hs.get_datastores().main
self.event_auth_handler = hs.get_event_auth_handler() self.event_auth_handler = hs.get_event_auth_handler()
self.event_handler = hs.get_event_handler()
self._storage_controllers = hs.get_storage_controllers()
self.hs = hs self.hs = hs
@ -81,6 +84,20 @@ class ReceiptsHandler:
) )
continue continue
# Let's check that the origin server is in the room before accepting the receipt.
# We don't want to block waiting on a partial state so take an
# approximation if needed.
domains = await self._storage_controllers.state.get_current_hosts_in_room_or_partial_state_approximation(
room_id
)
if origin not in domains:
logger.info(
"Ignoring receipt for room %r from server %s as they're not in the room",
room_id,
origin,
)
continue
for receipt_type, users in room_values.items(): for receipt_type, users in room_values.items():
for user_id, user_values in users.items(): for user_id, user_values in users.items():
if get_domain_from_id(user_id) != origin: if get_domain_from_id(user_id) != origin:
@ -158,17 +175,23 @@ class ReceiptsHandler:
self, self,
room_id: str, room_id: str,
receipt_type: str, receipt_type: str,
user_id: str, user_id: UserID,
event_id: str, event_id: str,
thread_id: Optional[str], thread_id: Optional[str],
) -> None: ) -> None:
"""Called when a client tells us a local user has read up to the given """Called when a client tells us a local user has read up to the given
event_id in the room. event_id in the room.
""" """
# Ensure the room/event exists, this will raise an error if the user
# cannot view the event.
if not await self.event_handler.get_event(user_id, room_id, event_id):
return
receipt = ReadReceipt( receipt = ReadReceipt(
room_id=room_id, room_id=room_id,
receipt_type=receipt_type, receipt_type=receipt_type,
user_id=user_id, user_id=user_id.to_string(),
event_ids=[event_id], event_ids=[event_id],
thread_id=thread_id, thread_id=thread_id,
data={"ts": int(self.clock.time_msec())}, data={"ts": int(self.clock.time_msec())},
@ -182,15 +205,15 @@ class ReceiptsHandler:
await self.federation_sender.send_read_receipt(receipt) await self.federation_sender.send_read_receipt(receipt)
class ReceiptEventSource(EventSource[int, JsonDict]): class ReceiptEventSource(EventSource[int, JsonMapping]):
def __init__(self, hs: "HomeServer"): def __init__(self, hs: "HomeServer"):
self.store = hs.get_datastores().main self.store = hs.get_datastores().main
self.config = hs.config self.config = hs.config
@staticmethod @staticmethod
def filter_out_private_receipts( def filter_out_private_receipts(
rooms: Sequence[JsonDict], user_id: str rooms: Sequence[JsonMapping], user_id: str
) -> List[JsonDict]: ) -> List[JsonMapping]:
""" """
Filters a list of serialized receipts (as returned by /sync and /initialSync) Filters a list of serialized receipts (as returned by /sync and /initialSync)
and removes private read receipts of other users. and removes private read receipts of other users.
@ -207,7 +230,7 @@ class ReceiptEventSource(EventSource[int, JsonDict]):
The same as rooms, but filtered. The same as rooms, but filtered.
""" """
result = [] result: List[JsonMapping] = []
# Iterate through each room's receipt content. # Iterate through each room's receipt content.
for room in rooms: for room in rooms:
@ -260,7 +283,7 @@ class ReceiptEventSource(EventSource[int, JsonDict]):
room_ids: Iterable[str], room_ids: Iterable[str],
is_guest: bool, is_guest: bool,
explicit_room_id: Optional[str] = None, explicit_room_id: Optional[str] = None,
) -> Tuple[List[JsonDict], int]: ) -> Tuple[List[JsonMapping], int]:
from_key = int(from_key) from_key = int(from_key)
to_key = self.get_current_key() to_key = self.get_current_key()
@ -279,7 +302,7 @@ class ReceiptEventSource(EventSource[int, JsonDict]):
async def get_new_events_as( async def get_new_events_as(
self, from_key: int, to_key: int, service: ApplicationService self, from_key: int, to_key: int, service: ApplicationService
) -> Tuple[List[JsonDict], int]: ) -> Tuple[List[JsonMapping], int]:
"""Returns a set of new read receipt events that an appservice """Returns a set of new read receipt events that an appservice
may be interested in. may be interested in.

View File

@ -13,7 +13,17 @@
# limitations under the License. # limitations under the License.
import enum import enum
import logging import logging
from typing import TYPE_CHECKING, Collection, Dict, FrozenSet, Iterable, List, Optional from typing import (
TYPE_CHECKING,
Collection,
Dict,
FrozenSet,
Iterable,
List,
Mapping,
Optional,
Sequence,
)
import attr import attr
@ -245,7 +255,7 @@ class RelationsHandler:
async def get_references_for_events( async def get_references_for_events(
self, event_ids: Collection[str], ignored_users: FrozenSet[str] = frozenset() self, event_ids: Collection[str], ignored_users: FrozenSet[str] = frozenset()
) -> Dict[str, List[_RelatedEvent]]: ) -> Mapping[str, Sequence[_RelatedEvent]]:
"""Get a list of references to the given events. """Get a list of references to the given events.
Args: Args:

View File

@ -174,8 +174,8 @@ class SendEmailHandler:
if raw_to == "": if raw_to == "":
raise RuntimeError("Invalid 'to' address") raise RuntimeError("Invalid 'to' address")
html_part = MIMEText(html, "html", "utf8") html_part = MIMEText(html, "html", "utf-8")
text_part = MIMEText(text, "plain", "utf8") text_part = MIMEText(text, "plain", "utf-8")
multipart_msg = MIMEMultipart("alternative") multipart_msg = MIMEMultipart("alternative")
multipart_msg["Subject"] = subject multipart_msg["Subject"] = subject

View File

@ -57,6 +57,7 @@ from synapse.storage.roommember import MemberSummary
from synapse.types import ( from synapse.types import (
DeviceListUpdates, DeviceListUpdates,
JsonDict, JsonDict,
JsonMapping,
MutableStateMap, MutableStateMap,
Requester, Requester,
RoomStreamToken, RoomStreamToken,
@ -234,7 +235,7 @@ class SyncResult:
archived: List[ArchivedSyncResult] archived: List[ArchivedSyncResult]
to_device: List[JsonDict] to_device: List[JsonDict]
device_lists: DeviceListUpdates device_lists: DeviceListUpdates
device_one_time_keys_count: JsonDict device_one_time_keys_count: JsonMapping
device_unused_fallback_key_types: List[str] device_unused_fallback_key_types: List[str]
def __bool__(self) -> bool: def __bool__(self) -> bool:
@ -1557,7 +1558,7 @@ class SyncHandler:
logger.debug("Fetching OTK data") logger.debug("Fetching OTK data")
device_id = sync_config.device_id device_id = sync_config.device_id
one_time_keys_count: JsonDict = {} one_time_keys_count: JsonMapping = {}
unused_fallback_key_types: List[str] = [] unused_fallback_key_types: List[str] = []
if device_id: if device_id:
# TODO: We should have a way to let clients differentiate between the states of: # TODO: We should have a way to let clients differentiate between the states of:
@ -1793,19 +1794,23 @@ class SyncHandler:
) )
if push_rules_changed: if push_rules_changed:
global_account_data = dict(global_account_data) global_account_data = {
global_account_data[ AccountDataTypes.PUSH_RULES: await self._push_rules_handler.push_rules_for_user(
AccountDataTypes.PUSH_RULES sync_config.user
] = await self._push_rules_handler.push_rules_for_user(sync_config.user) ),
**global_account_data,
}
else: else:
all_global_account_data = await self.store.get_global_account_data_for_user( all_global_account_data = await self.store.get_global_account_data_for_user(
user_id user_id
) )
global_account_data = dict(all_global_account_data) global_account_data = {
global_account_data[ AccountDataTypes.PUSH_RULES: await self._push_rules_handler.push_rules_for_user(
AccountDataTypes.PUSH_RULES sync_config.user
] = await self._push_rules_handler.push_rules_for_user(sync_config.user) ),
**all_global_account_data,
}
account_data_for_user = ( account_data_for_user = (
await sync_config.filter_collection.filter_global_account_data( await sync_config.filter_collection.filter_global_account_data(
@ -1909,7 +1914,7 @@ class SyncHandler:
blocks_all_rooms blocks_all_rooms
or sync_result_builder.sync_config.filter_collection.blocks_all_room_account_data() or sync_result_builder.sync_config.filter_collection.blocks_all_room_account_data()
): ):
account_data_by_room: Mapping[str, Mapping[str, JsonDict]] = {} account_data_by_room: Mapping[str, Mapping[str, JsonMapping]] = {}
elif since_token and not sync_result_builder.full_state: elif since_token and not sync_result_builder.full_state:
account_data_by_room = ( account_data_by_room = (
await self.store.get_updated_room_account_data_for_user( await self.store.get_updated_room_account_data_for_user(
@ -2349,8 +2354,8 @@ class SyncHandler:
sync_result_builder: "SyncResultBuilder", sync_result_builder: "SyncResultBuilder",
room_builder: "RoomSyncResultBuilder", room_builder: "RoomSyncResultBuilder",
ephemeral: List[JsonDict], ephemeral: List[JsonDict],
tags: Optional[Mapping[str, Mapping[str, Any]]], tags: Optional[Mapping[str, JsonMapping]],
account_data: Mapping[str, JsonDict], account_data: Mapping[str, JsonMapping],
always_include: bool = False, always_include: bool = False,
) -> None: ) -> None:
"""Populates the `joined` and `archived` section of `sync_result_builder` """Populates the `joined` and `archived` section of `sync_result_builder`

View File

@ -26,7 +26,14 @@ from synapse.metrics.background_process_metrics import (
) )
from synapse.replication.tcp.streams import TypingStream from synapse.replication.tcp.streams import TypingStream
from synapse.streams import EventSource from synapse.streams import EventSource
from synapse.types import JsonDict, Requester, StrCollection, StreamKeyType, UserID from synapse.types import (
JsonDict,
JsonMapping,
Requester,
StrCollection,
StreamKeyType,
UserID,
)
from synapse.util.caches.stream_change_cache import StreamChangeCache from synapse.util.caches.stream_change_cache import StreamChangeCache
from synapse.util.metrics import Measure from synapse.util.metrics import Measure
from synapse.util.retryutils import filter_destinations_by_retry_limiter from synapse.util.retryutils import filter_destinations_by_retry_limiter
@ -487,7 +494,7 @@ class TypingWriterHandler(FollowerTypingHandler):
raise Exception("Typing writer instance got typing info over replication") raise Exception("Typing writer instance got typing info over replication")
class TypingNotificationEventSource(EventSource[int, JsonDict]): class TypingNotificationEventSource(EventSource[int, JsonMapping]):
def __init__(self, hs: "HomeServer"): def __init__(self, hs: "HomeServer"):
self._main_store = hs.get_datastores().main self._main_store = hs.get_datastores().main
self.clock = hs.get_clock() self.clock = hs.get_clock()
@ -497,7 +504,7 @@ class TypingNotificationEventSource(EventSource[int, JsonDict]):
# #
self.get_typing_handler = hs.get_typing_handler self.get_typing_handler = hs.get_typing_handler
def _make_event_for(self, room_id: str) -> JsonDict: def _make_event_for(self, room_id: str) -> JsonMapping:
typing = self.get_typing_handler()._room_typing[room_id] typing = self.get_typing_handler()._room_typing[room_id]
return { return {
"type": EduTypes.TYPING, "type": EduTypes.TYPING,
@ -507,7 +514,7 @@ class TypingNotificationEventSource(EventSource[int, JsonDict]):
async def get_new_events_as( async def get_new_events_as(
self, from_key: int, service: ApplicationService self, from_key: int, service: ApplicationService
) -> Tuple[List[JsonDict], int]: ) -> Tuple[List[JsonMapping], int]:
"""Returns a set of new typing events that an appservice """Returns a set of new typing events that an appservice
may be interested in. may be interested in.
@ -551,7 +558,7 @@ class TypingNotificationEventSource(EventSource[int, JsonDict]):
room_ids: Iterable[str], room_ids: Iterable[str],
is_guest: bool, is_guest: bool,
explicit_room_id: Optional[str] = None, explicit_room_id: Optional[str] = None,
) -> Tuple[List[JsonDict], int]: ) -> Tuple[List[JsonMapping], int]:
with Measure(self.clock, "typing.get_new_events"): with Measure(self.clock, "typing.get_new_events"):
from_key = int(from_key) from_key = int(from_key)
handler = self.get_typing_handler() handler = self.get_typing_handler()

View File

@ -131,7 +131,7 @@ class BulkPushRuleEvaluator:
async def _get_rules_for_event( async def _get_rules_for_event(
self, self,
event: EventBase, event: EventBase,
) -> Dict[str, FilteredPushRules]: ) -> Mapping[str, FilteredPushRules]:
"""Get the push rules for all users who may need to be notified about """Get the push rules for all users who may need to be notified about
the event. the event.

View File

@ -39,7 +39,7 @@ from synapse.rest.admin._base import (
from synapse.rest.client._base import client_patterns from synapse.rest.client._base import client_patterns
from synapse.storage.databases.main.registration import ExternalIDReuseException from synapse.storage.databases.main.registration import ExternalIDReuseException
from synapse.storage.databases.main.stats import UserSortOrder from synapse.storage.databases.main.stats import UserSortOrder
from synapse.types import JsonDict, UserID from synapse.types import JsonDict, JsonMapping, UserID
if TYPE_CHECKING: if TYPE_CHECKING:
from synapse.server import HomeServer from synapse.server import HomeServer
@ -66,6 +66,7 @@ class UsersRestServletV2(RestServlet):
The parameter `deactivated` can be used to include deactivated users. The parameter `deactivated` can be used to include deactivated users.
The parameter `order_by` can be used to order the result. The parameter `order_by` can be used to order the result.
The parameter `not_user_type` can be used to exclude certain user types. The parameter `not_user_type` can be used to exclude certain user types.
The parameter `locked` can be used to include locked users.
Possible values are `bot`, `support` or "empty string". Possible values are `bot`, `support` or "empty string".
"empty string" here means to exclude users without a type. "empty string" here means to exclude users without a type.
""" """
@ -107,8 +108,9 @@ class UsersRestServletV2(RestServlet):
"The guests parameter is not supported when MSC3861 is enabled.", "The guests parameter is not supported when MSC3861 is enabled.",
errcode=Codes.INVALID_PARAM, errcode=Codes.INVALID_PARAM,
) )
deactivated = parse_boolean(request, "deactivated", default=False)
deactivated = parse_boolean(request, "deactivated", default=False)
locked = parse_boolean(request, "locked", default=False)
admins = parse_boolean(request, "admins") admins = parse_boolean(request, "admins")
# If support for MSC3866 is not enabled, apply no filtering based on the # If support for MSC3866 is not enabled, apply no filtering based on the
@ -133,6 +135,7 @@ class UsersRestServletV2(RestServlet):
UserSortOrder.SHADOW_BANNED.value, UserSortOrder.SHADOW_BANNED.value,
UserSortOrder.CREATION_TS.value, UserSortOrder.CREATION_TS.value,
UserSortOrder.LAST_SEEN_TS.value, UserSortOrder.LAST_SEEN_TS.value,
UserSortOrder.LOCKED.value,
), ),
) )
@ -154,6 +157,7 @@ class UsersRestServletV2(RestServlet):
direction, direction,
approved, approved,
not_user_types, not_user_types,
locked,
) )
# If support for MSC3866 is not enabled, don't show the approval flag. # If support for MSC3866 is not enabled, don't show the approval flag.
@ -211,7 +215,7 @@ class UserRestServletV2(RestServlet):
async def on_GET( async def on_GET(
self, request: SynapseRequest, user_id: str self, request: SynapseRequest, user_id: str
) -> Tuple[int, JsonDict]: ) -> Tuple[int, JsonMapping]:
await assert_requester_is_admin(self.auth, request) await assert_requester_is_admin(self.auth, request)
target_user = UserID.from_string(user_id) target_user = UserID.from_string(user_id)
@ -226,7 +230,7 @@ class UserRestServletV2(RestServlet):
async def on_PUT( async def on_PUT(
self, request: SynapseRequest, user_id: str self, request: SynapseRequest, user_id: str
) -> Tuple[int, JsonDict]: ) -> Tuple[int, JsonMapping]:
requester = await self.auth.get_user_by_req(request) requester = await self.auth.get_user_by_req(request)
await assert_user_is_admin(self.auth, requester) await assert_user_is_admin(self.auth, requester)
@ -658,7 +662,7 @@ class WhoisRestServlet(RestServlet):
async def on_GET( async def on_GET(
self, request: SynapseRequest, user_id: str self, request: SynapseRequest, user_id: str
) -> Tuple[int, JsonDict]: ) -> Tuple[int, JsonMapping]:
target_user = UserID.from_string(user_id) target_user = UserID.from_string(user_id)
requester = await self.auth.get_user_by_req(request) requester = await self.auth.get_user_by_req(request)

View File

@ -20,7 +20,7 @@ from synapse.api.errors import AuthError, Codes, NotFoundError, SynapseError
from synapse.http.server import HttpServer from synapse.http.server import HttpServer
from synapse.http.servlet import RestServlet, parse_json_object_from_request from synapse.http.servlet import RestServlet, parse_json_object_from_request
from synapse.http.site import SynapseRequest from synapse.http.site import SynapseRequest
from synapse.types import JsonDict, RoomID from synapse.types import JsonDict, JsonMapping, RoomID
from ._base import client_patterns from ._base import client_patterns
@ -95,7 +95,7 @@ class AccountDataServlet(RestServlet):
async def on_GET( async def on_GET(
self, request: SynapseRequest, user_id: str, account_data_type: str self, request: SynapseRequest, user_id: str, account_data_type: str
) -> Tuple[int, JsonDict]: ) -> Tuple[int, JsonMapping]:
requester = await self.auth.get_user_by_req(request) requester = await self.auth.get_user_by_req(request)
if user_id != requester.user.to_string(): if user_id != requester.user.to_string():
raise AuthError(403, "Cannot get account data for other users.") raise AuthError(403, "Cannot get account data for other users.")
@ -106,7 +106,7 @@ class AccountDataServlet(RestServlet):
and account_data_type == AccountDataTypes.PUSH_RULES and account_data_type == AccountDataTypes.PUSH_RULES
): ):
account_data: Optional[ account_data: Optional[
JsonDict JsonMapping
] = await self._push_rules_handler.push_rules_for_user(requester.user) ] = await self._push_rules_handler.push_rules_for_user(requester.user)
else: else:
account_data = await self.store.get_global_account_data_by_type_for_user( account_data = await self.store.get_global_account_data_by_type_for_user(
@ -236,7 +236,7 @@ class RoomAccountDataServlet(RestServlet):
user_id: str, user_id: str,
room_id: str, room_id: str,
account_data_type: str, account_data_type: str,
) -> Tuple[int, JsonDict]: ) -> Tuple[int, JsonMapping]:
requester = await self.auth.get_user_by_req(request) requester = await self.auth.get_user_by_req(request)
if user_id != requester.user.to_string(): if user_id != requester.user.to_string():
raise AuthError(403, "Cannot get account data for other users.") raise AuthError(403, "Cannot get account data for other users.")
@ -253,7 +253,7 @@ class RoomAccountDataServlet(RestServlet):
self._hs.config.experimental.msc4010_push_rules_account_data self._hs.config.experimental.msc4010_push_rules_account_data
and account_data_type == AccountDataTypes.PUSH_RULES and account_data_type == AccountDataTypes.PUSH_RULES
): ):
account_data: Optional[JsonDict] = {} account_data: Optional[JsonMapping] = {}
else: else:
account_data = await self.store.get_account_data_for_room_and_type( account_data = await self.store.get_account_data_for_room_and_type(
user_id, room_id, account_data_type user_id, room_id, account_data_type

View File

@ -19,7 +19,7 @@ from synapse.api.errors import AuthError, NotFoundError, StoreError, SynapseErro
from synapse.http.server import HttpServer from synapse.http.server import HttpServer
from synapse.http.servlet import RestServlet, parse_json_object_from_request from synapse.http.servlet import RestServlet, parse_json_object_from_request
from synapse.http.site import SynapseRequest from synapse.http.site import SynapseRequest
from synapse.types import JsonDict, UserID from synapse.types import JsonDict, JsonMapping, UserID
from ._base import client_patterns, set_timeline_upper_limit from ._base import client_patterns, set_timeline_upper_limit
@ -41,7 +41,7 @@ class GetFilterRestServlet(RestServlet):
async def on_GET( async def on_GET(
self, request: SynapseRequest, user_id: str, filter_id: str self, request: SynapseRequest, user_id: str, filter_id: str
) -> Tuple[int, JsonDict]: ) -> Tuple[int, JsonMapping]:
target_user = UserID.from_string(user_id) target_user = UserID.from_string(user_id)
requester = await self.auth.get_user_by_req(request) requester = await self.auth.get_user_by_req(request)

View File

@ -84,7 +84,7 @@ class ReadMarkerRestServlet(RestServlet):
await self.receipts_handler.received_client_receipt( await self.receipts_handler.received_client_receipt(
room_id, room_id,
receipt_type, receipt_type,
user_id=requester.user.to_string(), user_id=requester.user,
event_id=event_id, event_id=event_id,
# Setting the thread ID is not possible with the /read_markers endpoint. # Setting the thread ID is not possible with the /read_markers endpoint.
thread_id=None, thread_id=None,

View File

@ -108,7 +108,7 @@ class ReceiptRestServlet(RestServlet):
await self.receipts_handler.received_client_receipt( await self.receipts_handler.received_client_receipt(
room_id, room_id,
receipt_type, receipt_type,
user_id=requester.user.to_string(), user_id=requester.user,
event_id=event_id, event_id=event_id,
thread_id=thread_id, thread_id=thread_id,
) )

View File

@ -19,6 +19,7 @@ import logging
from collections import deque from collections import deque
from typing import ( from typing import (
TYPE_CHECKING, TYPE_CHECKING,
AbstractSet,
Any, Any,
Awaitable, Awaitable,
Callable, Callable,
@ -618,7 +619,7 @@ class EventsPersistenceStorageController:
) )
for room_id, ev_ctx_rm in events_by_room.items(): for room_id, ev_ctx_rm in events_by_room.items():
latest_event_ids = set( latest_event_ids = (
await self.main_store.get_latest_event_ids_in_room(room_id) await self.main_store.get_latest_event_ids_in_room(room_id)
) )
new_latest_event_ids = await self._calculate_new_extremities( new_latest_event_ids = await self._calculate_new_extremities(
@ -740,7 +741,7 @@ class EventsPersistenceStorageController:
self, self,
room_id: str, room_id: str,
event_contexts: List[Tuple[EventBase, EventContext]], event_contexts: List[Tuple[EventBase, EventContext]],
latest_event_ids: Collection[str], latest_event_ids: AbstractSet[str],
) -> Set[str]: ) -> Set[str]:
"""Calculates the new forward extremities for a room given events to """Calculates the new forward extremities for a room given events to
persist. persist.
@ -758,8 +759,6 @@ class EventsPersistenceStorageController:
and not event.internal_metadata.is_soft_failed() and not event.internal_metadata.is_soft_failed()
] ]
latest_event_ids = set(latest_event_ids)
# start with the existing forward extremities # start with the existing forward extremities
result = set(latest_event_ids) result = set(latest_event_ids)
@ -798,7 +797,7 @@ class EventsPersistenceStorageController:
self, self,
room_id: str, room_id: str,
events_context: List[Tuple[EventBase, EventContext]], events_context: List[Tuple[EventBase, EventContext]],
old_latest_event_ids: Set[str], old_latest_event_ids: AbstractSet[str],
new_latest_event_ids: Set[str], new_latest_event_ids: Set[str],
) -> Tuple[Optional[StateMap[str]], Optional[StateMap[str]], Set[str]]: ) -> Tuple[Optional[StateMap[str]], Optional[StateMap[str]], Set[str]]:
"""Calculate the current state dict after adding some new events to """Calculate the current state dict after adding some new events to

View File

@ -582,7 +582,7 @@ class StateStorageController:
@trace @trace
@tag_args @tag_args
async def get_current_hosts_in_room_ordered(self, room_id: str) -> List[str]: async def get_current_hosts_in_room_ordered(self, room_id: str) -> Tuple[str, ...]:
"""Get current hosts in room based on current state. """Get current hosts in room based on current state.
Blocks until we have full state for the given room. This only happens for rooms Blocks until we have full state for the given room. This only happens for rooms

View File

@ -175,6 +175,7 @@ class DataStore(
direction: Direction = Direction.FORWARDS, direction: Direction = Direction.FORWARDS,
approved: bool = True, approved: bool = True,
not_user_types: Optional[List[str]] = None, not_user_types: Optional[List[str]] = None,
locked: bool = False,
) -> Tuple[List[JsonDict], int]: ) -> Tuple[List[JsonDict], int]:
"""Function to retrieve a paginated list of users from """Function to retrieve a paginated list of users from
users list. This will return a json list of users and the users list. This will return a json list of users and the
@ -194,6 +195,7 @@ class DataStore(
direction: sort ascending or descending direction: sort ascending or descending
approved: whether to include approved users approved: whether to include approved users
not_user_types: list of user types to exclude not_user_types: list of user types to exclude
locked: whether to include locked users
Returns: Returns:
A tuple of a list of mappings from user to information and a count of total users. A tuple of a list of mappings from user to information and a count of total users.
""" """
@ -226,6 +228,9 @@ class DataStore(
if not deactivated: if not deactivated:
filters.append("deactivated = 0") filters.append("deactivated = 0")
if not locked:
filters.append("locked IS FALSE")
if admins is not None: if admins is not None:
if admins: if admins:
filters.append("admin = 1") filters.append("admin = 1")
@ -290,7 +295,7 @@ class DataStore(
sql = f""" sql = f"""
SELECT name, user_type, is_guest, admin, deactivated, shadow_banned, SELECT name, user_type, is_guest, admin, deactivated, shadow_banned,
displayname, avatar_url, creation_ts * 1000 as creation_ts, approved, displayname, avatar_url, creation_ts * 1000 as creation_ts, approved,
eu.user_id is not null as erased, last_seen_ts eu.user_id is not null as erased, last_seen_ts, locked
{sql_base} {sql_base}
ORDER BY {order_by_column} {order}, u.name ASC ORDER BY {order_by_column} {order}, u.name ASC
LIMIT ? OFFSET ? LIMIT ? OFFSET ?

View File

@ -43,7 +43,7 @@ from synapse.storage.util.id_generators import (
MultiWriterIdGenerator, MultiWriterIdGenerator,
StreamIdGenerator, StreamIdGenerator,
) )
from synapse.types import JsonDict from synapse.types import JsonDict, JsonMapping
from synapse.util import json_encoder from synapse.util import json_encoder
from synapse.util.caches.descriptors import cached from synapse.util.caches.descriptors import cached
from synapse.util.caches.stream_change_cache import StreamChangeCache from synapse.util.caches.stream_change_cache import StreamChangeCache
@ -119,7 +119,7 @@ class AccountDataWorkerStore(PushRulesWorkerStore, CacheInvalidationWorkerStore)
@cached() @cached()
async def get_global_account_data_for_user( async def get_global_account_data_for_user(
self, user_id: str self, user_id: str
) -> Mapping[str, JsonDict]: ) -> Mapping[str, JsonMapping]:
""" """
Get all the global client account_data for a user. Get all the global client account_data for a user.
@ -164,7 +164,7 @@ class AccountDataWorkerStore(PushRulesWorkerStore, CacheInvalidationWorkerStore)
@cached() @cached()
async def get_room_account_data_for_user( async def get_room_account_data_for_user(
self, user_id: str self, user_id: str
) -> Mapping[str, Mapping[str, JsonDict]]: ) -> Mapping[str, Mapping[str, JsonMapping]]:
""" """
Get all of the per-room client account_data for a user. Get all of the per-room client account_data for a user.
@ -213,7 +213,7 @@ class AccountDataWorkerStore(PushRulesWorkerStore, CacheInvalidationWorkerStore)
@cached(num_args=2, max_entries=5000, tree=True) @cached(num_args=2, max_entries=5000, tree=True)
async def get_global_account_data_by_type_for_user( async def get_global_account_data_by_type_for_user(
self, user_id: str, data_type: str self, user_id: str, data_type: str
) -> Optional[JsonDict]: ) -> Optional[JsonMapping]:
""" """
Returns: Returns:
The account data. The account data.
@ -265,7 +265,7 @@ class AccountDataWorkerStore(PushRulesWorkerStore, CacheInvalidationWorkerStore)
@cached(num_args=2, tree=True) @cached(num_args=2, tree=True)
async def get_account_data_for_room( async def get_account_data_for_room(
self, user_id: str, room_id: str self, user_id: str, room_id: str
) -> Mapping[str, JsonDict]: ) -> Mapping[str, JsonMapping]:
"""Get all the client account_data for a user for a room. """Get all the client account_data for a user for a room.
Args: Args:
@ -296,7 +296,7 @@ class AccountDataWorkerStore(PushRulesWorkerStore, CacheInvalidationWorkerStore)
@cached(num_args=3, max_entries=5000, tree=True) @cached(num_args=3, max_entries=5000, tree=True)
async def get_account_data_for_room_and_type( async def get_account_data_for_room_and_type(
self, user_id: str, room_id: str, account_data_type: str self, user_id: str, room_id: str, account_data_type: str
) -> Optional[JsonDict]: ) -> Optional[JsonMapping]:
"""Get the client account_data of given type for a user for a room. """Get the client account_data of given type for a user for a room.
Args: Args:
@ -394,7 +394,7 @@ class AccountDataWorkerStore(PushRulesWorkerStore, CacheInvalidationWorkerStore)
async def get_updated_global_account_data_for_user( async def get_updated_global_account_data_for_user(
self, user_id: str, stream_id: int self, user_id: str, stream_id: int
) -> Dict[str, JsonDict]: ) -> Mapping[str, JsonMapping]:
"""Get all the global account_data that's changed for a user. """Get all the global account_data that's changed for a user.
Args: Args:

View File

@ -45,7 +45,7 @@ from synapse.storage.databases.main.events_worker import EventsWorkerStore
from synapse.storage.databases.main.roommember import RoomMemberWorkerStore from synapse.storage.databases.main.roommember import RoomMemberWorkerStore
from synapse.storage.types import Cursor from synapse.storage.types import Cursor
from synapse.storage.util.sequence import build_sequence_generator from synapse.storage.util.sequence import build_sequence_generator
from synapse.types import DeviceListUpdates, JsonDict from synapse.types import DeviceListUpdates, JsonMapping
from synapse.util import json_encoder from synapse.util import json_encoder
from synapse.util.caches.descriptors import _CacheContext, cached from synapse.util.caches.descriptors import _CacheContext, cached
@ -268,8 +268,8 @@ class ApplicationServiceTransactionWorkerStore(
self, self,
service: ApplicationService, service: ApplicationService,
events: Sequence[EventBase], events: Sequence[EventBase],
ephemeral: List[JsonDict], ephemeral: List[JsonMapping],
to_device_messages: List[JsonDict], to_device_messages: List[JsonMapping],
one_time_keys_count: TransactionOneTimeKeysCount, one_time_keys_count: TransactionOneTimeKeysCount,
unused_fallback_keys: TransactionUnusedFallbackKeys, unused_fallback_keys: TransactionUnusedFallbackKeys,
device_list_summary: DeviceListUpdates, device_list_summary: DeviceListUpdates,

View File

@ -55,7 +55,12 @@ from synapse.storage.util.id_generators import (
AbstractStreamIdGenerator, AbstractStreamIdGenerator,
StreamIdGenerator, StreamIdGenerator,
) )
from synapse.types import JsonDict, StrCollection, get_verify_key_from_cross_signing_key from synapse.types import (
JsonDict,
JsonMapping,
StrCollection,
get_verify_key_from_cross_signing_key,
)
from synapse.util import json_decoder, json_encoder from synapse.util import json_decoder, json_encoder
from synapse.util.caches.descriptors import cached, cachedList from synapse.util.caches.descriptors import cached, cachedList
from synapse.util.caches.lrucache import LruCache from synapse.util.caches.lrucache import LruCache
@ -746,7 +751,7 @@ class DeviceWorkerStore(RoomMemberWorkerStore, EndToEndKeyWorkerStore):
@cancellable @cancellable
async def get_user_devices_from_cache( async def get_user_devices_from_cache(
self, user_ids: Set[str], user_and_device_ids: List[Tuple[str, str]] self, user_ids: Set[str], user_and_device_ids: List[Tuple[str, str]]
) -> Tuple[Set[str], Dict[str, Mapping[str, JsonDict]]]: ) -> Tuple[Set[str], Dict[str, Mapping[str, JsonMapping]]]:
"""Get the devices (and keys if any) for remote users from the cache. """Get the devices (and keys if any) for remote users from the cache.
Args: Args:
@ -766,13 +771,13 @@ class DeviceWorkerStore(RoomMemberWorkerStore, EndToEndKeyWorkerStore):
user_ids_not_in_cache = unique_user_ids - user_ids_in_cache user_ids_not_in_cache = unique_user_ids - user_ids_in_cache
# First fetch all the users which all devices are to be returned. # First fetch all the users which all devices are to be returned.
results: Dict[str, Mapping[str, JsonDict]] = {} results: Dict[str, Mapping[str, JsonMapping]] = {}
for user_id in user_ids: for user_id in user_ids:
if user_id in user_ids_in_cache: if user_id in user_ids_in_cache:
results[user_id] = await self.get_cached_devices_for_user(user_id) results[user_id] = await self.get_cached_devices_for_user(user_id)
# Then fetch all device-specific requests, but skip users we've already # Then fetch all device-specific requests, but skip users we've already
# fetched all devices for. # fetched all devices for.
device_specific_results: Dict[str, Dict[str, JsonDict]] = {} device_specific_results: Dict[str, Dict[str, JsonMapping]] = {}
for user_id, device_id in user_and_device_ids: for user_id, device_id in user_and_device_ids:
if user_id in user_ids_in_cache and user_id not in user_ids: if user_id in user_ids_in_cache and user_id not in user_ids:
device = await self._get_cached_user_device(user_id, device_id) device = await self._get_cached_user_device(user_id, device_id)
@ -801,7 +806,9 @@ class DeviceWorkerStore(RoomMemberWorkerStore, EndToEndKeyWorkerStore):
return user_ids_in_cache return user_ids_in_cache
@cached(num_args=2, tree=True) @cached(num_args=2, tree=True)
async def _get_cached_user_device(self, user_id: str, device_id: str) -> JsonDict: async def _get_cached_user_device(
self, user_id: str, device_id: str
) -> JsonMapping:
content = await self.db_pool.simple_select_one_onecol( content = await self.db_pool.simple_select_one_onecol(
table="device_lists_remote_cache", table="device_lists_remote_cache",
keyvalues={"user_id": user_id, "device_id": device_id}, keyvalues={"user_id": user_id, "device_id": device_id},
@ -811,7 +818,9 @@ class DeviceWorkerStore(RoomMemberWorkerStore, EndToEndKeyWorkerStore):
return db_to_json(content) return db_to_json(content)
@cached() @cached()
async def get_cached_devices_for_user(self, user_id: str) -> Mapping[str, JsonDict]: async def get_cached_devices_for_user(
self, user_id: str
) -> Mapping[str, JsonMapping]:
devices = await self.db_pool.simple_select_list( devices = await self.db_pool.simple_select_list(
table="device_lists_remote_cache", table="device_lists_remote_cache",
keyvalues={"user_id": user_id}, keyvalues={"user_id": user_id},
@ -1042,7 +1051,7 @@ class DeviceWorkerStore(RoomMemberWorkerStore, EndToEndKeyWorkerStore):
) )
async def get_device_list_last_stream_id_for_remotes( async def get_device_list_last_stream_id_for_remotes(
self, user_ids: Iterable[str] self, user_ids: Iterable[str]
) -> Dict[str, Optional[str]]: ) -> Mapping[str, Optional[str]]:
rows = await self.db_pool.simple_select_many_batch( rows = await self.db_pool.simple_select_many_batch(
table="device_lists_remote_extremeties", table="device_lists_remote_extremeties",
column="user_id", column="user_id",

View File

@ -52,7 +52,7 @@ from synapse.storage.database import (
from synapse.storage.databases.main.cache import CacheInvalidationWorkerStore from synapse.storage.databases.main.cache import CacheInvalidationWorkerStore
from synapse.storage.engines import PostgresEngine from synapse.storage.engines import PostgresEngine
from synapse.storage.util.id_generators import StreamIdGenerator from synapse.storage.util.id_generators import StreamIdGenerator
from synapse.types import JsonDict from synapse.types import JsonDict, JsonMapping
from synapse.util import json_decoder, json_encoder from synapse.util import json_decoder, json_encoder
from synapse.util.caches.descriptors import cached, cachedList from synapse.util.caches.descriptors import cached, cachedList
from synapse.util.cancellation import cancellable from synapse.util.cancellation import cancellable
@ -125,7 +125,7 @@ class EndToEndKeyWorkerStore(EndToEndKeyBackgroundStore, CacheInvalidationWorker
async def get_e2e_device_keys_for_federation_query( async def get_e2e_device_keys_for_federation_query(
self, user_id: str self, user_id: str
) -> Tuple[int, List[JsonDict]]: ) -> Tuple[int, Sequence[JsonMapping]]:
"""Get all devices (with any device keys) for a user """Get all devices (with any device keys) for a user
Returns: Returns:
@ -174,7 +174,7 @@ class EndToEndKeyWorkerStore(EndToEndKeyBackgroundStore, CacheInvalidationWorker
@cached(iterable=True) @cached(iterable=True)
async def _get_e2e_device_keys_for_federation_query_inner( async def _get_e2e_device_keys_for_federation_query_inner(
self, user_id: str self, user_id: str
) -> List[JsonDict]: ) -> Sequence[JsonMapping]:
"""Get all devices (with any device keys) for a user""" """Get all devices (with any device keys) for a user"""
devices = await self.get_e2e_device_keys_and_signatures([(user_id, None)]) devices = await self.get_e2e_device_keys_and_signatures([(user_id, None)])
@ -578,7 +578,7 @@ class EndToEndKeyWorkerStore(EndToEndKeyBackgroundStore, CacheInvalidationWorker
@cached(max_entries=10000) @cached(max_entries=10000)
async def count_e2e_one_time_keys( async def count_e2e_one_time_keys(
self, user_id: str, device_id: str self, user_id: str, device_id: str
) -> Dict[str, int]: ) -> Mapping[str, int]:
"""Count the number of one time keys the server has for a device """Count the number of one time keys the server has for a device
Returns: Returns:
A mapping from algorithm to number of keys for that algorithm. A mapping from algorithm to number of keys for that algorithm.
@ -812,7 +812,7 @@ class EndToEndKeyWorkerStore(EndToEndKeyBackgroundStore, CacheInvalidationWorker
async def get_e2e_cross_signing_key( async def get_e2e_cross_signing_key(
self, user_id: str, key_type: str, from_user_id: Optional[str] = None self, user_id: str, key_type: str, from_user_id: Optional[str] = None
) -> Optional[JsonDict]: ) -> Optional[JsonMapping]:
"""Returns a user's cross-signing key. """Returns a user's cross-signing key.
Args: Args:
@ -833,7 +833,9 @@ class EndToEndKeyWorkerStore(EndToEndKeyBackgroundStore, CacheInvalidationWorker
return user_keys.get(key_type) return user_keys.get(key_type)
@cached(num_args=1) @cached(num_args=1)
def _get_bare_e2e_cross_signing_keys(self, user_id: str) -> Mapping[str, JsonDict]: def _get_bare_e2e_cross_signing_keys(
self, user_id: str
) -> Mapping[str, JsonMapping]:
"""Dummy function. Only used to make a cache for """Dummy function. Only used to make a cache for
_get_bare_e2e_cross_signing_keys_bulk. _get_bare_e2e_cross_signing_keys_bulk.
""" """
@ -846,7 +848,7 @@ class EndToEndKeyWorkerStore(EndToEndKeyBackgroundStore, CacheInvalidationWorker
) )
async def _get_bare_e2e_cross_signing_keys_bulk( async def _get_bare_e2e_cross_signing_keys_bulk(
self, user_ids: Iterable[str] self, user_ids: Iterable[str]
) -> Dict[str, Optional[Mapping[str, JsonDict]]]: ) -> Mapping[str, Optional[Mapping[str, JsonMapping]]]:
"""Returns the cross-signing keys for a set of users. The output of this """Returns the cross-signing keys for a set of users. The output of this
function should be passed to _get_e2e_cross_signing_signatures_txn if function should be passed to _get_e2e_cross_signing_signatures_txn if
the signatures for the calling user need to be fetched. the signatures for the calling user need to be fetched.
@ -860,15 +862,12 @@ class EndToEndKeyWorkerStore(EndToEndKeyBackgroundStore, CacheInvalidationWorker
their user ID will map to None. their user ID will map to None.
""" """
result = await self.db_pool.runInteraction( return await self.db_pool.runInteraction(
"get_bare_e2e_cross_signing_keys_bulk", "get_bare_e2e_cross_signing_keys_bulk",
self._get_bare_e2e_cross_signing_keys_bulk_txn, self._get_bare_e2e_cross_signing_keys_bulk_txn,
user_ids, user_ids,
) )
# The `Optional` comes from the `@cachedList` decorator.
return cast(Dict[str, Optional[Mapping[str, JsonDict]]], result)
def _get_bare_e2e_cross_signing_keys_bulk_txn( def _get_bare_e2e_cross_signing_keys_bulk_txn(
self, self,
txn: LoggingTransaction, txn: LoggingTransaction,
@ -1026,7 +1025,7 @@ class EndToEndKeyWorkerStore(EndToEndKeyBackgroundStore, CacheInvalidationWorker
@cancellable @cancellable
async def get_e2e_cross_signing_keys_bulk( async def get_e2e_cross_signing_keys_bulk(
self, user_ids: List[str], from_user_id: Optional[str] = None self, user_ids: List[str], from_user_id: Optional[str] = None
) -> Dict[str, Optional[Mapping[str, JsonDict]]]: ) -> Mapping[str, Optional[Mapping[str, JsonMapping]]]:
"""Returns the cross-signing keys for a set of users. """Returns the cross-signing keys for a set of users.
Args: Args:
@ -1043,7 +1042,7 @@ class EndToEndKeyWorkerStore(EndToEndKeyBackgroundStore, CacheInvalidationWorker
if from_user_id: if from_user_id:
result = cast( result = cast(
Dict[str, Optional[Mapping[str, JsonDict]]], Dict[str, Optional[Mapping[str, JsonMapping]]],
await self.db_pool.runInteraction( await self.db_pool.runInteraction(
"get_e2e_cross_signing_signatures", "get_e2e_cross_signing_signatures",
self._get_e2e_cross_signing_signatures_txn, self._get_e2e_cross_signing_signatures_txn,

View File

@ -19,6 +19,7 @@ from typing import (
TYPE_CHECKING, TYPE_CHECKING,
Collection, Collection,
Dict, Dict,
FrozenSet,
Iterable, Iterable,
List, List,
Optional, Optional,
@ -47,7 +48,7 @@ from synapse.storage.database import (
from synapse.storage.databases.main.events_worker import EventsWorkerStore from synapse.storage.databases.main.events_worker import EventsWorkerStore
from synapse.storage.databases.main.signatures import SignatureWorkerStore from synapse.storage.databases.main.signatures import SignatureWorkerStore
from synapse.storage.engines import PostgresEngine, Sqlite3Engine from synapse.storage.engines import PostgresEngine, Sqlite3Engine
from synapse.types import JsonDict, StrCollection, StrSequence from synapse.types import JsonDict, StrCollection
from synapse.util import json_encoder from synapse.util import json_encoder
from synapse.util.caches.descriptors import cached from synapse.util.caches.descriptors import cached
from synapse.util.caches.lrucache import LruCache from synapse.util.caches.lrucache import LruCache
@ -1179,13 +1180,14 @@ class EventFederationWorkerStore(SignatureWorkerStore, EventsWorkerStore, SQLBas
) )
@cached(max_entries=5000, iterable=True) @cached(max_entries=5000, iterable=True)
async def get_latest_event_ids_in_room(self, room_id: str) -> StrSequence: async def get_latest_event_ids_in_room(self, room_id: str) -> FrozenSet[str]:
return await self.db_pool.simple_select_onecol( event_ids = await self.db_pool.simple_select_onecol(
table="event_forward_extremities", table="event_forward_extremities",
keyvalues={"room_id": room_id}, keyvalues={"room_id": room_id},
retcol="event_id", retcol="event_id",
desc="get_latest_event_ids_in_room", desc="get_latest_event_ids_in_room",
) )
return frozenset(event_ids)
async def get_min_depth(self, room_id: str) -> Optional[int]: async def get_min_depth(self, room_id: str) -> Optional[int]:
"""For the given room, get the minimum depth we have seen for it.""" """For the given room, get the minimum depth we have seen for it."""

View File

@ -1599,10 +1599,7 @@ class EventPushActionsWorkerStore(ReceiptsWorkerStore, StreamWorkerStore, SQLBas
txn, txn,
table="event_push_summary", table="event_push_summary",
key_names=("user_id", "room_id", "thread_id"), key_names=("user_id", "room_id", "thread_id"),
key_values=[ key_values=list(summaries),
(user_id, room_id, thread_id)
for user_id, room_id, thread_id in summaries
],
value_names=("notif_count", "unread_count", "stream_ordering"), value_names=("notif_count", "unread_count", "stream_ordering"),
value_values=[ value_values=[
( (

View File

@ -222,7 +222,7 @@ class PersistEventsStore:
for room_id, latest_event_ids in new_forward_extremities.items(): for room_id, latest_event_ids in new_forward_extremities.items():
self.store.get_latest_event_ids_in_room.prefill( self.store.get_latest_event_ids_in_room.prefill(
(room_id,), list(latest_event_ids) (room_id,), frozenset(latest_event_ids)
) )
async def _get_events_which_are_prevs(self, event_ids: Iterable[str]) -> List[str]: async def _get_events_which_are_prevs(self, event_ids: Iterable[str]) -> List[str]:
@ -827,15 +827,7 @@ class PersistEventsStore:
"target_chain_id", "target_chain_id",
"target_sequence_number", "target_sequence_number",
), ),
values=[ values=list(chain_links.get_additions()),
(source_id, source_seq, target_id, target_seq)
for (
source_id,
source_seq,
target_id,
target_seq,
) in chain_links.get_additions()
],
) )
@staticmethod @staticmethod

View File

@ -24,6 +24,7 @@ from typing import (
Dict, Dict,
Iterable, Iterable,
List, List,
Mapping,
MutableMapping, MutableMapping,
Optional, Optional,
Set, Set,
@ -1633,7 +1634,7 @@ class EventsWorkerStore(SQLBaseStore):
self, self,
room_id: str, room_id: str,
event_ids: Collection[str], event_ids: Collection[str],
) -> Dict[str, bool]: ) -> Mapping[str, bool]:
"""Helper for have_seen_events """Helper for have_seen_events
Returns: Returns:
@ -2325,7 +2326,7 @@ class EventsWorkerStore(SQLBaseStore):
@cachedList(cached_method_name="is_partial_state_event", list_name="event_ids") @cachedList(cached_method_name="is_partial_state_event", list_name="event_ids")
async def get_partial_state_events( async def get_partial_state_events(
self, event_ids: Collection[str] self, event_ids: Collection[str]
) -> Dict[str, bool]: ) -> Mapping[str, bool]:
"""Checks which of the given events have partial state """Checks which of the given events have partial state
Args: Args:

View File

@ -12,11 +12,10 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from typing import TYPE_CHECKING, Dict from typing import TYPE_CHECKING, Dict, FrozenSet
from synapse.storage.database import DatabasePool, LoggingDatabaseConnection from synapse.storage.database import DatabasePool, LoggingDatabaseConnection
from synapse.storage.databases.main import CacheInvalidationWorkerStore from synapse.storage.databases.main import CacheInvalidationWorkerStore
from synapse.types import StrCollection
from synapse.util.caches.descriptors import cached from synapse.util.caches.descriptors import cached
if TYPE_CHECKING: if TYPE_CHECKING:
@ -34,7 +33,7 @@ class ExperimentalFeaturesStore(CacheInvalidationWorkerStore):
super().__init__(database, db_conn, hs) super().__init__(database, db_conn, hs)
@cached() @cached()
async def list_enabled_features(self, user_id: str) -> StrCollection: async def list_enabled_features(self, user_id: str) -> FrozenSet[str]:
""" """
Checks to see what features are enabled for a given user Checks to see what features are enabled for a given user
Args: Args:
@ -49,7 +48,7 @@ class ExperimentalFeaturesStore(CacheInvalidationWorkerStore):
["feature"], ["feature"],
) )
return [feature["feature"] for feature in enabled] return frozenset(feature["feature"] for feature in enabled)
async def set_features_for_user( async def set_features_for_user(
self, self,

View File

@ -25,7 +25,7 @@ from synapse.storage.database import (
LoggingTransaction, LoggingTransaction,
) )
from synapse.storage.engines import PostgresEngine from synapse.storage.engines import PostgresEngine
from synapse.types import JsonDict, UserID from synapse.types import JsonDict, JsonMapping, UserID
from synapse.util.caches.descriptors import cached from synapse.util.caches.descriptors import cached
if TYPE_CHECKING: if TYPE_CHECKING:
@ -145,7 +145,7 @@ class FilteringWorkerStore(SQLBaseStore):
@cached(num_args=2) @cached(num_args=2)
async def get_user_filter( async def get_user_filter(
self, user_id: UserID, filter_id: Union[int, str] self, user_id: UserID, filter_id: Union[int, str]
) -> JsonDict: ) -> JsonMapping:
# filter_id is BIGINT UNSIGNED, so if it isn't a number, fail # filter_id is BIGINT UNSIGNED, so if it isn't a number, fail
# with a coherent error message rather than 500 M_UNKNOWN. # with a coherent error message rather than 500 M_UNKNOWN.
try: try:

View File

@ -16,7 +16,7 @@
import itertools import itertools
import json import json
import logging import logging
from typing import Dict, Iterable, Optional, Tuple from typing import Dict, Iterable, Mapping, Optional, Tuple
from canonicaljson import encode_canonical_json from canonicaljson import encode_canonical_json
from signedjson.key import decode_verify_key_bytes from signedjson.key import decode_verify_key_bytes
@ -130,7 +130,7 @@ class KeyStore(CacheInvalidationWorkerStore):
) )
async def get_server_keys_json( async def get_server_keys_json(
self, server_name_and_key_ids: Iterable[Tuple[str, str]] self, server_name_and_key_ids: Iterable[Tuple[str, str]]
) -> Dict[Tuple[str, str], FetchKeyResult]: ) -> Mapping[Tuple[str, str], FetchKeyResult]:
""" """
Args: Args:
server_name_and_key_ids: server_name_and_key_ids:
@ -200,7 +200,7 @@ class KeyStore(CacheInvalidationWorkerStore):
) )
async def get_server_keys_json_for_remote( async def get_server_keys_json_for_remote(
self, server_name: str, key_ids: Iterable[str] self, server_name: str, key_ids: Iterable[str]
) -> Dict[str, Optional[FetchKeyResultForRemote]]: ) -> Mapping[str, Optional[FetchKeyResultForRemote]]:
"""Fetch the cached keys for the given server/key IDs. """Fetch the cached keys for the given server/key IDs.
If we have multiple entries for a given key ID, returns the most recent. If we have multiple entries for a given key ID, returns the most recent.

View File

@ -11,7 +11,17 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from typing import TYPE_CHECKING, Any, Dict, Iterable, List, Optional, Tuple, cast from typing import (
TYPE_CHECKING,
Any,
Dict,
Iterable,
List,
Mapping,
Optional,
Tuple,
cast,
)
from synapse.api.presence import PresenceState, UserPresenceState from synapse.api.presence import PresenceState, UserPresenceState
from synapse.replication.tcp.streams import PresenceStream from synapse.replication.tcp.streams import PresenceStream
@ -249,7 +259,7 @@ class PresenceStore(PresenceBackgroundUpdateStore, CacheInvalidationWorkerStore)
) )
async def get_presence_for_users( async def get_presence_for_users(
self, user_ids: Iterable[str] self, user_ids: Iterable[str]
) -> Dict[str, UserPresenceState]: ) -> Mapping[str, UserPresenceState]:
rows = await self.db_pool.simple_select_many_batch( rows = await self.db_pool.simple_select_many_batch(
table="presence_stream", table="presence_stream",
column="user_id", column="user_id",

View File

@ -216,7 +216,7 @@ class PushRulesWorkerStore(
@cachedList(cached_method_name="get_push_rules_for_user", list_name="user_ids") @cachedList(cached_method_name="get_push_rules_for_user", list_name="user_ids")
async def bulk_get_push_rules( async def bulk_get_push_rules(
self, user_ids: Collection[str] self, user_ids: Collection[str]
) -> Dict[str, FilteredPushRules]: ) -> Mapping[str, FilteredPushRules]:
if not user_ids: if not user_ids:
return {} return {}

View File

@ -43,7 +43,7 @@ from synapse.storage.util.id_generators import (
MultiWriterIdGenerator, MultiWriterIdGenerator,
StreamIdGenerator, StreamIdGenerator,
) )
from synapse.types import JsonDict from synapse.types import JsonDict, JsonMapping
from synapse.util import json_encoder from synapse.util import json_encoder
from synapse.util.caches.descriptors import cached, cachedList from synapse.util.caches.descriptors import cached, cachedList
from synapse.util.caches.stream_change_cache import StreamChangeCache from synapse.util.caches.stream_change_cache import StreamChangeCache
@ -218,7 +218,7 @@ class ReceiptsWorkerStore(SQLBaseStore):
@cached() @cached()
async def _get_receipts_for_user_with_orderings( async def _get_receipts_for_user_with_orderings(
self, user_id: str, receipt_type: str self, user_id: str, receipt_type: str
) -> JsonDict: ) -> JsonMapping:
""" """
Fetch receipts for all rooms that the given user is joined to. Fetch receipts for all rooms that the given user is joined to.
@ -258,7 +258,7 @@ class ReceiptsWorkerStore(SQLBaseStore):
async def get_linearized_receipts_for_rooms( async def get_linearized_receipts_for_rooms(
self, room_ids: Iterable[str], to_key: int, from_key: Optional[int] = None self, room_ids: Iterable[str], to_key: int, from_key: Optional[int] = None
) -> List[dict]: ) -> List[JsonMapping]:
"""Get receipts for multiple rooms for sending to clients. """Get receipts for multiple rooms for sending to clients.
Args: Args:
@ -287,7 +287,7 @@ class ReceiptsWorkerStore(SQLBaseStore):
async def get_linearized_receipts_for_room( async def get_linearized_receipts_for_room(
self, room_id: str, to_key: int, from_key: Optional[int] = None self, room_id: str, to_key: int, from_key: Optional[int] = None
) -> Sequence[JsonDict]: ) -> Sequence[JsonMapping]:
"""Get receipts for a single room for sending to clients. """Get receipts for a single room for sending to clients.
Args: Args:
@ -310,7 +310,7 @@ class ReceiptsWorkerStore(SQLBaseStore):
@cached(tree=True) @cached(tree=True)
async def _get_linearized_receipts_for_room( async def _get_linearized_receipts_for_room(
self, room_id: str, to_key: int, from_key: Optional[int] = None self, room_id: str, to_key: int, from_key: Optional[int] = None
) -> Sequence[JsonDict]: ) -> Sequence[JsonMapping]:
"""See get_linearized_receipts_for_room""" """See get_linearized_receipts_for_room"""
def f(txn: LoggingTransaction) -> List[Dict[str, Any]]: def f(txn: LoggingTransaction) -> List[Dict[str, Any]]:
@ -353,7 +353,7 @@ class ReceiptsWorkerStore(SQLBaseStore):
) )
async def _get_linearized_receipts_for_rooms( async def _get_linearized_receipts_for_rooms(
self, room_ids: Collection[str], to_key: int, from_key: Optional[int] = None self, room_ids: Collection[str], to_key: int, from_key: Optional[int] = None
) -> Dict[str, Sequence[JsonDict]]: ) -> Mapping[str, Sequence[JsonMapping]]:
if not room_ids: if not room_ids:
return {} return {}
@ -415,7 +415,7 @@ class ReceiptsWorkerStore(SQLBaseStore):
) )
async def get_linearized_receipts_for_all_rooms( async def get_linearized_receipts_for_all_rooms(
self, to_key: int, from_key: Optional[int] = None self, to_key: int, from_key: Optional[int] = None
) -> Mapping[str, JsonDict]: ) -> Mapping[str, JsonMapping]:
"""Get receipts for all rooms between two stream_ids, up """Get receipts for all rooms between two stream_ids, up
to a limit of the latest 100 read receipts. to a limit of the latest 100 read receipts.

View File

@ -465,7 +465,7 @@ class RelationsWorkerStore(SQLBaseStore):
@cachedList(cached_method_name="get_references_for_event", list_name="event_ids") @cachedList(cached_method_name="get_references_for_event", list_name="event_ids")
async def get_references_for_events( async def get_references_for_events(
self, event_ids: Collection[str] self, event_ids: Collection[str]
) -> Mapping[str, Optional[List[_RelatedEvent]]]: ) -> Mapping[str, Optional[Sequence[_RelatedEvent]]]:
"""Get a list of references to the given events. """Get a list of references to the given events.
Args: Args:
@ -519,7 +519,7 @@ class RelationsWorkerStore(SQLBaseStore):
@cachedList(cached_method_name="get_applicable_edit", list_name="event_ids") @cachedList(cached_method_name="get_applicable_edit", list_name="event_ids")
async def get_applicable_edits( async def get_applicable_edits(
self, event_ids: Collection[str] self, event_ids: Collection[str]
) -> Dict[str, Optional[EventBase]]: ) -> Mapping[str, Optional[EventBase]]:
"""Get the most recent edit (if any) that has happened for the given """Get the most recent edit (if any) that has happened for the given
events. events.
@ -605,7 +605,7 @@ class RelationsWorkerStore(SQLBaseStore):
@cachedList(cached_method_name="get_thread_summary", list_name="event_ids") @cachedList(cached_method_name="get_thread_summary", list_name="event_ids")
async def get_thread_summaries( async def get_thread_summaries(
self, event_ids: Collection[str] self, event_ids: Collection[str]
) -> Dict[str, Optional[Tuple[int, EventBase]]]: ) -> Mapping[str, Optional[Tuple[int, EventBase]]]:
"""Get the number of threaded replies and the latest reply (if any) for the given events. """Get the number of threaded replies and the latest reply (if any) for the given events.
Args: Args:
@ -779,7 +779,7 @@ class RelationsWorkerStore(SQLBaseStore):
@cachedList(cached_method_name="get_thread_participated", list_name="event_ids") @cachedList(cached_method_name="get_thread_participated", list_name="event_ids")
async def get_threads_participated( async def get_threads_participated(
self, event_ids: Collection[str], user_id: str self, event_ids: Collection[str], user_id: str
) -> Dict[str, bool]: ) -> Mapping[str, bool]:
"""Get whether the requesting user participated in the given threads. """Get whether the requesting user participated in the given threads.
This is separate from get_thread_summaries since that can be cached across This is separate from get_thread_summaries since that can be cached across
@ -931,7 +931,7 @@ class RelationsWorkerStore(SQLBaseStore):
room_id: str, room_id: str,
limit: int = 5, limit: int = 5,
from_token: Optional[ThreadsNextBatch] = None, from_token: Optional[ThreadsNextBatch] = None,
) -> Tuple[List[str], Optional[ThreadsNextBatch]]: ) -> Tuple[Sequence[str], Optional[ThreadsNextBatch]]:
"""Get a list of thread IDs, ordered by topological ordering of their """Get a list of thread IDs, ordered by topological ordering of their
latest reply. latest reply.

View File

@ -191,7 +191,7 @@ class RoomMemberWorkerStore(EventsWorkerStore, CacheInvalidationWorkerStore):
) )
async def get_subset_users_in_room_with_profiles( async def get_subset_users_in_room_with_profiles(
self, room_id: str, user_ids: Collection[str] self, room_id: str, user_ids: Collection[str]
) -> Dict[str, ProfileInfo]: ) -> Mapping[str, ProfileInfo]:
"""Get a mapping from user ID to profile information for a list of users """Get a mapping from user ID to profile information for a list of users
in a given room. in a given room.
@ -676,7 +676,7 @@ class RoomMemberWorkerStore(EventsWorkerStore, CacheInvalidationWorkerStore):
) )
async def _get_rooms_for_users( async def _get_rooms_for_users(
self, user_ids: Collection[str] self, user_ids: Collection[str]
) -> Dict[str, FrozenSet[str]]: ) -> Mapping[str, FrozenSet[str]]:
"""A batched version of `get_rooms_for_user`. """A batched version of `get_rooms_for_user`.
Returns: Returns:
@ -881,7 +881,7 @@ class RoomMemberWorkerStore(EventsWorkerStore, CacheInvalidationWorkerStore):
) )
async def _get_user_ids_from_membership_event_ids( async def _get_user_ids_from_membership_event_ids(
self, event_ids: Iterable[str] self, event_ids: Iterable[str]
) -> Dict[str, Optional[str]]: ) -> Mapping[str, Optional[str]]:
"""For given set of member event_ids check if they point to a join """For given set of member event_ids check if they point to a join
event. event.
@ -984,7 +984,7 @@ class RoomMemberWorkerStore(EventsWorkerStore, CacheInvalidationWorkerStore):
) )
@cached(iterable=True, max_entries=10000) @cached(iterable=True, max_entries=10000)
async def get_current_hosts_in_room_ordered(self, room_id: str) -> List[str]: async def get_current_hosts_in_room_ordered(self, room_id: str) -> Tuple[str, ...]:
""" """
Get current hosts in room based on current state. Get current hosts in room based on current state.
@ -1013,12 +1013,14 @@ class RoomMemberWorkerStore(EventsWorkerStore, CacheInvalidationWorkerStore):
# `get_users_in_room` rather than funky SQL. # `get_users_in_room` rather than funky SQL.
domains = await self.get_current_hosts_in_room(room_id) domains = await self.get_current_hosts_in_room(room_id)
return list(domains) return tuple(domains)
# For PostgreSQL we can use a regex to pull out the domains from the # For PostgreSQL we can use a regex to pull out the domains from the
# joined users in `current_state_events` via regex. # joined users in `current_state_events` via regex.
def get_current_hosts_in_room_ordered_txn(txn: LoggingTransaction) -> List[str]: def get_current_hosts_in_room_ordered_txn(
txn: LoggingTransaction,
) -> Tuple[str, ...]:
# Returns a list of servers currently joined in the room sorted by # Returns a list of servers currently joined in the room sorted by
# longest in the room first (aka. with the lowest depth). The # longest in the room first (aka. with the lowest depth). The
# heuristic of sorting by servers who have been in the room the # heuristic of sorting by servers who have been in the room the
@ -1043,7 +1045,7 @@ class RoomMemberWorkerStore(EventsWorkerStore, CacheInvalidationWorkerStore):
""" """
txn.execute(sql, (room_id,)) txn.execute(sql, (room_id,))
# `server_domain` will be `NULL` for malformed MXIDs with no colons. # `server_domain` will be `NULL` for malformed MXIDs with no colons.
return [d for d, in txn if d is not None] return tuple(d for d, in txn if d is not None)
return await self.db_pool.runInteraction( return await self.db_pool.runInteraction(
"get_current_hosts_in_room_ordered", get_current_hosts_in_room_ordered_txn "get_current_hosts_in_room_ordered", get_current_hosts_in_room_ordered_txn
@ -1191,7 +1193,7 @@ class RoomMemberWorkerStore(EventsWorkerStore, CacheInvalidationWorkerStore):
) )
async def get_membership_from_event_ids( async def get_membership_from_event_ids(
self, member_event_ids: Iterable[str] self, member_event_ids: Iterable[str]
) -> Dict[str, Optional[EventIdMembership]]: ) -> Mapping[str, Optional[EventIdMembership]]:
"""Get user_id and membership of a set of event IDs. """Get user_id and membership of a set of event IDs.
Returns: Returns:

View File

@ -14,7 +14,17 @@
# limitations under the License. # limitations under the License.
import collections.abc import collections.abc
import logging import logging
from typing import TYPE_CHECKING, Any, Collection, Dict, Iterable, Optional, Set, Tuple from typing import (
TYPE_CHECKING,
Any,
Collection,
Dict,
Iterable,
Mapping,
Optional,
Set,
Tuple,
)
import attr import attr
@ -372,7 +382,7 @@ class StateGroupWorkerStore(EventsWorkerStore, SQLBaseStore):
) )
async def _get_state_group_for_events( async def _get_state_group_for_events(
self, event_ids: Collection[str] self, event_ids: Collection[str]
) -> Dict[str, int]: ) -> Mapping[str, int]:
"""Returns mapping event_id -> state_group. """Returns mapping event_id -> state_group.
Raises: Raises:

View File

@ -108,6 +108,7 @@ class UserSortOrder(Enum):
SHADOW_BANNED = "shadow_banned" SHADOW_BANNED = "shadow_banned"
CREATION_TS = "creation_ts" CREATION_TS = "creation_ts"
LAST_SEEN_TS = "last_seen_ts" LAST_SEEN_TS = "last_seen_ts"
LOCKED = "locked"
class StatsStore(StateDeltasStore): class StatsStore(StateDeltasStore):

Some files were not shown because too many files have changed in this diff Show More