Merge branch 'develop' of github.com:matrix-org/synapse into erikj/simplify_streams
commit
1410dab764
|
@ -6,12 +6,7 @@
|
||||||
set -ex
|
set -ex
|
||||||
|
|
||||||
apt-get update
|
apt-get update
|
||||||
apt-get install -y python3.5 python3.5-dev python3-pip libxml2-dev libxslt-dev zlib1g-dev
|
apt-get install -y python3.5 python3.5-dev python3-pip libxml2-dev libxslt-dev zlib1g-dev tox
|
||||||
|
|
||||||
# workaround for https://github.com/jaraco/zipp/issues/40
|
|
||||||
python3.5 -m pip install 'setuptools>=34.4.0'
|
|
||||||
|
|
||||||
python3.5 -m pip install tox
|
|
||||||
|
|
||||||
export LANG="C.UTF-8"
|
export LANG="C.UTF-8"
|
||||||
|
|
||||||
|
|
92
CHANGES.md
92
CHANGES.md
|
@ -1,3 +1,95 @@
|
||||||
|
Synapse 1.12.0rc1 (2020-03-19)
|
||||||
|
==============================
|
||||||
|
|
||||||
|
Features
|
||||||
|
--------
|
||||||
|
|
||||||
|
- Changes related to room alias management ([MSC2432](https://github.com/matrix-org/matrix-doc/pull/2432)):
|
||||||
|
- Publishing/removing a room from the room directory now requires the user to have a power level capable of modifying the canonical alias, instead of the room aliases. ([\#6965](https://github.com/matrix-org/synapse/issues/6965))
|
||||||
|
- Validate the `alt_aliases` property of canonical alias events. ([\#6971](https://github.com/matrix-org/synapse/issues/6971))
|
||||||
|
- Users with a power level sufficient to modify the canonical alias of a room can now delete room aliases. ([\#6986](https://github.com/matrix-org/synapse/issues/6986))
|
||||||
|
- Implement updated authorization rules and redaction rules for aliases events, from [MSC2261](https://github.com/matrix-org/matrix-doc/pull/2261) and [MSC2432](https://github.com/matrix-org/matrix-doc/pull/2432). ([\#7037](https://github.com/matrix-org/synapse/issues/7037))
|
||||||
|
- Stop sending m.room.aliases events during room creation and upgrade. ([\#6941](https://github.com/matrix-org/synapse/issues/6941))
|
||||||
|
- Synapse no longer uses room alias events to calculate room names for push notifications. ([\#6966](https://github.com/matrix-org/synapse/issues/6966))
|
||||||
|
- The room list endpoint no longer returns a list of aliases. ([\#6970](https://github.com/matrix-org/synapse/issues/6970))
|
||||||
|
- Remove special handling of aliases events from [MSC2260](https://github.com/matrix-org/matrix-doc/pull/2260) added in v1.10.0rc1. ([\#7034](https://github.com/matrix-org/synapse/issues/7034))
|
||||||
|
- Expose the `synctl`, `hash_password` and `generate_config` commands in the snapcraft package. Contributed by @devec0. ([\#6315](https://github.com/matrix-org/synapse/issues/6315))
|
||||||
|
- Check that server_name is correctly set before running database updates. ([\#6982](https://github.com/matrix-org/synapse/issues/6982))
|
||||||
|
- Break down monthly active users by `appservice_id` and emit via Prometheus. ([\#7030](https://github.com/matrix-org/synapse/issues/7030))
|
||||||
|
- Render a configurable and comprehensible error page if something goes wrong during the SAML2 authentication process. ([\#7058](https://github.com/matrix-org/synapse/issues/7058), [\#7067](https://github.com/matrix-org/synapse/issues/7067))
|
||||||
|
- Add an optional parameter to control whether other sessions are logged out when a user's password is modified. ([\#7085](https://github.com/matrix-org/synapse/issues/7085))
|
||||||
|
- Add prometheus metrics for the number of active pushers. ([\#7103](https://github.com/matrix-org/synapse/issues/7103), [\#7106](https://github.com/matrix-org/synapse/issues/7106))
|
||||||
|
- Improve performance when making HTTPS requests to sygnal, sydent, etc, by sharing the SSL context object between connections. ([\#7094](https://github.com/matrix-org/synapse/issues/7094))
|
||||||
|
|
||||||
|
|
||||||
|
Bugfixes
|
||||||
|
--------
|
||||||
|
|
||||||
|
- When a user's profile is updated via the admin API, also generate a displayname/avatar update for that user in each room. ([\#6572](https://github.com/matrix-org/synapse/issues/6572))
|
||||||
|
- Fix a couple of bugs in email configuration handling. ([\#6962](https://github.com/matrix-org/synapse/issues/6962))
|
||||||
|
- Fix an issue affecting worker-based deployments where replication would stop working, necessitating a full restart, after joining a large room. ([\#6967](https://github.com/matrix-org/synapse/issues/6967))
|
||||||
|
- Fix `duplicate key` error which was logged when rejoining a room over federation. ([\#6968](https://github.com/matrix-org/synapse/issues/6968))
|
||||||
|
- Prevent user from setting 'deactivated' to anything other than a bool on the v2 PUT /users Admin API. ([\#6990](https://github.com/matrix-org/synapse/issues/6990))
|
||||||
|
- Fix py35-old CI by using native tox package. ([\#7018](https://github.com/matrix-org/synapse/issues/7018))
|
||||||
|
- Fix a bug causing `org.matrix.dummy_event` to be included in responses from `/sync`. ([\#7035](https://github.com/matrix-org/synapse/issues/7035))
|
||||||
|
- Fix a bug that renders UTF-8 text files incorrectly when loaded from media. Contributed by @TheStranjer. ([\#7044](https://github.com/matrix-org/synapse/issues/7044))
|
||||||
|
- Fix a bug that would cause Synapse to respond with an error about event visibility if a client tried to request the state of a room at a given token. ([\#7066](https://github.com/matrix-org/synapse/issues/7066))
|
||||||
|
- Repair a data-corruption issue which was introduced in Synapse 1.10, and fixed in Synapse 1.11, and which could cause `/sync` to return with 404 errors about missing events and unknown rooms. ([\#7070](https://github.com/matrix-org/synapse/issues/7070))
|
||||||
|
- Fix a bug causing account validity renewal emails to be sent even if the feature is turned off in some cases. ([\#7074](https://github.com/matrix-org/synapse/issues/7074))
|
||||||
|
|
||||||
|
|
||||||
|
Improved Documentation
|
||||||
|
----------------------
|
||||||
|
|
||||||
|
- Updated CentOS8 install instructions. Contributed by Richard Kellner. ([\#6925](https://github.com/matrix-org/synapse/issues/6925))
|
||||||
|
- Fix `POSTGRES_INITDB_ARGS` in the `contrib/docker/docker-compose.yml` example docker-compose configuration. ([\#6984](https://github.com/matrix-org/synapse/issues/6984))
|
||||||
|
- Change date in [INSTALL.md](./INSTALL.md#tls-certificates) for last date of getting TLS certificates to November 2019. ([\#7015](https://github.com/matrix-org/synapse/issues/7015))
|
||||||
|
- Document that the fallback auth endpoints must be routed to the same worker node as the register endpoints. ([\#7048](https://github.com/matrix-org/synapse/issues/7048))
|
||||||
|
|
||||||
|
|
||||||
|
Deprecations and Removals
|
||||||
|
-------------------------
|
||||||
|
|
||||||
|
- Remove the unused query_auth federation endpoint per [MSC2451](https://github.com/matrix-org/matrix-doc/pull/2451). ([\#7026](https://github.com/matrix-org/synapse/issues/7026))
|
||||||
|
|
||||||
|
|
||||||
|
Internal Changes
|
||||||
|
----------------
|
||||||
|
|
||||||
|
- Add type hints to `logging/context.py`. ([\#6309](https://github.com/matrix-org/synapse/issues/6309))
|
||||||
|
- Add some clarifications to `README.md` in the database schema directory. ([\#6615](https://github.com/matrix-org/synapse/issues/6615))
|
||||||
|
- Refactoring work in preparation for changing the event redaction algorithm. ([\#6874](https://github.com/matrix-org/synapse/issues/6874), [\#6875](https://github.com/matrix-org/synapse/issues/6875), [\#6983](https://github.com/matrix-org/synapse/issues/6983), [\#7003](https://github.com/matrix-org/synapse/issues/7003))
|
||||||
|
- Improve performance of v2 state resolution for large rooms. ([\#6952](https://github.com/matrix-org/synapse/issues/6952), [\#7095](https://github.com/matrix-org/synapse/issues/7095))
|
||||||
|
- Reduce time spent doing GC, by freezing objects on startup. ([\#6953](https://github.com/matrix-org/synapse/issues/6953))
|
||||||
|
- Minor perfermance fixes to `get_auth_chain_ids`. ([\#6954](https://github.com/matrix-org/synapse/issues/6954))
|
||||||
|
- Don't record remote cross-signing keys in the `devices` table. ([\#6956](https://github.com/matrix-org/synapse/issues/6956))
|
||||||
|
- Use flake8-comprehensions to enforce good hygiene of list/set/dict comprehensions. ([\#6957](https://github.com/matrix-org/synapse/issues/6957))
|
||||||
|
- Merge worker apps together. ([\#6964](https://github.com/matrix-org/synapse/issues/6964), [\#7002](https://github.com/matrix-org/synapse/issues/7002), [\#7055](https://github.com/matrix-org/synapse/issues/7055), [\#7104](https://github.com/matrix-org/synapse/issues/7104))
|
||||||
|
- Remove redundant `store_room` call from `FederationHandler._process_received_pdu`. ([\#6979](https://github.com/matrix-org/synapse/issues/6979))
|
||||||
|
- Update warning for incorrect database collation/ctype to include link to documentation. ([\#6985](https://github.com/matrix-org/synapse/issues/6985))
|
||||||
|
- Add some type annotations to the database storage classes. ([\#6987](https://github.com/matrix-org/synapse/issues/6987))
|
||||||
|
- Port `synapse.handlers.presence` to async/await. ([\#6991](https://github.com/matrix-org/synapse/issues/6991), [\#7019](https://github.com/matrix-org/synapse/issues/7019))
|
||||||
|
- Add some type annotations to the federation base & client classes. ([\#6995](https://github.com/matrix-org/synapse/issues/6995))
|
||||||
|
- Port `synapse.rest.keys` to async/await. ([\#7020](https://github.com/matrix-org/synapse/issues/7020))
|
||||||
|
- Add a type check to `is_verified` when processing room keys. ([\#7045](https://github.com/matrix-org/synapse/issues/7045))
|
||||||
|
- Add type annotations and comments to the auth handler. ([\#7063](https://github.com/matrix-org/synapse/issues/7063))
|
||||||
|
|
||||||
|
|
||||||
|
Synapse 1.11.1 (2020-03-03)
|
||||||
|
===========================
|
||||||
|
|
||||||
|
This release includes a security fix impacting installations using Single Sign-On (i.e. SAML2 or CAS) for authentication. Administrators of such installations are encouraged to upgrade as soon as possible.
|
||||||
|
|
||||||
|
The release also includes fixes for a couple of other bugs.
|
||||||
|
|
||||||
|
Bugfixes
|
||||||
|
--------
|
||||||
|
|
||||||
|
- Add a confirmation step to the SSO login flow before redirecting users to the redirect URL. ([b2bd54a2](https://github.com/matrix-org/synapse/commit/b2bd54a2e31d9a248f73fadb184ae9b4cbdb49f9), [65c73cdf](https://github.com/matrix-org/synapse/commit/65c73cdfec1876a9fec2fd2c3a74923cd146fe0b), [a0178df1](https://github.com/matrix-org/synapse/commit/a0178df10422a76fd403b82d2b2a4ed28a9a9d1e))
|
||||||
|
- Fixed set a user as an admin with the admin API `PUT /_synapse/admin/v2/users/<user_id>`. Contributed by @dklimpel. ([\#6910](https://github.com/matrix-org/synapse/issues/6910))
|
||||||
|
- Fix bug introduced in Synapse 1.11.0 which sometimes caused errors when joining rooms over federation, with `'coroutine' object has no attribute 'event_id'`. ([\#6996](https://github.com/matrix-org/synapse/issues/6996))
|
||||||
|
|
||||||
|
|
||||||
Synapse 1.11.0 (2020-02-21)
|
Synapse 1.11.0 (2020-02-21)
|
||||||
===========================
|
===========================
|
||||||
|
|
||||||
|
|
15
INSTALL.md
15
INSTALL.md
|
@ -124,12 +124,21 @@ sudo pacman -S base-devel python python-pip \
|
||||||
|
|
||||||
#### CentOS/Fedora
|
#### CentOS/Fedora
|
||||||
|
|
||||||
Installing prerequisites on CentOS 7 or Fedora 25:
|
Installing prerequisites on CentOS 8 or Fedora>26:
|
||||||
|
|
||||||
|
```
|
||||||
|
sudo dnf install libtiff-devel libjpeg-devel libzip-devel freetype-devel \
|
||||||
|
libwebp-devel tk-devel redhat-rpm-config \
|
||||||
|
python3-virtualenv libffi-devel openssl-devel
|
||||||
|
sudo dnf groupinstall "Development Tools"
|
||||||
|
```
|
||||||
|
|
||||||
|
Installing prerequisites on CentOS 7 or Fedora<=25:
|
||||||
|
|
||||||
```
|
```
|
||||||
sudo yum install libtiff-devel libjpeg-devel libzip-devel freetype-devel \
|
sudo yum install libtiff-devel libjpeg-devel libzip-devel freetype-devel \
|
||||||
lcms2-devel libwebp-devel tcl-devel tk-devel redhat-rpm-config \
|
lcms2-devel libwebp-devel tcl-devel tk-devel redhat-rpm-config \
|
||||||
python-virtualenv libffi-devel openssl-devel
|
python3-virtualenv libffi-devel openssl-devel
|
||||||
sudo yum groupinstall "Development Tools"
|
sudo yum groupinstall "Development Tools"
|
||||||
```
|
```
|
||||||
|
|
||||||
|
@ -418,7 +427,7 @@ so, you will need to edit `homeserver.yaml`, as follows:
|
||||||
for having Synapse automatically provision and renew federation
|
for having Synapse automatically provision and renew federation
|
||||||
certificates through ACME can be found at [ACME.md](docs/ACME.md).
|
certificates through ACME can be found at [ACME.md](docs/ACME.md).
|
||||||
Note that, as pointed out in that document, this feature will not
|
Note that, as pointed out in that document, this feature will not
|
||||||
work with installs set up after November 2020.
|
work with installs set up after November 2019.
|
||||||
|
|
||||||
If you are using your own certificate, be sure to use a `.pem` file that
|
If you are using your own certificate, be sure to use a `.pem` file that
|
||||||
includes the full certificate chain including any intermediate certificates
|
includes the full certificate chain including any intermediate certificates
|
||||||
|
|
|
@ -1 +0,0 @@
|
||||||
Expose the `synctl`, `hash_password` and `generate_config` commands in the snapcraft package. Contributed by @devec0.
|
|
|
@ -1 +0,0 @@
|
||||||
When a user's profile is updated via the admin API, also generate a displayname/avatar update for that user in each room.
|
|
|
@ -1 +0,0 @@
|
||||||
Add some clarifications to `README.md` in the database schema directory.
|
|
|
@ -1 +0,0 @@
|
||||||
Fixed set a user as an admin with the admin API `PUT /_synapse/admin/v2/users/<user_id>`. Contributed by @dklimpel.
|
|
|
@ -1 +0,0 @@
|
||||||
Stop sending m.room.aliases events during room creation and upgrade.
|
|
|
@ -1 +0,0 @@
|
||||||
Improve perf of v2 state res for large rooms.
|
|
|
@ -1 +0,0 @@
|
||||||
Reduce time spent doing GC by freezing objects on startup.
|
|
|
@ -1 +0,0 @@
|
||||||
Minor perf fixes to `get_auth_chain_ids`.
|
|
|
@ -1 +0,0 @@
|
||||||
Don't record remote cross-signing keys in the `devices` table.
|
|
|
@ -1 +0,0 @@
|
||||||
Use flake8-comprehensions to enforce good hygiene of list/set/dict comprehensions.
|
|
|
@ -1 +0,0 @@
|
||||||
Fix a couple of bugs in email configuration handling.
|
|
|
@ -1 +0,0 @@
|
||||||
Merge worker apps together.
|
|
|
@ -1 +0,0 @@
|
||||||
Publishing/removing a room from the room directory now requires the user to have a power level capable of modifying the canonical alias, instead of the room aliases.
|
|
|
@ -1 +0,0 @@
|
||||||
Synapse no longer uses room alias events to calculate room names for email notifications.
|
|
|
@ -1 +0,0 @@
|
||||||
Fix an issue affecting worker-based deployments where replication would stop working, necessitating a full restart, after joining a large room.
|
|
|
@ -1 +0,0 @@
|
||||||
Fix `duplicate key` error which was logged when rejoining a room over federation.
|
|
|
@ -1 +0,0 @@
|
||||||
The room list endpoint no longer returns a list of aliases.
|
|
|
@ -1 +0,0 @@
|
||||||
Remove redundant `store_room` call from `FederationHandler._process_received_pdu`.
|
|
|
@ -1 +0,0 @@
|
||||||
Check that server_name is correctly set before running database updates.
|
|
|
@ -1 +0,0 @@
|
||||||
Refactoring work in preparation for changing the event redaction algorithm.
|
|
|
@ -1 +0,0 @@
|
||||||
Fix `POSTGRES_INITDB_ARGS` in the `contrib/docker/docker-compose.yml` example docker-compose configuration.
|
|
|
@ -1 +0,0 @@
|
||||||
Update warning for incorrect database collation/ctype to include link to documentation.
|
|
|
@ -1 +0,0 @@
|
||||||
Add some type annotations to the database storage classes.
|
|
|
@ -1 +0,0 @@
|
||||||
Prevent user from setting 'deactivated' to anything other than a bool on the v2 PUT /users Admin API.
|
|
|
@ -1 +0,0 @@
|
||||||
Port `synapse.handlers.presence` to async/await.
|
|
|
@ -1 +0,0 @@
|
||||||
Fix bug which caused an error when joining a room, with `'coroutine' object has no attribute 'event_id'`.
|
|
|
@ -1 +0,0 @@
|
||||||
Merge worker apps together.
|
|
|
@ -1 +0,0 @@
|
||||||
Refactoring work in preparation for changing the event redaction algorithm.
|
|
|
@ -15,10 +15,9 @@ services:
|
||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
# See the readme for a full documentation of the environment settings
|
# See the readme for a full documentation of the environment settings
|
||||||
environment:
|
environment:
|
||||||
- SYNAPSE_CONFIG_PATH=/etc/homeserver.yaml
|
- SYNAPSE_CONFIG_PATH=/data/homeserver.yaml
|
||||||
volumes:
|
volumes:
|
||||||
# You may either store all the files in a local folder
|
# You may either store all the files in a local folder
|
||||||
- ./matrix-config/homeserver.yaml:/etc/homeserver.yaml
|
|
||||||
- ./files:/data
|
- ./files:/data
|
||||||
# .. or you may split this between different storage points
|
# .. or you may split this between different storage points
|
||||||
# - ./files:/data
|
# - ./files:/data
|
||||||
|
|
|
@ -1,6 +1,6 @@
|
||||||
# Using the Synapse Grafana dashboard
|
# Using the Synapse Grafana dashboard
|
||||||
|
|
||||||
0. Set up Prometheus and Grafana. Out of scope for this readme. Useful documentation about using Grafana with Prometheus: http://docs.grafana.org/features/datasources/prometheus/
|
0. Set up Prometheus and Grafana. Out of scope for this readme. Useful documentation about using Grafana with Prometheus: http://docs.grafana.org/features/datasources/prometheus/
|
||||||
1. Have your Prometheus scrape your Synapse. https://github.com/matrix-org/synapse/blob/master/docs/metrics-howto.rst
|
1. Have your Prometheus scrape your Synapse. https://github.com/matrix-org/synapse/blob/master/docs/metrics-howto.md
|
||||||
2. Import dashboard into Grafana. Download `synapse.json`. Import it to Grafana and select the correct Prometheus datasource. http://docs.grafana.org/reference/export_import/
|
2. Import dashboard into Grafana. Download `synapse.json`. Import it to Grafana and select the correct Prometheus datasource. http://docs.grafana.org/reference/export_import/
|
||||||
3. Set up additional recording rules
|
3. Set up additional recording rules
|
||||||
|
|
|
@ -18,7 +18,7 @@
|
||||||
"gnetId": null,
|
"gnetId": null,
|
||||||
"graphTooltip": 0,
|
"graphTooltip": 0,
|
||||||
"id": 1,
|
"id": 1,
|
||||||
"iteration": 1561447718159,
|
"iteration": 1584612489167,
|
||||||
"links": [
|
"links": [
|
||||||
{
|
{
|
||||||
"asDropdown": true,
|
"asDropdown": true,
|
||||||
|
@ -34,6 +34,7 @@
|
||||||
"panels": [
|
"panels": [
|
||||||
{
|
{
|
||||||
"collapsed": false,
|
"collapsed": false,
|
||||||
|
"datasource": null,
|
||||||
"gridPos": {
|
"gridPos": {
|
||||||
"h": 1,
|
"h": 1,
|
||||||
"w": 24,
|
"w": 24,
|
||||||
|
@ -52,12 +53,14 @@
|
||||||
"dashes": false,
|
"dashes": false,
|
||||||
"datasource": "$datasource",
|
"datasource": "$datasource",
|
||||||
"fill": 1,
|
"fill": 1,
|
||||||
|
"fillGradient": 0,
|
||||||
"gridPos": {
|
"gridPos": {
|
||||||
"h": 9,
|
"h": 9,
|
||||||
"w": 12,
|
"w": 12,
|
||||||
"x": 0,
|
"x": 0,
|
||||||
"y": 1
|
"y": 1
|
||||||
},
|
},
|
||||||
|
"hiddenSeries": false,
|
||||||
"id": 75,
|
"id": 75,
|
||||||
"legend": {
|
"legend": {
|
||||||
"avg": false,
|
"avg": false,
|
||||||
|
@ -72,7 +75,9 @@
|
||||||
"linewidth": 1,
|
"linewidth": 1,
|
||||||
"links": [],
|
"links": [],
|
||||||
"nullPointMode": "null",
|
"nullPointMode": "null",
|
||||||
"options": {},
|
"options": {
|
||||||
|
"dataLinks": []
|
||||||
|
},
|
||||||
"paceLength": 10,
|
"paceLength": 10,
|
||||||
"percentage": false,
|
"percentage": false,
|
||||||
"pointradius": 5,
|
"pointradius": 5,
|
||||||
|
@ -151,6 +156,7 @@
|
||||||
"editable": true,
|
"editable": true,
|
||||||
"error": false,
|
"error": false,
|
||||||
"fill": 1,
|
"fill": 1,
|
||||||
|
"fillGradient": 0,
|
||||||
"grid": {},
|
"grid": {},
|
||||||
"gridPos": {
|
"gridPos": {
|
||||||
"h": 9,
|
"h": 9,
|
||||||
|
@ -158,6 +164,7 @@
|
||||||
"x": 12,
|
"x": 12,
|
||||||
"y": 1
|
"y": 1
|
||||||
},
|
},
|
||||||
|
"hiddenSeries": false,
|
||||||
"id": 33,
|
"id": 33,
|
||||||
"legend": {
|
"legend": {
|
||||||
"avg": false,
|
"avg": false,
|
||||||
|
@ -172,7 +179,9 @@
|
||||||
"linewidth": 2,
|
"linewidth": 2,
|
||||||
"links": [],
|
"links": [],
|
||||||
"nullPointMode": "null",
|
"nullPointMode": "null",
|
||||||
"options": {},
|
"options": {
|
||||||
|
"dataLinks": []
|
||||||
|
},
|
||||||
"paceLength": 10,
|
"paceLength": 10,
|
||||||
"percentage": false,
|
"percentage": false,
|
||||||
"pointradius": 5,
|
"pointradius": 5,
|
||||||
|
@ -302,12 +311,14 @@
|
||||||
"dashes": false,
|
"dashes": false,
|
||||||
"datasource": "$datasource",
|
"datasource": "$datasource",
|
||||||
"fill": 0,
|
"fill": 0,
|
||||||
|
"fillGradient": 0,
|
||||||
"gridPos": {
|
"gridPos": {
|
||||||
"h": 9,
|
"h": 9,
|
||||||
"w": 12,
|
"w": 12,
|
||||||
"x": 12,
|
"x": 12,
|
||||||
"y": 10
|
"y": 10
|
||||||
},
|
},
|
||||||
|
"hiddenSeries": false,
|
||||||
"id": 107,
|
"id": 107,
|
||||||
"legend": {
|
"legend": {
|
||||||
"avg": false,
|
"avg": false,
|
||||||
|
@ -322,7 +333,9 @@
|
||||||
"linewidth": 1,
|
"linewidth": 1,
|
||||||
"links": [],
|
"links": [],
|
||||||
"nullPointMode": "null",
|
"nullPointMode": "null",
|
||||||
"options": {},
|
"options": {
|
||||||
|
"dataLinks": []
|
||||||
|
},
|
||||||
"paceLength": 10,
|
"paceLength": 10,
|
||||||
"percentage": false,
|
"percentage": false,
|
||||||
"pointradius": 5,
|
"pointradius": 5,
|
||||||
|
@ -425,12 +438,14 @@
|
||||||
"dashes": false,
|
"dashes": false,
|
||||||
"datasource": "$datasource",
|
"datasource": "$datasource",
|
||||||
"fill": 0,
|
"fill": 0,
|
||||||
|
"fillGradient": 0,
|
||||||
"gridPos": {
|
"gridPos": {
|
||||||
"h": 9,
|
"h": 9,
|
||||||
"w": 12,
|
"w": 12,
|
||||||
"x": 0,
|
"x": 0,
|
||||||
"y": 19
|
"y": 19
|
||||||
},
|
},
|
||||||
|
"hiddenSeries": false,
|
||||||
"id": 118,
|
"id": 118,
|
||||||
"legend": {
|
"legend": {
|
||||||
"avg": false,
|
"avg": false,
|
||||||
|
@ -445,7 +460,9 @@
|
||||||
"linewidth": 1,
|
"linewidth": 1,
|
||||||
"links": [],
|
"links": [],
|
||||||
"nullPointMode": "null",
|
"nullPointMode": "null",
|
||||||
"options": {},
|
"options": {
|
||||||
|
"dataLinks": []
|
||||||
|
},
|
||||||
"paceLength": 10,
|
"paceLength": 10,
|
||||||
"percentage": false,
|
"percentage": false,
|
||||||
"pointradius": 5,
|
"pointradius": 5,
|
||||||
|
@ -542,6 +559,7 @@
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"collapsed": true,
|
"collapsed": true,
|
||||||
|
"datasource": null,
|
||||||
"gridPos": {
|
"gridPos": {
|
||||||
"h": 1,
|
"h": 1,
|
||||||
"w": 24,
|
"w": 24,
|
||||||
|
@ -1361,6 +1379,7 @@
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"collapsed": true,
|
"collapsed": true,
|
||||||
|
"datasource": null,
|
||||||
"gridPos": {
|
"gridPos": {
|
||||||
"h": 1,
|
"h": 1,
|
||||||
"w": 24,
|
"w": 24,
|
||||||
|
@ -1732,6 +1751,7 @@
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"collapsed": true,
|
"collapsed": true,
|
||||||
|
"datasource": null,
|
||||||
"gridPos": {
|
"gridPos": {
|
||||||
"h": 1,
|
"h": 1,
|
||||||
"w": 24,
|
"w": 24,
|
||||||
|
@ -2439,6 +2459,7 @@
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"collapsed": true,
|
"collapsed": true,
|
||||||
|
"datasource": null,
|
||||||
"gridPos": {
|
"gridPos": {
|
||||||
"h": 1,
|
"h": 1,
|
||||||
"w": 24,
|
"w": 24,
|
||||||
|
@ -2635,6 +2656,7 @@
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"collapsed": true,
|
"collapsed": true,
|
||||||
|
"datasource": null,
|
||||||
"gridPos": {
|
"gridPos": {
|
||||||
"h": 1,
|
"h": 1,
|
||||||
"w": 24,
|
"w": 24,
|
||||||
|
@ -2650,11 +2672,12 @@
|
||||||
"dashes": false,
|
"dashes": false,
|
||||||
"datasource": "$datasource",
|
"datasource": "$datasource",
|
||||||
"fill": 1,
|
"fill": 1,
|
||||||
|
"fillGradient": 0,
|
||||||
"gridPos": {
|
"gridPos": {
|
||||||
"h": 9,
|
"h": 9,
|
||||||
"w": 12,
|
"w": 12,
|
||||||
"x": 0,
|
"x": 0,
|
||||||
"y": 61
|
"y": 33
|
||||||
},
|
},
|
||||||
"id": 79,
|
"id": 79,
|
||||||
"legend": {
|
"legend": {
|
||||||
|
@ -2670,6 +2693,9 @@
|
||||||
"linewidth": 1,
|
"linewidth": 1,
|
||||||
"links": [],
|
"links": [],
|
||||||
"nullPointMode": "null",
|
"nullPointMode": "null",
|
||||||
|
"options": {
|
||||||
|
"dataLinks": []
|
||||||
|
},
|
||||||
"paceLength": 10,
|
"paceLength": 10,
|
||||||
"percentage": false,
|
"percentage": false,
|
||||||
"pointradius": 5,
|
"pointradius": 5,
|
||||||
|
@ -2684,8 +2710,13 @@
|
||||||
"expr": "sum(rate(synapse_federation_client_sent_transactions{instance=\"$instance\"}[$bucket_size]))",
|
"expr": "sum(rate(synapse_federation_client_sent_transactions{instance=\"$instance\"}[$bucket_size]))",
|
||||||
"format": "time_series",
|
"format": "time_series",
|
||||||
"intervalFactor": 1,
|
"intervalFactor": 1,
|
||||||
"legendFormat": "txn rate",
|
"legendFormat": "successful txn rate",
|
||||||
"refId": "A"
|
"refId": "A"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"expr": "sum(rate(synapse_util_metrics_block_count{block_name=\"_send_new_transaction\",instance=\"$instance\"}[$bucket_size]) - ignoring (block_name) rate(synapse_federation_client_sent_transactions{instance=\"$instance\"}[$bucket_size]))",
|
||||||
|
"legendFormat": "failed txn rate",
|
||||||
|
"refId": "B"
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
"thresholds": [],
|
"thresholds": [],
|
||||||
|
@ -2736,11 +2767,12 @@
|
||||||
"dashes": false,
|
"dashes": false,
|
||||||
"datasource": "$datasource",
|
"datasource": "$datasource",
|
||||||
"fill": 1,
|
"fill": 1,
|
||||||
|
"fillGradient": 0,
|
||||||
"gridPos": {
|
"gridPos": {
|
||||||
"h": 9,
|
"h": 9,
|
||||||
"w": 12,
|
"w": 12,
|
||||||
"x": 12,
|
"x": 12,
|
||||||
"y": 61
|
"y": 33
|
||||||
},
|
},
|
||||||
"id": 83,
|
"id": 83,
|
||||||
"legend": {
|
"legend": {
|
||||||
|
@ -2756,6 +2788,9 @@
|
||||||
"linewidth": 1,
|
"linewidth": 1,
|
||||||
"links": [],
|
"links": [],
|
||||||
"nullPointMode": "null",
|
"nullPointMode": "null",
|
||||||
|
"options": {
|
||||||
|
"dataLinks": []
|
||||||
|
},
|
||||||
"paceLength": 10,
|
"paceLength": 10,
|
||||||
"percentage": false,
|
"percentage": false,
|
||||||
"pointradius": 5,
|
"pointradius": 5,
|
||||||
|
@ -2829,11 +2864,12 @@
|
||||||
"dashes": false,
|
"dashes": false,
|
||||||
"datasource": "$datasource",
|
"datasource": "$datasource",
|
||||||
"fill": 1,
|
"fill": 1,
|
||||||
|
"fillGradient": 0,
|
||||||
"gridPos": {
|
"gridPos": {
|
||||||
"h": 9,
|
"h": 9,
|
||||||
"w": 12,
|
"w": 12,
|
||||||
"x": 0,
|
"x": 0,
|
||||||
"y": 70
|
"y": 42
|
||||||
},
|
},
|
||||||
"id": 109,
|
"id": 109,
|
||||||
"legend": {
|
"legend": {
|
||||||
|
@ -2849,6 +2885,9 @@
|
||||||
"linewidth": 1,
|
"linewidth": 1,
|
||||||
"links": [],
|
"links": [],
|
||||||
"nullPointMode": "null",
|
"nullPointMode": "null",
|
||||||
|
"options": {
|
||||||
|
"dataLinks": []
|
||||||
|
},
|
||||||
"paceLength": 10,
|
"paceLength": 10,
|
||||||
"percentage": false,
|
"percentage": false,
|
||||||
"pointradius": 5,
|
"pointradius": 5,
|
||||||
|
@ -2923,11 +2962,12 @@
|
||||||
"dashes": false,
|
"dashes": false,
|
||||||
"datasource": "$datasource",
|
"datasource": "$datasource",
|
||||||
"fill": 1,
|
"fill": 1,
|
||||||
|
"fillGradient": 0,
|
||||||
"gridPos": {
|
"gridPos": {
|
||||||
"h": 9,
|
"h": 9,
|
||||||
"w": 12,
|
"w": 12,
|
||||||
"x": 12,
|
"x": 12,
|
||||||
"y": 70
|
"y": 42
|
||||||
},
|
},
|
||||||
"id": 111,
|
"id": 111,
|
||||||
"legend": {
|
"legend": {
|
||||||
|
@ -2943,6 +2983,9 @@
|
||||||
"linewidth": 1,
|
"linewidth": 1,
|
||||||
"links": [],
|
"links": [],
|
||||||
"nullPointMode": "null",
|
"nullPointMode": "null",
|
||||||
|
"options": {
|
||||||
|
"dataLinks": []
|
||||||
|
},
|
||||||
"paceLength": 10,
|
"paceLength": 10,
|
||||||
"percentage": false,
|
"percentage": false,
|
||||||
"pointradius": 5,
|
"pointradius": 5,
|
||||||
|
@ -3009,6 +3052,7 @@
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"collapsed": true,
|
"collapsed": true,
|
||||||
|
"datasource": null,
|
||||||
"gridPos": {
|
"gridPos": {
|
||||||
"h": 1,
|
"h": 1,
|
||||||
"w": 24,
|
"w": 24,
|
||||||
|
@ -3024,12 +3068,14 @@
|
||||||
"dashes": false,
|
"dashes": false,
|
||||||
"datasource": "$datasource",
|
"datasource": "$datasource",
|
||||||
"fill": 1,
|
"fill": 1,
|
||||||
|
"fillGradient": 0,
|
||||||
"gridPos": {
|
"gridPos": {
|
||||||
"h": 7,
|
"h": 8,
|
||||||
"w": 12,
|
"w": 12,
|
||||||
"x": 0,
|
"x": 0,
|
||||||
"y": 62
|
"y": 34
|
||||||
},
|
},
|
||||||
|
"hiddenSeries": false,
|
||||||
"id": 51,
|
"id": 51,
|
||||||
"legend": {
|
"legend": {
|
||||||
"avg": false,
|
"avg": false,
|
||||||
|
@ -3044,6 +3090,9 @@
|
||||||
"linewidth": 1,
|
"linewidth": 1,
|
||||||
"links": [],
|
"links": [],
|
||||||
"nullPointMode": "null",
|
"nullPointMode": "null",
|
||||||
|
"options": {
|
||||||
|
"dataLinks": []
|
||||||
|
},
|
||||||
"paceLength": 10,
|
"paceLength": 10,
|
||||||
"percentage": false,
|
"percentage": false,
|
||||||
"pointradius": 5,
|
"pointradius": 5,
|
||||||
|
@ -3112,6 +3161,95 @@
|
||||||
"align": false,
|
"align": false,
|
||||||
"alignLevel": null
|
"alignLevel": null
|
||||||
}
|
}
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"aliasColors": {},
|
||||||
|
"bars": false,
|
||||||
|
"dashLength": 10,
|
||||||
|
"dashes": false,
|
||||||
|
"datasource": "$datasource",
|
||||||
|
"description": "",
|
||||||
|
"fill": 1,
|
||||||
|
"fillGradient": 0,
|
||||||
|
"gridPos": {
|
||||||
|
"h": 8,
|
||||||
|
"w": 12,
|
||||||
|
"x": 12,
|
||||||
|
"y": 34
|
||||||
|
},
|
||||||
|
"hiddenSeries": false,
|
||||||
|
"id": 134,
|
||||||
|
"legend": {
|
||||||
|
"avg": false,
|
||||||
|
"current": false,
|
||||||
|
"hideZero": false,
|
||||||
|
"max": false,
|
||||||
|
"min": false,
|
||||||
|
"show": true,
|
||||||
|
"total": false,
|
||||||
|
"values": false
|
||||||
|
},
|
||||||
|
"lines": true,
|
||||||
|
"linewidth": 1,
|
||||||
|
"nullPointMode": "null",
|
||||||
|
"options": {
|
||||||
|
"dataLinks": []
|
||||||
|
},
|
||||||
|
"percentage": false,
|
||||||
|
"pointradius": 2,
|
||||||
|
"points": false,
|
||||||
|
"renderer": "flot",
|
||||||
|
"seriesOverrides": [],
|
||||||
|
"spaceLength": 10,
|
||||||
|
"stack": false,
|
||||||
|
"steppedLine": false,
|
||||||
|
"targets": [
|
||||||
|
{
|
||||||
|
"expr": "topk(10,synapse_pushers{job=~\"$job\",index=~\"$index\", instance=\"$instance\"})",
|
||||||
|
"legendFormat": "{{kind}} {{app_id}}",
|
||||||
|
"refId": "A"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"thresholds": [],
|
||||||
|
"timeFrom": null,
|
||||||
|
"timeRegions": [],
|
||||||
|
"timeShift": null,
|
||||||
|
"title": "Active pusher instances by app",
|
||||||
|
"tooltip": {
|
||||||
|
"shared": false,
|
||||||
|
"sort": 2,
|
||||||
|
"value_type": "individual"
|
||||||
|
},
|
||||||
|
"type": "graph",
|
||||||
|
"xaxis": {
|
||||||
|
"buckets": null,
|
||||||
|
"mode": "time",
|
||||||
|
"name": null,
|
||||||
|
"show": true,
|
||||||
|
"values": []
|
||||||
|
},
|
||||||
|
"yaxes": [
|
||||||
|
{
|
||||||
|
"format": "short",
|
||||||
|
"label": null,
|
||||||
|
"logBase": 1,
|
||||||
|
"max": null,
|
||||||
|
"min": null,
|
||||||
|
"show": true
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"format": "short",
|
||||||
|
"label": null,
|
||||||
|
"logBase": 1,
|
||||||
|
"max": null,
|
||||||
|
"min": null,
|
||||||
|
"show": true
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"yaxis": {
|
||||||
|
"align": false,
|
||||||
|
"alignLevel": null
|
||||||
|
}
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
"repeat": null,
|
"repeat": null,
|
||||||
|
@ -3120,6 +3258,7 @@
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"collapsed": true,
|
"collapsed": true,
|
||||||
|
"datasource": null,
|
||||||
"gridPos": {
|
"gridPos": {
|
||||||
"h": 1,
|
"h": 1,
|
||||||
"w": 24,
|
"w": 24,
|
||||||
|
@ -3523,6 +3662,7 @@
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"collapsed": true,
|
"collapsed": true,
|
||||||
|
"datasource": null,
|
||||||
"gridPos": {
|
"gridPos": {
|
||||||
"h": 1,
|
"h": 1,
|
||||||
"w": 24,
|
"w": 24,
|
||||||
|
@ -3540,6 +3680,7 @@
|
||||||
"editable": true,
|
"editable": true,
|
||||||
"error": false,
|
"error": false,
|
||||||
"fill": 1,
|
"fill": 1,
|
||||||
|
"fillGradient": 0,
|
||||||
"grid": {},
|
"grid": {},
|
||||||
"gridPos": {
|
"gridPos": {
|
||||||
"h": 13,
|
"h": 13,
|
||||||
|
@ -3562,6 +3703,9 @@
|
||||||
"linewidth": 2,
|
"linewidth": 2,
|
||||||
"links": [],
|
"links": [],
|
||||||
"nullPointMode": "null",
|
"nullPointMode": "null",
|
||||||
|
"options": {
|
||||||
|
"dataLinks": []
|
||||||
|
},
|
||||||
"paceLength": 10,
|
"paceLength": 10,
|
||||||
"percentage": false,
|
"percentage": false,
|
||||||
"pointradius": 5,
|
"pointradius": 5,
|
||||||
|
@ -3630,6 +3774,7 @@
|
||||||
"editable": true,
|
"editable": true,
|
||||||
"error": false,
|
"error": false,
|
||||||
"fill": 1,
|
"fill": 1,
|
||||||
|
"fillGradient": 0,
|
||||||
"grid": {},
|
"grid": {},
|
||||||
"gridPos": {
|
"gridPos": {
|
||||||
"h": 13,
|
"h": 13,
|
||||||
|
@ -3652,6 +3797,9 @@
|
||||||
"linewidth": 2,
|
"linewidth": 2,
|
||||||
"links": [],
|
"links": [],
|
||||||
"nullPointMode": "null",
|
"nullPointMode": "null",
|
||||||
|
"options": {
|
||||||
|
"dataLinks": []
|
||||||
|
},
|
||||||
"paceLength": 10,
|
"paceLength": 10,
|
||||||
"percentage": false,
|
"percentage": false,
|
||||||
"pointradius": 5,
|
"pointradius": 5,
|
||||||
|
@ -3720,6 +3868,7 @@
|
||||||
"editable": true,
|
"editable": true,
|
||||||
"error": false,
|
"error": false,
|
||||||
"fill": 1,
|
"fill": 1,
|
||||||
|
"fillGradient": 0,
|
||||||
"grid": {},
|
"grid": {},
|
||||||
"gridPos": {
|
"gridPos": {
|
||||||
"h": 13,
|
"h": 13,
|
||||||
|
@ -3742,6 +3891,9 @@
|
||||||
"linewidth": 2,
|
"linewidth": 2,
|
||||||
"links": [],
|
"links": [],
|
||||||
"nullPointMode": "null",
|
"nullPointMode": "null",
|
||||||
|
"options": {
|
||||||
|
"dataLinks": []
|
||||||
|
},
|
||||||
"paceLength": 10,
|
"paceLength": 10,
|
||||||
"percentage": false,
|
"percentage": false,
|
||||||
"pointradius": 5,
|
"pointradius": 5,
|
||||||
|
@ -3810,6 +3962,7 @@
|
||||||
"editable": true,
|
"editable": true,
|
||||||
"error": false,
|
"error": false,
|
||||||
"fill": 1,
|
"fill": 1,
|
||||||
|
"fillGradient": 0,
|
||||||
"grid": {},
|
"grid": {},
|
||||||
"gridPos": {
|
"gridPos": {
|
||||||
"h": 13,
|
"h": 13,
|
||||||
|
@ -3832,6 +3985,9 @@
|
||||||
"linewidth": 2,
|
"linewidth": 2,
|
||||||
"links": [],
|
"links": [],
|
||||||
"nullPointMode": "null",
|
"nullPointMode": "null",
|
||||||
|
"options": {
|
||||||
|
"dataLinks": []
|
||||||
|
},
|
||||||
"paceLength": 10,
|
"paceLength": 10,
|
||||||
"percentage": false,
|
"percentage": false,
|
||||||
"pointradius": 5,
|
"pointradius": 5,
|
||||||
|
@ -3921,6 +4077,7 @@
|
||||||
"linewidth": 2,
|
"linewidth": 2,
|
||||||
"links": [],
|
"links": [],
|
||||||
"nullPointMode": "null",
|
"nullPointMode": "null",
|
||||||
|
"options": {},
|
||||||
"paceLength": 10,
|
"paceLength": 10,
|
||||||
"percentage": false,
|
"percentage": false,
|
||||||
"pointradius": 5,
|
"pointradius": 5,
|
||||||
|
@ -4010,6 +4167,7 @@
|
||||||
"linewidth": 2,
|
"linewidth": 2,
|
||||||
"links": [],
|
"links": [],
|
||||||
"nullPointMode": "null",
|
"nullPointMode": "null",
|
||||||
|
"options": {},
|
||||||
"paceLength": 10,
|
"paceLength": 10,
|
||||||
"percentage": false,
|
"percentage": false,
|
||||||
"pointradius": 5,
|
"pointradius": 5,
|
||||||
|
@ -4076,6 +4234,7 @@
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"collapsed": true,
|
"collapsed": true,
|
||||||
|
"datasource": null,
|
||||||
"gridPos": {
|
"gridPos": {
|
||||||
"h": 1,
|
"h": 1,
|
||||||
"w": 24,
|
"w": 24,
|
||||||
|
@ -4540,6 +4699,7 @@
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"collapsed": true,
|
"collapsed": true,
|
||||||
|
"datasource": null,
|
||||||
"gridPos": {
|
"gridPos": {
|
||||||
"h": 1,
|
"h": 1,
|
||||||
"w": 24,
|
"w": 24,
|
||||||
|
@ -5060,6 +5220,7 @@
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"collapsed": true,
|
"collapsed": true,
|
||||||
|
"datasource": null,
|
||||||
"gridPos": {
|
"gridPos": {
|
||||||
"h": 1,
|
"h": 1,
|
||||||
"w": 24,
|
"w": 24,
|
||||||
|
@ -5079,7 +5240,7 @@
|
||||||
"h": 7,
|
"h": 7,
|
||||||
"w": 12,
|
"w": 12,
|
||||||
"x": 0,
|
"x": 0,
|
||||||
"y": 67
|
"y": 39
|
||||||
},
|
},
|
||||||
"id": 2,
|
"id": 2,
|
||||||
"legend": {
|
"legend": {
|
||||||
|
@ -5095,6 +5256,7 @@
|
||||||
"linewidth": 1,
|
"linewidth": 1,
|
||||||
"links": [],
|
"links": [],
|
||||||
"nullPointMode": "null",
|
"nullPointMode": "null",
|
||||||
|
"options": {},
|
||||||
"paceLength": 10,
|
"paceLength": 10,
|
||||||
"percentage": false,
|
"percentage": false,
|
||||||
"pointradius": 5,
|
"pointradius": 5,
|
||||||
|
@ -5198,7 +5360,7 @@
|
||||||
"h": 7,
|
"h": 7,
|
||||||
"w": 12,
|
"w": 12,
|
||||||
"x": 12,
|
"x": 12,
|
||||||
"y": 67
|
"y": 39
|
||||||
},
|
},
|
||||||
"id": 41,
|
"id": 41,
|
||||||
"legend": {
|
"legend": {
|
||||||
|
@ -5214,6 +5376,7 @@
|
||||||
"linewidth": 1,
|
"linewidth": 1,
|
||||||
"links": [],
|
"links": [],
|
||||||
"nullPointMode": "null",
|
"nullPointMode": "null",
|
||||||
|
"options": {},
|
||||||
"paceLength": 10,
|
"paceLength": 10,
|
||||||
"percentage": false,
|
"percentage": false,
|
||||||
"pointradius": 5,
|
"pointradius": 5,
|
||||||
|
@ -5286,7 +5449,7 @@
|
||||||
"h": 7,
|
"h": 7,
|
||||||
"w": 12,
|
"w": 12,
|
||||||
"x": 0,
|
"x": 0,
|
||||||
"y": 74
|
"y": 46
|
||||||
},
|
},
|
||||||
"id": 42,
|
"id": 42,
|
||||||
"legend": {
|
"legend": {
|
||||||
|
@ -5302,6 +5465,7 @@
|
||||||
"linewidth": 1,
|
"linewidth": 1,
|
||||||
"links": [],
|
"links": [],
|
||||||
"nullPointMode": "null",
|
"nullPointMode": "null",
|
||||||
|
"options": {},
|
||||||
"paceLength": 10,
|
"paceLength": 10,
|
||||||
"percentage": false,
|
"percentage": false,
|
||||||
"pointradius": 5,
|
"pointradius": 5,
|
||||||
|
@ -5373,7 +5537,7 @@
|
||||||
"h": 7,
|
"h": 7,
|
||||||
"w": 12,
|
"w": 12,
|
||||||
"x": 12,
|
"x": 12,
|
||||||
"y": 74
|
"y": 46
|
||||||
},
|
},
|
||||||
"id": 43,
|
"id": 43,
|
||||||
"legend": {
|
"legend": {
|
||||||
|
@ -5389,6 +5553,7 @@
|
||||||
"linewidth": 1,
|
"linewidth": 1,
|
||||||
"links": [],
|
"links": [],
|
||||||
"nullPointMode": "null",
|
"nullPointMode": "null",
|
||||||
|
"options": {},
|
||||||
"paceLength": 10,
|
"paceLength": 10,
|
||||||
"percentage": false,
|
"percentage": false,
|
||||||
"pointradius": 5,
|
"pointradius": 5,
|
||||||
|
@ -5460,7 +5625,7 @@
|
||||||
"h": 7,
|
"h": 7,
|
||||||
"w": 12,
|
"w": 12,
|
||||||
"x": 0,
|
"x": 0,
|
||||||
"y": 81
|
"y": 53
|
||||||
},
|
},
|
||||||
"id": 113,
|
"id": 113,
|
||||||
"legend": {
|
"legend": {
|
||||||
|
@ -5476,6 +5641,7 @@
|
||||||
"linewidth": 1,
|
"linewidth": 1,
|
||||||
"links": [],
|
"links": [],
|
||||||
"nullPointMode": "null",
|
"nullPointMode": "null",
|
||||||
|
"options": {},
|
||||||
"paceLength": 10,
|
"paceLength": 10,
|
||||||
"percentage": false,
|
"percentage": false,
|
||||||
"pointradius": 5,
|
"pointradius": 5,
|
||||||
|
@ -5546,7 +5712,7 @@
|
||||||
"h": 7,
|
"h": 7,
|
||||||
"w": 12,
|
"w": 12,
|
||||||
"x": 12,
|
"x": 12,
|
||||||
"y": 81
|
"y": 53
|
||||||
},
|
},
|
||||||
"id": 115,
|
"id": 115,
|
||||||
"legend": {
|
"legend": {
|
||||||
|
@ -5562,6 +5728,7 @@
|
||||||
"linewidth": 1,
|
"linewidth": 1,
|
||||||
"links": [],
|
"links": [],
|
||||||
"nullPointMode": "null",
|
"nullPointMode": "null",
|
||||||
|
"options": {},
|
||||||
"paceLength": 10,
|
"paceLength": 10,
|
||||||
"percentage": false,
|
"percentage": false,
|
||||||
"pointradius": 5,
|
"pointradius": 5,
|
||||||
|
@ -5573,7 +5740,7 @@
|
||||||
"steppedLine": false,
|
"steppedLine": false,
|
||||||
"targets": [
|
"targets": [
|
||||||
{
|
{
|
||||||
"expr": "rate(synapse_replication_tcp_protocol_close_reason{job=\"$job\",index=~\"$index\",instance=\"$instance\"}[$bucket_size])",
|
"expr": "rate(synapse_replication_tcp_protocol_close_reason{job=~\"$job\",index=~\"$index\",instance=\"$instance\"}[$bucket_size])",
|
||||||
"format": "time_series",
|
"format": "time_series",
|
||||||
"intervalFactor": 1,
|
"intervalFactor": 1,
|
||||||
"legendFormat": "{{job}}-{{index}} {{reason_type}}",
|
"legendFormat": "{{job}}-{{index}} {{reason_type}}",
|
||||||
|
@ -5628,6 +5795,7 @@
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"collapsed": true,
|
"collapsed": true,
|
||||||
|
"datasource": null,
|
||||||
"gridPos": {
|
"gridPos": {
|
||||||
"h": 1,
|
"h": 1,
|
||||||
"w": 24,
|
"w": 24,
|
||||||
|
@ -5643,11 +5811,12 @@
|
||||||
"dashes": false,
|
"dashes": false,
|
||||||
"datasource": "$datasource",
|
"datasource": "$datasource",
|
||||||
"fill": 1,
|
"fill": 1,
|
||||||
|
"fillGradient": 0,
|
||||||
"gridPos": {
|
"gridPos": {
|
||||||
"h": 9,
|
"h": 9,
|
||||||
"w": 12,
|
"w": 12,
|
||||||
"x": 0,
|
"x": 0,
|
||||||
"y": 13
|
"y": 40
|
||||||
},
|
},
|
||||||
"id": 67,
|
"id": 67,
|
||||||
"legend": {
|
"legend": {
|
||||||
|
@ -5663,7 +5832,9 @@
|
||||||
"linewidth": 1,
|
"linewidth": 1,
|
||||||
"links": [],
|
"links": [],
|
||||||
"nullPointMode": "connected",
|
"nullPointMode": "connected",
|
||||||
"options": {},
|
"options": {
|
||||||
|
"dataLinks": []
|
||||||
|
},
|
||||||
"paceLength": 10,
|
"paceLength": 10,
|
||||||
"percentage": false,
|
"percentage": false,
|
||||||
"pointradius": 5,
|
"pointradius": 5,
|
||||||
|
@ -5679,7 +5850,7 @@
|
||||||
"format": "time_series",
|
"format": "time_series",
|
||||||
"interval": "",
|
"interval": "",
|
||||||
"intervalFactor": 1,
|
"intervalFactor": 1,
|
||||||
"legendFormat": "{{job}}-{{index}} ",
|
"legendFormat": "{{job}}-{{index}} {{name}}",
|
||||||
"refId": "A"
|
"refId": "A"
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
|
@ -5731,11 +5902,12 @@
|
||||||
"dashes": false,
|
"dashes": false,
|
||||||
"datasource": "$datasource",
|
"datasource": "$datasource",
|
||||||
"fill": 1,
|
"fill": 1,
|
||||||
|
"fillGradient": 0,
|
||||||
"gridPos": {
|
"gridPos": {
|
||||||
"h": 9,
|
"h": 9,
|
||||||
"w": 12,
|
"w": 12,
|
||||||
"x": 12,
|
"x": 12,
|
||||||
"y": 13
|
"y": 40
|
||||||
},
|
},
|
||||||
"id": 71,
|
"id": 71,
|
||||||
"legend": {
|
"legend": {
|
||||||
|
@ -5751,7 +5923,9 @@
|
||||||
"linewidth": 1,
|
"linewidth": 1,
|
||||||
"links": [],
|
"links": [],
|
||||||
"nullPointMode": "connected",
|
"nullPointMode": "connected",
|
||||||
"options": {},
|
"options": {
|
||||||
|
"dataLinks": []
|
||||||
|
},
|
||||||
"paceLength": 10,
|
"paceLength": 10,
|
||||||
"percentage": false,
|
"percentage": false,
|
||||||
"pointradius": 5,
|
"pointradius": 5,
|
||||||
|
@ -5819,11 +5993,12 @@
|
||||||
"dashes": false,
|
"dashes": false,
|
||||||
"datasource": "$datasource",
|
"datasource": "$datasource",
|
||||||
"fill": 1,
|
"fill": 1,
|
||||||
|
"fillGradient": 0,
|
||||||
"gridPos": {
|
"gridPos": {
|
||||||
"h": 9,
|
"h": 9,
|
||||||
"w": 12,
|
"w": 12,
|
||||||
"x": 0,
|
"x": 0,
|
||||||
"y": 22
|
"y": 49
|
||||||
},
|
},
|
||||||
"id": 121,
|
"id": 121,
|
||||||
"interval": "",
|
"interval": "",
|
||||||
|
@ -5840,7 +6015,9 @@
|
||||||
"linewidth": 1,
|
"linewidth": 1,
|
||||||
"links": [],
|
"links": [],
|
||||||
"nullPointMode": "connected",
|
"nullPointMode": "connected",
|
||||||
"options": {},
|
"options": {
|
||||||
|
"dataLinks": []
|
||||||
|
},
|
||||||
"paceLength": 10,
|
"paceLength": 10,
|
||||||
"percentage": false,
|
"percentage": false,
|
||||||
"pointradius": 5,
|
"pointradius": 5,
|
||||||
|
@ -5909,6 +6086,7 @@
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"collapsed": true,
|
"collapsed": true,
|
||||||
|
"datasource": null,
|
||||||
"gridPos": {
|
"gridPos": {
|
||||||
"h": 1,
|
"h": 1,
|
||||||
"w": 24,
|
"w": 24,
|
||||||
|
@ -6607,7 +6785,7 @@
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
"refresh": "5m",
|
"refresh": "5m",
|
||||||
"schemaVersion": 18,
|
"schemaVersion": 22,
|
||||||
"style": "dark",
|
"style": "dark",
|
||||||
"tags": [
|
"tags": [
|
||||||
"matrix"
|
"matrix"
|
||||||
|
@ -6616,7 +6794,7 @@
|
||||||
"list": [
|
"list": [
|
||||||
{
|
{
|
||||||
"current": {
|
"current": {
|
||||||
"tags": [],
|
"selected": true,
|
||||||
"text": "Prometheus",
|
"text": "Prometheus",
|
||||||
"value": "Prometheus"
|
"value": "Prometheus"
|
||||||
},
|
},
|
||||||
|
@ -6638,6 +6816,7 @@
|
||||||
"auto_count": 100,
|
"auto_count": 100,
|
||||||
"auto_min": "30s",
|
"auto_min": "30s",
|
||||||
"current": {
|
"current": {
|
||||||
|
"selected": false,
|
||||||
"text": "auto",
|
"text": "auto",
|
||||||
"value": "$__auto_interval_bucket_size"
|
"value": "$__auto_interval_bucket_size"
|
||||||
},
|
},
|
||||||
|
@ -6719,9 +6898,9 @@
|
||||||
"allFormat": "regex wildcard",
|
"allFormat": "regex wildcard",
|
||||||
"allValue": "",
|
"allValue": "",
|
||||||
"current": {
|
"current": {
|
||||||
"text": "All",
|
"text": "synapse",
|
||||||
"value": [
|
"value": [
|
||||||
"$__all"
|
"synapse"
|
||||||
]
|
]
|
||||||
},
|
},
|
||||||
"datasource": "$datasource",
|
"datasource": "$datasource",
|
||||||
|
@ -6751,7 +6930,9 @@
|
||||||
"allValue": ".*",
|
"allValue": ".*",
|
||||||
"current": {
|
"current": {
|
||||||
"text": "All",
|
"text": "All",
|
||||||
"value": "$__all"
|
"value": [
|
||||||
|
"$__all"
|
||||||
|
]
|
||||||
},
|
},
|
||||||
"datasource": "$datasource",
|
"datasource": "$datasource",
|
||||||
"definition": "",
|
"definition": "",
|
||||||
|
@ -6810,5 +6991,5 @@
|
||||||
"timezone": "",
|
"timezone": "",
|
||||||
"title": "Synapse",
|
"title": "Synapse",
|
||||||
"uid": "000000012",
|
"uid": "000000012",
|
||||||
"version": 10
|
"version": 19
|
||||||
}
|
}
|
|
@ -1,3 +1,9 @@
|
||||||
|
matrix-synapse-py3 (1.11.1) stable; urgency=medium
|
||||||
|
|
||||||
|
* New synapse release 1.11.1.
|
||||||
|
|
||||||
|
-- Synapse Packaging team <packages@matrix.org> Tue, 03 Mar 2020 15:01:22 +0000
|
||||||
|
|
||||||
matrix-synapse-py3 (1.11.0) stable; urgency=medium
|
matrix-synapse-py3 (1.11.0) stable; urgency=medium
|
||||||
|
|
||||||
* New synapse release 1.11.0.
|
* New synapse release 1.11.0.
|
||||||
|
|
|
@ -38,6 +38,7 @@ The parameter ``threepids`` is optional.
|
||||||
The parameter ``avatar_url`` is optional.
|
The parameter ``avatar_url`` is optional.
|
||||||
The parameter ``admin`` is optional and defaults to 'false'.
|
The parameter ``admin`` is optional and defaults to 'false'.
|
||||||
The parameter ``deactivated`` is optional and defaults to 'false'.
|
The parameter ``deactivated`` is optional and defaults to 'false'.
|
||||||
|
The parameter ``password`` is optional. If provided the user's password is updated and all devices are logged out.
|
||||||
If the user already exists then optional parameters default to the current value.
|
If the user already exists then optional parameters default to the current value.
|
||||||
|
|
||||||
List Accounts
|
List Accounts
|
||||||
|
@ -168,11 +169,14 @@ with a body of:
|
||||||
.. code:: json
|
.. code:: json
|
||||||
|
|
||||||
{
|
{
|
||||||
"new_password": "<secret>"
|
"new_password": "<secret>",
|
||||||
|
"logout_devices": true,
|
||||||
}
|
}
|
||||||
|
|
||||||
including an ``access_token`` of a server admin.
|
including an ``access_token`` of a server admin.
|
||||||
|
|
||||||
|
The parameter ``new_password`` is required.
|
||||||
|
The parameter ``logout_devices`` is optional and defaults to ``true``.
|
||||||
|
|
||||||
Get whether a user is a server administrator or not
|
Get whether a user is a server administrator or not
|
||||||
===================================================
|
===================================================
|
||||||
|
|
|
@ -1347,6 +1347,25 @@ saml2_config:
|
||||||
#
|
#
|
||||||
#grandfathered_mxid_source_attribute: upn
|
#grandfathered_mxid_source_attribute: upn
|
||||||
|
|
||||||
|
# Directory in which Synapse will try to find the template files below.
|
||||||
|
# If not set, default templates from within the Synapse package will be used.
|
||||||
|
#
|
||||||
|
# DO NOT UNCOMMENT THIS SETTING unless you want to customise the templates.
|
||||||
|
# If you *do* uncomment it, you will need to make sure that all the templates
|
||||||
|
# below are in the directory.
|
||||||
|
#
|
||||||
|
# Synapse will look for the following templates in this directory:
|
||||||
|
#
|
||||||
|
# * HTML page to display to users if something goes wrong during the
|
||||||
|
# authentication process: 'saml_error.html'.
|
||||||
|
#
|
||||||
|
# This template doesn't currently need any variable to render.
|
||||||
|
#
|
||||||
|
# You can see the default templates at:
|
||||||
|
# https://github.com/matrix-org/synapse/tree/master/synapse/res/templates
|
||||||
|
#
|
||||||
|
#template_dir: "res/templates"
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
# Enable CAS for registration and login.
|
# Enable CAS for registration and login.
|
||||||
|
@ -1360,6 +1379,56 @@ saml2_config:
|
||||||
# # name: value
|
# # name: value
|
||||||
|
|
||||||
|
|
||||||
|
# Additional settings to use with single-sign on systems such as SAML2 and CAS.
|
||||||
|
#
|
||||||
|
sso:
|
||||||
|
# A list of client URLs which are whitelisted so that the user does not
|
||||||
|
# have to confirm giving access to their account to the URL. Any client
|
||||||
|
# whose URL starts with an entry in the following list will not be subject
|
||||||
|
# to an additional confirmation step after the SSO login is completed.
|
||||||
|
#
|
||||||
|
# WARNING: An entry such as "https://my.client" is insecure, because it
|
||||||
|
# will also match "https://my.client.evil.site", exposing your users to
|
||||||
|
# phishing attacks from evil.site. To avoid this, include a slash after the
|
||||||
|
# hostname: "https://my.client/".
|
||||||
|
#
|
||||||
|
# By default, this list is empty.
|
||||||
|
#
|
||||||
|
#client_whitelist:
|
||||||
|
# - https://riot.im/develop
|
||||||
|
# - https://my.custom.client/
|
||||||
|
|
||||||
|
# Directory in which Synapse will try to find the template files below.
|
||||||
|
# If not set, default templates from within the Synapse package will be used.
|
||||||
|
#
|
||||||
|
# DO NOT UNCOMMENT THIS SETTING unless you want to customise the templates.
|
||||||
|
# If you *do* uncomment it, you will need to make sure that all the templates
|
||||||
|
# below are in the directory.
|
||||||
|
#
|
||||||
|
# Synapse will look for the following templates in this directory:
|
||||||
|
#
|
||||||
|
# * HTML page for a confirmation step before redirecting back to the client
|
||||||
|
# with the login token: 'sso_redirect_confirm.html'.
|
||||||
|
#
|
||||||
|
# When rendering, this template is given three variables:
|
||||||
|
# * redirect_url: the URL the user is about to be redirected to. Needs
|
||||||
|
# manual escaping (see
|
||||||
|
# https://jinja.palletsprojects.com/en/2.11.x/templates/#html-escaping).
|
||||||
|
#
|
||||||
|
# * display_url: the same as `redirect_url`, but with the query
|
||||||
|
# parameters stripped. The intention is to have a
|
||||||
|
# human-readable URL to show to users, not to use it as
|
||||||
|
# the final address to redirect to. Needs manual escaping
|
||||||
|
# (see https://jinja.palletsprojects.com/en/2.11.x/templates/#html-escaping).
|
||||||
|
#
|
||||||
|
# * server_name: the homeserver's name.
|
||||||
|
#
|
||||||
|
# You can see the default templates at:
|
||||||
|
# https://github.com/matrix-org/synapse/tree/master/synapse/res/templates
|
||||||
|
#
|
||||||
|
#template_dir: "res/templates"
|
||||||
|
|
||||||
|
|
||||||
# The JWT needs to contain a globally unique "sub" (subject) claim.
|
# The JWT needs to contain a globally unique "sub" (subject) claim.
|
||||||
#
|
#
|
||||||
#jwt_config:
|
#jwt_config:
|
||||||
|
|
|
@ -273,6 +273,7 @@ Additionally, the following REST endpoints can be handled, but all requests must
|
||||||
be routed to the same instance:
|
be routed to the same instance:
|
||||||
|
|
||||||
^/_matrix/client/(r0|unstable)/register$
|
^/_matrix/client/(r0|unstable)/register$
|
||||||
|
^/_matrix/client/(r0|unstable)/auth/.*/fallback/web$
|
||||||
|
|
||||||
Pagination requests can also be handled, but all requests with the same path
|
Pagination requests can also be handled, but all requests with the same path
|
||||||
room must be routed to the same instance. Additionally, care must be taken to
|
room must be routed to the same instance. Additionally, care must be taken to
|
||||||
|
|
|
@ -36,7 +36,7 @@ try:
|
||||||
except ImportError:
|
except ImportError:
|
||||||
pass
|
pass
|
||||||
|
|
||||||
__version__ = "1.11.0"
|
__version__ = "1.12.0rc1"
|
||||||
|
|
||||||
if bool(os.environ.get("SYNAPSE_TEST_PATCH_LOG_CONTEXTS", False)):
|
if bool(os.environ.get("SYNAPSE_TEST_PATCH_LOG_CONTEXTS", False)):
|
||||||
# We import here so that we don't have to install a bunch of deps when
|
# We import here so that we don't have to install a bunch of deps when
|
||||||
|
|
|
@ -539,7 +539,7 @@ class Auth(object):
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def check_can_change_room_list(self, room_id: str, user: UserID):
|
def check_can_change_room_list(self, room_id: str, user: UserID):
|
||||||
"""Check if the user is allowed to edit the room's entry in the
|
"""Determine whether the user is allowed to edit the room's entry in the
|
||||||
published room list.
|
published room list.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
|
@ -570,12 +570,7 @@ class Auth(object):
|
||||||
)
|
)
|
||||||
user_level = event_auth.get_user_power_level(user_id, auth_events)
|
user_level = event_auth.get_user_power_level(user_id, auth_events)
|
||||||
|
|
||||||
if user_level < send_level:
|
return user_level >= send_level
|
||||||
raise AuthError(
|
|
||||||
403,
|
|
||||||
"This server requires you to be a moderator in the room to"
|
|
||||||
" edit its room list entry",
|
|
||||||
)
|
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def has_access_token(request):
|
def has_access_token(request):
|
||||||
|
|
|
@ -66,6 +66,7 @@ class Codes(object):
|
||||||
EXPIRED_ACCOUNT = "ORG_MATRIX_EXPIRED_ACCOUNT"
|
EXPIRED_ACCOUNT = "ORG_MATRIX_EXPIRED_ACCOUNT"
|
||||||
INVALID_SIGNATURE = "M_INVALID_SIGNATURE"
|
INVALID_SIGNATURE = "M_INVALID_SIGNATURE"
|
||||||
USER_DEACTIVATED = "M_USER_DEACTIVATED"
|
USER_DEACTIVATED = "M_USER_DEACTIVATED"
|
||||||
|
BAD_ALIAS = "M_BAD_ALIAS"
|
||||||
|
|
||||||
|
|
||||||
class CodeMessageException(RuntimeError):
|
class CodeMessageException(RuntimeError):
|
||||||
|
|
|
@ -57,7 +57,7 @@ class RoomVersion(object):
|
||||||
state_res = attr.ib() # int; one of the StateResolutionVersions
|
state_res = attr.ib() # int; one of the StateResolutionVersions
|
||||||
enforce_key_validity = attr.ib() # bool
|
enforce_key_validity = attr.ib() # bool
|
||||||
|
|
||||||
# bool: before MSC2260, anyone was allowed to send an aliases event
|
# bool: before MSC2261/MSC2432, m.room.aliases had special auth rules and redaction rules
|
||||||
special_case_aliases_auth = attr.ib(type=bool, default=False)
|
special_case_aliases_auth = attr.ib(type=bool, default=False)
|
||||||
|
|
||||||
|
|
||||||
|
@ -102,12 +102,13 @@ class RoomVersions(object):
|
||||||
enforce_key_validity=True,
|
enforce_key_validity=True,
|
||||||
special_case_aliases_auth=True,
|
special_case_aliases_auth=True,
|
||||||
)
|
)
|
||||||
MSC2260_DEV = RoomVersion(
|
MSC2432_DEV = RoomVersion(
|
||||||
"org.matrix.msc2260",
|
"org.matrix.msc2432",
|
||||||
RoomDisposition.UNSTABLE,
|
RoomDisposition.UNSTABLE,
|
||||||
EventFormatVersions.V3,
|
EventFormatVersions.V3,
|
||||||
StateResolutionVersions.V2,
|
StateResolutionVersions.V2,
|
||||||
enforce_key_validity=True,
|
enforce_key_validity=True,
|
||||||
|
special_case_aliases_auth=False,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@ -119,6 +120,6 @@ KNOWN_ROOM_VERSIONS = {
|
||||||
RoomVersions.V3,
|
RoomVersions.V3,
|
||||||
RoomVersions.V4,
|
RoomVersions.V4,
|
||||||
RoomVersions.V5,
|
RoomVersions.V5,
|
||||||
RoomVersions.MSC2260_DEV,
|
RoomVersions.MSC2432_DEV,
|
||||||
)
|
)
|
||||||
} # type: Dict[str, RoomVersion]
|
} # type: Dict[str, RoomVersion]
|
||||||
|
|
|
@ -276,6 +276,7 @@ def start(hs, listeners=None):
|
||||||
# It is now safe to start your Synapse.
|
# It is now safe to start your Synapse.
|
||||||
hs.start_listening(listeners)
|
hs.start_listening(listeners)
|
||||||
hs.get_datastore().db.start_profiling()
|
hs.get_datastore().db.start_profiling()
|
||||||
|
hs.get_pusherpool().start()
|
||||||
|
|
||||||
setup_sentry(hs)
|
setup_sentry(hs)
|
||||||
setup_sdnotify(hs)
|
setup_sdnotify(hs)
|
||||||
|
|
|
@ -676,8 +676,9 @@ class GenericWorkerReplicationHandler(ReplicationClientHandler):
|
||||||
elif stream_name == "device_lists":
|
elif stream_name == "device_lists":
|
||||||
all_room_ids = set()
|
all_room_ids = set()
|
||||||
for row in rows:
|
for row in rows:
|
||||||
room_ids = await self.store.get_rooms_for_user(row.user_id)
|
if row.entity.startswith("@"):
|
||||||
all_room_ids.update(room_ids)
|
room_ids = await self.store.get_rooms_for_user(row.entity)
|
||||||
|
all_room_ids.update(room_ids)
|
||||||
self.notifier.on_new_event("device_list_key", token, rooms=all_room_ids)
|
self.notifier.on_new_event("device_list_key", token, rooms=all_room_ids)
|
||||||
elif stream_name == "presence":
|
elif stream_name == "presence":
|
||||||
await self.presence_handler.process_replication_rows(token, rows)
|
await self.presence_handler.process_replication_rows(token, rows)
|
||||||
|
@ -774,6 +775,9 @@ class FederationSenderHandler(object):
|
||||||
|
|
||||||
# ... as well as device updates and messages
|
# ... as well as device updates and messages
|
||||||
elif stream_name == DeviceListsStream.NAME:
|
elif stream_name == DeviceListsStream.NAME:
|
||||||
|
# The entities are either user IDs (starting with '@') whose devices
|
||||||
|
# have changed, or remote servers that we need to tell about
|
||||||
|
# changes.
|
||||||
hosts = {row.entity for row in rows if not row.entity.startswith("@")}
|
hosts = {row.entity for row in rows if not row.entity.startswith("@")}
|
||||||
for host in hosts:
|
for host in hosts:
|
||||||
self.federation_sender.send_device_messages(host)
|
self.federation_sender.send_device_messages(host)
|
||||||
|
|
|
@ -298,6 +298,11 @@ class SynapseHomeServer(HomeServer):
|
||||||
|
|
||||||
# Gauges to expose monthly active user control metrics
|
# Gauges to expose monthly active user control metrics
|
||||||
current_mau_gauge = Gauge("synapse_admin_mau:current", "Current MAU")
|
current_mau_gauge = Gauge("synapse_admin_mau:current", "Current MAU")
|
||||||
|
current_mau_by_service_gauge = Gauge(
|
||||||
|
"synapse_admin_mau_current_mau_by_service",
|
||||||
|
"Current MAU by service",
|
||||||
|
["app_service"],
|
||||||
|
)
|
||||||
max_mau_gauge = Gauge("synapse_admin_mau:max", "MAU Limit")
|
max_mau_gauge = Gauge("synapse_admin_mau:max", "MAU Limit")
|
||||||
registered_reserved_users_mau_gauge = Gauge(
|
registered_reserved_users_mau_gauge = Gauge(
|
||||||
"synapse_admin_mau:registered_reserved_users",
|
"synapse_admin_mau:registered_reserved_users",
|
||||||
|
@ -403,7 +408,6 @@ def setup(config_options):
|
||||||
|
|
||||||
_base.start(hs, config.listeners)
|
_base.start(hs, config.listeners)
|
||||||
|
|
||||||
hs.get_pusherpool().start()
|
|
||||||
hs.get_datastore().db.updates.start_doing_background_updates()
|
hs.get_datastore().db.updates.start_doing_background_updates()
|
||||||
except Exception:
|
except Exception:
|
||||||
# Print the exception and bail out.
|
# Print the exception and bail out.
|
||||||
|
@ -585,12 +589,20 @@ def run(hs):
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def generate_monthly_active_users():
|
def generate_monthly_active_users():
|
||||||
current_mau_count = 0
|
current_mau_count = 0
|
||||||
|
current_mau_count_by_service = {}
|
||||||
reserved_users = ()
|
reserved_users = ()
|
||||||
store = hs.get_datastore()
|
store = hs.get_datastore()
|
||||||
if hs.config.limit_usage_by_mau or hs.config.mau_stats_only:
|
if hs.config.limit_usage_by_mau or hs.config.mau_stats_only:
|
||||||
current_mau_count = yield store.get_monthly_active_count()
|
current_mau_count = yield store.get_monthly_active_count()
|
||||||
|
current_mau_count_by_service = (
|
||||||
|
yield store.get_monthly_active_count_by_service()
|
||||||
|
)
|
||||||
reserved_users = yield store.get_registered_reserved_users()
|
reserved_users = yield store.get_registered_reserved_users()
|
||||||
current_mau_gauge.set(float(current_mau_count))
|
current_mau_gauge.set(float(current_mau_count))
|
||||||
|
|
||||||
|
for app_service, count in current_mau_count_by_service.items():
|
||||||
|
current_mau_by_service_gauge.labels(app_service).set(float(count))
|
||||||
|
|
||||||
registered_reserved_users_mau_gauge.set(float(len(reserved_users)))
|
registered_reserved_users_mau_gauge.set(float(len(reserved_users)))
|
||||||
max_mau_gauge.set(float(hs.config.max_mau_value))
|
max_mau_gauge.set(float(hs.config.max_mau_value))
|
||||||
|
|
||||||
|
|
|
@ -24,6 +24,7 @@ from synapse.config import (
|
||||||
server,
|
server,
|
||||||
server_notices_config,
|
server_notices_config,
|
||||||
spam_checker,
|
spam_checker,
|
||||||
|
sso,
|
||||||
stats,
|
stats,
|
||||||
third_party_event_rules,
|
third_party_event_rules,
|
||||||
tls,
|
tls,
|
||||||
|
@ -57,6 +58,7 @@ class RootConfig:
|
||||||
key: key.KeyConfig
|
key: key.KeyConfig
|
||||||
saml2: saml2_config.SAML2Config
|
saml2: saml2_config.SAML2Config
|
||||||
cas: cas.CasConfig
|
cas: cas.CasConfig
|
||||||
|
sso: sso.SSOConfig
|
||||||
jwt: jwt_config.JWTConfig
|
jwt: jwt_config.JWTConfig
|
||||||
password: password.PasswordConfig
|
password: password.PasswordConfig
|
||||||
email: emailconfig.EmailConfig
|
email: emailconfig.EmailConfig
|
||||||
|
|
|
@ -38,6 +38,7 @@ from .saml2_config import SAML2Config
|
||||||
from .server import ServerConfig
|
from .server import ServerConfig
|
||||||
from .server_notices_config import ServerNoticesConfig
|
from .server_notices_config import ServerNoticesConfig
|
||||||
from .spam_checker import SpamCheckerConfig
|
from .spam_checker import SpamCheckerConfig
|
||||||
|
from .sso import SSOConfig
|
||||||
from .stats import StatsConfig
|
from .stats import StatsConfig
|
||||||
from .third_party_event_rules import ThirdPartyRulesConfig
|
from .third_party_event_rules import ThirdPartyRulesConfig
|
||||||
from .tls import TlsConfig
|
from .tls import TlsConfig
|
||||||
|
@ -65,6 +66,7 @@ class HomeServerConfig(RootConfig):
|
||||||
KeyConfig,
|
KeyConfig,
|
||||||
SAML2Config,
|
SAML2Config,
|
||||||
CasConfig,
|
CasConfig,
|
||||||
|
SSOConfig,
|
||||||
JWTConfig,
|
JWTConfig,
|
||||||
PasswordConfig,
|
PasswordConfig,
|
||||||
EmailConfig,
|
EmailConfig,
|
||||||
|
|
|
@ -15,6 +15,9 @@
|
||||||
# limitations under the License.
|
# limitations under the License.
|
||||||
|
|
||||||
import logging
|
import logging
|
||||||
|
import os
|
||||||
|
|
||||||
|
import pkg_resources
|
||||||
|
|
||||||
from synapse.python_dependencies import DependencyException, check_requirements
|
from synapse.python_dependencies import DependencyException, check_requirements
|
||||||
from synapse.util.module_loader import load_module, load_python_module
|
from synapse.util.module_loader import load_module, load_python_module
|
||||||
|
@ -160,6 +163,14 @@ class SAML2Config(Config):
|
||||||
saml2_config.get("saml_session_lifetime", "5m")
|
saml2_config.get("saml_session_lifetime", "5m")
|
||||||
)
|
)
|
||||||
|
|
||||||
|
template_dir = saml2_config.get("template_dir")
|
||||||
|
if not template_dir:
|
||||||
|
template_dir = pkg_resources.resource_filename("synapse", "res/templates",)
|
||||||
|
|
||||||
|
self.saml2_error_html_content = self.read_file(
|
||||||
|
os.path.join(template_dir, "saml_error.html"), "saml2_config.saml_error",
|
||||||
|
)
|
||||||
|
|
||||||
def _default_saml_config_dict(
|
def _default_saml_config_dict(
|
||||||
self, required_attributes: set, optional_attributes: set
|
self, required_attributes: set, optional_attributes: set
|
||||||
):
|
):
|
||||||
|
@ -325,6 +336,25 @@ class SAML2Config(Config):
|
||||||
# The default is 'uid'.
|
# The default is 'uid'.
|
||||||
#
|
#
|
||||||
#grandfathered_mxid_source_attribute: upn
|
#grandfathered_mxid_source_attribute: upn
|
||||||
|
|
||||||
|
# Directory in which Synapse will try to find the template files below.
|
||||||
|
# If not set, default templates from within the Synapse package will be used.
|
||||||
|
#
|
||||||
|
# DO NOT UNCOMMENT THIS SETTING unless you want to customise the templates.
|
||||||
|
# If you *do* uncomment it, you will need to make sure that all the templates
|
||||||
|
# below are in the directory.
|
||||||
|
#
|
||||||
|
# Synapse will look for the following templates in this directory:
|
||||||
|
#
|
||||||
|
# * HTML page to display to users if something goes wrong during the
|
||||||
|
# authentication process: 'saml_error.html'.
|
||||||
|
#
|
||||||
|
# This template doesn't currently need any variable to render.
|
||||||
|
#
|
||||||
|
# You can see the default templates at:
|
||||||
|
# https://github.com/matrix-org/synapse/tree/master/synapse/res/templates
|
||||||
|
#
|
||||||
|
#template_dir: "res/templates"
|
||||||
""" % {
|
""" % {
|
||||||
"config_dir_path": config_dir_path
|
"config_dir_path": config_dir_path
|
||||||
}
|
}
|
||||||
|
|
|
@ -0,0 +1,92 @@
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
# Copyright 2020 The Matrix.org Foundation C.I.C.
|
||||||
|
#
|
||||||
|
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
# you may not use this file except in compliance with the License.
|
||||||
|
# You may obtain a copy of the License at
|
||||||
|
#
|
||||||
|
# http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
#
|
||||||
|
# Unless required by applicable law or agreed to in writing, software
|
||||||
|
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
# See the License for the specific language governing permissions and
|
||||||
|
# limitations under the License.
|
||||||
|
from typing import Any, Dict
|
||||||
|
|
||||||
|
import pkg_resources
|
||||||
|
|
||||||
|
from ._base import Config
|
||||||
|
|
||||||
|
|
||||||
|
class SSOConfig(Config):
|
||||||
|
"""SSO Configuration
|
||||||
|
"""
|
||||||
|
|
||||||
|
section = "sso"
|
||||||
|
|
||||||
|
def read_config(self, config, **kwargs):
|
||||||
|
sso_config = config.get("sso") or {} # type: Dict[str, Any]
|
||||||
|
|
||||||
|
# Pick a template directory in order of:
|
||||||
|
# * The sso-specific template_dir
|
||||||
|
# * /path/to/synapse/install/res/templates
|
||||||
|
template_dir = sso_config.get("template_dir")
|
||||||
|
if not template_dir:
|
||||||
|
template_dir = pkg_resources.resource_filename("synapse", "res/templates",)
|
||||||
|
|
||||||
|
self.sso_redirect_confirm_template_dir = template_dir
|
||||||
|
|
||||||
|
self.sso_client_whitelist = sso_config.get("client_whitelist") or []
|
||||||
|
|
||||||
|
def generate_config_section(self, **kwargs):
|
||||||
|
return """\
|
||||||
|
# Additional settings to use with single-sign on systems such as SAML2 and CAS.
|
||||||
|
#
|
||||||
|
sso:
|
||||||
|
# A list of client URLs which are whitelisted so that the user does not
|
||||||
|
# have to confirm giving access to their account to the URL. Any client
|
||||||
|
# whose URL starts with an entry in the following list will not be subject
|
||||||
|
# to an additional confirmation step after the SSO login is completed.
|
||||||
|
#
|
||||||
|
# WARNING: An entry such as "https://my.client" is insecure, because it
|
||||||
|
# will also match "https://my.client.evil.site", exposing your users to
|
||||||
|
# phishing attacks from evil.site. To avoid this, include a slash after the
|
||||||
|
# hostname: "https://my.client/".
|
||||||
|
#
|
||||||
|
# By default, this list is empty.
|
||||||
|
#
|
||||||
|
#client_whitelist:
|
||||||
|
# - https://riot.im/develop
|
||||||
|
# - https://my.custom.client/
|
||||||
|
|
||||||
|
# Directory in which Synapse will try to find the template files below.
|
||||||
|
# If not set, default templates from within the Synapse package will be used.
|
||||||
|
#
|
||||||
|
# DO NOT UNCOMMENT THIS SETTING unless you want to customise the templates.
|
||||||
|
# If you *do* uncomment it, you will need to make sure that all the templates
|
||||||
|
# below are in the directory.
|
||||||
|
#
|
||||||
|
# Synapse will look for the following templates in this directory:
|
||||||
|
#
|
||||||
|
# * HTML page for a confirmation step before redirecting back to the client
|
||||||
|
# with the login token: 'sso_redirect_confirm.html'.
|
||||||
|
#
|
||||||
|
# When rendering, this template is given three variables:
|
||||||
|
# * redirect_url: the URL the user is about to be redirected to. Needs
|
||||||
|
# manual escaping (see
|
||||||
|
# https://jinja.palletsprojects.com/en/2.11.x/templates/#html-escaping).
|
||||||
|
#
|
||||||
|
# * display_url: the same as `redirect_url`, but with the query
|
||||||
|
# parameters stripped. The intention is to have a
|
||||||
|
# human-readable URL to show to users, not to use it as
|
||||||
|
# the final address to redirect to. Needs manual escaping
|
||||||
|
# (see https://jinja.palletsprojects.com/en/2.11.x/templates/#html-escaping).
|
||||||
|
#
|
||||||
|
# * server_name: the homeserver's name.
|
||||||
|
#
|
||||||
|
# You can see the default templates at:
|
||||||
|
# https://github.com/matrix-org/synapse/tree/master/synapse/res/templates
|
||||||
|
#
|
||||||
|
#template_dir: "res/templates"
|
||||||
|
"""
|
|
@ -75,7 +75,7 @@ class ServerContextFactory(ContextFactory):
|
||||||
|
|
||||||
|
|
||||||
@implementer(IPolicyForHTTPS)
|
@implementer(IPolicyForHTTPS)
|
||||||
class ClientTLSOptionsFactory(object):
|
class FederationPolicyForHTTPS(object):
|
||||||
"""Factory for Twisted SSLClientConnectionCreators that are used to make connections
|
"""Factory for Twisted SSLClientConnectionCreators that are used to make connections
|
||||||
to remote servers for federation.
|
to remote servers for federation.
|
||||||
|
|
||||||
|
@ -103,15 +103,15 @@ class ClientTLSOptionsFactory(object):
|
||||||
# let us do).
|
# let us do).
|
||||||
minTLS = _TLS_VERSION_MAP[config.federation_client_minimum_tls_version]
|
minTLS = _TLS_VERSION_MAP[config.federation_client_minimum_tls_version]
|
||||||
|
|
||||||
self._verify_ssl = CertificateOptions(
|
_verify_ssl = CertificateOptions(
|
||||||
trustRoot=trust_root, insecurelyLowerMinimumTo=minTLS
|
trustRoot=trust_root, insecurelyLowerMinimumTo=minTLS
|
||||||
)
|
)
|
||||||
self._verify_ssl_context = self._verify_ssl.getContext()
|
self._verify_ssl_context = _verify_ssl.getContext()
|
||||||
self._verify_ssl_context.set_info_callback(self._context_info_cb)
|
self._verify_ssl_context.set_info_callback(_context_info_cb)
|
||||||
|
|
||||||
self._no_verify_ssl = CertificateOptions(insecurelyLowerMinimumTo=minTLS)
|
_no_verify_ssl = CertificateOptions(insecurelyLowerMinimumTo=minTLS)
|
||||||
self._no_verify_ssl_context = self._no_verify_ssl.getContext()
|
self._no_verify_ssl_context = _no_verify_ssl.getContext()
|
||||||
self._no_verify_ssl_context.set_info_callback(self._context_info_cb)
|
self._no_verify_ssl_context.set_info_callback(_context_info_cb)
|
||||||
|
|
||||||
def get_options(self, host: bytes):
|
def get_options(self, host: bytes):
|
||||||
|
|
||||||
|
@ -136,23 +136,6 @@ class ClientTLSOptionsFactory(object):
|
||||||
|
|
||||||
return SSLClientConnectionCreator(host, ssl_context, should_verify)
|
return SSLClientConnectionCreator(host, ssl_context, should_verify)
|
||||||
|
|
||||||
@staticmethod
|
|
||||||
def _context_info_cb(ssl_connection, where, ret):
|
|
||||||
"""The 'information callback' for our openssl context object."""
|
|
||||||
# we assume that the app_data on the connection object has been set to
|
|
||||||
# a TLSMemoryBIOProtocol object. (This is done by SSLClientConnectionCreator)
|
|
||||||
tls_protocol = ssl_connection.get_app_data()
|
|
||||||
try:
|
|
||||||
# ... we further assume that SSLClientConnectionCreator has set the
|
|
||||||
# '_synapse_tls_verifier' attribute to a ConnectionVerifier object.
|
|
||||||
tls_protocol._synapse_tls_verifier.verify_context_info_cb(
|
|
||||||
ssl_connection, where
|
|
||||||
)
|
|
||||||
except: # noqa: E722, taken from the twisted implementation
|
|
||||||
logger.exception("Error during info_callback")
|
|
||||||
f = Failure()
|
|
||||||
tls_protocol.failVerification(f)
|
|
||||||
|
|
||||||
def creatorForNetloc(self, hostname, port):
|
def creatorForNetloc(self, hostname, port):
|
||||||
"""Implements the IPolicyForHTTPS interace so that this can be passed
|
"""Implements the IPolicyForHTTPS interace so that this can be passed
|
||||||
directly to agents.
|
directly to agents.
|
||||||
|
@ -160,6 +143,43 @@ class ClientTLSOptionsFactory(object):
|
||||||
return self.get_options(hostname)
|
return self.get_options(hostname)
|
||||||
|
|
||||||
|
|
||||||
|
@implementer(IPolicyForHTTPS)
|
||||||
|
class RegularPolicyForHTTPS(object):
|
||||||
|
"""Factory for Twisted SSLClientConnectionCreators that are used to make connections
|
||||||
|
to remote servers, for other than federation.
|
||||||
|
|
||||||
|
Always uses the same OpenSSL context object, which uses the default OpenSSL CA
|
||||||
|
trust root.
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
trust_root = platformTrust()
|
||||||
|
self._ssl_context = CertificateOptions(trustRoot=trust_root).getContext()
|
||||||
|
self._ssl_context.set_info_callback(_context_info_cb)
|
||||||
|
|
||||||
|
def creatorForNetloc(self, hostname, port):
|
||||||
|
return SSLClientConnectionCreator(hostname, self._ssl_context, True)
|
||||||
|
|
||||||
|
|
||||||
|
def _context_info_cb(ssl_connection, where, ret):
|
||||||
|
"""The 'information callback' for our openssl context objects.
|
||||||
|
|
||||||
|
Note: Once this is set as the info callback on a Context object, the Context should
|
||||||
|
only be used with the SSLClientConnectionCreator.
|
||||||
|
"""
|
||||||
|
# we assume that the app_data on the connection object has been set to
|
||||||
|
# a TLSMemoryBIOProtocol object. (This is done by SSLClientConnectionCreator)
|
||||||
|
tls_protocol = ssl_connection.get_app_data()
|
||||||
|
try:
|
||||||
|
# ... we further assume that SSLClientConnectionCreator has set the
|
||||||
|
# '_synapse_tls_verifier' attribute to a ConnectionVerifier object.
|
||||||
|
tls_protocol._synapse_tls_verifier.verify_context_info_cb(ssl_connection, where)
|
||||||
|
except: # noqa: E722, taken from the twisted implementation
|
||||||
|
logger.exception("Error during info_callback")
|
||||||
|
f = Failure()
|
||||||
|
tls_protocol.failVerification(f)
|
||||||
|
|
||||||
|
|
||||||
@implementer(IOpenSSLClientConnectionCreator)
|
@implementer(IOpenSSLClientConnectionCreator)
|
||||||
class SSLClientConnectionCreator(object):
|
class SSLClientConnectionCreator(object):
|
||||||
"""Creates openssl connection objects for client connections.
|
"""Creates openssl connection objects for client connections.
|
||||||
|
|
|
@ -140,7 +140,7 @@ def compute_event_signature(
|
||||||
Returns:
|
Returns:
|
||||||
a dictionary in the same format of an event's signatures field.
|
a dictionary in the same format of an event's signatures field.
|
||||||
"""
|
"""
|
||||||
redact_json = prune_event_dict(event_dict)
|
redact_json = prune_event_dict(room_version, event_dict)
|
||||||
redact_json.pop("age_ts", None)
|
redact_json.pop("age_ts", None)
|
||||||
redact_json.pop("unsigned", None)
|
redact_json.pop("unsigned", None)
|
||||||
if logger.isEnabledFor(logging.DEBUG):
|
if logger.isEnabledFor(logging.DEBUG):
|
||||||
|
|
|
@ -137,7 +137,7 @@ def check(
|
||||||
raise AuthError(403, "This room has been marked as unfederatable.")
|
raise AuthError(403, "This room has been marked as unfederatable.")
|
||||||
|
|
||||||
# 4. If type is m.room.aliases
|
# 4. If type is m.room.aliases
|
||||||
if event.type == EventTypes.Aliases:
|
if event.type == EventTypes.Aliases and room_version_obj.special_case_aliases_auth:
|
||||||
# 4a. If event has no state_key, reject
|
# 4a. If event has no state_key, reject
|
||||||
if not event.is_state():
|
if not event.is_state():
|
||||||
raise AuthError(403, "Alias event must be a state event")
|
raise AuthError(403, "Alias event must be a state event")
|
||||||
|
@ -152,10 +152,8 @@ def check(
|
||||||
)
|
)
|
||||||
|
|
||||||
# 4c. Otherwise, allow.
|
# 4c. Otherwise, allow.
|
||||||
# This is removed by https://github.com/matrix-org/matrix-doc/pull/2260
|
logger.debug("Allowing! %s", event)
|
||||||
if room_version_obj.special_case_aliases_auth:
|
return
|
||||||
logger.debug("Allowing! %s", event)
|
|
||||||
return
|
|
||||||
|
|
||||||
if logger.isEnabledFor(logging.DEBUG):
|
if logger.isEnabledFor(logging.DEBUG):
|
||||||
logger.debug("Auth events: %s", [a.event_id for a in auth_events.values()])
|
logger.debug("Auth events: %s", [a.event_id for a in auth_events.values()])
|
||||||
|
|
|
@ -15,9 +15,10 @@
|
||||||
# See the License for the specific language governing permissions and
|
# See the License for the specific language governing permissions and
|
||||||
# limitations under the License.
|
# limitations under the License.
|
||||||
|
|
||||||
|
import abc
|
||||||
import os
|
import os
|
||||||
from distutils.util import strtobool
|
from distutils.util import strtobool
|
||||||
from typing import Optional, Type
|
from typing import Dict, Optional, Type
|
||||||
|
|
||||||
import six
|
import six
|
||||||
|
|
||||||
|
@ -199,15 +200,25 @@ class _EventInternalMetadata(object):
|
||||||
return self._dict.get("redacted", False)
|
return self._dict.get("redacted", False)
|
||||||
|
|
||||||
|
|
||||||
class EventBase(object):
|
class EventBase(metaclass=abc.ABCMeta):
|
||||||
|
@property
|
||||||
|
@abc.abstractmethod
|
||||||
|
def format_version(self) -> int:
|
||||||
|
"""The EventFormatVersion implemented by this event"""
|
||||||
|
...
|
||||||
|
|
||||||
def __init__(
|
def __init__(
|
||||||
self,
|
self,
|
||||||
event_dict,
|
event_dict: JsonDict,
|
||||||
signatures={},
|
room_version: RoomVersion,
|
||||||
unsigned={},
|
signatures: Dict[str, Dict[str, str]],
|
||||||
internal_metadata_dict={},
|
unsigned: JsonDict,
|
||||||
rejected_reason=None,
|
internal_metadata_dict: JsonDict,
|
||||||
|
rejected_reason: Optional[str],
|
||||||
):
|
):
|
||||||
|
assert room_version.event_format == self.format_version
|
||||||
|
|
||||||
|
self.room_version = room_version
|
||||||
self.signatures = signatures
|
self.signatures = signatures
|
||||||
self.unsigned = unsigned
|
self.unsigned = unsigned
|
||||||
self.rejected_reason = rejected_reason
|
self.rejected_reason = rejected_reason
|
||||||
|
@ -303,7 +314,13 @@ class EventBase(object):
|
||||||
class FrozenEvent(EventBase):
|
class FrozenEvent(EventBase):
|
||||||
format_version = EventFormatVersions.V1 # All events of this type are V1
|
format_version = EventFormatVersions.V1 # All events of this type are V1
|
||||||
|
|
||||||
def __init__(self, event_dict, internal_metadata_dict={}, rejected_reason=None):
|
def __init__(
|
||||||
|
self,
|
||||||
|
event_dict: JsonDict,
|
||||||
|
room_version: RoomVersion,
|
||||||
|
internal_metadata_dict: JsonDict = {},
|
||||||
|
rejected_reason: Optional[str] = None,
|
||||||
|
):
|
||||||
event_dict = dict(event_dict)
|
event_dict = dict(event_dict)
|
||||||
|
|
||||||
# Signatures is a dict of dicts, and this is faster than doing a
|
# Signatures is a dict of dicts, and this is faster than doing a
|
||||||
|
@ -326,8 +343,9 @@ class FrozenEvent(EventBase):
|
||||||
|
|
||||||
self._event_id = event_dict["event_id"]
|
self._event_id = event_dict["event_id"]
|
||||||
|
|
||||||
super(FrozenEvent, self).__init__(
|
super().__init__(
|
||||||
frozen_dict,
|
frozen_dict,
|
||||||
|
room_version=room_version,
|
||||||
signatures=signatures,
|
signatures=signatures,
|
||||||
unsigned=unsigned,
|
unsigned=unsigned,
|
||||||
internal_metadata_dict=internal_metadata_dict,
|
internal_metadata_dict=internal_metadata_dict,
|
||||||
|
@ -352,7 +370,13 @@ class FrozenEvent(EventBase):
|
||||||
class FrozenEventV2(EventBase):
|
class FrozenEventV2(EventBase):
|
||||||
format_version = EventFormatVersions.V2 # All events of this type are V2
|
format_version = EventFormatVersions.V2 # All events of this type are V2
|
||||||
|
|
||||||
def __init__(self, event_dict, internal_metadata_dict={}, rejected_reason=None):
|
def __init__(
|
||||||
|
self,
|
||||||
|
event_dict: JsonDict,
|
||||||
|
room_version: RoomVersion,
|
||||||
|
internal_metadata_dict: JsonDict = {},
|
||||||
|
rejected_reason: Optional[str] = None,
|
||||||
|
):
|
||||||
event_dict = dict(event_dict)
|
event_dict = dict(event_dict)
|
||||||
|
|
||||||
# Signatures is a dict of dicts, and this is faster than doing a
|
# Signatures is a dict of dicts, and this is faster than doing a
|
||||||
|
@ -377,8 +401,9 @@ class FrozenEventV2(EventBase):
|
||||||
|
|
||||||
self._event_id = None
|
self._event_id = None
|
||||||
|
|
||||||
super(FrozenEventV2, self).__init__(
|
super().__init__(
|
||||||
frozen_dict,
|
frozen_dict,
|
||||||
|
room_version=room_version,
|
||||||
signatures=signatures,
|
signatures=signatures,
|
||||||
unsigned=unsigned,
|
unsigned=unsigned,
|
||||||
internal_metadata_dict=internal_metadata_dict,
|
internal_metadata_dict=internal_metadata_dict,
|
||||||
|
@ -445,7 +470,7 @@ class FrozenEventV3(FrozenEventV2):
|
||||||
return self._event_id
|
return self._event_id
|
||||||
|
|
||||||
|
|
||||||
def event_type_from_format_version(format_version: int) -> Type[EventBase]:
|
def _event_type_from_format_version(format_version: int) -> Type[EventBase]:
|
||||||
"""Returns the python type to use to construct an Event object for the
|
"""Returns the python type to use to construct an Event object for the
|
||||||
given event format version.
|
given event format version.
|
||||||
|
|
||||||
|
@ -474,5 +499,5 @@ def make_event_from_dict(
|
||||||
rejected_reason: Optional[str] = None,
|
rejected_reason: Optional[str] = None,
|
||||||
) -> EventBase:
|
) -> EventBase:
|
||||||
"""Construct an EventBase from the given event dict"""
|
"""Construct an EventBase from the given event dict"""
|
||||||
event_type = event_type_from_format_version(room_version.event_format)
|
event_type = _event_type_from_format_version(room_version.event_format)
|
||||||
return event_type(event_dict, internal_metadata_dict, rejected_reason)
|
return event_type(event_dict, room_version, internal_metadata_dict, rejected_reason)
|
||||||
|
|
|
@ -23,6 +23,7 @@ from frozendict import frozendict
|
||||||
from twisted.internet import defer
|
from twisted.internet import defer
|
||||||
|
|
||||||
from synapse.api.constants import EventTypes, RelationTypes
|
from synapse.api.constants import EventTypes, RelationTypes
|
||||||
|
from synapse.api.room_versions import RoomVersion
|
||||||
from synapse.util.async_helpers import yieldable_gather_results
|
from synapse.util.async_helpers import yieldable_gather_results
|
||||||
|
|
||||||
from . import EventBase
|
from . import EventBase
|
||||||
|
@ -35,26 +36,20 @@ from . import EventBase
|
||||||
SPLIT_FIELD_REGEX = re.compile(r"(?<!\\)\.")
|
SPLIT_FIELD_REGEX = re.compile(r"(?<!\\)\.")
|
||||||
|
|
||||||
|
|
||||||
def prune_event(event):
|
def prune_event(event: EventBase) -> EventBase:
|
||||||
""" Returns a pruned version of the given event, which removes all keys we
|
""" Returns a pruned version of the given event, which removes all keys we
|
||||||
don't know about or think could potentially be dodgy.
|
don't know about or think could potentially be dodgy.
|
||||||
|
|
||||||
This is used when we "redact" an event. We want to remove all fields that
|
This is used when we "redact" an event. We want to remove all fields that
|
||||||
the user has specified, but we do want to keep necessary information like
|
the user has specified, but we do want to keep necessary information like
|
||||||
type, state_key etc.
|
type, state_key etc.
|
||||||
|
|
||||||
Args:
|
|
||||||
event (FrozenEvent)
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
FrozenEvent
|
|
||||||
"""
|
"""
|
||||||
pruned_event_dict = prune_event_dict(event.get_dict())
|
pruned_event_dict = prune_event_dict(event.room_version, event.get_dict())
|
||||||
|
|
||||||
from . import event_type_from_format_version
|
from . import make_event_from_dict
|
||||||
|
|
||||||
pruned_event = event_type_from_format_version(event.format_version)(
|
pruned_event = make_event_from_dict(
|
||||||
pruned_event_dict, event.internal_metadata.get_dict()
|
pruned_event_dict, event.room_version, event.internal_metadata.get_dict()
|
||||||
)
|
)
|
||||||
|
|
||||||
# Mark the event as redacted
|
# Mark the event as redacted
|
||||||
|
@ -63,15 +58,12 @@ def prune_event(event):
|
||||||
return pruned_event
|
return pruned_event
|
||||||
|
|
||||||
|
|
||||||
def prune_event_dict(event_dict):
|
def prune_event_dict(room_version: RoomVersion, event_dict: dict) -> dict:
|
||||||
"""Redacts the event_dict in the same way as `prune_event`, except it
|
"""Redacts the event_dict in the same way as `prune_event`, except it
|
||||||
operates on dicts rather than event objects
|
operates on dicts rather than event objects
|
||||||
|
|
||||||
Args:
|
|
||||||
event_dict (dict)
|
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
dict: A copy of the pruned event dict
|
A copy of the pruned event dict
|
||||||
"""
|
"""
|
||||||
|
|
||||||
allowed_keys = [
|
allowed_keys = [
|
||||||
|
@ -118,7 +110,7 @@ def prune_event_dict(event_dict):
|
||||||
"kick",
|
"kick",
|
||||||
"redact",
|
"redact",
|
||||||
)
|
)
|
||||||
elif event_type == EventTypes.Aliases:
|
elif event_type == EventTypes.Aliases and room_version.special_case_aliases_auth:
|
||||||
add_fields("aliases")
|
add_fields("aliases")
|
||||||
elif event_type == EventTypes.RoomHistoryVisibility:
|
elif event_type == EventTypes.RoomHistoryVisibility:
|
||||||
add_fields("history_visibility")
|
add_fields("history_visibility")
|
||||||
|
|
|
@ -15,11 +15,13 @@
|
||||||
# limitations under the License.
|
# limitations under the License.
|
||||||
import logging
|
import logging
|
||||||
from collections import namedtuple
|
from collections import namedtuple
|
||||||
|
from typing import Iterable, List
|
||||||
|
|
||||||
import six
|
import six
|
||||||
|
|
||||||
from twisted.internet import defer
|
from twisted.internet import defer
|
||||||
from twisted.internet.defer import DeferredList
|
from twisted.internet.defer import Deferred, DeferredList
|
||||||
|
from twisted.python.failure import Failure
|
||||||
|
|
||||||
from synapse.api.constants import MAX_DEPTH, EventTypes, Membership
|
from synapse.api.constants import MAX_DEPTH, EventTypes, Membership
|
||||||
from synapse.api.errors import Codes, SynapseError
|
from synapse.api.errors import Codes, SynapseError
|
||||||
|
@ -29,6 +31,7 @@ from synapse.api.room_versions import (
|
||||||
RoomVersion,
|
RoomVersion,
|
||||||
)
|
)
|
||||||
from synapse.crypto.event_signing import check_event_content_hash
|
from synapse.crypto.event_signing import check_event_content_hash
|
||||||
|
from synapse.crypto.keyring import Keyring
|
||||||
from synapse.events import EventBase, make_event_from_dict
|
from synapse.events import EventBase, make_event_from_dict
|
||||||
from synapse.events.utils import prune_event
|
from synapse.events.utils import prune_event
|
||||||
from synapse.http.servlet import assert_params_in_dict
|
from synapse.http.servlet import assert_params_in_dict
|
||||||
|
@ -36,10 +39,8 @@ from synapse.logging.context import (
|
||||||
LoggingContext,
|
LoggingContext,
|
||||||
PreserveLoggingContext,
|
PreserveLoggingContext,
|
||||||
make_deferred_yieldable,
|
make_deferred_yieldable,
|
||||||
preserve_fn,
|
|
||||||
)
|
)
|
||||||
from synapse.types import JsonDict, get_domain_from_id
|
from synapse.types import JsonDict, get_domain_from_id
|
||||||
from synapse.util import unwrapFirstError
|
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
@ -54,94 +55,23 @@ class FederationBase(object):
|
||||||
self.store = hs.get_datastore()
|
self.store = hs.get_datastore()
|
||||||
self._clock = hs.get_clock()
|
self._clock = hs.get_clock()
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
def _check_sigs_and_hash(self, room_version: str, pdu: EventBase) -> Deferred:
|
||||||
def _check_sigs_and_hash_and_fetch(
|
|
||||||
self, origin, pdus, room_version, outlier=False, include_none=False
|
|
||||||
):
|
|
||||||
"""Takes a list of PDUs and checks the signatures and hashs of each
|
|
||||||
one. If a PDU fails its signature check then we check if we have it in
|
|
||||||
the database and if not then request if from the originating server of
|
|
||||||
that PDU.
|
|
||||||
|
|
||||||
If a PDU fails its content hash check then it is redacted.
|
|
||||||
|
|
||||||
The given list of PDUs are not modified, instead the function returns
|
|
||||||
a new list.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
origin (str)
|
|
||||||
pdu (list)
|
|
||||||
room_version (str)
|
|
||||||
outlier (bool): Whether the events are outliers or not
|
|
||||||
include_none (str): Whether to include None in the returned list
|
|
||||||
for events that have failed their checks
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
Deferred : A list of PDUs that have valid signatures and hashes.
|
|
||||||
"""
|
|
||||||
deferreds = self._check_sigs_and_hashes(room_version, pdus)
|
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
|
||||||
def handle_check_result(pdu, deferred):
|
|
||||||
try:
|
|
||||||
res = yield make_deferred_yieldable(deferred)
|
|
||||||
except SynapseError:
|
|
||||||
res = None
|
|
||||||
|
|
||||||
if not res:
|
|
||||||
# Check local db.
|
|
||||||
res = yield self.store.get_event(
|
|
||||||
pdu.event_id, allow_rejected=True, allow_none=True
|
|
||||||
)
|
|
||||||
|
|
||||||
if not res and pdu.origin != origin:
|
|
||||||
try:
|
|
||||||
res = yield defer.ensureDeferred(
|
|
||||||
self.get_pdu(
|
|
||||||
destinations=[pdu.origin],
|
|
||||||
event_id=pdu.event_id,
|
|
||||||
room_version=room_version,
|
|
||||||
outlier=outlier,
|
|
||||||
timeout=10000,
|
|
||||||
)
|
|
||||||
)
|
|
||||||
except SynapseError:
|
|
||||||
pass
|
|
||||||
|
|
||||||
if not res:
|
|
||||||
logger.warning(
|
|
||||||
"Failed to find copy of %s with valid signature", pdu.event_id
|
|
||||||
)
|
|
||||||
|
|
||||||
return res
|
|
||||||
|
|
||||||
handle = preserve_fn(handle_check_result)
|
|
||||||
deferreds2 = [handle(pdu, deferred) for pdu, deferred in zip(pdus, deferreds)]
|
|
||||||
|
|
||||||
valid_pdus = yield make_deferred_yieldable(
|
|
||||||
defer.gatherResults(deferreds2, consumeErrors=True)
|
|
||||||
).addErrback(unwrapFirstError)
|
|
||||||
|
|
||||||
if include_none:
|
|
||||||
return valid_pdus
|
|
||||||
else:
|
|
||||||
return [p for p in valid_pdus if p]
|
|
||||||
|
|
||||||
def _check_sigs_and_hash(self, room_version, pdu):
|
|
||||||
return make_deferred_yieldable(
|
return make_deferred_yieldable(
|
||||||
self._check_sigs_and_hashes(room_version, [pdu])[0]
|
self._check_sigs_and_hashes(room_version, [pdu])[0]
|
||||||
)
|
)
|
||||||
|
|
||||||
def _check_sigs_and_hashes(self, room_version, pdus):
|
def _check_sigs_and_hashes(
|
||||||
|
self, room_version: str, pdus: List[EventBase]
|
||||||
|
) -> List[Deferred]:
|
||||||
"""Checks that each of the received events is correctly signed by the
|
"""Checks that each of the received events is correctly signed by the
|
||||||
sending server.
|
sending server.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
room_version (str): The room version of the PDUs
|
room_version: The room version of the PDUs
|
||||||
pdus (list[FrozenEvent]): the events to be checked
|
pdus: the events to be checked
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
list[Deferred]: for each input event, a deferred which:
|
For each input event, a deferred which:
|
||||||
* returns the original event if the checks pass
|
* returns the original event if the checks pass
|
||||||
* returns a redacted version of the event (if the signature
|
* returns a redacted version of the event (if the signature
|
||||||
matched but the hash did not)
|
matched but the hash did not)
|
||||||
|
@ -152,7 +82,7 @@ class FederationBase(object):
|
||||||
|
|
||||||
ctx = LoggingContext.current_context()
|
ctx = LoggingContext.current_context()
|
||||||
|
|
||||||
def callback(_, pdu):
|
def callback(_, pdu: EventBase):
|
||||||
with PreserveLoggingContext(ctx):
|
with PreserveLoggingContext(ctx):
|
||||||
if not check_event_content_hash(pdu):
|
if not check_event_content_hash(pdu):
|
||||||
# let's try to distinguish between failures because the event was
|
# let's try to distinguish between failures because the event was
|
||||||
|
@ -189,7 +119,7 @@ class FederationBase(object):
|
||||||
|
|
||||||
return pdu
|
return pdu
|
||||||
|
|
||||||
def errback(failure, pdu):
|
def errback(failure: Failure, pdu: EventBase):
|
||||||
failure.trap(SynapseError)
|
failure.trap(SynapseError)
|
||||||
with PreserveLoggingContext(ctx):
|
with PreserveLoggingContext(ctx):
|
||||||
logger.warning(
|
logger.warning(
|
||||||
|
@ -215,16 +145,18 @@ class PduToCheckSig(
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
|
||||||
def _check_sigs_on_pdus(keyring, room_version, pdus):
|
def _check_sigs_on_pdus(
|
||||||
|
keyring: Keyring, room_version: str, pdus: Iterable[EventBase]
|
||||||
|
) -> List[Deferred]:
|
||||||
"""Check that the given events are correctly signed
|
"""Check that the given events are correctly signed
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
keyring (synapse.crypto.Keyring): keyring object to do the checks
|
keyring: keyring object to do the checks
|
||||||
room_version (str): the room version of the PDUs
|
room_version: the room version of the PDUs
|
||||||
pdus (Collection[EventBase]): the events to be checked
|
pdus: the events to be checked
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
List[Deferred]: a Deferred for each event in pdus, which will either succeed if
|
A Deferred for each event in pdus, which will either succeed if
|
||||||
the signatures are valid, or fail (with a SynapseError) if not.
|
the signatures are valid, or fail (with a SynapseError) if not.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
@ -329,7 +261,7 @@ def _check_sigs_on_pdus(keyring, room_version, pdus):
|
||||||
return [_flatten_deferred_list(p.deferreds) for p in pdus_to_check]
|
return [_flatten_deferred_list(p.deferreds) for p in pdus_to_check]
|
||||||
|
|
||||||
|
|
||||||
def _flatten_deferred_list(deferreds):
|
def _flatten_deferred_list(deferreds: List[Deferred]) -> Deferred:
|
||||||
"""Given a list of deferreds, either return the single deferred,
|
"""Given a list of deferreds, either return the single deferred,
|
||||||
combine into a DeferredList, or return an already resolved deferred.
|
combine into a DeferredList, or return an already resolved deferred.
|
||||||
"""
|
"""
|
||||||
|
@ -341,7 +273,7 @@ def _flatten_deferred_list(deferreds):
|
||||||
return defer.succeed(None)
|
return defer.succeed(None)
|
||||||
|
|
||||||
|
|
||||||
def _is_invite_via_3pid(event):
|
def _is_invite_via_3pid(event: EventBase) -> bool:
|
||||||
return (
|
return (
|
||||||
event.type == EventTypes.Member
|
event.type == EventTypes.Member
|
||||||
and event.membership == Membership.INVITE
|
and event.membership == Membership.INVITE
|
||||||
|
|
|
@ -33,6 +33,7 @@ from typing import (
|
||||||
from prometheus_client import Counter
|
from prometheus_client import Counter
|
||||||
|
|
||||||
from twisted.internet import defer
|
from twisted.internet import defer
|
||||||
|
from twisted.internet.defer import Deferred
|
||||||
|
|
||||||
from synapse.api.constants import EventTypes, Membership
|
from synapse.api.constants import EventTypes, Membership
|
||||||
from synapse.api.errors import (
|
from synapse.api.errors import (
|
||||||
|
@ -51,7 +52,7 @@ from synapse.api.room_versions import (
|
||||||
)
|
)
|
||||||
from synapse.events import EventBase, builder
|
from synapse.events import EventBase, builder
|
||||||
from synapse.federation.federation_base import FederationBase, event_from_pdu_json
|
from synapse.federation.federation_base import FederationBase, event_from_pdu_json
|
||||||
from synapse.logging.context import make_deferred_yieldable
|
from synapse.logging.context import make_deferred_yieldable, preserve_fn
|
||||||
from synapse.logging.utils import log_function
|
from synapse.logging.utils import log_function
|
||||||
from synapse.types import JsonDict
|
from synapse.types import JsonDict
|
||||||
from synapse.util import unwrapFirstError
|
from synapse.util import unwrapFirstError
|
||||||
|
@ -187,7 +188,7 @@ class FederationClient(FederationBase):
|
||||||
|
|
||||||
async def backfill(
|
async def backfill(
|
||||||
self, dest: str, room_id: str, limit: int, extremities: Iterable[str]
|
self, dest: str, room_id: str, limit: int, extremities: Iterable[str]
|
||||||
) -> List[EventBase]:
|
) -> Optional[List[EventBase]]:
|
||||||
"""Requests some more historic PDUs for the given room from the
|
"""Requests some more historic PDUs for the given room from the
|
||||||
given destination server.
|
given destination server.
|
||||||
|
|
||||||
|
@ -199,9 +200,9 @@ class FederationClient(FederationBase):
|
||||||
"""
|
"""
|
||||||
logger.debug("backfill extrem=%s", extremities)
|
logger.debug("backfill extrem=%s", extremities)
|
||||||
|
|
||||||
# If there are no extremeties then we've (probably) reached the start.
|
# If there are no extremities then we've (probably) reached the start.
|
||||||
if not extremities:
|
if not extremities:
|
||||||
return
|
return None
|
||||||
|
|
||||||
transaction_data = await self.transport_layer.backfill(
|
transaction_data = await self.transport_layer.backfill(
|
||||||
dest, room_id, extremities, limit
|
dest, room_id, extremities, limit
|
||||||
|
@ -284,7 +285,7 @@ class FederationClient(FederationBase):
|
||||||
pdu_list = [
|
pdu_list = [
|
||||||
event_from_pdu_json(p, room_version, outlier=outlier)
|
event_from_pdu_json(p, room_version, outlier=outlier)
|
||||||
for p in transaction_data["pdus"]
|
for p in transaction_data["pdus"]
|
||||||
]
|
] # type: List[EventBase]
|
||||||
|
|
||||||
if pdu_list and pdu_list[0]:
|
if pdu_list and pdu_list[0]:
|
||||||
pdu = pdu_list[0]
|
pdu = pdu_list[0]
|
||||||
|
@ -345,6 +346,83 @@ class FederationClient(FederationBase):
|
||||||
|
|
||||||
return state_event_ids, auth_event_ids
|
return state_event_ids, auth_event_ids
|
||||||
|
|
||||||
|
async def _check_sigs_and_hash_and_fetch(
|
||||||
|
self,
|
||||||
|
origin: str,
|
||||||
|
pdus: List[EventBase],
|
||||||
|
room_version: str,
|
||||||
|
outlier: bool = False,
|
||||||
|
include_none: bool = False,
|
||||||
|
) -> List[EventBase]:
|
||||||
|
"""Takes a list of PDUs and checks the signatures and hashs of each
|
||||||
|
one. If a PDU fails its signature check then we check if we have it in
|
||||||
|
the database and if not then request if from the originating server of
|
||||||
|
that PDU.
|
||||||
|
|
||||||
|
If a PDU fails its content hash check then it is redacted.
|
||||||
|
|
||||||
|
The given list of PDUs are not modified, instead the function returns
|
||||||
|
a new list.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
origin
|
||||||
|
pdu
|
||||||
|
room_version
|
||||||
|
outlier: Whether the events are outliers or not
|
||||||
|
include_none: Whether to include None in the returned list
|
||||||
|
for events that have failed their checks
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Deferred : A list of PDUs that have valid signatures and hashes.
|
||||||
|
"""
|
||||||
|
deferreds = self._check_sigs_and_hashes(room_version, pdus)
|
||||||
|
|
||||||
|
@defer.inlineCallbacks
|
||||||
|
def handle_check_result(pdu: EventBase, deferred: Deferred):
|
||||||
|
try:
|
||||||
|
res = yield make_deferred_yieldable(deferred)
|
||||||
|
except SynapseError:
|
||||||
|
res = None
|
||||||
|
|
||||||
|
if not res:
|
||||||
|
# Check local db.
|
||||||
|
res = yield self.store.get_event(
|
||||||
|
pdu.event_id, allow_rejected=True, allow_none=True
|
||||||
|
)
|
||||||
|
|
||||||
|
if not res and pdu.origin != origin:
|
||||||
|
try:
|
||||||
|
res = yield defer.ensureDeferred(
|
||||||
|
self.get_pdu(
|
||||||
|
destinations=[pdu.origin],
|
||||||
|
event_id=pdu.event_id,
|
||||||
|
room_version=room_version, # type: ignore
|
||||||
|
outlier=outlier,
|
||||||
|
timeout=10000,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
except SynapseError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
if not res:
|
||||||
|
logger.warning(
|
||||||
|
"Failed to find copy of %s with valid signature", pdu.event_id
|
||||||
|
)
|
||||||
|
|
||||||
|
return res
|
||||||
|
|
||||||
|
handle = preserve_fn(handle_check_result)
|
||||||
|
deferreds2 = [handle(pdu, deferred) for pdu, deferred in zip(pdus, deferreds)]
|
||||||
|
|
||||||
|
valid_pdus = await make_deferred_yieldable(
|
||||||
|
defer.gatherResults(deferreds2, consumeErrors=True)
|
||||||
|
).addErrback(unwrapFirstError)
|
||||||
|
|
||||||
|
if include_none:
|
||||||
|
return valid_pdus
|
||||||
|
else:
|
||||||
|
return [p for p in valid_pdus if p]
|
||||||
|
|
||||||
async def get_event_auth(self, destination, room_id, event_id):
|
async def get_event_auth(self, destination, room_id, event_id):
|
||||||
res = await self.transport_layer.get_event_auth(destination, room_id, event_id)
|
res = await self.transport_layer.get_event_auth(destination, room_id, event_id)
|
||||||
|
|
||||||
|
@ -615,7 +693,7 @@ class FederationClient(FederationBase):
|
||||||
]
|
]
|
||||||
if auth_chain_create_events != [create_event.event_id]:
|
if auth_chain_create_events != [create_event.event_id]:
|
||||||
raise InvalidResponseError(
|
raise InvalidResponseError(
|
||||||
"Unexpected create event(s) in auth chain"
|
"Unexpected create event(s) in auth chain: %s"
|
||||||
% (auth_chain_create_events,)
|
% (auth_chain_create_events,)
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
|
@ -470,57 +470,6 @@ class FederationServer(FederationBase):
|
||||||
res = {"auth_chain": [a.get_pdu_json(time_now) for a in auth_pdus]}
|
res = {"auth_chain": [a.get_pdu_json(time_now) for a in auth_pdus]}
|
||||||
return 200, res
|
return 200, res
|
||||||
|
|
||||||
async def on_query_auth_request(self, origin, content, room_id, event_id):
|
|
||||||
"""
|
|
||||||
Content is a dict with keys::
|
|
||||||
auth_chain (list): A list of events that give the auth chain.
|
|
||||||
missing (list): A list of event_ids indicating what the other
|
|
||||||
side (`origin`) think we're missing.
|
|
||||||
rejects (dict): A mapping from event_id to a 2-tuple of reason
|
|
||||||
string and a proof (or None) of why the event was rejected.
|
|
||||||
The keys of this dict give the list of events the `origin` has
|
|
||||||
rejected.
|
|
||||||
|
|
||||||
Args:
|
|
||||||
origin (str)
|
|
||||||
content (dict)
|
|
||||||
event_id (str)
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
Deferred: Results in `dict` with the same format as `content`
|
|
||||||
"""
|
|
||||||
with (await self._server_linearizer.queue((origin, room_id))):
|
|
||||||
origin_host, _ = parse_server_name(origin)
|
|
||||||
await self.check_server_matches_acl(origin_host, room_id)
|
|
||||||
|
|
||||||
room_version = await self.store.get_room_version(room_id)
|
|
||||||
|
|
||||||
auth_chain = [
|
|
||||||
event_from_pdu_json(e, room_version) for e in content["auth_chain"]
|
|
||||||
]
|
|
||||||
|
|
||||||
signed_auth = await self._check_sigs_and_hash_and_fetch(
|
|
||||||
origin, auth_chain, outlier=True, room_version=room_version.identifier
|
|
||||||
)
|
|
||||||
|
|
||||||
ret = await self.handler.on_query_auth(
|
|
||||||
origin,
|
|
||||||
event_id,
|
|
||||||
room_id,
|
|
||||||
signed_auth,
|
|
||||||
content.get("rejects", []),
|
|
||||||
content.get("missing", []),
|
|
||||||
)
|
|
||||||
|
|
||||||
time_now = self._clock.time_msec()
|
|
||||||
send_content = {
|
|
||||||
"auth_chain": [e.get_pdu_json(time_now) for e in ret["auth_chain"]],
|
|
||||||
"rejects": ret.get("rejects", []),
|
|
||||||
"missing": ret.get("missing", []),
|
|
||||||
}
|
|
||||||
|
|
||||||
return 200, send_content
|
|
||||||
|
|
||||||
@log_function
|
@log_function
|
||||||
def on_query_client_keys(self, origin, content):
|
def on_query_client_keys(self, origin, content):
|
||||||
return self.on_query_request("client_keys", content)
|
return self.on_query_request("client_keys", content)
|
||||||
|
|
|
@ -643,17 +643,6 @@ class FederationClientKeysClaimServlet(BaseFederationServlet):
|
||||||
return 200, response
|
return 200, response
|
||||||
|
|
||||||
|
|
||||||
class FederationQueryAuthServlet(BaseFederationServlet):
|
|
||||||
PATH = "/query_auth/(?P<context>[^/]*)/(?P<event_id>[^/]*)"
|
|
||||||
|
|
||||||
async def on_POST(self, origin, content, query, context, event_id):
|
|
||||||
new_content = await self.handler.on_query_auth_request(
|
|
||||||
origin, content, context, event_id
|
|
||||||
)
|
|
||||||
|
|
||||||
return 200, new_content
|
|
||||||
|
|
||||||
|
|
||||||
class FederationGetMissingEventsServlet(BaseFederationServlet):
|
class FederationGetMissingEventsServlet(BaseFederationServlet):
|
||||||
# TODO(paul): Why does this path alone end with "/?" optional?
|
# TODO(paul): Why does this path alone end with "/?" optional?
|
||||||
PATH = "/get_missing_events/(?P<room_id>[^/]*)/?"
|
PATH = "/get_missing_events/(?P<room_id>[^/]*)/?"
|
||||||
|
@ -1412,7 +1401,6 @@ FEDERATION_SERVLET_CLASSES = (
|
||||||
FederationV2SendLeaveServlet,
|
FederationV2SendLeaveServlet,
|
||||||
FederationV1InviteServlet,
|
FederationV1InviteServlet,
|
||||||
FederationV2InviteServlet,
|
FederationV2InviteServlet,
|
||||||
FederationQueryAuthServlet,
|
|
||||||
FederationGetMissingEventsServlet,
|
FederationGetMissingEventsServlet,
|
||||||
FederationEventAuthServlet,
|
FederationEventAuthServlet,
|
||||||
FederationClientKeysQueryServlet,
|
FederationClientKeysQueryServlet,
|
||||||
|
|
|
@ -44,7 +44,11 @@ class AccountValidityHandler(object):
|
||||||
|
|
||||||
self._account_validity = self.hs.config.account_validity
|
self._account_validity = self.hs.config.account_validity
|
||||||
|
|
||||||
if self._account_validity.renew_by_email_enabled and load_jinja2_templates:
|
if (
|
||||||
|
self._account_validity.enabled
|
||||||
|
and self._account_validity.renew_by_email_enabled
|
||||||
|
and load_jinja2_templates
|
||||||
|
):
|
||||||
# Don't do email-specific configuration if renewal by email is disabled.
|
# Don't do email-specific configuration if renewal by email is disabled.
|
||||||
try:
|
try:
|
||||||
app_name = self.hs.config.email_app_name
|
app_name = self.hs.config.email_app_name
|
||||||
|
|
|
@ -17,9 +17,11 @@
|
||||||
import logging
|
import logging
|
||||||
import time
|
import time
|
||||||
import unicodedata
|
import unicodedata
|
||||||
|
import urllib.parse
|
||||||
|
from typing import Any, Dict, Iterable, List, Optional
|
||||||
|
|
||||||
import attr
|
import attr
|
||||||
import bcrypt
|
import bcrypt # type: ignore[import]
|
||||||
import pymacaroons
|
import pymacaroons
|
||||||
|
|
||||||
from twisted.internet import defer
|
from twisted.internet import defer
|
||||||
|
@ -38,9 +40,12 @@ from synapse.api.errors import (
|
||||||
from synapse.api.ratelimiting import Ratelimiter
|
from synapse.api.ratelimiting import Ratelimiter
|
||||||
from synapse.handlers.ui_auth import INTERACTIVE_AUTH_CHECKERS
|
from synapse.handlers.ui_auth import INTERACTIVE_AUTH_CHECKERS
|
||||||
from synapse.handlers.ui_auth.checkers import UserInteractiveAuthChecker
|
from synapse.handlers.ui_auth.checkers import UserInteractiveAuthChecker
|
||||||
|
from synapse.http.server import finish_request
|
||||||
|
from synapse.http.site import SynapseRequest
|
||||||
from synapse.logging.context import defer_to_thread
|
from synapse.logging.context import defer_to_thread
|
||||||
from synapse.module_api import ModuleApi
|
from synapse.module_api import ModuleApi
|
||||||
from synapse.types import UserID
|
from synapse.push.mailer import load_jinja2_templates
|
||||||
|
from synapse.types import Requester, UserID
|
||||||
from synapse.util.caches.expiringcache import ExpiringCache
|
from synapse.util.caches.expiringcache import ExpiringCache
|
||||||
|
|
||||||
from ._base import BaseHandler
|
from ._base import BaseHandler
|
||||||
|
@ -58,11 +63,11 @@ class AuthHandler(BaseHandler):
|
||||||
"""
|
"""
|
||||||
super(AuthHandler, self).__init__(hs)
|
super(AuthHandler, self).__init__(hs)
|
||||||
|
|
||||||
self.checkers = {} # type: dict[str, UserInteractiveAuthChecker]
|
self.checkers = {} # type: Dict[str, UserInteractiveAuthChecker]
|
||||||
for auth_checker_class in INTERACTIVE_AUTH_CHECKERS:
|
for auth_checker_class in INTERACTIVE_AUTH_CHECKERS:
|
||||||
inst = auth_checker_class(hs)
|
inst = auth_checker_class(hs)
|
||||||
if inst.is_enabled():
|
if inst.is_enabled():
|
||||||
self.checkers[inst.AUTH_TYPE] = inst
|
self.checkers[inst.AUTH_TYPE] = inst # type: ignore
|
||||||
|
|
||||||
self.bcrypt_rounds = hs.config.bcrypt_rounds
|
self.bcrypt_rounds = hs.config.bcrypt_rounds
|
||||||
|
|
||||||
|
@ -108,8 +113,20 @@ class AuthHandler(BaseHandler):
|
||||||
|
|
||||||
self._clock = self.hs.get_clock()
|
self._clock = self.hs.get_clock()
|
||||||
|
|
||||||
|
# Load the SSO redirect confirmation page HTML template
|
||||||
|
self._sso_redirect_confirm_template = load_jinja2_templates(
|
||||||
|
hs.config.sso_redirect_confirm_template_dir, ["sso_redirect_confirm.html"],
|
||||||
|
)[0]
|
||||||
|
|
||||||
|
self._server_name = hs.config.server_name
|
||||||
|
|
||||||
|
# cast to tuple for use with str.startswith
|
||||||
|
self._whitelisted_sso_clients = tuple(hs.config.sso_client_whitelist)
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def validate_user_via_ui_auth(self, requester, request_body, clientip):
|
def validate_user_via_ui_auth(
|
||||||
|
self, requester: Requester, request_body: Dict[str, Any], clientip: str
|
||||||
|
):
|
||||||
"""
|
"""
|
||||||
Checks that the user is who they claim to be, via a UI auth.
|
Checks that the user is who they claim to be, via a UI auth.
|
||||||
|
|
||||||
|
@ -118,11 +135,11 @@ class AuthHandler(BaseHandler):
|
||||||
that it isn't stolen by re-authenticating them.
|
that it isn't stolen by re-authenticating them.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
requester (Requester): The user, as given by the access token
|
requester: The user, as given by the access token
|
||||||
|
|
||||||
request_body (dict): The body of the request sent by the client
|
request_body: The body of the request sent by the client
|
||||||
|
|
||||||
clientip (str): The IP address of the client.
|
clientip: The IP address of the client.
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
defer.Deferred[dict]: the parameters for this request (which may
|
defer.Deferred[dict]: the parameters for this request (which may
|
||||||
|
@ -193,7 +210,9 @@ class AuthHandler(BaseHandler):
|
||||||
return self.checkers.keys()
|
return self.checkers.keys()
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def check_auth(self, flows, clientdict, clientip):
|
def check_auth(
|
||||||
|
self, flows: List[List[str]], clientdict: Dict[str, Any], clientip: str
|
||||||
|
):
|
||||||
"""
|
"""
|
||||||
Takes a dictionary sent by the client in the login / registration
|
Takes a dictionary sent by the client in the login / registration
|
||||||
protocol and handles the User-Interactive Auth flow.
|
protocol and handles the User-Interactive Auth flow.
|
||||||
|
@ -208,14 +227,14 @@ class AuthHandler(BaseHandler):
|
||||||
decorator.
|
decorator.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
flows (list): A list of login flows. Each flow is an ordered list of
|
flows: A list of login flows. Each flow is an ordered list of
|
||||||
strings representing auth-types. At least one full
|
strings representing auth-types. At least one full
|
||||||
flow must be completed in order for auth to be successful.
|
flow must be completed in order for auth to be successful.
|
||||||
|
|
||||||
clientdict: The dictionary from the client root level, not the
|
clientdict: The dictionary from the client root level, not the
|
||||||
'auth' key: this method prompts for auth if none is sent.
|
'auth' key: this method prompts for auth if none is sent.
|
||||||
|
|
||||||
clientip (str): The IP address of the client.
|
clientip: The IP address of the client.
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
defer.Deferred[dict, dict, str]: a deferred tuple of
|
defer.Deferred[dict, dict, str]: a deferred tuple of
|
||||||
|
@ -235,7 +254,7 @@ class AuthHandler(BaseHandler):
|
||||||
"""
|
"""
|
||||||
|
|
||||||
authdict = None
|
authdict = None
|
||||||
sid = None
|
sid = None # type: Optional[str]
|
||||||
if clientdict and "auth" in clientdict:
|
if clientdict and "auth" in clientdict:
|
||||||
authdict = clientdict["auth"]
|
authdict = clientdict["auth"]
|
||||||
del clientdict["auth"]
|
del clientdict["auth"]
|
||||||
|
@ -268,9 +287,9 @@ class AuthHandler(BaseHandler):
|
||||||
creds = session["creds"]
|
creds = session["creds"]
|
||||||
|
|
||||||
# check auth type currently being presented
|
# check auth type currently being presented
|
||||||
errordict = {}
|
errordict = {} # type: Dict[str, Any]
|
||||||
if "type" in authdict:
|
if "type" in authdict:
|
||||||
login_type = authdict["type"]
|
login_type = authdict["type"] # type: str
|
||||||
try:
|
try:
|
||||||
result = yield self._check_auth_dict(authdict, clientip)
|
result = yield self._check_auth_dict(authdict, clientip)
|
||||||
if result:
|
if result:
|
||||||
|
@ -311,7 +330,7 @@ class AuthHandler(BaseHandler):
|
||||||
raise InteractiveAuthIncompleteError(ret)
|
raise InteractiveAuthIncompleteError(ret)
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def add_oob_auth(self, stagetype, authdict, clientip):
|
def add_oob_auth(self, stagetype: str, authdict: Dict[str, Any], clientip: str):
|
||||||
"""
|
"""
|
||||||
Adds the result of out-of-band authentication into an existing auth
|
Adds the result of out-of-band authentication into an existing auth
|
||||||
session. Currently used for adding the result of fallback auth.
|
session. Currently used for adding the result of fallback auth.
|
||||||
|
@ -333,7 +352,7 @@ class AuthHandler(BaseHandler):
|
||||||
return True
|
return True
|
||||||
return False
|
return False
|
||||||
|
|
||||||
def get_session_id(self, clientdict):
|
def get_session_id(self, clientdict: Dict[str, Any]) -> Optional[str]:
|
||||||
"""
|
"""
|
||||||
Gets the session ID for a client given the client dictionary
|
Gets the session ID for a client given the client dictionary
|
||||||
|
|
||||||
|
@ -341,7 +360,7 @@ class AuthHandler(BaseHandler):
|
||||||
clientdict: The dictionary sent by the client in the request
|
clientdict: The dictionary sent by the client in the request
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
str|None: The string session ID the client sent. If the client did
|
The string session ID the client sent. If the client did
|
||||||
not send a session ID, returns None.
|
not send a session ID, returns None.
|
||||||
"""
|
"""
|
||||||
sid = None
|
sid = None
|
||||||
|
@ -351,40 +370,42 @@ class AuthHandler(BaseHandler):
|
||||||
sid = authdict["session"]
|
sid = authdict["session"]
|
||||||
return sid
|
return sid
|
||||||
|
|
||||||
def set_session_data(self, session_id, key, value):
|
def set_session_data(self, session_id: str, key: str, value: Any) -> None:
|
||||||
"""
|
"""
|
||||||
Store a key-value pair into the sessions data associated with this
|
Store a key-value pair into the sessions data associated with this
|
||||||
request. This data is stored server-side and cannot be modified by
|
request. This data is stored server-side and cannot be modified by
|
||||||
the client.
|
the client.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
session_id (string): The ID of this session as returned from check_auth
|
session_id: The ID of this session as returned from check_auth
|
||||||
key (string): The key to store the data under
|
key: The key to store the data under
|
||||||
value (any): The data to store
|
value: The data to store
|
||||||
"""
|
"""
|
||||||
sess = self._get_session_info(session_id)
|
sess = self._get_session_info(session_id)
|
||||||
sess.setdefault("serverdict", {})[key] = value
|
sess.setdefault("serverdict", {})[key] = value
|
||||||
self._save_session(sess)
|
self._save_session(sess)
|
||||||
|
|
||||||
def get_session_data(self, session_id, key, default=None):
|
def get_session_data(
|
||||||
|
self, session_id: str, key: str, default: Optional[Any] = None
|
||||||
|
) -> Any:
|
||||||
"""
|
"""
|
||||||
Retrieve data stored with set_session_data
|
Retrieve data stored with set_session_data
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
session_id (string): The ID of this session as returned from check_auth
|
session_id: The ID of this session as returned from check_auth
|
||||||
key (string): The key to store the data under
|
key: The key to store the data under
|
||||||
default (any): Value to return if the key has not been set
|
default: Value to return if the key has not been set
|
||||||
"""
|
"""
|
||||||
sess = self._get_session_info(session_id)
|
sess = self._get_session_info(session_id)
|
||||||
return sess.setdefault("serverdict", {}).get(key, default)
|
return sess.setdefault("serverdict", {}).get(key, default)
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def _check_auth_dict(self, authdict, clientip):
|
def _check_auth_dict(self, authdict: Dict[str, Any], clientip: str):
|
||||||
"""Attempt to validate the auth dict provided by a client
|
"""Attempt to validate the auth dict provided by a client
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
authdict (object): auth dict provided by the client
|
authdict: auth dict provided by the client
|
||||||
clientip (str): IP address of the client
|
clientip: IP address of the client
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
Deferred: result of the stage verification.
|
Deferred: result of the stage verification.
|
||||||
|
@ -410,10 +431,10 @@ class AuthHandler(BaseHandler):
|
||||||
(canonical_id, callback) = yield self.validate_login(user_id, authdict)
|
(canonical_id, callback) = yield self.validate_login(user_id, authdict)
|
||||||
return canonical_id
|
return canonical_id
|
||||||
|
|
||||||
def _get_params_recaptcha(self):
|
def _get_params_recaptcha(self) -> dict:
|
||||||
return {"public_key": self.hs.config.recaptcha_public_key}
|
return {"public_key": self.hs.config.recaptcha_public_key}
|
||||||
|
|
||||||
def _get_params_terms(self):
|
def _get_params_terms(self) -> dict:
|
||||||
return {
|
return {
|
||||||
"policies": {
|
"policies": {
|
||||||
"privacy_policy": {
|
"privacy_policy": {
|
||||||
|
@ -430,7 +451,9 @@ class AuthHandler(BaseHandler):
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
def _auth_dict_for_flows(self, flows, session):
|
def _auth_dict_for_flows(
|
||||||
|
self, flows: List[List[str]], session: Dict[str, Any]
|
||||||
|
) -> Dict[str, Any]:
|
||||||
public_flows = []
|
public_flows = []
|
||||||
for f in flows:
|
for f in flows:
|
||||||
public_flows.append(f)
|
public_flows.append(f)
|
||||||
|
@ -440,7 +463,7 @@ class AuthHandler(BaseHandler):
|
||||||
LoginType.TERMS: self._get_params_terms,
|
LoginType.TERMS: self._get_params_terms,
|
||||||
}
|
}
|
||||||
|
|
||||||
params = {}
|
params = {} # type: Dict[str, Any]
|
||||||
|
|
||||||
for f in public_flows:
|
for f in public_flows:
|
||||||
for stage in f:
|
for stage in f:
|
||||||
|
@ -453,7 +476,13 @@ class AuthHandler(BaseHandler):
|
||||||
"params": params,
|
"params": params,
|
||||||
}
|
}
|
||||||
|
|
||||||
def _get_session_info(self, session_id):
|
def _get_session_info(self, session_id: Optional[str]) -> dict:
|
||||||
|
"""
|
||||||
|
Gets or creates a session given a session ID.
|
||||||
|
|
||||||
|
The session can be used to track data across multiple requests, e.g. for
|
||||||
|
interactive authentication.
|
||||||
|
"""
|
||||||
if session_id not in self.sessions:
|
if session_id not in self.sessions:
|
||||||
session_id = None
|
session_id = None
|
||||||
|
|
||||||
|
@ -466,7 +495,9 @@ class AuthHandler(BaseHandler):
|
||||||
return self.sessions[session_id]
|
return self.sessions[session_id]
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def get_access_token_for_user_id(self, user_id, device_id, valid_until_ms):
|
def get_access_token_for_user_id(
|
||||||
|
self, user_id: str, device_id: Optional[str], valid_until_ms: Optional[int]
|
||||||
|
):
|
||||||
"""
|
"""
|
||||||
Creates a new access token for the user with the given user ID.
|
Creates a new access token for the user with the given user ID.
|
||||||
|
|
||||||
|
@ -476,11 +507,11 @@ class AuthHandler(BaseHandler):
|
||||||
The device will be recorded in the table if it is not there already.
|
The device will be recorded in the table if it is not there already.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
user_id (str): canonical User ID
|
user_id: canonical User ID
|
||||||
device_id (str|None): the device ID to associate with the tokens.
|
device_id: the device ID to associate with the tokens.
|
||||||
None to leave the tokens unassociated with a device (deprecated:
|
None to leave the tokens unassociated with a device (deprecated:
|
||||||
we should always have a device ID)
|
we should always have a device ID)
|
||||||
valid_until_ms (int|None): when the token is valid until. None for
|
valid_until_ms: when the token is valid until. None for
|
||||||
no expiry.
|
no expiry.
|
||||||
Returns:
|
Returns:
|
||||||
The access token for the user's session.
|
The access token for the user's session.
|
||||||
|
@ -515,13 +546,13 @@ class AuthHandler(BaseHandler):
|
||||||
return access_token
|
return access_token
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def check_user_exists(self, user_id):
|
def check_user_exists(self, user_id: str):
|
||||||
"""
|
"""
|
||||||
Checks to see if a user with the given id exists. Will check case
|
Checks to see if a user with the given id exists. Will check case
|
||||||
insensitively, but return None if there are multiple inexact matches.
|
insensitively, but return None if there are multiple inexact matches.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
(unicode|bytes) user_id: complete @user:id
|
user_id: complete @user:id
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
defer.Deferred: (unicode) canonical_user_id, or None if zero or
|
defer.Deferred: (unicode) canonical_user_id, or None if zero or
|
||||||
|
@ -536,7 +567,7 @@ class AuthHandler(BaseHandler):
|
||||||
return None
|
return None
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def _find_user_id_and_pwd_hash(self, user_id):
|
def _find_user_id_and_pwd_hash(self, user_id: str):
|
||||||
"""Checks to see if a user with the given id exists. Will check case
|
"""Checks to see if a user with the given id exists. Will check case
|
||||||
insensitively, but will return None if there are multiple inexact
|
insensitively, but will return None if there are multiple inexact
|
||||||
matches.
|
matches.
|
||||||
|
@ -566,7 +597,7 @@ class AuthHandler(BaseHandler):
|
||||||
)
|
)
|
||||||
return result
|
return result
|
||||||
|
|
||||||
def get_supported_login_types(self):
|
def get_supported_login_types(self) -> Iterable[str]:
|
||||||
"""Get a the login types supported for the /login API
|
"""Get a the login types supported for the /login API
|
||||||
|
|
||||||
By default this is just 'm.login.password' (unless password_enabled is
|
By default this is just 'm.login.password' (unless password_enabled is
|
||||||
|
@ -574,20 +605,20 @@ class AuthHandler(BaseHandler):
|
||||||
other login types.
|
other login types.
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
Iterable[str]: login types
|
login types
|
||||||
"""
|
"""
|
||||||
return self._supported_login_types
|
return self._supported_login_types
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def validate_login(self, username, login_submission):
|
def validate_login(self, username: str, login_submission: Dict[str, Any]):
|
||||||
"""Authenticates the user for the /login API
|
"""Authenticates the user for the /login API
|
||||||
|
|
||||||
Also used by the user-interactive auth flow to validate
|
Also used by the user-interactive auth flow to validate
|
||||||
m.login.password auth types.
|
m.login.password auth types.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
username (str): username supplied by the user
|
username: username supplied by the user
|
||||||
login_submission (dict): the whole of the login submission
|
login_submission: the whole of the login submission
|
||||||
(including 'type' and other relevant fields)
|
(including 'type' and other relevant fields)
|
||||||
Returns:
|
Returns:
|
||||||
Deferred[str, func]: canonical user id, and optional callback
|
Deferred[str, func]: canonical user id, and optional callback
|
||||||
|
@ -675,13 +706,13 @@ class AuthHandler(BaseHandler):
|
||||||
raise LoginError(403, "Invalid password", errcode=Codes.FORBIDDEN)
|
raise LoginError(403, "Invalid password", errcode=Codes.FORBIDDEN)
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def check_password_provider_3pid(self, medium, address, password):
|
def check_password_provider_3pid(self, medium: str, address: str, password: str):
|
||||||
"""Check if a password provider is able to validate a thirdparty login
|
"""Check if a password provider is able to validate a thirdparty login
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
medium (str): The medium of the 3pid (ex. email).
|
medium: The medium of the 3pid (ex. email).
|
||||||
address (str): The address of the 3pid (ex. jdoe@example.com).
|
address: The address of the 3pid (ex. jdoe@example.com).
|
||||||
password (str): The password of the user.
|
password: The password of the user.
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
Deferred[(str|None, func|None)]: A tuple of `(user_id,
|
Deferred[(str|None, func|None)]: A tuple of `(user_id,
|
||||||
|
@ -709,15 +740,15 @@ class AuthHandler(BaseHandler):
|
||||||
return None, None
|
return None, None
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def _check_local_password(self, user_id, password):
|
def _check_local_password(self, user_id: str, password: str):
|
||||||
"""Authenticate a user against the local password database.
|
"""Authenticate a user against the local password database.
|
||||||
|
|
||||||
user_id is checked case insensitively, but will return None if there are
|
user_id is checked case insensitively, but will return None if there are
|
||||||
multiple inexact matches.
|
multiple inexact matches.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
user_id (unicode): complete @user:id
|
user_id: complete @user:id
|
||||||
password (unicode): the provided password
|
password: the provided password
|
||||||
Returns:
|
Returns:
|
||||||
Deferred[unicode] the canonical_user_id, or Deferred[None] if
|
Deferred[unicode] the canonical_user_id, or Deferred[None] if
|
||||||
unknown user/bad password
|
unknown user/bad password
|
||||||
|
@ -740,7 +771,7 @@ class AuthHandler(BaseHandler):
|
||||||
return user_id
|
return user_id
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def validate_short_term_login_token_and_get_user_id(self, login_token):
|
def validate_short_term_login_token_and_get_user_id(self, login_token: str):
|
||||||
auth_api = self.hs.get_auth()
|
auth_api = self.hs.get_auth()
|
||||||
user_id = None
|
user_id = None
|
||||||
try:
|
try:
|
||||||
|
@ -754,11 +785,11 @@ class AuthHandler(BaseHandler):
|
||||||
return user_id
|
return user_id
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def delete_access_token(self, access_token):
|
def delete_access_token(self, access_token: str):
|
||||||
"""Invalidate a single access token
|
"""Invalidate a single access token
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
access_token (str): access token to be deleted
|
access_token: access token to be deleted
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
Deferred
|
Deferred
|
||||||
|
@ -783,15 +814,17 @@ class AuthHandler(BaseHandler):
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def delete_access_tokens_for_user(
|
def delete_access_tokens_for_user(
|
||||||
self, user_id, except_token_id=None, device_id=None
|
self,
|
||||||
|
user_id: str,
|
||||||
|
except_token_id: Optional[str] = None,
|
||||||
|
device_id: Optional[str] = None,
|
||||||
):
|
):
|
||||||
"""Invalidate access tokens belonging to a user
|
"""Invalidate access tokens belonging to a user
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
user_id (str): ID of user the tokens belong to
|
user_id: ID of user the tokens belong to
|
||||||
except_token_id (str|None): access_token ID which should *not* be
|
except_token_id: access_token ID which should *not* be deleted
|
||||||
deleted
|
device_id: ID of device the tokens are associated with.
|
||||||
device_id (str|None): ID of device the tokens are associated with.
|
|
||||||
If None, tokens associated with any device (or no device) will
|
If None, tokens associated with any device (or no device) will
|
||||||
be deleted
|
be deleted
|
||||||
Returns:
|
Returns:
|
||||||
|
@ -815,7 +848,7 @@ class AuthHandler(BaseHandler):
|
||||||
)
|
)
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def add_threepid(self, user_id, medium, address, validated_at):
|
def add_threepid(self, user_id: str, medium: str, address: str, validated_at: int):
|
||||||
# check if medium has a valid value
|
# check if medium has a valid value
|
||||||
if medium not in ["email", "msisdn"]:
|
if medium not in ["email", "msisdn"]:
|
||||||
raise SynapseError(
|
raise SynapseError(
|
||||||
|
@ -841,19 +874,20 @@ class AuthHandler(BaseHandler):
|
||||||
)
|
)
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def delete_threepid(self, user_id, medium, address, id_server=None):
|
def delete_threepid(
|
||||||
|
self, user_id: str, medium: str, address: str, id_server: Optional[str] = None
|
||||||
|
):
|
||||||
"""Attempts to unbind the 3pid on the identity servers and deletes it
|
"""Attempts to unbind the 3pid on the identity servers and deletes it
|
||||||
from the local database.
|
from the local database.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
user_id (str)
|
user_id: ID of user to remove the 3pid from.
|
||||||
medium (str)
|
medium: The medium of the 3pid being removed: "email" or "msisdn".
|
||||||
address (str)
|
address: The 3pid address to remove.
|
||||||
id_server (str|None): Use the given identity server when unbinding
|
id_server: Use the given identity server when unbinding
|
||||||
any threepids. If None then will attempt to unbind using the
|
any threepids. If None then will attempt to unbind using the
|
||||||
identity server specified when binding (if known).
|
identity server specified when binding (if known).
|
||||||
|
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
Deferred[bool]: Returns True if successfully unbound the 3pid on
|
Deferred[bool]: Returns True if successfully unbound the 3pid on
|
||||||
the identity server, False if identity server doesn't support the
|
the identity server, False if identity server doesn't support the
|
||||||
|
@ -872,17 +906,18 @@ class AuthHandler(BaseHandler):
|
||||||
yield self.store.user_delete_threepid(user_id, medium, address)
|
yield self.store.user_delete_threepid(user_id, medium, address)
|
||||||
return result
|
return result
|
||||||
|
|
||||||
def _save_session(self, session):
|
def _save_session(self, session: Dict[str, Any]) -> None:
|
||||||
|
"""Update the last used time on the session to now and add it back to the session store."""
|
||||||
# TODO: Persistent storage
|
# TODO: Persistent storage
|
||||||
logger.debug("Saving session %s", session)
|
logger.debug("Saving session %s", session)
|
||||||
session["last_used"] = self.hs.get_clock().time_msec()
|
session["last_used"] = self.hs.get_clock().time_msec()
|
||||||
self.sessions[session["id"]] = session
|
self.sessions[session["id"]] = session
|
||||||
|
|
||||||
def hash(self, password):
|
def hash(self, password: str):
|
||||||
"""Computes a secure hash of password.
|
"""Computes a secure hash of password.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
password (unicode): Password to hash.
|
password: Password to hash.
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
Deferred(unicode): Hashed password.
|
Deferred(unicode): Hashed password.
|
||||||
|
@ -899,12 +934,12 @@ class AuthHandler(BaseHandler):
|
||||||
|
|
||||||
return defer_to_thread(self.hs.get_reactor(), _do_hash)
|
return defer_to_thread(self.hs.get_reactor(), _do_hash)
|
||||||
|
|
||||||
def validate_hash(self, password, stored_hash):
|
def validate_hash(self, password: str, stored_hash: bytes):
|
||||||
"""Validates that self.hash(password) == stored_hash.
|
"""Validates that self.hash(password) == stored_hash.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
password (unicode): Password to hash.
|
password: Password to hash.
|
||||||
stored_hash (bytes): Expected hash value.
|
stored_hash: Expected hash value.
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
Deferred(bool): Whether self.hash(password) == stored_hash.
|
Deferred(bool): Whether self.hash(password) == stored_hash.
|
||||||
|
@ -927,13 +962,74 @@ class AuthHandler(BaseHandler):
|
||||||
else:
|
else:
|
||||||
return defer.succeed(False)
|
return defer.succeed(False)
|
||||||
|
|
||||||
|
def complete_sso_login(
|
||||||
|
self,
|
||||||
|
registered_user_id: str,
|
||||||
|
request: SynapseRequest,
|
||||||
|
client_redirect_url: str,
|
||||||
|
):
|
||||||
|
"""Having figured out a mxid for this user, complete the HTTP request
|
||||||
|
|
||||||
|
Args:
|
||||||
|
registered_user_id: The registered user ID to complete SSO login for.
|
||||||
|
request: The request to complete.
|
||||||
|
client_redirect_url: The URL to which to redirect the user at the end of the
|
||||||
|
process.
|
||||||
|
"""
|
||||||
|
# Create a login token
|
||||||
|
login_token = self.macaroon_gen.generate_short_term_login_token(
|
||||||
|
registered_user_id
|
||||||
|
)
|
||||||
|
|
||||||
|
# Append the login token to the original redirect URL (i.e. with its query
|
||||||
|
# parameters kept intact) to build the URL to which the template needs to
|
||||||
|
# redirect the users once they have clicked on the confirmation link.
|
||||||
|
redirect_url = self.add_query_param_to_url(
|
||||||
|
client_redirect_url, "loginToken", login_token
|
||||||
|
)
|
||||||
|
|
||||||
|
# if the client is whitelisted, we can redirect straight to it
|
||||||
|
if client_redirect_url.startswith(self._whitelisted_sso_clients):
|
||||||
|
request.redirect(redirect_url)
|
||||||
|
finish_request(request)
|
||||||
|
return
|
||||||
|
|
||||||
|
# Otherwise, serve the redirect confirmation page.
|
||||||
|
|
||||||
|
# Remove the query parameters from the redirect URL to get a shorter version of
|
||||||
|
# it. This is only to display a human-readable URL in the template, but not the
|
||||||
|
# URL we redirect users to.
|
||||||
|
redirect_url_no_params = client_redirect_url.split("?")[0]
|
||||||
|
|
||||||
|
html = self._sso_redirect_confirm_template.render(
|
||||||
|
display_url=redirect_url_no_params,
|
||||||
|
redirect_url=redirect_url,
|
||||||
|
server_name=self._server_name,
|
||||||
|
).encode("utf-8")
|
||||||
|
|
||||||
|
request.setResponseCode(200)
|
||||||
|
request.setHeader(b"Content-Type", b"text/html; charset=utf-8")
|
||||||
|
request.setHeader(b"Content-Length", b"%d" % (len(html),))
|
||||||
|
request.write(html)
|
||||||
|
finish_request(request)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def add_query_param_to_url(url: str, param_name: str, param: Any):
|
||||||
|
url_parts = list(urllib.parse.urlparse(url))
|
||||||
|
query = dict(urllib.parse.parse_qsl(url_parts[4]))
|
||||||
|
query.update({param_name: param})
|
||||||
|
url_parts[4] = urllib.parse.urlencode(query)
|
||||||
|
return urllib.parse.urlunparse(url_parts)
|
||||||
|
|
||||||
|
|
||||||
@attr.s
|
@attr.s
|
||||||
class MacaroonGenerator(object):
|
class MacaroonGenerator(object):
|
||||||
|
|
||||||
hs = attr.ib()
|
hs = attr.ib()
|
||||||
|
|
||||||
def generate_access_token(self, user_id, extra_caveats=None):
|
def generate_access_token(
|
||||||
|
self, user_id: str, extra_caveats: Optional[List[str]] = None
|
||||||
|
) -> str:
|
||||||
extra_caveats = extra_caveats or []
|
extra_caveats = extra_caveats or []
|
||||||
macaroon = self._generate_base_macaroon(user_id)
|
macaroon = self._generate_base_macaroon(user_id)
|
||||||
macaroon.add_first_party_caveat("type = access")
|
macaroon.add_first_party_caveat("type = access")
|
||||||
|
@ -946,16 +1042,9 @@ class MacaroonGenerator(object):
|
||||||
macaroon.add_first_party_caveat(caveat)
|
macaroon.add_first_party_caveat(caveat)
|
||||||
return macaroon.serialize()
|
return macaroon.serialize()
|
||||||
|
|
||||||
def generate_short_term_login_token(self, user_id, duration_in_ms=(2 * 60 * 1000)):
|
def generate_short_term_login_token(
|
||||||
"""
|
self, user_id: str, duration_in_ms: int = (2 * 60 * 1000)
|
||||||
|
) -> str:
|
||||||
Args:
|
|
||||||
user_id (unicode):
|
|
||||||
duration_in_ms (int):
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
unicode
|
|
||||||
"""
|
|
||||||
macaroon = self._generate_base_macaroon(user_id)
|
macaroon = self._generate_base_macaroon(user_id)
|
||||||
macaroon.add_first_party_caveat("type = login")
|
macaroon.add_first_party_caveat("type = login")
|
||||||
now = self.hs.get_clock().time_msec()
|
now = self.hs.get_clock().time_msec()
|
||||||
|
@ -963,12 +1052,12 @@ class MacaroonGenerator(object):
|
||||||
macaroon.add_first_party_caveat("time < %d" % (expiry,))
|
macaroon.add_first_party_caveat("time < %d" % (expiry,))
|
||||||
return macaroon.serialize()
|
return macaroon.serialize()
|
||||||
|
|
||||||
def generate_delete_pusher_token(self, user_id):
|
def generate_delete_pusher_token(self, user_id: str) -> str:
|
||||||
macaroon = self._generate_base_macaroon(user_id)
|
macaroon = self._generate_base_macaroon(user_id)
|
||||||
macaroon.add_first_party_caveat("type = delete_pusher")
|
macaroon.add_first_party_caveat("type = delete_pusher")
|
||||||
return macaroon.serialize()
|
return macaroon.serialize()
|
||||||
|
|
||||||
def _generate_base_macaroon(self, user_id):
|
def _generate_base_macaroon(self, user_id: str) -> pymacaroons.Macaroon:
|
||||||
macaroon = pymacaroons.Macaroon(
|
macaroon = pymacaroons.Macaroon(
|
||||||
location=self.hs.config.server_name,
|
location=self.hs.config.server_name,
|
||||||
identifier="key",
|
identifier="key",
|
||||||
|
|
|
@ -13,11 +13,9 @@
|
||||||
# See the License for the specific language governing permissions and
|
# See the License for the specific language governing permissions and
|
||||||
# limitations under the License.
|
# limitations under the License.
|
||||||
|
|
||||||
|
|
||||||
import collections
|
|
||||||
import logging
|
import logging
|
||||||
import string
|
import string
|
||||||
from typing import List
|
from typing import Iterable, List, Optional
|
||||||
|
|
||||||
from twisted.internet import defer
|
from twisted.internet import defer
|
||||||
|
|
||||||
|
@ -30,6 +28,7 @@ from synapse.api.errors import (
|
||||||
StoreError,
|
StoreError,
|
||||||
SynapseError,
|
SynapseError,
|
||||||
)
|
)
|
||||||
|
from synapse.appservice import ApplicationService
|
||||||
from synapse.types import Requester, RoomAlias, UserID, get_domain_from_id
|
from synapse.types import Requester, RoomAlias, UserID, get_domain_from_id
|
||||||
|
|
||||||
from ._base import BaseHandler
|
from ._base import BaseHandler
|
||||||
|
@ -57,7 +56,13 @@ class DirectoryHandler(BaseHandler):
|
||||||
self.spam_checker = hs.get_spam_checker()
|
self.spam_checker = hs.get_spam_checker()
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def _create_association(self, room_alias, room_id, servers=None, creator=None):
|
def _create_association(
|
||||||
|
self,
|
||||||
|
room_alias: RoomAlias,
|
||||||
|
room_id: str,
|
||||||
|
servers: Optional[Iterable[str]] = None,
|
||||||
|
creator: Optional[str] = None,
|
||||||
|
):
|
||||||
# general association creation for both human users and app services
|
# general association creation for both human users and app services
|
||||||
|
|
||||||
for wchar in string.whitespace:
|
for wchar in string.whitespace:
|
||||||
|
@ -83,17 +88,21 @@ class DirectoryHandler(BaseHandler):
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def create_association(
|
def create_association(
|
||||||
self, requester, room_alias, room_id, servers=None, check_membership=True,
|
self,
|
||||||
|
requester: Requester,
|
||||||
|
room_alias: RoomAlias,
|
||||||
|
room_id: str,
|
||||||
|
servers: Optional[List[str]] = None,
|
||||||
|
check_membership: bool = True,
|
||||||
):
|
):
|
||||||
"""Attempt to create a new alias
|
"""Attempt to create a new alias
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
requester (Requester)
|
requester
|
||||||
room_alias (RoomAlias)
|
room_alias
|
||||||
room_id (str)
|
room_id
|
||||||
servers (list[str]|None): List of servers that others servers
|
servers: Iterable of servers that others servers should try and join via
|
||||||
should try and join via
|
check_membership: Whether to check if the user is in the room
|
||||||
check_membership (bool): Whether to check if the user is in the room
|
|
||||||
before the alias can be set (if the server's config requires it).
|
before the alias can be set (if the server's config requires it).
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
|
@ -147,15 +156,15 @@ class DirectoryHandler(BaseHandler):
|
||||||
yield self._create_association(room_alias, room_id, servers, creator=user_id)
|
yield self._create_association(room_alias, room_id, servers, creator=user_id)
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def delete_association(self, requester, room_alias):
|
def delete_association(self, requester: Requester, room_alias: RoomAlias):
|
||||||
"""Remove an alias from the directory
|
"""Remove an alias from the directory
|
||||||
|
|
||||||
(this is only meant for human users; AS users should call
|
(this is only meant for human users; AS users should call
|
||||||
delete_appservice_association)
|
delete_appservice_association)
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
requester (Requester):
|
requester
|
||||||
room_alias (RoomAlias):
|
room_alias
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
Deferred[unicode]: room id that the alias used to point to
|
Deferred[unicode]: room id that the alias used to point to
|
||||||
|
@ -191,16 +200,16 @@ class DirectoryHandler(BaseHandler):
|
||||||
room_id = yield self._delete_association(room_alias)
|
room_id = yield self._delete_association(room_alias)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
yield self._update_canonical_alias(
|
yield self._update_canonical_alias(requester, user_id, room_id, room_alias)
|
||||||
requester, requester.user.to_string(), room_id, room_alias
|
|
||||||
)
|
|
||||||
except AuthError as e:
|
except AuthError as e:
|
||||||
logger.info("Failed to update alias events: %s", e)
|
logger.info("Failed to update alias events: %s", e)
|
||||||
|
|
||||||
return room_id
|
return room_id
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def delete_appservice_association(self, service, room_alias):
|
def delete_appservice_association(
|
||||||
|
self, service: ApplicationService, room_alias: RoomAlias
|
||||||
|
):
|
||||||
if not service.is_interested_in_alias(room_alias.to_string()):
|
if not service.is_interested_in_alias(room_alias.to_string()):
|
||||||
raise SynapseError(
|
raise SynapseError(
|
||||||
400,
|
400,
|
||||||
|
@ -210,7 +219,7 @@ class DirectoryHandler(BaseHandler):
|
||||||
yield self._delete_association(room_alias)
|
yield self._delete_association(room_alias)
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def _delete_association(self, room_alias):
|
def _delete_association(self, room_alias: RoomAlias):
|
||||||
if not self.hs.is_mine(room_alias):
|
if not self.hs.is_mine(room_alias):
|
||||||
raise SynapseError(400, "Room alias must be local")
|
raise SynapseError(400, "Room alias must be local")
|
||||||
|
|
||||||
|
@ -219,7 +228,7 @@ class DirectoryHandler(BaseHandler):
|
||||||
return room_id
|
return room_id
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def get_association(self, room_alias):
|
def get_association(self, room_alias: RoomAlias):
|
||||||
room_id = None
|
room_id = None
|
||||||
if self.hs.is_mine(room_alias):
|
if self.hs.is_mine(room_alias):
|
||||||
result = yield self.get_association_from_room_alias(room_alias)
|
result = yield self.get_association_from_room_alias(room_alias)
|
||||||
|
@ -284,7 +293,9 @@ class DirectoryHandler(BaseHandler):
|
||||||
)
|
)
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def _update_canonical_alias(self, requester, user_id, room_id, room_alias):
|
def _update_canonical_alias(
|
||||||
|
self, requester: Requester, user_id: str, room_id: str, room_alias: RoomAlias
|
||||||
|
):
|
||||||
"""
|
"""
|
||||||
Send an updated canonical alias event if the removed alias was set as
|
Send an updated canonical alias event if the removed alias was set as
|
||||||
the canonical alias or listed in the alt_aliases field.
|
the canonical alias or listed in the alt_aliases field.
|
||||||
|
@ -307,15 +318,17 @@ class DirectoryHandler(BaseHandler):
|
||||||
send_update = True
|
send_update = True
|
||||||
content.pop("alias", "")
|
content.pop("alias", "")
|
||||||
|
|
||||||
# Filter alt_aliases for the removed alias.
|
# Filter the alt_aliases property for the removed alias. Note that the
|
||||||
alt_aliases = content.pop("alt_aliases", None)
|
# value is not modified if alt_aliases is of an unexpected form.
|
||||||
# If the aliases are not a list (or not found) do not attempt to modify
|
alt_aliases = content.get("alt_aliases")
|
||||||
# the list.
|
if isinstance(alt_aliases, (list, tuple)) and alias_str in alt_aliases:
|
||||||
if isinstance(alt_aliases, collections.Sequence):
|
|
||||||
send_update = True
|
send_update = True
|
||||||
alt_aliases = [alias for alias in alt_aliases if alias != alias_str]
|
alt_aliases = [alias for alias in alt_aliases if alias != alias_str]
|
||||||
|
|
||||||
if alt_aliases:
|
if alt_aliases:
|
||||||
content["alt_aliases"] = alt_aliases
|
content["alt_aliases"] = alt_aliases
|
||||||
|
else:
|
||||||
|
del content["alt_aliases"]
|
||||||
|
|
||||||
if send_update:
|
if send_update:
|
||||||
yield self.event_creation_handler.create_and_send_nonmember_event(
|
yield self.event_creation_handler.create_and_send_nonmember_event(
|
||||||
|
@ -331,7 +344,7 @@ class DirectoryHandler(BaseHandler):
|
||||||
)
|
)
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def get_association_from_room_alias(self, room_alias):
|
def get_association_from_room_alias(self, room_alias: RoomAlias):
|
||||||
result = yield self.store.get_association_from_room_alias(room_alias)
|
result = yield self.store.get_association_from_room_alias(room_alias)
|
||||||
if not result:
|
if not result:
|
||||||
# Query AS to see if it exists
|
# Query AS to see if it exists
|
||||||
|
@ -339,7 +352,7 @@ class DirectoryHandler(BaseHandler):
|
||||||
result = yield as_handler.query_room_alias_exists(room_alias)
|
result = yield as_handler.query_room_alias_exists(room_alias)
|
||||||
return result
|
return result
|
||||||
|
|
||||||
def can_modify_alias(self, alias, user_id=None):
|
def can_modify_alias(self, alias: RoomAlias, user_id: Optional[str] = None):
|
||||||
# Any application service "interested" in an alias they are regexing on
|
# Any application service "interested" in an alias they are regexing on
|
||||||
# can modify the alias.
|
# can modify the alias.
|
||||||
# Users can only modify the alias if ALL the interested services have
|
# Users can only modify the alias if ALL the interested services have
|
||||||
|
@ -360,22 +373,42 @@ class DirectoryHandler(BaseHandler):
|
||||||
return defer.succeed(True)
|
return defer.succeed(True)
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def _user_can_delete_alias(self, alias, user_id):
|
def _user_can_delete_alias(self, alias: RoomAlias, user_id: str):
|
||||||
|
"""Determine whether a user can delete an alias.
|
||||||
|
|
||||||
|
One of the following must be true:
|
||||||
|
|
||||||
|
1. The user created the alias.
|
||||||
|
2. The user is a server administrator.
|
||||||
|
3. The user has a power-level sufficient to send a canonical alias event
|
||||||
|
for the current room.
|
||||||
|
|
||||||
|
"""
|
||||||
creator = yield self.store.get_room_alias_creator(alias.to_string())
|
creator = yield self.store.get_room_alias_creator(alias.to_string())
|
||||||
|
|
||||||
if creator is not None and creator == user_id:
|
if creator is not None and creator == user_id:
|
||||||
return True
|
return True
|
||||||
|
|
||||||
is_admin = yield self.auth.is_server_admin(UserID.from_string(user_id))
|
# Resolve the alias to the corresponding room.
|
||||||
return is_admin
|
room_mapping = yield self.get_association(alias)
|
||||||
|
room_id = room_mapping["room_id"]
|
||||||
|
if not room_id:
|
||||||
|
return False
|
||||||
|
|
||||||
|
res = yield self.auth.check_can_change_room_list(
|
||||||
|
room_id, UserID.from_string(user_id)
|
||||||
|
)
|
||||||
|
return res
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def edit_published_room_list(self, requester, room_id, visibility):
|
def edit_published_room_list(
|
||||||
|
self, requester: Requester, room_id: str, visibility: str
|
||||||
|
):
|
||||||
"""Edit the entry of the room in the published room list.
|
"""Edit the entry of the room in the published room list.
|
||||||
|
|
||||||
requester
|
requester
|
||||||
room_id (str)
|
room_id
|
||||||
visibility (str): "public" or "private"
|
visibility: "public" or "private"
|
||||||
"""
|
"""
|
||||||
user_id = requester.user.to_string()
|
user_id = requester.user.to_string()
|
||||||
|
|
||||||
|
@ -400,7 +433,15 @@ class DirectoryHandler(BaseHandler):
|
||||||
if room is None:
|
if room is None:
|
||||||
raise SynapseError(400, "Unknown room")
|
raise SynapseError(400, "Unknown room")
|
||||||
|
|
||||||
yield self.auth.check_can_change_room_list(room_id, requester.user)
|
can_change_room_list = yield self.auth.check_can_change_room_list(
|
||||||
|
room_id, requester.user
|
||||||
|
)
|
||||||
|
if not can_change_room_list:
|
||||||
|
raise AuthError(
|
||||||
|
403,
|
||||||
|
"This server requires you to be a moderator in the room to"
|
||||||
|
" edit its room list entry",
|
||||||
|
)
|
||||||
|
|
||||||
making_public = visibility == "public"
|
making_public = visibility == "public"
|
||||||
if making_public:
|
if making_public:
|
||||||
|
@ -421,16 +462,16 @@ class DirectoryHandler(BaseHandler):
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def edit_published_appservice_room_list(
|
def edit_published_appservice_room_list(
|
||||||
self, appservice_id, network_id, room_id, visibility
|
self, appservice_id: str, network_id: str, room_id: str, visibility: str
|
||||||
):
|
):
|
||||||
"""Add or remove a room from the appservice/network specific public
|
"""Add or remove a room from the appservice/network specific public
|
||||||
room list.
|
room list.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
appservice_id (str): ID of the appservice that owns the list
|
appservice_id: ID of the appservice that owns the list
|
||||||
network_id (str): The ID of the network the list is associated with
|
network_id: The ID of the network the list is associated with
|
||||||
room_id (str)
|
room_id
|
||||||
visibility (str): either "public" or "private"
|
visibility: either "public" or "private"
|
||||||
"""
|
"""
|
||||||
if visibility not in ["public", "private"]:
|
if visibility not in ["public", "private"]:
|
||||||
raise SynapseError(400, "Invalid visibility setting")
|
raise SynapseError(400, "Invalid visibility setting")
|
||||||
|
|
|
@ -207,6 +207,13 @@ class E2eRoomKeysHandler(object):
|
||||||
changed = False # if anything has changed, we need to update the etag
|
changed = False # if anything has changed, we need to update the etag
|
||||||
for room_id, room in iteritems(room_keys["rooms"]):
|
for room_id, room in iteritems(room_keys["rooms"]):
|
||||||
for session_id, room_key in iteritems(room["sessions"]):
|
for session_id, room_key in iteritems(room["sessions"]):
|
||||||
|
if not isinstance(room_key["is_verified"], bool):
|
||||||
|
msg = (
|
||||||
|
"is_verified must be a boolean in keys for session %s in"
|
||||||
|
"room %s" % (session_id, room_id)
|
||||||
|
)
|
||||||
|
raise SynapseError(400, msg, Codes.INVALID_PARAM)
|
||||||
|
|
||||||
log_kv(
|
log_kv(
|
||||||
{
|
{
|
||||||
"message": "Trying to upload room key",
|
"message": "Trying to upload room key",
|
||||||
|
|
|
@ -160,7 +160,7 @@ class MessageHandler(object):
|
||||||
raise NotFoundError("Can't find event for token %s" % (at_token,))
|
raise NotFoundError("Can't find event for token %s" % (at_token,))
|
||||||
|
|
||||||
visible_events = yield filter_events_for_client(
|
visible_events = yield filter_events_for_client(
|
||||||
self.storage, user_id, last_events, apply_retention_policies=False
|
self.storage, user_id, last_events, filter_send_to_client=False
|
||||||
)
|
)
|
||||||
|
|
||||||
event = last_events[0]
|
event = last_events[0]
|
||||||
|
@ -888,19 +888,60 @@ class EventCreationHandler(object):
|
||||||
yield self.base_handler.maybe_kick_guest_users(event, context)
|
yield self.base_handler.maybe_kick_guest_users(event, context)
|
||||||
|
|
||||||
if event.type == EventTypes.CanonicalAlias:
|
if event.type == EventTypes.CanonicalAlias:
|
||||||
# Check the alias is acually valid (at this time at least)
|
# Validate a newly added alias or newly added alt_aliases.
|
||||||
|
|
||||||
|
original_alias = None
|
||||||
|
original_alt_aliases = set()
|
||||||
|
|
||||||
|
original_event_id = event.unsigned.get("replaces_state")
|
||||||
|
if original_event_id:
|
||||||
|
original_event = yield self.store.get_event(original_event_id)
|
||||||
|
|
||||||
|
if original_event:
|
||||||
|
original_alias = original_event.content.get("alias", None)
|
||||||
|
original_alt_aliases = original_event.content.get("alt_aliases", [])
|
||||||
|
|
||||||
|
# Check the alias is currently valid (if it has changed).
|
||||||
room_alias_str = event.content.get("alias", None)
|
room_alias_str = event.content.get("alias", None)
|
||||||
if room_alias_str:
|
directory_handler = self.hs.get_handlers().directory_handler
|
||||||
|
if room_alias_str and room_alias_str != original_alias:
|
||||||
room_alias = RoomAlias.from_string(room_alias_str)
|
room_alias = RoomAlias.from_string(room_alias_str)
|
||||||
directory_handler = self.hs.get_handlers().directory_handler
|
|
||||||
mapping = yield directory_handler.get_association(room_alias)
|
mapping = yield directory_handler.get_association(room_alias)
|
||||||
|
|
||||||
if mapping["room_id"] != event.room_id:
|
if mapping["room_id"] != event.room_id:
|
||||||
raise SynapseError(
|
raise SynapseError(
|
||||||
400,
|
400,
|
||||||
"Room alias %s does not point to the room" % (room_alias_str,),
|
"Room alias %s does not point to the room" % (room_alias_str,),
|
||||||
|
Codes.BAD_ALIAS,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
# Check that alt_aliases is the proper form.
|
||||||
|
alt_aliases = event.content.get("alt_aliases", [])
|
||||||
|
if not isinstance(alt_aliases, (list, tuple)):
|
||||||
|
raise SynapseError(
|
||||||
|
400, "The alt_aliases property must be a list.", Codes.INVALID_PARAM
|
||||||
|
)
|
||||||
|
|
||||||
|
# If the old version of alt_aliases is of an unknown form,
|
||||||
|
# completely replace it.
|
||||||
|
if not isinstance(original_alt_aliases, (list, tuple)):
|
||||||
|
original_alt_aliases = []
|
||||||
|
|
||||||
|
# Check that each alias is currently valid.
|
||||||
|
new_alt_aliases = set(alt_aliases) - set(original_alt_aliases)
|
||||||
|
if new_alt_aliases:
|
||||||
|
for alias_str in new_alt_aliases:
|
||||||
|
room_alias = RoomAlias.from_string(alias_str)
|
||||||
|
mapping = yield directory_handler.get_association(room_alias)
|
||||||
|
|
||||||
|
if mapping["room_id"] != event.room_id:
|
||||||
|
raise SynapseError(
|
||||||
|
400,
|
||||||
|
"Room alias %s does not point to the room"
|
||||||
|
% (room_alias_str,),
|
||||||
|
Codes.BAD_ALIAS,
|
||||||
|
)
|
||||||
|
|
||||||
federation_handler = self.hs.get_handlers().federation_handler
|
federation_handler = self.hs.get_handlers().federation_handler
|
||||||
|
|
||||||
if event.type == EventTypes.Member:
|
if event.type == EventTypes.Member:
|
||||||
|
|
|
@ -292,16 +292,6 @@ class RoomCreationHandler(BaseHandler):
|
||||||
except AuthError as e:
|
except AuthError as e:
|
||||||
logger.warning("Unable to update PLs in old room: %s", e)
|
logger.warning("Unable to update PLs in old room: %s", e)
|
||||||
|
|
||||||
new_pl_content = copy_power_levels_contents(old_room_pl_state.content)
|
|
||||||
|
|
||||||
# pre-msc2260 rooms may not have the right setting for aliases. If no other
|
|
||||||
# value is set, set it now.
|
|
||||||
events_default = new_pl_content.get("events_default", 0)
|
|
||||||
new_pl_content.setdefault("events", {}).setdefault(
|
|
||||||
EventTypes.Aliases, events_default
|
|
||||||
)
|
|
||||||
|
|
||||||
logger.debug("Setting correct PLs in new room to %s", new_pl_content)
|
|
||||||
yield self.event_creation_handler.create_and_send_nonmember_event(
|
yield self.event_creation_handler.create_and_send_nonmember_event(
|
||||||
requester,
|
requester,
|
||||||
{
|
{
|
||||||
|
@ -309,7 +299,7 @@ class RoomCreationHandler(BaseHandler):
|
||||||
"state_key": "",
|
"state_key": "",
|
||||||
"room_id": new_room_id,
|
"room_id": new_room_id,
|
||||||
"sender": requester.user.to_string(),
|
"sender": requester.user.to_string(),
|
||||||
"content": new_pl_content,
|
"content": old_room_pl_state.content,
|
||||||
},
|
},
|
||||||
ratelimit=False,
|
ratelimit=False,
|
||||||
)
|
)
|
||||||
|
@ -814,10 +804,6 @@ class RoomCreationHandler(BaseHandler):
|
||||||
EventTypes.RoomHistoryVisibility: 100,
|
EventTypes.RoomHistoryVisibility: 100,
|
||||||
EventTypes.CanonicalAlias: 50,
|
EventTypes.CanonicalAlias: 50,
|
||||||
EventTypes.RoomAvatar: 50,
|
EventTypes.RoomAvatar: 50,
|
||||||
# MSC2260: Allow everybody to send alias events by default
|
|
||||||
# This will be reudundant on pre-MSC2260 rooms, since the
|
|
||||||
# aliases event is special-cased.
|
|
||||||
EventTypes.Aliases: 0,
|
|
||||||
EventTypes.Tombstone: 100,
|
EventTypes.Tombstone: 100,
|
||||||
EventTypes.ServerACL: 100,
|
EventTypes.ServerACL: 100,
|
||||||
},
|
},
|
||||||
|
|
|
@ -23,9 +23,9 @@ from saml2.client import Saml2Client
|
||||||
|
|
||||||
from synapse.api.errors import SynapseError
|
from synapse.api.errors import SynapseError
|
||||||
from synapse.config import ConfigError
|
from synapse.config import ConfigError
|
||||||
|
from synapse.http.server import finish_request
|
||||||
from synapse.http.servlet import parse_string
|
from synapse.http.servlet import parse_string
|
||||||
from synapse.module_api import ModuleApi
|
from synapse.module_api import ModuleApi
|
||||||
from synapse.rest.client.v1.login import SSOAuthHandler
|
|
||||||
from synapse.types import (
|
from synapse.types import (
|
||||||
UserID,
|
UserID,
|
||||||
map_username_to_mxid_localpart,
|
map_username_to_mxid_localpart,
|
||||||
|
@ -48,7 +48,7 @@ class Saml2SessionData:
|
||||||
class SamlHandler:
|
class SamlHandler:
|
||||||
def __init__(self, hs):
|
def __init__(self, hs):
|
||||||
self._saml_client = Saml2Client(hs.config.saml2_sp_config)
|
self._saml_client = Saml2Client(hs.config.saml2_sp_config)
|
||||||
self._sso_auth_handler = SSOAuthHandler(hs)
|
self._auth_handler = hs.get_auth_handler()
|
||||||
self._registration_handler = hs.get_registration_handler()
|
self._registration_handler = hs.get_registration_handler()
|
||||||
|
|
||||||
self._clock = hs.get_clock()
|
self._clock = hs.get_clock()
|
||||||
|
@ -74,6 +74,8 @@ class SamlHandler:
|
||||||
# a lock on the mappings
|
# a lock on the mappings
|
||||||
self._mapping_lock = Linearizer(name="saml_mapping", clock=self._clock)
|
self._mapping_lock = Linearizer(name="saml_mapping", clock=self._clock)
|
||||||
|
|
||||||
|
self._error_html_content = hs.config.saml2_error_html_content
|
||||||
|
|
||||||
def handle_redirect_request(self, client_redirect_url):
|
def handle_redirect_request(self, client_redirect_url):
|
||||||
"""Handle an incoming request to /login/sso/redirect
|
"""Handle an incoming request to /login/sso/redirect
|
||||||
|
|
||||||
|
@ -115,8 +117,23 @@ class SamlHandler:
|
||||||
# the dict.
|
# the dict.
|
||||||
self.expire_sessions()
|
self.expire_sessions()
|
||||||
|
|
||||||
user_id = await self._map_saml_response_to_user(resp_bytes, relay_state)
|
try:
|
||||||
self._sso_auth_handler.complete_sso_login(user_id, request, relay_state)
|
user_id = await self._map_saml_response_to_user(resp_bytes, relay_state)
|
||||||
|
except Exception as e:
|
||||||
|
# If decoding the response or mapping it to a user failed, then log the
|
||||||
|
# error and tell the user that something went wrong.
|
||||||
|
logger.error(e)
|
||||||
|
|
||||||
|
request.setResponseCode(400)
|
||||||
|
request.setHeader(b"Content-Type", b"text/html; charset=utf-8")
|
||||||
|
request.setHeader(
|
||||||
|
b"Content-Length", b"%d" % (len(self._error_html_content),)
|
||||||
|
)
|
||||||
|
request.write(self._error_html_content.encode("utf8"))
|
||||||
|
finish_request(request)
|
||||||
|
return
|
||||||
|
|
||||||
|
self._auth_handler.complete_sso_login(user_id, request, relay_state)
|
||||||
|
|
||||||
async def _map_saml_response_to_user(self, resp_bytes, client_redirect_url):
|
async def _map_saml_response_to_user(self, resp_bytes, client_redirect_url):
|
||||||
try:
|
try:
|
||||||
|
|
|
@ -13,10 +13,12 @@
|
||||||
# See the License for the specific language governing permissions and
|
# See the License for the specific language governing permissions and
|
||||||
# limitations under the License.
|
# limitations under the License.
|
||||||
import logging
|
import logging
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
from twisted.internet import defer
|
from twisted.internet import defer
|
||||||
|
|
||||||
from synapse.api.errors import Codes, StoreError, SynapseError
|
from synapse.api.errors import Codes, StoreError, SynapseError
|
||||||
|
from synapse.types import Requester
|
||||||
|
|
||||||
from ._base import BaseHandler
|
from ._base import BaseHandler
|
||||||
|
|
||||||
|
@ -32,14 +34,17 @@ class SetPasswordHandler(BaseHandler):
|
||||||
self._device_handler = hs.get_device_handler()
|
self._device_handler = hs.get_device_handler()
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def set_password(self, user_id, newpassword, requester=None):
|
def set_password(
|
||||||
|
self,
|
||||||
|
user_id: str,
|
||||||
|
new_password: str,
|
||||||
|
logout_devices: bool,
|
||||||
|
requester: Optional[Requester] = None,
|
||||||
|
):
|
||||||
if not self.hs.config.password_localdb_enabled:
|
if not self.hs.config.password_localdb_enabled:
|
||||||
raise SynapseError(403, "Password change disabled", errcode=Codes.FORBIDDEN)
|
raise SynapseError(403, "Password change disabled", errcode=Codes.FORBIDDEN)
|
||||||
|
|
||||||
password_hash = yield self._auth_handler.hash(newpassword)
|
password_hash = yield self._auth_handler.hash(new_password)
|
||||||
|
|
||||||
except_device_id = requester.device_id if requester else None
|
|
||||||
except_access_token_id = requester.access_token_id if requester else None
|
|
||||||
|
|
||||||
try:
|
try:
|
||||||
yield self.store.user_set_password_hash(user_id, password_hash)
|
yield self.store.user_set_password_hash(user_id, password_hash)
|
||||||
|
@ -48,14 +53,18 @@ class SetPasswordHandler(BaseHandler):
|
||||||
raise SynapseError(404, "Unknown user", Codes.NOT_FOUND)
|
raise SynapseError(404, "Unknown user", Codes.NOT_FOUND)
|
||||||
raise e
|
raise e
|
||||||
|
|
||||||
# we want to log out all of the user's other sessions. First delete
|
# Optionally, log out all of the user's other sessions.
|
||||||
# all his other devices.
|
if logout_devices:
|
||||||
yield self._device_handler.delete_all_devices_for_user(
|
except_device_id = requester.device_id if requester else None
|
||||||
user_id, except_device_id=except_device_id
|
except_access_token_id = requester.access_token_id if requester else None
|
||||||
)
|
|
||||||
|
|
||||||
# and now delete any access tokens which weren't associated with
|
# First delete all of their other devices.
|
||||||
# devices (or were associated with this device).
|
yield self._device_handler.delete_all_devices_for_user(
|
||||||
yield self._auth_handler.delete_access_tokens_for_user(
|
user_id, except_device_id=except_device_id
|
||||||
user_id, except_token_id=except_access_token_id
|
)
|
||||||
)
|
|
||||||
|
# and now delete any access tokens which weren't associated with
|
||||||
|
# devices (or were associated with this device).
|
||||||
|
yield self._auth_handler.delete_access_tokens_for_user(
|
||||||
|
user_id, except_token_id=except_access_token_id
|
||||||
|
)
|
||||||
|
|
|
@ -244,9 +244,6 @@ class SimpleHttpClient(object):
|
||||||
pool.maxPersistentPerHost = max((100 * CACHE_SIZE_FACTOR, 5))
|
pool.maxPersistentPerHost = max((100 * CACHE_SIZE_FACTOR, 5))
|
||||||
pool.cachedConnectionTimeout = 2 * 60
|
pool.cachedConnectionTimeout = 2 * 60
|
||||||
|
|
||||||
# The default context factory in Twisted 14.0.0 (which we require) is
|
|
||||||
# BrowserLikePolicyForHTTPS which will do regular cert validation
|
|
||||||
# 'like a browser'
|
|
||||||
self.agent = ProxyAgent(
|
self.agent = ProxyAgent(
|
||||||
self.reactor,
|
self.reactor,
|
||||||
connectTimeout=15,
|
connectTimeout=15,
|
||||||
|
|
|
@ -45,7 +45,7 @@ class MatrixFederationAgent(object):
|
||||||
Args:
|
Args:
|
||||||
reactor (IReactor): twisted reactor to use for underlying requests
|
reactor (IReactor): twisted reactor to use for underlying requests
|
||||||
|
|
||||||
tls_client_options_factory (ClientTLSOptionsFactory|None):
|
tls_client_options_factory (FederationPolicyForHTTPS|None):
|
||||||
factory to use for fetching client tls options, or none to disable TLS.
|
factory to use for fetching client tls options, or none to disable TLS.
|
||||||
|
|
||||||
_srv_resolver (SrvResolver|None):
|
_srv_resolver (SrvResolver|None):
|
||||||
|
|
|
@ -27,10 +27,15 @@ import inspect
|
||||||
import logging
|
import logging
|
||||||
import threading
|
import threading
|
||||||
import types
|
import types
|
||||||
from typing import Any, List
|
from typing import TYPE_CHECKING, Optional, Tuple, TypeVar, Union
|
||||||
|
|
||||||
|
from typing_extensions import Literal
|
||||||
|
|
||||||
from twisted.internet import defer, threads
|
from twisted.internet import defer, threads
|
||||||
|
|
||||||
|
if TYPE_CHECKING:
|
||||||
|
from synapse.logging.scopecontextmanager import _LogContextScope
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
|
@ -91,7 +96,7 @@ class ContextResourceUsage(object):
|
||||||
"evt_db_fetch_count",
|
"evt_db_fetch_count",
|
||||||
]
|
]
|
||||||
|
|
||||||
def __init__(self, copy_from=None):
|
def __init__(self, copy_from: "Optional[ContextResourceUsage]" = None) -> None:
|
||||||
"""Create a new ContextResourceUsage
|
"""Create a new ContextResourceUsage
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
|
@ -101,27 +106,28 @@ class ContextResourceUsage(object):
|
||||||
if copy_from is None:
|
if copy_from is None:
|
||||||
self.reset()
|
self.reset()
|
||||||
else:
|
else:
|
||||||
self.ru_utime = copy_from.ru_utime
|
# FIXME: mypy can't infer the types set via reset() above, so specify explicitly for now
|
||||||
self.ru_stime = copy_from.ru_stime
|
self.ru_utime = copy_from.ru_utime # type: float
|
||||||
self.db_txn_count = copy_from.db_txn_count
|
self.ru_stime = copy_from.ru_stime # type: float
|
||||||
|
self.db_txn_count = copy_from.db_txn_count # type: int
|
||||||
|
|
||||||
self.db_txn_duration_sec = copy_from.db_txn_duration_sec
|
self.db_txn_duration_sec = copy_from.db_txn_duration_sec # type: float
|
||||||
self.db_sched_duration_sec = copy_from.db_sched_duration_sec
|
self.db_sched_duration_sec = copy_from.db_sched_duration_sec # type: float
|
||||||
self.evt_db_fetch_count = copy_from.evt_db_fetch_count
|
self.evt_db_fetch_count = copy_from.evt_db_fetch_count # type: int
|
||||||
|
|
||||||
def copy(self):
|
def copy(self) -> "ContextResourceUsage":
|
||||||
return ContextResourceUsage(copy_from=self)
|
return ContextResourceUsage(copy_from=self)
|
||||||
|
|
||||||
def reset(self):
|
def reset(self) -> None:
|
||||||
self.ru_stime = 0.0
|
self.ru_stime = 0.0
|
||||||
self.ru_utime = 0.0
|
self.ru_utime = 0.0
|
||||||
self.db_txn_count = 0
|
self.db_txn_count = 0
|
||||||
|
|
||||||
self.db_txn_duration_sec = 0
|
self.db_txn_duration_sec = 0.0
|
||||||
self.db_sched_duration_sec = 0
|
self.db_sched_duration_sec = 0.0
|
||||||
self.evt_db_fetch_count = 0
|
self.evt_db_fetch_count = 0
|
||||||
|
|
||||||
def __repr__(self):
|
def __repr__(self) -> str:
|
||||||
return (
|
return (
|
||||||
"<ContextResourceUsage ru_stime='%r', ru_utime='%r', "
|
"<ContextResourceUsage ru_stime='%r', ru_utime='%r', "
|
||||||
"db_txn_count='%r', db_txn_duration_sec='%r', "
|
"db_txn_count='%r', db_txn_duration_sec='%r', "
|
||||||
|
@ -135,7 +141,7 @@ class ContextResourceUsage(object):
|
||||||
self.evt_db_fetch_count,
|
self.evt_db_fetch_count,
|
||||||
)
|
)
|
||||||
|
|
||||||
def __iadd__(self, other):
|
def __iadd__(self, other: "ContextResourceUsage") -> "ContextResourceUsage":
|
||||||
"""Add another ContextResourceUsage's stats to this one's.
|
"""Add another ContextResourceUsage's stats to this one's.
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
|
@ -149,7 +155,7 @@ class ContextResourceUsage(object):
|
||||||
self.evt_db_fetch_count += other.evt_db_fetch_count
|
self.evt_db_fetch_count += other.evt_db_fetch_count
|
||||||
return self
|
return self
|
||||||
|
|
||||||
def __isub__(self, other):
|
def __isub__(self, other: "ContextResourceUsage") -> "ContextResourceUsage":
|
||||||
self.ru_utime -= other.ru_utime
|
self.ru_utime -= other.ru_utime
|
||||||
self.ru_stime -= other.ru_stime
|
self.ru_stime -= other.ru_stime
|
||||||
self.db_txn_count -= other.db_txn_count
|
self.db_txn_count -= other.db_txn_count
|
||||||
|
@ -158,17 +164,20 @@ class ContextResourceUsage(object):
|
||||||
self.evt_db_fetch_count -= other.evt_db_fetch_count
|
self.evt_db_fetch_count -= other.evt_db_fetch_count
|
||||||
return self
|
return self
|
||||||
|
|
||||||
def __add__(self, other):
|
def __add__(self, other: "ContextResourceUsage") -> "ContextResourceUsage":
|
||||||
res = ContextResourceUsage(copy_from=self)
|
res = ContextResourceUsage(copy_from=self)
|
||||||
res += other
|
res += other
|
||||||
return res
|
return res
|
||||||
|
|
||||||
def __sub__(self, other):
|
def __sub__(self, other: "ContextResourceUsage") -> "ContextResourceUsage":
|
||||||
res = ContextResourceUsage(copy_from=self)
|
res = ContextResourceUsage(copy_from=self)
|
||||||
res -= other
|
res -= other
|
||||||
return res
|
return res
|
||||||
|
|
||||||
|
|
||||||
|
LoggingContextOrSentinel = Union["LoggingContext", "LoggingContext.Sentinel"]
|
||||||
|
|
||||||
|
|
||||||
class LoggingContext(object):
|
class LoggingContext(object):
|
||||||
"""Additional context for log formatting. Contexts are scoped within a
|
"""Additional context for log formatting. Contexts are scoped within a
|
||||||
"with" block.
|
"with" block.
|
||||||
|
@ -201,7 +210,15 @@ class LoggingContext(object):
|
||||||
class Sentinel(object):
|
class Sentinel(object):
|
||||||
"""Sentinel to represent the root context"""
|
"""Sentinel to represent the root context"""
|
||||||
|
|
||||||
__slots__ = [] # type: List[Any]
|
__slots__ = ["previous_context", "alive", "request", "scope", "tag"]
|
||||||
|
|
||||||
|
def __init__(self) -> None:
|
||||||
|
# Minimal set for compatibility with LoggingContext
|
||||||
|
self.previous_context = None
|
||||||
|
self.alive = None
|
||||||
|
self.request = None
|
||||||
|
self.scope = None
|
||||||
|
self.tag = None
|
||||||
|
|
||||||
def __str__(self):
|
def __str__(self):
|
||||||
return "sentinel"
|
return "sentinel"
|
||||||
|
@ -235,7 +252,7 @@ class LoggingContext(object):
|
||||||
|
|
||||||
sentinel = Sentinel()
|
sentinel = Sentinel()
|
||||||
|
|
||||||
def __init__(self, name=None, parent_context=None, request=None):
|
def __init__(self, name=None, parent_context=None, request=None) -> None:
|
||||||
self.previous_context = LoggingContext.current_context()
|
self.previous_context = LoggingContext.current_context()
|
||||||
self.name = name
|
self.name = name
|
||||||
|
|
||||||
|
@ -250,7 +267,7 @@ class LoggingContext(object):
|
||||||
self.request = None
|
self.request = None
|
||||||
self.tag = ""
|
self.tag = ""
|
||||||
self.alive = True
|
self.alive = True
|
||||||
self.scope = None
|
self.scope = None # type: Optional[_LogContextScope]
|
||||||
|
|
||||||
self.parent_context = parent_context
|
self.parent_context = parent_context
|
||||||
|
|
||||||
|
@ -261,13 +278,13 @@ class LoggingContext(object):
|
||||||
# the request param overrides the request from the parent context
|
# the request param overrides the request from the parent context
|
||||||
self.request = request
|
self.request = request
|
||||||
|
|
||||||
def __str__(self):
|
def __str__(self) -> str:
|
||||||
if self.request:
|
if self.request:
|
||||||
return str(self.request)
|
return str(self.request)
|
||||||
return "%s@%x" % (self.name, id(self))
|
return "%s@%x" % (self.name, id(self))
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def current_context(cls):
|
def current_context(cls) -> LoggingContextOrSentinel:
|
||||||
"""Get the current logging context from thread local storage
|
"""Get the current logging context from thread local storage
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
|
@ -276,7 +293,9 @@ class LoggingContext(object):
|
||||||
return getattr(cls.thread_local, "current_context", cls.sentinel)
|
return getattr(cls.thread_local, "current_context", cls.sentinel)
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def set_current_context(cls, context):
|
def set_current_context(
|
||||||
|
cls, context: LoggingContextOrSentinel
|
||||||
|
) -> LoggingContextOrSentinel:
|
||||||
"""Set the current logging context in thread local storage
|
"""Set the current logging context in thread local storage
|
||||||
Args:
|
Args:
|
||||||
context(LoggingContext): The context to activate.
|
context(LoggingContext): The context to activate.
|
||||||
|
@ -291,7 +310,7 @@ class LoggingContext(object):
|
||||||
context.start()
|
context.start()
|
||||||
return current
|
return current
|
||||||
|
|
||||||
def __enter__(self):
|
def __enter__(self) -> "LoggingContext":
|
||||||
"""Enters this logging context into thread local storage"""
|
"""Enters this logging context into thread local storage"""
|
||||||
old_context = self.set_current_context(self)
|
old_context = self.set_current_context(self)
|
||||||
if self.previous_context != old_context:
|
if self.previous_context != old_context:
|
||||||
|
@ -304,7 +323,7 @@ class LoggingContext(object):
|
||||||
|
|
||||||
return self
|
return self
|
||||||
|
|
||||||
def __exit__(self, type, value, traceback):
|
def __exit__(self, type, value, traceback) -> None:
|
||||||
"""Restore the logging context in thread local storage to the state it
|
"""Restore the logging context in thread local storage to the state it
|
||||||
was before this context was entered.
|
was before this context was entered.
|
||||||
Returns:
|
Returns:
|
||||||
|
@ -318,7 +337,6 @@ class LoggingContext(object):
|
||||||
logger.warning(
|
logger.warning(
|
||||||
"Expected logging context %s but found %s", self, current
|
"Expected logging context %s but found %s", self, current
|
||||||
)
|
)
|
||||||
self.previous_context = None
|
|
||||||
self.alive = False
|
self.alive = False
|
||||||
|
|
||||||
# if we have a parent, pass our CPU usage stats on
|
# if we have a parent, pass our CPU usage stats on
|
||||||
|
@ -330,7 +348,7 @@ class LoggingContext(object):
|
||||||
# reset them in case we get entered again
|
# reset them in case we get entered again
|
||||||
self._resource_usage.reset()
|
self._resource_usage.reset()
|
||||||
|
|
||||||
def copy_to(self, record):
|
def copy_to(self, record) -> None:
|
||||||
"""Copy logging fields from this context to a log record or
|
"""Copy logging fields from this context to a log record or
|
||||||
another LoggingContext
|
another LoggingContext
|
||||||
"""
|
"""
|
||||||
|
@ -341,14 +359,14 @@ class LoggingContext(object):
|
||||||
# we also track the current scope:
|
# we also track the current scope:
|
||||||
record.scope = self.scope
|
record.scope = self.scope
|
||||||
|
|
||||||
def copy_to_twisted_log_entry(self, record):
|
def copy_to_twisted_log_entry(self, record) -> None:
|
||||||
"""
|
"""
|
||||||
Copy logging fields from this context to a Twisted log record.
|
Copy logging fields from this context to a Twisted log record.
|
||||||
"""
|
"""
|
||||||
record["request"] = self.request
|
record["request"] = self.request
|
||||||
record["scope"] = self.scope
|
record["scope"] = self.scope
|
||||||
|
|
||||||
def start(self):
|
def start(self) -> None:
|
||||||
if get_thread_id() != self.main_thread:
|
if get_thread_id() != self.main_thread:
|
||||||
logger.warning("Started logcontext %s on different thread", self)
|
logger.warning("Started logcontext %s on different thread", self)
|
||||||
return
|
return
|
||||||
|
@ -358,7 +376,7 @@ class LoggingContext(object):
|
||||||
if not self.usage_start:
|
if not self.usage_start:
|
||||||
self.usage_start = get_thread_resource_usage()
|
self.usage_start = get_thread_resource_usage()
|
||||||
|
|
||||||
def stop(self):
|
def stop(self) -> None:
|
||||||
if get_thread_id() != self.main_thread:
|
if get_thread_id() != self.main_thread:
|
||||||
logger.warning("Stopped logcontext %s on different thread", self)
|
logger.warning("Stopped logcontext %s on different thread", self)
|
||||||
return
|
return
|
||||||
|
@ -378,7 +396,7 @@ class LoggingContext(object):
|
||||||
|
|
||||||
self.usage_start = None
|
self.usage_start = None
|
||||||
|
|
||||||
def get_resource_usage(self):
|
def get_resource_usage(self) -> ContextResourceUsage:
|
||||||
"""Get resources used by this logcontext so far.
|
"""Get resources used by this logcontext so far.
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
|
@ -398,11 +416,13 @@ class LoggingContext(object):
|
||||||
|
|
||||||
return res
|
return res
|
||||||
|
|
||||||
def _get_cputime(self):
|
def _get_cputime(self) -> Tuple[float, float]:
|
||||||
"""Get the cpu usage time so far
|
"""Get the cpu usage time so far
|
||||||
|
|
||||||
Returns: Tuple[float, float]: seconds in user mode, seconds in system mode
|
Returns: Tuple[float, float]: seconds in user mode, seconds in system mode
|
||||||
"""
|
"""
|
||||||
|
assert self.usage_start is not None
|
||||||
|
|
||||||
current = get_thread_resource_usage()
|
current = get_thread_resource_usage()
|
||||||
|
|
||||||
# Indicate to mypy that we know that self.usage_start is None.
|
# Indicate to mypy that we know that self.usage_start is None.
|
||||||
|
@ -430,13 +450,13 @@ class LoggingContext(object):
|
||||||
|
|
||||||
return utime_delta, stime_delta
|
return utime_delta, stime_delta
|
||||||
|
|
||||||
def add_database_transaction(self, duration_sec):
|
def add_database_transaction(self, duration_sec: float) -> None:
|
||||||
if duration_sec < 0:
|
if duration_sec < 0:
|
||||||
raise ValueError("DB txn time can only be non-negative")
|
raise ValueError("DB txn time can only be non-negative")
|
||||||
self._resource_usage.db_txn_count += 1
|
self._resource_usage.db_txn_count += 1
|
||||||
self._resource_usage.db_txn_duration_sec += duration_sec
|
self._resource_usage.db_txn_duration_sec += duration_sec
|
||||||
|
|
||||||
def add_database_scheduled(self, sched_sec):
|
def add_database_scheduled(self, sched_sec: float) -> None:
|
||||||
"""Record a use of the database pool
|
"""Record a use of the database pool
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
|
@ -447,7 +467,7 @@ class LoggingContext(object):
|
||||||
raise ValueError("DB scheduling time can only be non-negative")
|
raise ValueError("DB scheduling time can only be non-negative")
|
||||||
self._resource_usage.db_sched_duration_sec += sched_sec
|
self._resource_usage.db_sched_duration_sec += sched_sec
|
||||||
|
|
||||||
def record_event_fetch(self, event_count):
|
def record_event_fetch(self, event_count: int) -> None:
|
||||||
"""Record a number of events being fetched from the db
|
"""Record a number of events being fetched from the db
|
||||||
|
|
||||||
Args:
|
Args:
|
||||||
|
@ -464,10 +484,10 @@ class LoggingContextFilter(logging.Filter):
|
||||||
missing fields
|
missing fields
|
||||||
"""
|
"""
|
||||||
|
|
||||||
def __init__(self, **defaults):
|
def __init__(self, **defaults) -> None:
|
||||||
self.defaults = defaults
|
self.defaults = defaults
|
||||||
|
|
||||||
def filter(self, record):
|
def filter(self, record) -> Literal[True]:
|
||||||
"""Add each fields from the logging contexts to the record.
|
"""Add each fields from the logging contexts to the record.
|
||||||
Returns:
|
Returns:
|
||||||
True to include the record in the log output.
|
True to include the record in the log output.
|
||||||
|
@ -492,12 +512,13 @@ class PreserveLoggingContext(object):
|
||||||
|
|
||||||
__slots__ = ["current_context", "new_context", "has_parent"]
|
__slots__ = ["current_context", "new_context", "has_parent"]
|
||||||
|
|
||||||
def __init__(self, new_context=None):
|
def __init__(self, new_context: Optional[LoggingContextOrSentinel] = None) -> None:
|
||||||
if new_context is None:
|
if new_context is None:
|
||||||
new_context = LoggingContext.sentinel
|
self.new_context = LoggingContext.sentinel # type: LoggingContextOrSentinel
|
||||||
self.new_context = new_context
|
else:
|
||||||
|
self.new_context = new_context
|
||||||
|
|
||||||
def __enter__(self):
|
def __enter__(self) -> None:
|
||||||
"""Captures the current logging context"""
|
"""Captures the current logging context"""
|
||||||
self.current_context = LoggingContext.set_current_context(self.new_context)
|
self.current_context = LoggingContext.set_current_context(self.new_context)
|
||||||
|
|
||||||
|
@ -506,7 +527,7 @@ class PreserveLoggingContext(object):
|
||||||
if not self.current_context.alive:
|
if not self.current_context.alive:
|
||||||
logger.debug("Entering dead context: %s", self.current_context)
|
logger.debug("Entering dead context: %s", self.current_context)
|
||||||
|
|
||||||
def __exit__(self, type, value, traceback):
|
def __exit__(self, type, value, traceback) -> None:
|
||||||
"""Restores the current logging context"""
|
"""Restores the current logging context"""
|
||||||
context = LoggingContext.set_current_context(self.current_context)
|
context = LoggingContext.set_current_context(self.current_context)
|
||||||
|
|
||||||
|
@ -525,7 +546,9 @@ class PreserveLoggingContext(object):
|
||||||
logger.debug("Restoring dead context: %s", self.current_context)
|
logger.debug("Restoring dead context: %s", self.current_context)
|
||||||
|
|
||||||
|
|
||||||
def nested_logging_context(suffix, parent_context=None):
|
def nested_logging_context(
|
||||||
|
suffix: str, parent_context: Optional[LoggingContext] = None
|
||||||
|
) -> LoggingContext:
|
||||||
"""Creates a new logging context as a child of another.
|
"""Creates a new logging context as a child of another.
|
||||||
|
|
||||||
The nested logging context will have a 'request' made up of the parent context's
|
The nested logging context will have a 'request' made up of the parent context's
|
||||||
|
@ -546,10 +569,12 @@ def nested_logging_context(suffix, parent_context=None):
|
||||||
Returns:
|
Returns:
|
||||||
LoggingContext: new logging context.
|
LoggingContext: new logging context.
|
||||||
"""
|
"""
|
||||||
if parent_context is None:
|
if parent_context is not None:
|
||||||
parent_context = LoggingContext.current_context()
|
context = parent_context # type: LoggingContextOrSentinel
|
||||||
|
else:
|
||||||
|
context = LoggingContext.current_context()
|
||||||
return LoggingContext(
|
return LoggingContext(
|
||||||
parent_context=parent_context, request=parent_context.request + "-" + suffix
|
parent_context=context, request=str(context.request) + "-" + suffix
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
|
@ -654,7 +679,10 @@ def make_deferred_yieldable(deferred):
|
||||||
return deferred
|
return deferred
|
||||||
|
|
||||||
|
|
||||||
def _set_context_cb(result, context):
|
ResultT = TypeVar("ResultT")
|
||||||
|
|
||||||
|
|
||||||
|
def _set_context_cb(result: ResultT, context: LoggingContext) -> ResultT:
|
||||||
"""A callback function which just sets the logging context"""
|
"""A callback function which just sets the logging context"""
|
||||||
LoggingContext.set_current_context(context)
|
LoggingContext.set_current_context(context)
|
||||||
return result
|
return result
|
||||||
|
|
|
@ -20,7 +20,7 @@ import os
|
||||||
import platform
|
import platform
|
||||||
import threading
|
import threading
|
||||||
import time
|
import time
|
||||||
from typing import Dict, Union
|
from typing import Callable, Dict, Iterable, Optional, Tuple, Union
|
||||||
|
|
||||||
import six
|
import six
|
||||||
|
|
||||||
|
@ -59,10 +59,12 @@ class RegistryProxy(object):
|
||||||
@attr.s(hash=True)
|
@attr.s(hash=True)
|
||||||
class LaterGauge(object):
|
class LaterGauge(object):
|
||||||
|
|
||||||
name = attr.ib()
|
name = attr.ib(type=str)
|
||||||
desc = attr.ib()
|
desc = attr.ib(type=str)
|
||||||
labels = attr.ib(hash=False)
|
labels = attr.ib(hash=False, type=Optional[Iterable[str]])
|
||||||
caller = attr.ib()
|
# callback: should either return a value (if there are no labels for this metric),
|
||||||
|
# or dict mapping from a label tuple to a value
|
||||||
|
caller = attr.ib(type=Callable[[], Union[Dict[Tuple[str, ...], float], float]])
|
||||||
|
|
||||||
def collect(self):
|
def collect(self):
|
||||||
|
|
||||||
|
|
|
@ -17,6 +17,7 @@ import logging
|
||||||
import threading
|
import threading
|
||||||
from asyncio import iscoroutine
|
from asyncio import iscoroutine
|
||||||
from functools import wraps
|
from functools import wraps
|
||||||
|
from typing import Dict, Set
|
||||||
|
|
||||||
import six
|
import six
|
||||||
|
|
||||||
|
@ -80,13 +81,13 @@ _background_process_db_sched_duration = Counter(
|
||||||
# map from description to a counter, so that we can name our logcontexts
|
# map from description to a counter, so that we can name our logcontexts
|
||||||
# incrementally. (It actually duplicates _background_process_start_count, but
|
# incrementally. (It actually duplicates _background_process_start_count, but
|
||||||
# it's much simpler to do so than to try to combine them.)
|
# it's much simpler to do so than to try to combine them.)
|
||||||
_background_process_counts = {} # type: dict[str, int]
|
_background_process_counts = {} # type: Dict[str, int]
|
||||||
|
|
||||||
# map from description to the currently running background processes.
|
# map from description to the currently running background processes.
|
||||||
#
|
#
|
||||||
# it's kept as a dict of sets rather than a big set so that we can keep track
|
# it's kept as a dict of sets rather than a big set so that we can keep track
|
||||||
# of process descriptions that no longer have any active processes.
|
# of process descriptions that no longer have any active processes.
|
||||||
_background_processes = {} # type: dict[str, set[_BackgroundProcess]]
|
_background_processes = {} # type: Dict[str, Set[_BackgroundProcess]]
|
||||||
|
|
||||||
# A lock that covers the above dicts
|
# A lock that covers the above dicts
|
||||||
_bg_metrics_lock = threading.Lock()
|
_bg_metrics_lock = threading.Lock()
|
||||||
|
|
|
@ -17,6 +17,7 @@ import logging
|
||||||
|
|
||||||
from twisted.internet import defer
|
from twisted.internet import defer
|
||||||
|
|
||||||
|
from synapse.http.site import SynapseRequest
|
||||||
from synapse.logging.context import make_deferred_yieldable, run_in_background
|
from synapse.logging.context import make_deferred_yieldable, run_in_background
|
||||||
from synapse.types import UserID
|
from synapse.types import UserID
|
||||||
|
|
||||||
|
@ -211,3 +212,21 @@ class ModuleApi(object):
|
||||||
Deferred[object]: result of func
|
Deferred[object]: result of func
|
||||||
"""
|
"""
|
||||||
return self._store.db.runInteraction(desc, func, *args, **kwargs)
|
return self._store.db.runInteraction(desc, func, *args, **kwargs)
|
||||||
|
|
||||||
|
def complete_sso_login(
|
||||||
|
self, registered_user_id: str, request: SynapseRequest, client_redirect_url: str
|
||||||
|
):
|
||||||
|
"""Complete a SSO login by redirecting the user to a page to confirm whether they
|
||||||
|
want their access token sent to `client_redirect_url`, or redirect them to that
|
||||||
|
URL with a token directly if the URL matches with one of the whitelisted clients.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
registered_user_id: The MXID that has been registered as a previous step of
|
||||||
|
of this SSO login.
|
||||||
|
request: The request to respond to.
|
||||||
|
client_redirect_url: The URL to which to offer to redirect the user (or to
|
||||||
|
redirect them directly if whitelisted).
|
||||||
|
"""
|
||||||
|
self._auth_handler.complete_sso_login(
|
||||||
|
registered_user_id, request, client_redirect_url,
|
||||||
|
)
|
||||||
|
|
|
@ -555,10 +555,12 @@ class Mailer(object):
|
||||||
else:
|
else:
|
||||||
# If the reason room doesn't have a name, say who the messages
|
# If the reason room doesn't have a name, say who the messages
|
||||||
# are from explicitly to avoid, "messages in the Bob room"
|
# are from explicitly to avoid, "messages in the Bob room"
|
||||||
|
room_id = reason["room_id"]
|
||||||
|
|
||||||
sender_ids = list(
|
sender_ids = list(
|
||||||
{
|
{
|
||||||
notif_events[n["event_id"]].sender
|
notif_events[n["event_id"]].sender
|
||||||
for n in notifs_by_room[reason["room_id"]]
|
for n in notifs_by_room[room_id]
|
||||||
}
|
}
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
|
@ -15,11 +15,17 @@
|
||||||
# limitations under the License.
|
# limitations under the License.
|
||||||
|
|
||||||
import logging
|
import logging
|
||||||
|
from collections import defaultdict
|
||||||
|
from threading import Lock
|
||||||
|
from typing import Dict, Tuple, Union
|
||||||
|
|
||||||
from twisted.internet import defer
|
from twisted.internet import defer
|
||||||
|
|
||||||
|
from synapse.metrics import LaterGauge
|
||||||
from synapse.metrics.background_process_metrics import run_as_background_process
|
from synapse.metrics.background_process_metrics import run_as_background_process
|
||||||
from synapse.push import PusherConfigException
|
from synapse.push import PusherConfigException
|
||||||
|
from synapse.push.emailpusher import EmailPusher
|
||||||
|
from synapse.push.httppusher import HttpPusher
|
||||||
from synapse.push.pusher import PusherFactory
|
from synapse.push.pusher import PusherFactory
|
||||||
from synapse.util.async_helpers import concurrently_execute
|
from synapse.util.async_helpers import concurrently_execute
|
||||||
|
|
||||||
|
@ -47,7 +53,29 @@ class PusherPool:
|
||||||
self._should_start_pushers = _hs.config.start_pushers
|
self._should_start_pushers = _hs.config.start_pushers
|
||||||
self.store = self.hs.get_datastore()
|
self.store = self.hs.get_datastore()
|
||||||
self.clock = self.hs.get_clock()
|
self.clock = self.hs.get_clock()
|
||||||
self.pushers = {}
|
|
||||||
|
# map from user id to app_id:pushkey to pusher
|
||||||
|
self.pushers = {} # type: Dict[str, Dict[str, Union[HttpPusher, EmailPusher]]]
|
||||||
|
|
||||||
|
# a lock for the pushers dict, since `count_pushers` is called from an different
|
||||||
|
# and we otherwise get concurrent modification errors
|
||||||
|
self._pushers_lock = Lock()
|
||||||
|
|
||||||
|
def count_pushers():
|
||||||
|
results = defaultdict(int) # type: Dict[Tuple[str, str], int]
|
||||||
|
with self._pushers_lock:
|
||||||
|
for pushers in self.pushers.values():
|
||||||
|
for pusher in pushers.values():
|
||||||
|
k = (type(pusher).__name__, pusher.app_id)
|
||||||
|
results[k] += 1
|
||||||
|
return results
|
||||||
|
|
||||||
|
LaterGauge(
|
||||||
|
name="synapse_pushers",
|
||||||
|
desc="the number of active pushers",
|
||||||
|
labels=["kind", "app_id"],
|
||||||
|
caller=count_pushers,
|
||||||
|
)
|
||||||
|
|
||||||
def start(self):
|
def start(self):
|
||||||
"""Starts the pushers off in a background process.
|
"""Starts the pushers off in a background process.
|
||||||
|
@ -271,11 +299,12 @@ class PusherPool:
|
||||||
return
|
return
|
||||||
|
|
||||||
appid_pushkey = "%s:%s" % (pusherdict["app_id"], pusherdict["pushkey"])
|
appid_pushkey = "%s:%s" % (pusherdict["app_id"], pusherdict["pushkey"])
|
||||||
byuser = self.pushers.setdefault(pusherdict["user_name"], {})
|
|
||||||
|
|
||||||
if appid_pushkey in byuser:
|
with self._pushers_lock:
|
||||||
byuser[appid_pushkey].on_stop()
|
byuser = self.pushers.setdefault(pusherdict["user_name"], {})
|
||||||
byuser[appid_pushkey] = p
|
if appid_pushkey in byuser:
|
||||||
|
byuser[appid_pushkey].on_stop()
|
||||||
|
byuser[appid_pushkey] = p
|
||||||
|
|
||||||
# Check if there *may* be push to process. We do this as this check is a
|
# Check if there *may* be push to process. We do this as this check is a
|
||||||
# lot cheaper to do than actually fetching the exact rows we need to
|
# lot cheaper to do than actually fetching the exact rows we need to
|
||||||
|
@ -304,7 +333,9 @@ class PusherPool:
|
||||||
if appid_pushkey in byuser:
|
if appid_pushkey in byuser:
|
||||||
logger.info("Stopping pusher %s / %s", user_id, appid_pushkey)
|
logger.info("Stopping pusher %s / %s", user_id, appid_pushkey)
|
||||||
byuser[appid_pushkey].on_stop()
|
byuser[appid_pushkey].on_stop()
|
||||||
del byuser[appid_pushkey]
|
with self._pushers_lock:
|
||||||
|
del byuser[appid_pushkey]
|
||||||
|
|
||||||
yield self.store.delete_pusher_by_app_id_pushkey_user_id(
|
yield self.store.delete_pusher_by_app_id_pushkey_user_id(
|
||||||
app_id, pushkey, user_id
|
app_id, pushkey, user_id
|
||||||
)
|
)
|
||||||
|
|
|
@ -18,7 +18,7 @@ import logging
|
||||||
from twisted.internet import defer
|
from twisted.internet import defer
|
||||||
|
|
||||||
from synapse.api.room_versions import KNOWN_ROOM_VERSIONS
|
from synapse.api.room_versions import KNOWN_ROOM_VERSIONS
|
||||||
from synapse.events import event_type_from_format_version
|
from synapse.events import make_event_from_dict
|
||||||
from synapse.events.snapshot import EventContext
|
from synapse.events.snapshot import EventContext
|
||||||
from synapse.http.servlet import parse_json_object_from_request
|
from synapse.http.servlet import parse_json_object_from_request
|
||||||
from synapse.replication.http._base import ReplicationEndpoint
|
from synapse.replication.http._base import ReplicationEndpoint
|
||||||
|
@ -38,6 +38,9 @@ class ReplicationFederationSendEventsRestServlet(ReplicationEndpoint):
|
||||||
{
|
{
|
||||||
"events": [{
|
"events": [{
|
||||||
"event": { .. serialized event .. },
|
"event": { .. serialized event .. },
|
||||||
|
"room_version": .., // "1", "2", "3", etc: the version of the room
|
||||||
|
// containing the event
|
||||||
|
"event_format_version": .., // 1,2,3 etc: the event format version
|
||||||
"internal_metadata": { .. serialized internal_metadata .. },
|
"internal_metadata": { .. serialized internal_metadata .. },
|
||||||
"rejected_reason": .., // The event.rejected_reason field
|
"rejected_reason": .., // The event.rejected_reason field
|
||||||
"context": { .. serialized event context .. },
|
"context": { .. serialized event context .. },
|
||||||
|
@ -73,6 +76,7 @@ class ReplicationFederationSendEventsRestServlet(ReplicationEndpoint):
|
||||||
event_payloads.append(
|
event_payloads.append(
|
||||||
{
|
{
|
||||||
"event": event.get_pdu_json(),
|
"event": event.get_pdu_json(),
|
||||||
|
"room_version": event.room_version.identifier,
|
||||||
"event_format_version": event.format_version,
|
"event_format_version": event.format_version,
|
||||||
"internal_metadata": event.internal_metadata.get_dict(),
|
"internal_metadata": event.internal_metadata.get_dict(),
|
||||||
"rejected_reason": event.rejected_reason,
|
"rejected_reason": event.rejected_reason,
|
||||||
|
@ -95,12 +99,13 @@ class ReplicationFederationSendEventsRestServlet(ReplicationEndpoint):
|
||||||
event_and_contexts = []
|
event_and_contexts = []
|
||||||
for event_payload in event_payloads:
|
for event_payload in event_payloads:
|
||||||
event_dict = event_payload["event"]
|
event_dict = event_payload["event"]
|
||||||
format_ver = event_payload["event_format_version"]
|
room_ver = KNOWN_ROOM_VERSIONS[event_payload["room_version"]]
|
||||||
internal_metadata = event_payload["internal_metadata"]
|
internal_metadata = event_payload["internal_metadata"]
|
||||||
rejected_reason = event_payload["rejected_reason"]
|
rejected_reason = event_payload["rejected_reason"]
|
||||||
|
|
||||||
EventType = event_type_from_format_version(format_ver)
|
event = make_event_from_dict(
|
||||||
event = EventType(event_dict, internal_metadata, rejected_reason)
|
event_dict, room_ver, internal_metadata, rejected_reason
|
||||||
|
)
|
||||||
|
|
||||||
context = EventContext.deserialize(
|
context = EventContext.deserialize(
|
||||||
self.storage, event_payload["context"]
|
self.storage, event_payload["context"]
|
||||||
|
|
|
@ -17,7 +17,8 @@ import logging
|
||||||
|
|
||||||
from twisted.internet import defer
|
from twisted.internet import defer
|
||||||
|
|
||||||
from synapse.events import event_type_from_format_version
|
from synapse.api.room_versions import KNOWN_ROOM_VERSIONS
|
||||||
|
from synapse.events import make_event_from_dict
|
||||||
from synapse.events.snapshot import EventContext
|
from synapse.events.snapshot import EventContext
|
||||||
from synapse.http.servlet import parse_json_object_from_request
|
from synapse.http.servlet import parse_json_object_from_request
|
||||||
from synapse.replication.http._base import ReplicationEndpoint
|
from synapse.replication.http._base import ReplicationEndpoint
|
||||||
|
@ -37,6 +38,9 @@ class ReplicationSendEventRestServlet(ReplicationEndpoint):
|
||||||
|
|
||||||
{
|
{
|
||||||
"event": { .. serialized event .. },
|
"event": { .. serialized event .. },
|
||||||
|
"room_version": .., // "1", "2", "3", etc: the version of the room
|
||||||
|
// containing the event
|
||||||
|
"event_format_version": .., // 1,2,3 etc: the event format version
|
||||||
"internal_metadata": { .. serialized internal_metadata .. },
|
"internal_metadata": { .. serialized internal_metadata .. },
|
||||||
"rejected_reason": .., // The event.rejected_reason field
|
"rejected_reason": .., // The event.rejected_reason field
|
||||||
"context": { .. serialized event context .. },
|
"context": { .. serialized event context .. },
|
||||||
|
@ -77,6 +81,7 @@ class ReplicationSendEventRestServlet(ReplicationEndpoint):
|
||||||
|
|
||||||
payload = {
|
payload = {
|
||||||
"event": event.get_pdu_json(),
|
"event": event.get_pdu_json(),
|
||||||
|
"room_version": event.room_version.identifier,
|
||||||
"event_format_version": event.format_version,
|
"event_format_version": event.format_version,
|
||||||
"internal_metadata": event.internal_metadata.get_dict(),
|
"internal_metadata": event.internal_metadata.get_dict(),
|
||||||
"rejected_reason": event.rejected_reason,
|
"rejected_reason": event.rejected_reason,
|
||||||
|
@ -93,12 +98,13 @@ class ReplicationSendEventRestServlet(ReplicationEndpoint):
|
||||||
content = parse_json_object_from_request(request)
|
content = parse_json_object_from_request(request)
|
||||||
|
|
||||||
event_dict = content["event"]
|
event_dict = content["event"]
|
||||||
format_ver = content["event_format_version"]
|
room_ver = KNOWN_ROOM_VERSIONS[content["room_version"]]
|
||||||
internal_metadata = content["internal_metadata"]
|
internal_metadata = content["internal_metadata"]
|
||||||
rejected_reason = content["rejected_reason"]
|
rejected_reason = content["rejected_reason"]
|
||||||
|
|
||||||
EventType = event_type_from_format_version(format_ver)
|
event = make_event_from_dict(
|
||||||
event = EventType(event_dict, internal_metadata, rejected_reason)
|
event_dict, room_ver, internal_metadata, rejected_reason
|
||||||
|
)
|
||||||
|
|
||||||
requester = Requester.deserialize(self.store, content["requester"])
|
requester = Requester.deserialize(self.store, content["requester"])
|
||||||
context = EventContext.deserialize(self.storage, content["context"])
|
context = EventContext.deserialize(self.storage, content["context"])
|
||||||
|
|
|
@ -72,6 +72,9 @@ class SlavedDeviceStore(EndToEndKeyWorkerStore, DeviceWorkerStore, BaseSlavedSto
|
||||||
|
|
||||||
def _invalidate_caches_for_devices(self, token, rows):
|
def _invalidate_caches_for_devices(self, token, rows):
|
||||||
for row in rows:
|
for row in rows:
|
||||||
|
# The entities are either user IDs (starting with '@') whose devices
|
||||||
|
# have changed, or remote servers that we need to tell about
|
||||||
|
# changes.
|
||||||
if row.entity.startswith("@"):
|
if row.entity.startswith("@"):
|
||||||
self._device_list_stream_cache.entity_has_changed(row.entity, token)
|
self._device_list_stream_cache.entity_has_changed(row.entity, token)
|
||||||
self.get_cached_devices_for_user.invalidate((row.entity,))
|
self.get_cached_devices_for_user.invalidate((row.entity,))
|
||||||
|
|
|
@ -0,0 +1,45 @@
|
||||||
|
<!DOCTYPE html>
|
||||||
|
<html lang="en">
|
||||||
|
<head>
|
||||||
|
<meta charset="UTF-8">
|
||||||
|
<title>SSO error</title>
|
||||||
|
</head>
|
||||||
|
<body>
|
||||||
|
<p>Oops! Something went wrong during authentication<span id="errormsg"></span>.</p>
|
||||||
|
<p>
|
||||||
|
If you are seeing this page after clicking a link sent to you via email, make
|
||||||
|
sure you only click the confirmation link once, and that you open the
|
||||||
|
validation link in the same client you're logging in from.
|
||||||
|
</p>
|
||||||
|
<p>
|
||||||
|
Try logging in again from your Matrix client and if the problem persists
|
||||||
|
please contact the server's administrator.
|
||||||
|
</p>
|
||||||
|
|
||||||
|
<script type="text/javascript">
|
||||||
|
// Error handling to support Auth0 errors that we might get through a GET request
|
||||||
|
// to the validation endpoint. If an error is provided, it's either going to be
|
||||||
|
// located in the query string or in a query string-like URI fragment.
|
||||||
|
// We try to locate the error from any of these two locations, but if we can't
|
||||||
|
// we just don't print anything specific.
|
||||||
|
let searchStr = "";
|
||||||
|
if (window.location.search) {
|
||||||
|
// window.location.searchParams isn't always defined when
|
||||||
|
// window.location.search is, so it's more reliable to parse the latter.
|
||||||
|
searchStr = window.location.search;
|
||||||
|
} else if (window.location.hash) {
|
||||||
|
// Replace the # with a ? so that URLSearchParams does the right thing and
|
||||||
|
// doesn't parse the first parameter incorrectly.
|
||||||
|
searchStr = window.location.hash.replace("#", "?");
|
||||||
|
}
|
||||||
|
|
||||||
|
// We might end up with no error in the URL, so we need to check if we have one
|
||||||
|
// to print one.
|
||||||
|
let errorDesc = new URLSearchParams(searchStr).get("error_description")
|
||||||
|
if (errorDesc) {
|
||||||
|
|
||||||
|
document.getElementById("errormsg").innerText = ` ("${errorDesc}")`;
|
||||||
|
}
|
||||||
|
</script>
|
||||||
|
</body>
|
||||||
|
</html>
|
|
@ -0,0 +1,14 @@
|
||||||
|
<!DOCTYPE html>
|
||||||
|
<html lang="en">
|
||||||
|
<head>
|
||||||
|
<meta charset="UTF-8">
|
||||||
|
<title>SSO redirect confirmation</title>
|
||||||
|
</head>
|
||||||
|
<body>
|
||||||
|
<p>The application at <span style="font-weight:bold">{{ display_url | e }}</span> is requesting full access to your <span style="font-weight:bold">{{ server_name }}</span> Matrix account.</p>
|
||||||
|
<p>If you don't recognise this address, you should ignore this and close this tab.</p>
|
||||||
|
<p>
|
||||||
|
<a href="{{ redirect_url | e }}">I trust this address</a>
|
||||||
|
</p>
|
||||||
|
</body>
|
||||||
|
</html>
|
|
@ -221,8 +221,9 @@ class UserRestServletV2(RestServlet):
|
||||||
raise SynapseError(400, "Invalid password")
|
raise SynapseError(400, "Invalid password")
|
||||||
else:
|
else:
|
||||||
new_password = body["password"]
|
new_password = body["password"]
|
||||||
|
logout_devices = True
|
||||||
await self.set_password_handler.set_password(
|
await self.set_password_handler.set_password(
|
||||||
target_user.to_string(), new_password, requester
|
target_user.to_string(), new_password, logout_devices, requester
|
||||||
)
|
)
|
||||||
|
|
||||||
if "deactivated" in body:
|
if "deactivated" in body:
|
||||||
|
@ -536,9 +537,10 @@ class ResetPasswordRestServlet(RestServlet):
|
||||||
params = parse_json_object_from_request(request)
|
params = parse_json_object_from_request(request)
|
||||||
assert_params_in_dict(params, ["new_password"])
|
assert_params_in_dict(params, ["new_password"])
|
||||||
new_password = params["new_password"]
|
new_password = params["new_password"]
|
||||||
|
logout_devices = params.get("logout_devices", True)
|
||||||
|
|
||||||
await self._set_password_handler.set_password(
|
await self._set_password_handler.set_password(
|
||||||
target_user_id, new_password, requester
|
target_user_id, new_password, logout_devices, requester
|
||||||
)
|
)
|
||||||
return 200, {}
|
return 200, {}
|
||||||
|
|
||||||
|
|
|
@ -28,7 +28,7 @@ from synapse.http.servlet import (
|
||||||
parse_json_object_from_request,
|
parse_json_object_from_request,
|
||||||
parse_string,
|
parse_string,
|
||||||
)
|
)
|
||||||
from synapse.http.site import SynapseRequest
|
from synapse.push.mailer import load_jinja2_templates
|
||||||
from synapse.rest.client.v2_alpha._base import client_patterns
|
from synapse.rest.client.v2_alpha._base import client_patterns
|
||||||
from synapse.rest.well_known import WellKnownBuilder
|
from synapse.rest.well_known import WellKnownBuilder
|
||||||
from synapse.types import UserID, map_username_to_mxid_localpart
|
from synapse.types import UserID, map_username_to_mxid_localpart
|
||||||
|
@ -548,6 +548,16 @@ class SSOAuthHandler(object):
|
||||||
self._registration_handler = hs.get_registration_handler()
|
self._registration_handler = hs.get_registration_handler()
|
||||||
self._macaroon_gen = hs.get_macaroon_generator()
|
self._macaroon_gen = hs.get_macaroon_generator()
|
||||||
|
|
||||||
|
# Load the redirect page HTML template
|
||||||
|
self._template = load_jinja2_templates(
|
||||||
|
hs.config.sso_redirect_confirm_template_dir, ["sso_redirect_confirm.html"],
|
||||||
|
)[0]
|
||||||
|
|
||||||
|
self._server_name = hs.config.server_name
|
||||||
|
|
||||||
|
# cast to tuple for use with str.startswith
|
||||||
|
self._whitelisted_sso_clients = tuple(hs.config.sso_client_whitelist)
|
||||||
|
|
||||||
async def on_successful_auth(
|
async def on_successful_auth(
|
||||||
self, username, request, client_redirect_url, user_display_name=None
|
self, username, request, client_redirect_url, user_display_name=None
|
||||||
):
|
):
|
||||||
|
@ -580,36 +590,9 @@ class SSOAuthHandler(object):
|
||||||
localpart=localpart, default_display_name=user_display_name
|
localpart=localpart, default_display_name=user_display_name
|
||||||
)
|
)
|
||||||
|
|
||||||
self.complete_sso_login(registered_user_id, request, client_redirect_url)
|
self._auth_handler.complete_sso_login(
|
||||||
|
registered_user_id, request, client_redirect_url
|
||||||
def complete_sso_login(
|
|
||||||
self, registered_user_id: str, request: SynapseRequest, client_redirect_url: str
|
|
||||||
):
|
|
||||||
"""Having figured out a mxid for this user, complete the HTTP request
|
|
||||||
|
|
||||||
Args:
|
|
||||||
registered_user_id:
|
|
||||||
request:
|
|
||||||
client_redirect_url:
|
|
||||||
"""
|
|
||||||
|
|
||||||
login_token = self._macaroon_gen.generate_short_term_login_token(
|
|
||||||
registered_user_id
|
|
||||||
)
|
)
|
||||||
redirect_url = self._add_login_token_to_redirect_url(
|
|
||||||
client_redirect_url, login_token
|
|
||||||
)
|
|
||||||
# Load page
|
|
||||||
request.redirect(redirect_url)
|
|
||||||
finish_request(request)
|
|
||||||
|
|
||||||
@staticmethod
|
|
||||||
def _add_login_token_to_redirect_url(url, token):
|
|
||||||
url_parts = list(urllib.parse.urlparse(url))
|
|
||||||
query = dict(urllib.parse.parse_qsl(url_parts[4]))
|
|
||||||
query.update({"loginToken": token})
|
|
||||||
url_parts[4] = urllib.parse.urlencode(query)
|
|
||||||
return urllib.parse.urlunparse(url_parts)
|
|
||||||
|
|
||||||
|
|
||||||
def register_servlets(hs, http_server):
|
def register_servlets(hs, http_server):
|
||||||
|
|
|
@ -189,12 +189,6 @@ class RoomStateEventRestServlet(TransactionRestServlet):
|
||||||
|
|
||||||
content = parse_json_object_from_request(request)
|
content = parse_json_object_from_request(request)
|
||||||
|
|
||||||
if event_type == EventTypes.Aliases:
|
|
||||||
# MSC2260
|
|
||||||
raise SynapseError(
|
|
||||||
400, "Cannot send m.room.aliases events via /rooms/{room_id}/state"
|
|
||||||
)
|
|
||||||
|
|
||||||
event_dict = {
|
event_dict = {
|
||||||
"type": event_type,
|
"type": event_type,
|
||||||
"content": content,
|
"content": content,
|
||||||
|
@ -242,12 +236,6 @@ class RoomSendEventRestServlet(TransactionRestServlet):
|
||||||
requester = await self.auth.get_user_by_req(request, allow_guest=True)
|
requester = await self.auth.get_user_by_req(request, allow_guest=True)
|
||||||
content = parse_json_object_from_request(request)
|
content = parse_json_object_from_request(request)
|
||||||
|
|
||||||
if event_type == EventTypes.Aliases:
|
|
||||||
# MSC2260
|
|
||||||
raise SynapseError(
|
|
||||||
400, "Cannot send m.room.aliases events via /rooms/{room_id}/send"
|
|
||||||
)
|
|
||||||
|
|
||||||
event_dict = {
|
event_dict = {
|
||||||
"type": event_type,
|
"type": event_type,
|
||||||
"content": content,
|
"content": content,
|
||||||
|
|
|
@ -265,8 +265,11 @@ class PasswordRestServlet(RestServlet):
|
||||||
|
|
||||||
assert_params_in_dict(params, ["new_password"])
|
assert_params_in_dict(params, ["new_password"])
|
||||||
new_password = params["new_password"]
|
new_password = params["new_password"]
|
||||||
|
logout_devices = params.get("logout_devices", True)
|
||||||
|
|
||||||
await self._set_password_handler.set_password(user_id, new_password, requester)
|
await self._set_password_handler.set_password(
|
||||||
|
user_id, new_password, logout_devices, requester
|
||||||
|
)
|
||||||
|
|
||||||
return 200, {}
|
return 200, {}
|
||||||
|
|
||||||
|
|
|
@ -18,8 +18,6 @@ from typing import Dict, Set
|
||||||
from canonicaljson import encode_canonical_json, json
|
from canonicaljson import encode_canonical_json, json
|
||||||
from signedjson.sign import sign_json
|
from signedjson.sign import sign_json
|
||||||
|
|
||||||
from twisted.internet import defer
|
|
||||||
|
|
||||||
from synapse.api.errors import Codes, SynapseError
|
from synapse.api.errors import Codes, SynapseError
|
||||||
from synapse.crypto.keyring import ServerKeyFetcher
|
from synapse.crypto.keyring import ServerKeyFetcher
|
||||||
from synapse.http.server import (
|
from synapse.http.server import (
|
||||||
|
@ -125,8 +123,7 @@ class RemoteKey(DirectServeResource):
|
||||||
|
|
||||||
await self.query_keys(request, query, query_remote_on_cache_miss=True)
|
await self.query_keys(request, query, query_remote_on_cache_miss=True)
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
async def query_keys(self, request, query, query_remote_on_cache_miss=False):
|
||||||
def query_keys(self, request, query, query_remote_on_cache_miss=False):
|
|
||||||
logger.info("Handling query for keys %r", query)
|
logger.info("Handling query for keys %r", query)
|
||||||
|
|
||||||
store_queries = []
|
store_queries = []
|
||||||
|
@ -143,7 +140,7 @@ class RemoteKey(DirectServeResource):
|
||||||
for key_id in key_ids:
|
for key_id in key_ids:
|
||||||
store_queries.append((server_name, key_id, None))
|
store_queries.append((server_name, key_id, None))
|
||||||
|
|
||||||
cached = yield self.store.get_server_keys_json(store_queries)
|
cached = await self.store.get_server_keys_json(store_queries)
|
||||||
|
|
||||||
json_results = set()
|
json_results = set()
|
||||||
|
|
||||||
|
@ -215,8 +212,8 @@ class RemoteKey(DirectServeResource):
|
||||||
json_results.add(bytes(result["key_json"]))
|
json_results.add(bytes(result["key_json"]))
|
||||||
|
|
||||||
if cache_misses and query_remote_on_cache_miss:
|
if cache_misses and query_remote_on_cache_miss:
|
||||||
yield self.fetcher.get_keys(cache_misses)
|
await self.fetcher.get_keys(cache_misses)
|
||||||
yield self.query_keys(request, query, query_remote_on_cache_miss=False)
|
await self.query_keys(request, query, query_remote_on_cache_miss=False)
|
||||||
else:
|
else:
|
||||||
signed_keys = []
|
signed_keys = []
|
||||||
for key_json in json_results:
|
for key_json in json_results:
|
||||||
|
|
|
@ -30,6 +30,22 @@ from synapse.util.stringutils import is_ascii
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
# list all text content types that will have the charset default to UTF-8 when
|
||||||
|
# none is given
|
||||||
|
TEXT_CONTENT_TYPES = [
|
||||||
|
"text/css",
|
||||||
|
"text/csv",
|
||||||
|
"text/html",
|
||||||
|
"text/calendar",
|
||||||
|
"text/plain",
|
||||||
|
"text/javascript",
|
||||||
|
"application/json",
|
||||||
|
"application/ld+json",
|
||||||
|
"application/rtf",
|
||||||
|
"image/svg+xml",
|
||||||
|
"text/xml",
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
def parse_media_id(request):
|
def parse_media_id(request):
|
||||||
try:
|
try:
|
||||||
|
@ -96,7 +112,14 @@ def add_file_headers(request, media_type, file_size, upload_name):
|
||||||
def _quote(x):
|
def _quote(x):
|
||||||
return urllib.parse.quote(x.encode("utf-8"))
|
return urllib.parse.quote(x.encode("utf-8"))
|
||||||
|
|
||||||
request.setHeader(b"Content-Type", media_type.encode("UTF-8"))
|
# Default to a UTF-8 charset for text content types.
|
||||||
|
# ex, uses UTF-8 for 'text/css' but not 'text/css; charset=UTF-16'
|
||||||
|
if media_type.lower() in TEXT_CONTENT_TYPES:
|
||||||
|
content_type = media_type + "; charset=UTF-8"
|
||||||
|
else:
|
||||||
|
content_type = media_type
|
||||||
|
|
||||||
|
request.setHeader(b"Content-Type", content_type.encode("UTF-8"))
|
||||||
if upload_name:
|
if upload_name:
|
||||||
# RFC6266 section 4.1 [1] defines both `filename` and `filename*`.
|
# RFC6266 section 4.1 [1] defines both `filename` and `filename*`.
|
||||||
#
|
#
|
||||||
|
|
|
@ -14,7 +14,11 @@
|
||||||
# See the License for the specific language governing permissions and
|
# See the License for the specific language governing permissions and
|
||||||
# limitations under the License.
|
# limitations under the License.
|
||||||
|
|
||||||
from synapse.http.server import DirectServeResource, wrap_html_request_handler
|
from synapse.http.server import (
|
||||||
|
DirectServeResource,
|
||||||
|
finish_request,
|
||||||
|
wrap_html_request_handler,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
class SAML2ResponseResource(DirectServeResource):
|
class SAML2ResponseResource(DirectServeResource):
|
||||||
|
@ -24,8 +28,20 @@ class SAML2ResponseResource(DirectServeResource):
|
||||||
|
|
||||||
def __init__(self, hs):
|
def __init__(self, hs):
|
||||||
super().__init__()
|
super().__init__()
|
||||||
|
self._error_html_content = hs.config.saml2_error_html_content
|
||||||
self._saml_handler = hs.get_saml_handler()
|
self._saml_handler = hs.get_saml_handler()
|
||||||
|
|
||||||
|
async def _async_render_GET(self, request):
|
||||||
|
# We're not expecting any GET request on that resource if everything goes right,
|
||||||
|
# but some IdPs sometimes end up responding with a 302 redirect on this endpoint.
|
||||||
|
# In this case, just tell the user that something went wrong and they should
|
||||||
|
# try to authenticate again.
|
||||||
|
request.setResponseCode(400)
|
||||||
|
request.setHeader(b"Content-Type", b"text/html; charset=utf-8")
|
||||||
|
request.setHeader(b"Content-Length", b"%d" % (len(self._error_html_content),))
|
||||||
|
request.write(self._error_html_content.encode("utf8"))
|
||||||
|
finish_request(request)
|
||||||
|
|
||||||
@wrap_html_request_handler
|
@wrap_html_request_handler
|
||||||
async def _async_render_POST(self, request):
|
async def _async_render_POST(self, request):
|
||||||
return await self._saml_handler.handle_saml_response(request)
|
return await self._saml_handler.handle_saml_response(request)
|
||||||
|
|
|
@ -26,7 +26,6 @@ import logging
|
||||||
import os
|
import os
|
||||||
|
|
||||||
from twisted.mail.smtp import sendmail
|
from twisted.mail.smtp import sendmail
|
||||||
from twisted.web.client import BrowserLikePolicyForHTTPS
|
|
||||||
|
|
||||||
from synapse.api.auth import Auth
|
from synapse.api.auth import Auth
|
||||||
from synapse.api.filtering import Filtering
|
from synapse.api.filtering import Filtering
|
||||||
|
@ -35,6 +34,7 @@ from synapse.appservice.api import ApplicationServiceApi
|
||||||
from synapse.appservice.scheduler import ApplicationServiceScheduler
|
from synapse.appservice.scheduler import ApplicationServiceScheduler
|
||||||
from synapse.config.homeserver import HomeServerConfig
|
from synapse.config.homeserver import HomeServerConfig
|
||||||
from synapse.crypto import context_factory
|
from synapse.crypto import context_factory
|
||||||
|
from synapse.crypto.context_factory import RegularPolicyForHTTPS
|
||||||
from synapse.crypto.keyring import Keyring
|
from synapse.crypto.keyring import Keyring
|
||||||
from synapse.events.builder import EventBuilderFactory
|
from synapse.events.builder import EventBuilderFactory
|
||||||
from synapse.events.spamcheck import SpamChecker
|
from synapse.events.spamcheck import SpamChecker
|
||||||
|
@ -310,7 +310,7 @@ class HomeServer(object):
|
||||||
return (
|
return (
|
||||||
InsecureInterceptableContextFactory()
|
InsecureInterceptableContextFactory()
|
||||||
if self.config.use_insecure_ssl_client_just_for_testing_do_not_use
|
if self.config.use_insecure_ssl_client_just_for_testing_do_not_use
|
||||||
else BrowserLikePolicyForHTTPS()
|
else RegularPolicyForHTTPS()
|
||||||
)
|
)
|
||||||
|
|
||||||
def build_simple_http_client(self):
|
def build_simple_http_client(self):
|
||||||
|
@ -420,7 +420,7 @@ class HomeServer(object):
|
||||||
return PusherPool(self)
|
return PusherPool(self)
|
||||||
|
|
||||||
def build_http_client(self):
|
def build_http_client(self):
|
||||||
tls_client_options_factory = context_factory.ClientTLSOptionsFactory(
|
tls_client_options_factory = context_factory.FederationPolicyForHTTPS(
|
||||||
self.config
|
self.config
|
||||||
)
|
)
|
||||||
return MatrixFederationHttpClient(self, tls_client_options_factory)
|
return MatrixFederationHttpClient(self, tls_client_options_factory)
|
||||||
|
|
|
@ -662,28 +662,16 @@ class StateResolutionStore(object):
|
||||||
allow_rejected=allow_rejected,
|
allow_rejected=allow_rejected,
|
||||||
)
|
)
|
||||||
|
|
||||||
def get_auth_chain(self, event_ids: List[str], ignore_events: Set[str]):
|
def get_auth_chain_difference(self, state_sets: List[Set[str]]):
|
||||||
"""Gets the full auth chain for a set of events (including rejected
|
"""Given sets of state events figure out the auth chain difference (as
|
||||||
events).
|
per state res v2 algorithm).
|
||||||
|
|
||||||
Includes the given event IDs in the result.
|
|
||||||
|
|
||||||
Note that:
|
|
||||||
1. All events must be state events.
|
|
||||||
2. For v1 rooms this may not have the full auth chain in the
|
|
||||||
presence of rejected events
|
|
||||||
|
|
||||||
Args:
|
|
||||||
event_ids: The event IDs of the events to fetch the auth chain for.
|
|
||||||
Must be state events.
|
|
||||||
ignore_events: Set of events to exclude from the returned auth
|
|
||||||
chain.
|
|
||||||
|
|
||||||
|
This equivalent to fetching the full auth chain for each set of state
|
||||||
|
and returning the events that don't appear in each and every auth
|
||||||
|
chain.
|
||||||
|
|
||||||
Returns:
|
Returns:
|
||||||
Deferred[list[str]]: List of event IDs of the auth chain.
|
Deferred[Set[str]]: Set of event IDs.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
return self.store.get_auth_chain_ids(
|
return self.store.get_auth_chain_difference(state_sets)
|
||||||
event_ids, include_given=True, ignore_events=ignore_events,
|
|
||||||
)
|
|
||||||
|
|
|
@ -227,36 +227,12 @@ def _get_auth_chain_difference(state_sets, event_map, state_res_store):
|
||||||
Returns:
|
Returns:
|
||||||
Deferred[set[str]]: Set of event IDs
|
Deferred[set[str]]: Set of event IDs
|
||||||
"""
|
"""
|
||||||
common = set(itervalues(state_sets[0])).intersection(
|
|
||||||
*(itervalues(s) for s in state_sets[1:])
|
difference = yield state_res_store.get_auth_chain_difference(
|
||||||
|
[set(state_set.values()) for state_set in state_sets]
|
||||||
)
|
)
|
||||||
|
|
||||||
auth_sets = []
|
return difference
|
||||||
for state_set in state_sets:
|
|
||||||
auth_ids = {
|
|
||||||
eid
|
|
||||||
for key, eid in iteritems(state_set)
|
|
||||||
if (
|
|
||||||
key[0] in (EventTypes.Member, EventTypes.ThirdPartyInvite)
|
|
||||||
or key
|
|
||||||
in (
|
|
||||||
(EventTypes.PowerLevels, ""),
|
|
||||||
(EventTypes.Create, ""),
|
|
||||||
(EventTypes.JoinRules, ""),
|
|
||||||
)
|
|
||||||
)
|
|
||||||
and eid not in common
|
|
||||||
}
|
|
||||||
|
|
||||||
auth_chain = yield state_res_store.get_auth_chain(auth_ids, common)
|
|
||||||
auth_ids.update(auth_chain)
|
|
||||||
|
|
||||||
auth_sets.append(auth_ids)
|
|
||||||
|
|
||||||
intersection = set(auth_sets[0]).intersection(*auth_sets[1:])
|
|
||||||
union = set().union(*auth_sets)
|
|
||||||
|
|
||||||
return union - intersection
|
|
||||||
|
|
||||||
|
|
||||||
def _seperate(state_sets):
|
def _seperate(state_sets):
|
||||||
|
|
|
@ -15,6 +15,7 @@
|
||||||
# See the License for the specific language governing permissions and
|
# See the License for the specific language governing permissions and
|
||||||
# limitations under the License.
|
# limitations under the License.
|
||||||
import logging
|
import logging
|
||||||
|
from typing import List, Tuple
|
||||||
|
|
||||||
from six import iteritems
|
from six import iteritems
|
||||||
|
|
||||||
|
@ -31,7 +32,7 @@ from synapse.logging.opentracing import (
|
||||||
)
|
)
|
||||||
from synapse.metrics.background_process_metrics import run_as_background_process
|
from synapse.metrics.background_process_metrics import run_as_background_process
|
||||||
from synapse.storage._base import SQLBaseStore, db_to_json, make_in_list_sql_clause
|
from synapse.storage._base import SQLBaseStore, db_to_json, make_in_list_sql_clause
|
||||||
from synapse.storage.database import Database
|
from synapse.storage.database import Database, LoggingTransaction
|
||||||
from synapse.types import Collection, get_verify_key_from_cross_signing_key
|
from synapse.types import Collection, get_verify_key_from_cross_signing_key
|
||||||
from synapse.util.caches.descriptors import (
|
from synapse.util.caches.descriptors import (
|
||||||
Cache,
|
Cache,
|
||||||
|
@ -574,10 +575,12 @@ class DeviceWorkerStore(SQLBaseStore):
|
||||||
else:
|
else:
|
||||||
return set()
|
return set()
|
||||||
|
|
||||||
def get_all_device_list_changes_for_remotes(self, from_key, to_key, limit):
|
async def get_all_device_list_changes_for_remotes(
|
||||||
"""Return a list of `(stream_id, user_id, destination)` which is the
|
self, from_key: int, to_key: int, limit: int,
|
||||||
combined list of changes to devices, and which destinations need to be
|
) -> List[Tuple[int, str]]:
|
||||||
poked. `destination` may be None if no destinations need to be poked.
|
"""Return a list of `(stream_id, entity)` which is the combined list of
|
||||||
|
changes to devices and which destinations need to be poked. Entity is
|
||||||
|
either a user ID (starting with '@') or a remote destination.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
# This query Does The Right Thing where it'll correctly apply the
|
# This query Does The Right Thing where it'll correctly apply the
|
||||||
|
@ -592,7 +595,7 @@ class DeviceWorkerStore(SQLBaseStore):
|
||||||
LIMIT ?
|
LIMIT ?
|
||||||
"""
|
"""
|
||||||
|
|
||||||
return self.db.execute(
|
return await self.db.execute(
|
||||||
"get_all_device_list_changes_for_remotes",
|
"get_all_device_list_changes_for_remotes",
|
||||||
None,
|
None,
|
||||||
sql,
|
sql,
|
||||||
|
@ -1024,11 +1027,19 @@ class DeviceStore(DeviceWorkerStore, DeviceBackgroundUpdateStore):
|
||||||
|
|
||||||
return stream_ids[-1]
|
return stream_ids[-1]
|
||||||
|
|
||||||
def _add_device_change_to_stream_txn(self, txn, user_id, device_ids, stream_ids):
|
def _add_device_change_to_stream_txn(
|
||||||
|
self,
|
||||||
|
txn: LoggingTransaction,
|
||||||
|
user_id: str,
|
||||||
|
device_ids: Collection[str],
|
||||||
|
stream_ids: List[str],
|
||||||
|
):
|
||||||
txn.call_after(
|
txn.call_after(
|
||||||
self._device_list_stream_cache.entity_has_changed, user_id, stream_ids[-1],
|
self._device_list_stream_cache.entity_has_changed, user_id, stream_ids[-1],
|
||||||
)
|
)
|
||||||
|
|
||||||
|
min_stream_id = stream_ids[0]
|
||||||
|
|
||||||
# Delete older entries in the table, as we really only care about
|
# Delete older entries in the table, as we really only care about
|
||||||
# when the latest change happened.
|
# when the latest change happened.
|
||||||
txn.executemany(
|
txn.executemany(
|
||||||
|
@ -1036,7 +1047,7 @@ class DeviceStore(DeviceWorkerStore, DeviceBackgroundUpdateStore):
|
||||||
DELETE FROM device_lists_stream
|
DELETE FROM device_lists_stream
|
||||||
WHERE user_id = ? AND device_id = ? AND stream_id < ?
|
WHERE user_id = ? AND device_id = ? AND stream_id < ?
|
||||||
""",
|
""",
|
||||||
[(user_id, device_id, stream_ids[0]) for device_id in device_ids],
|
[(user_id, device_id, min_stream_id) for device_id in device_ids],
|
||||||
)
|
)
|
||||||
|
|
||||||
self.db.simple_insert_many_txn(
|
self.db.simple_insert_many_txn(
|
||||||
|
|
|
@ -14,7 +14,7 @@
|
||||||
# limitations under the License.
|
# limitations under the License.
|
||||||
import itertools
|
import itertools
|
||||||
import logging
|
import logging
|
||||||
from typing import List, Optional, Set
|
from typing import Dict, List, Optional, Set, Tuple
|
||||||
|
|
||||||
from six.moves.queue import Empty, PriorityQueue
|
from six.moves.queue import Empty, PriorityQueue
|
||||||
|
|
||||||
|
@ -103,6 +103,154 @@ class EventFederationWorkerStore(EventsWorkerStore, SignatureWorkerStore, SQLBas
|
||||||
|
|
||||||
return list(results)
|
return list(results)
|
||||||
|
|
||||||
|
def get_auth_chain_difference(self, state_sets: List[Set[str]]):
|
||||||
|
"""Given sets of state events figure out the auth chain difference (as
|
||||||
|
per state res v2 algorithm).
|
||||||
|
|
||||||
|
This equivalent to fetching the full auth chain for each set of state
|
||||||
|
and returning the events that don't appear in each and every auth
|
||||||
|
chain.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Deferred[Set[str]]
|
||||||
|
"""
|
||||||
|
|
||||||
|
return self.db.runInteraction(
|
||||||
|
"get_auth_chain_difference",
|
||||||
|
self._get_auth_chain_difference_txn,
|
||||||
|
state_sets,
|
||||||
|
)
|
||||||
|
|
||||||
|
def _get_auth_chain_difference_txn(
|
||||||
|
self, txn, state_sets: List[Set[str]]
|
||||||
|
) -> Set[str]:
|
||||||
|
|
||||||
|
# Algorithm Description
|
||||||
|
# ~~~~~~~~~~~~~~~~~~~~~
|
||||||
|
#
|
||||||
|
# The idea here is to basically walk the auth graph of each state set in
|
||||||
|
# tandem, keeping track of which auth events are reachable by each state
|
||||||
|
# set. If we reach an auth event we've already visited (via a different
|
||||||
|
# state set) then we mark that auth event and all ancestors as reachable
|
||||||
|
# by the state set. This requires that we keep track of the auth chains
|
||||||
|
# in memory.
|
||||||
|
#
|
||||||
|
# Doing it in a such a way means that we can stop early if all auth
|
||||||
|
# events we're currently walking are reachable by all state sets.
|
||||||
|
#
|
||||||
|
# *Note*: We can't stop walking an event's auth chain if it is reachable
|
||||||
|
# by all state sets. This is because other auth chains we're walking
|
||||||
|
# might be reachable only via the original auth chain. For example,
|
||||||
|
# given the following auth chain:
|
||||||
|
#
|
||||||
|
# A -> C -> D -> E
|
||||||
|
# / /
|
||||||
|
# B -´---------´
|
||||||
|
#
|
||||||
|
# and state sets {A} and {B} then walking the auth chains of A and B
|
||||||
|
# would immediately show that C is reachable by both. However, if we
|
||||||
|
# stopped at C then we'd only reach E via the auth chain of B and so E
|
||||||
|
# would errornously get included in the returned difference.
|
||||||
|
#
|
||||||
|
# The other thing that we do is limit the number of auth chains we walk
|
||||||
|
# at once, due to practical limits (i.e. we can only query the database
|
||||||
|
# with a limited set of parameters). We pick the auth chains we walk
|
||||||
|
# each iteration based on their depth, in the hope that events with a
|
||||||
|
# lower depth are likely reachable by those with higher depths.
|
||||||
|
#
|
||||||
|
# We could use any ordering that we believe would give a rough
|
||||||
|
# topological ordering, e.g. origin server timestamp. If the ordering
|
||||||
|
# chosen is not topological then the algorithm still produces the right
|
||||||
|
# result, but perhaps a bit more inefficiently. This is why it is safe
|
||||||
|
# to use "depth" here.
|
||||||
|
|
||||||
|
initial_events = set(state_sets[0]).union(*state_sets[1:])
|
||||||
|
|
||||||
|
# Dict from events in auth chains to which sets *cannot* reach them.
|
||||||
|
# I.e. if the set is empty then all sets can reach the event.
|
||||||
|
event_to_missing_sets = {
|
||||||
|
event_id: {i for i, a in enumerate(state_sets) if event_id not in a}
|
||||||
|
for event_id in initial_events
|
||||||
|
}
|
||||||
|
|
||||||
|
# We need to get the depth of the initial events for sorting purposes.
|
||||||
|
sql = """
|
||||||
|
SELECT depth, event_id FROM events
|
||||||
|
WHERE %s
|
||||||
|
ORDER BY depth ASC
|
||||||
|
"""
|
||||||
|
clause, args = make_in_list_sql_clause(
|
||||||
|
txn.database_engine, "event_id", initial_events
|
||||||
|
)
|
||||||
|
txn.execute(sql % (clause,), args)
|
||||||
|
|
||||||
|
# The sorted list of events whose auth chains we should walk.
|
||||||
|
search = txn.fetchall() # type: List[Tuple[int, str]]
|
||||||
|
|
||||||
|
# Map from event to its auth events
|
||||||
|
event_to_auth_events = {} # type: Dict[str, Set[str]]
|
||||||
|
|
||||||
|
base_sql = """
|
||||||
|
SELECT a.event_id, auth_id, depth
|
||||||
|
FROM event_auth AS a
|
||||||
|
INNER JOIN events AS e ON (e.event_id = a.auth_id)
|
||||||
|
WHERE
|
||||||
|
"""
|
||||||
|
|
||||||
|
while search:
|
||||||
|
# Check whether all our current walks are reachable by all state
|
||||||
|
# sets. If so we can bail.
|
||||||
|
if all(not event_to_missing_sets[eid] for _, eid in search):
|
||||||
|
break
|
||||||
|
|
||||||
|
# Fetch the auth events and their depths of the N last events we're
|
||||||
|
# currently walking
|
||||||
|
search, chunk = search[:-100], search[-100:]
|
||||||
|
clause, args = make_in_list_sql_clause(
|
||||||
|
txn.database_engine, "a.event_id", [e_id for _, e_id in chunk]
|
||||||
|
)
|
||||||
|
txn.execute(base_sql + clause, args)
|
||||||
|
|
||||||
|
for event_id, auth_event_id, auth_event_depth in txn:
|
||||||
|
event_to_auth_events.setdefault(event_id, set()).add(auth_event_id)
|
||||||
|
|
||||||
|
sets = event_to_missing_sets.get(auth_event_id)
|
||||||
|
if sets is None:
|
||||||
|
# First time we're seeing this event, so we add it to the
|
||||||
|
# queue of things to fetch.
|
||||||
|
search.append((auth_event_depth, auth_event_id))
|
||||||
|
|
||||||
|
# Assume that this event is unreachable from any of the
|
||||||
|
# state sets until proven otherwise
|
||||||
|
sets = event_to_missing_sets[auth_event_id] = set(
|
||||||
|
range(len(state_sets))
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
# We've previously seen this event, so look up its auth
|
||||||
|
# events and recursively mark all ancestors as reachable
|
||||||
|
# by the current event's state set.
|
||||||
|
a_ids = event_to_auth_events.get(auth_event_id)
|
||||||
|
while a_ids:
|
||||||
|
new_aids = set()
|
||||||
|
for a_id in a_ids:
|
||||||
|
event_to_missing_sets[a_id].intersection_update(
|
||||||
|
event_to_missing_sets[event_id]
|
||||||
|
)
|
||||||
|
|
||||||
|
b = event_to_auth_events.get(a_id)
|
||||||
|
if b:
|
||||||
|
new_aids.update(b)
|
||||||
|
|
||||||
|
a_ids = new_aids
|
||||||
|
|
||||||
|
# Mark that the auth event is reachable by the approriate sets.
|
||||||
|
sets.intersection_update(event_to_missing_sets[event_id])
|
||||||
|
|
||||||
|
search.sort()
|
||||||
|
|
||||||
|
# Return all events where not all sets can reach them.
|
||||||
|
return {eid for eid, n in event_to_missing_sets.items() if n}
|
||||||
|
|
||||||
def get_oldest_events_in_room(self, room_id):
|
def get_oldest_events_in_room(self, room_id):
|
||||||
return self.db.runInteraction(
|
return self.db.runInteraction(
|
||||||
"get_oldest_events_in_room", self._get_oldest_events_in_room_txn, room_id
|
"get_oldest_events_in_room", self._get_oldest_events_in_room_txn, room_id
|
||||||
|
|
|
@ -608,6 +608,23 @@ class EventPushActionsWorkerStore(SQLBaseStore):
|
||||||
|
|
||||||
return range_end
|
return range_end
|
||||||
|
|
||||||
|
@defer.inlineCallbacks
|
||||||
|
def get_time_of_last_push_action_before(self, stream_ordering):
|
||||||
|
def f(txn):
|
||||||
|
sql = (
|
||||||
|
"SELECT e.received_ts"
|
||||||
|
" FROM event_push_actions AS ep"
|
||||||
|
" JOIN events e ON ep.room_id = e.room_id AND ep.event_id = e.event_id"
|
||||||
|
" WHERE ep.stream_ordering > ?"
|
||||||
|
" ORDER BY ep.stream_ordering ASC"
|
||||||
|
" LIMIT 1"
|
||||||
|
)
|
||||||
|
txn.execute(sql, (stream_ordering,))
|
||||||
|
return txn.fetchone()
|
||||||
|
|
||||||
|
result = yield self.db.runInteraction("get_time_of_last_push_action_before", f)
|
||||||
|
return result[0] if result else None
|
||||||
|
|
||||||
|
|
||||||
class EventPushActionsStore(EventPushActionsWorkerStore):
|
class EventPushActionsStore(EventPushActionsWorkerStore):
|
||||||
EPA_HIGHLIGHT_INDEX = "epa_highlight_index"
|
EPA_HIGHLIGHT_INDEX = "epa_highlight_index"
|
||||||
|
@ -735,23 +752,6 @@ class EventPushActionsStore(EventPushActionsWorkerStore):
|
||||||
pa["actions"] = _deserialize_action(pa["actions"], pa["highlight"])
|
pa["actions"] = _deserialize_action(pa["actions"], pa["highlight"])
|
||||||
return push_actions
|
return push_actions
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
|
||||||
def get_time_of_last_push_action_before(self, stream_ordering):
|
|
||||||
def f(txn):
|
|
||||||
sql = (
|
|
||||||
"SELECT e.received_ts"
|
|
||||||
" FROM event_push_actions AS ep"
|
|
||||||
" JOIN events e ON ep.room_id = e.room_id AND ep.event_id = e.event_id"
|
|
||||||
" WHERE ep.stream_ordering > ?"
|
|
||||||
" ORDER BY ep.stream_ordering ASC"
|
|
||||||
" LIMIT 1"
|
|
||||||
)
|
|
||||||
txn.execute(sql, (stream_ordering,))
|
|
||||||
return txn.fetchone()
|
|
||||||
|
|
||||||
result = yield self.db.runInteraction("get_time_of_last_push_action_before", f)
|
|
||||||
return result[0] if result else None
|
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def get_latest_push_action_stream_ordering(self):
|
def get_latest_push_action_stream_ordering(self):
|
||||||
def f(txn):
|
def f(txn):
|
||||||
|
|
|
@ -1168,7 +1168,11 @@ class EventsStore(
|
||||||
and original_event.internal_metadata.is_redacted()
|
and original_event.internal_metadata.is_redacted()
|
||||||
):
|
):
|
||||||
# Redaction was allowed
|
# Redaction was allowed
|
||||||
pruned_json = encode_json(prune_event_dict(original_event.get_dict()))
|
pruned_json = encode_json(
|
||||||
|
prune_event_dict(
|
||||||
|
original_event.room_version, original_event.get_dict()
|
||||||
|
)
|
||||||
|
)
|
||||||
else:
|
else:
|
||||||
# Redaction wasn't allowed
|
# Redaction wasn't allowed
|
||||||
pruned_json = None
|
pruned_json = None
|
||||||
|
@ -1929,7 +1933,9 @@ class EventsStore(
|
||||||
return
|
return
|
||||||
|
|
||||||
# Prune the event's dict then convert it to JSON.
|
# Prune the event's dict then convert it to JSON.
|
||||||
pruned_json = encode_json(prune_event_dict(event.get_dict()))
|
pruned_json = encode_json(
|
||||||
|
prune_event_dict(event.room_version, event.get_dict())
|
||||||
|
)
|
||||||
|
|
||||||
# Update the event_json table to replace the event's JSON with the pruned
|
# Update the event_json table to replace the event's JSON with the pruned
|
||||||
# JSON.
|
# JSON.
|
||||||
|
|
|
@ -28,9 +28,12 @@ from twisted.internet import defer
|
||||||
|
|
||||||
from synapse.api.constants import EventTypes
|
from synapse.api.constants import EventTypes
|
||||||
from synapse.api.errors import NotFoundError
|
from synapse.api.errors import NotFoundError
|
||||||
from synapse.api.room_versions import EventFormatVersions
|
from synapse.api.room_versions import (
|
||||||
from synapse.events import FrozenEvent, event_type_from_format_version # noqa: F401
|
KNOWN_ROOM_VERSIONS,
|
||||||
from synapse.events.snapshot import EventContext # noqa: F401
|
EventFormatVersions,
|
||||||
|
RoomVersions,
|
||||||
|
)
|
||||||
|
from synapse.events import make_event_from_dict
|
||||||
from synapse.events.utils import prune_event
|
from synapse.events.utils import prune_event
|
||||||
from synapse.logging.context import LoggingContext, PreserveLoggingContext
|
from synapse.logging.context import LoggingContext, PreserveLoggingContext
|
||||||
from synapse.metrics.background_process_metrics import run_as_background_process
|
from synapse.metrics.background_process_metrics import run_as_background_process
|
||||||
|
@ -580,8 +583,49 @@ class EventsWorkerStore(SQLBaseStore):
|
||||||
# of a event format version, so it must be a V1 event.
|
# of a event format version, so it must be a V1 event.
|
||||||
format_version = EventFormatVersions.V1
|
format_version = EventFormatVersions.V1
|
||||||
|
|
||||||
original_ev = event_type_from_format_version(format_version)(
|
room_version_id = row["room_version_id"]
|
||||||
|
|
||||||
|
if not room_version_id:
|
||||||
|
# this should only happen for out-of-band membership events
|
||||||
|
if not internal_metadata.get("out_of_band_membership"):
|
||||||
|
logger.warning(
|
||||||
|
"Room %s for event %s is unknown", d["room_id"], event_id
|
||||||
|
)
|
||||||
|
continue
|
||||||
|
|
||||||
|
# take a wild stab at the room version based on the event format
|
||||||
|
if format_version == EventFormatVersions.V1:
|
||||||
|
room_version = RoomVersions.V1
|
||||||
|
elif format_version == EventFormatVersions.V2:
|
||||||
|
room_version = RoomVersions.V3
|
||||||
|
else:
|
||||||
|
room_version = RoomVersions.V5
|
||||||
|
else:
|
||||||
|
room_version = KNOWN_ROOM_VERSIONS.get(room_version_id)
|
||||||
|
if not room_version:
|
||||||
|
logger.error(
|
||||||
|
"Event %s in room %s has unknown room version %s",
|
||||||
|
event_id,
|
||||||
|
d["room_id"],
|
||||||
|
room_version_id,
|
||||||
|
)
|
||||||
|
continue
|
||||||
|
|
||||||
|
if room_version.event_format != format_version:
|
||||||
|
logger.error(
|
||||||
|
"Event %s in room %s with version %s has wrong format: "
|
||||||
|
"expected %s, was %s",
|
||||||
|
event_id,
|
||||||
|
d["room_id"],
|
||||||
|
room_version_id,
|
||||||
|
room_version.event_format,
|
||||||
|
format_version,
|
||||||
|
)
|
||||||
|
continue
|
||||||
|
|
||||||
|
original_ev = make_event_from_dict(
|
||||||
event_dict=d,
|
event_dict=d,
|
||||||
|
room_version=room_version,
|
||||||
internal_metadata_dict=internal_metadata,
|
internal_metadata_dict=internal_metadata,
|
||||||
rejected_reason=rejected_reason,
|
rejected_reason=rejected_reason,
|
||||||
)
|
)
|
||||||
|
@ -661,6 +705,12 @@ class EventsWorkerStore(SQLBaseStore):
|
||||||
of EventFormatVersions. 'None' means the event predates
|
of EventFormatVersions. 'None' means the event predates
|
||||||
EventFormatVersions (so the event is format V1).
|
EventFormatVersions (so the event is format V1).
|
||||||
|
|
||||||
|
* room_version_id (str|None): The version of the room which contains the event.
|
||||||
|
Hopefully one of RoomVersions.
|
||||||
|
|
||||||
|
Due to historical reasons, there may be a few events in the database which
|
||||||
|
do not have an associated room; in this case None will be returned here.
|
||||||
|
|
||||||
* rejected_reason (str|None): if the event was rejected, the reason
|
* rejected_reason (str|None): if the event was rejected, the reason
|
||||||
why.
|
why.
|
||||||
|
|
||||||
|
@ -676,17 +726,18 @@ class EventsWorkerStore(SQLBaseStore):
|
||||||
"""
|
"""
|
||||||
event_dict = {}
|
event_dict = {}
|
||||||
for evs in batch_iter(event_ids, 200):
|
for evs in batch_iter(event_ids, 200):
|
||||||
sql = (
|
sql = """\
|
||||||
"SELECT "
|
SELECT
|
||||||
" e.event_id, "
|
e.event_id,
|
||||||
" e.internal_metadata,"
|
e.internal_metadata,
|
||||||
" e.json,"
|
e.json,
|
||||||
" e.format_version, "
|
e.format_version,
|
||||||
" rej.reason "
|
r.room_version,
|
||||||
" FROM event_json as e"
|
rej.reason
|
||||||
" LEFT JOIN rejections as rej USING (event_id)"
|
FROM event_json as e
|
||||||
" WHERE "
|
LEFT JOIN rooms r USING (room_id)
|
||||||
)
|
LEFT JOIN rejections as rej USING (event_id)
|
||||||
|
WHERE """
|
||||||
|
|
||||||
clause, args = make_in_list_sql_clause(
|
clause, args = make_in_list_sql_clause(
|
||||||
txn.database_engine, "e.event_id", evs
|
txn.database_engine, "e.event_id", evs
|
||||||
|
@ -701,7 +752,8 @@ class EventsWorkerStore(SQLBaseStore):
|
||||||
"internal_metadata": row[1],
|
"internal_metadata": row[1],
|
||||||
"json": row[2],
|
"json": row[2],
|
||||||
"format_version": row[3],
|
"format_version": row[3],
|
||||||
"rejected_reason": row[4],
|
"room_version_id": row[4],
|
||||||
|
"rejected_reason": row[5],
|
||||||
"redactions": [],
|
"redactions": [],
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
@ -43,13 +43,40 @@ class MonthlyActiveUsersWorkerStore(SQLBaseStore):
|
||||||
|
|
||||||
def _count_users(txn):
|
def _count_users(txn):
|
||||||
sql = "SELECT COALESCE(count(*), 0) FROM monthly_active_users"
|
sql = "SELECT COALESCE(count(*), 0) FROM monthly_active_users"
|
||||||
|
|
||||||
txn.execute(sql)
|
txn.execute(sql)
|
||||||
(count,) = txn.fetchone()
|
(count,) = txn.fetchone()
|
||||||
return count
|
return count
|
||||||
|
|
||||||
return self.db.runInteraction("count_users", _count_users)
|
return self.db.runInteraction("count_users", _count_users)
|
||||||
|
|
||||||
|
@cached(num_args=0)
|
||||||
|
def get_monthly_active_count_by_service(self):
|
||||||
|
"""Generates current count of monthly active users broken down by service.
|
||||||
|
A service is typically an appservice but also includes native matrix users.
|
||||||
|
Since the `monthly_active_users` table is populated from the `user_ips` table
|
||||||
|
`config.track_appservice_user_ips` must be set to `true` for this
|
||||||
|
method to return anything other than native matrix users.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Deferred[dict]: dict that includes a mapping between app_service_id
|
||||||
|
and the number of occurrences.
|
||||||
|
|
||||||
|
"""
|
||||||
|
|
||||||
|
def _count_users_by_service(txn):
|
||||||
|
sql = """
|
||||||
|
SELECT COALESCE(appservice_id, 'native'), COALESCE(count(*), 0)
|
||||||
|
FROM monthly_active_users
|
||||||
|
LEFT JOIN users ON monthly_active_users.user_id=users.name
|
||||||
|
GROUP BY appservice_id;
|
||||||
|
"""
|
||||||
|
|
||||||
|
txn.execute(sql)
|
||||||
|
result = txn.fetchall()
|
||||||
|
return dict(result)
|
||||||
|
|
||||||
|
return self.db.runInteraction("count_users_by_service", _count_users_by_service)
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def get_registered_reserved_users(self):
|
def get_registered_reserved_users(self):
|
||||||
"""Of the reserved threepids defined in config, which are associated
|
"""Of the reserved threepids defined in config, which are associated
|
||||||
|
@ -291,6 +318,9 @@ class MonthlyActiveUsersStore(MonthlyActiveUsersWorkerStore):
|
||||||
)
|
)
|
||||||
|
|
||||||
self._invalidate_cache_and_stream(txn, self.get_monthly_active_count, ())
|
self._invalidate_cache_and_stream(txn, self.get_monthly_active_count, ())
|
||||||
|
self._invalidate_cache_and_stream(
|
||||||
|
txn, self.get_monthly_active_count_by_service, ()
|
||||||
|
)
|
||||||
self._invalidate_cache_and_stream(
|
self._invalidate_cache_and_stream(
|
||||||
txn, self.user_last_seen_monthly_active, (user_id,)
|
txn, self.user_last_seen_monthly_active, (user_id,)
|
||||||
)
|
)
|
||||||
|
|
|
@ -0,0 +1,39 @@
|
||||||
|
/* Copyright 2020 The Matrix.org Foundation C.I.C.
|
||||||
|
*
|
||||||
|
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
* you may not use this file except in compliance with the License.
|
||||||
|
* You may obtain a copy of the License at
|
||||||
|
*
|
||||||
|
* http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
*
|
||||||
|
* Unless required by applicable law or agreed to in writing, software
|
||||||
|
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
* See the License for the specific language governing permissions and
|
||||||
|
* limitations under the License.
|
||||||
|
*/
|
||||||
|
|
||||||
|
-- When we first added the room_version column to the rooms table, it was populated from
|
||||||
|
-- the current_state_events table. However, there was an issue causing a background
|
||||||
|
-- update to clean up the current_state_events table for rooms where the server is no
|
||||||
|
-- longer participating, before that column could be populated. Therefore, some rooms had
|
||||||
|
-- a NULL room_version.
|
||||||
|
|
||||||
|
-- The rooms_version_column_2.sql.* delta files were introduced to make the populating
|
||||||
|
-- synchronous instead of running it in a background update, which fixed this issue.
|
||||||
|
-- However, all of the instances of Synapse installed or updated in the meantime got
|
||||||
|
-- their rooms table corrupted with NULL room_versions.
|
||||||
|
|
||||||
|
-- This query fishes out the room versions from the create event using the state_events
|
||||||
|
-- table instead of the current_state_events one, as the former still have all of the
|
||||||
|
-- create events.
|
||||||
|
|
||||||
|
UPDATE rooms SET room_version=(
|
||||||
|
SELECT COALESCE(json::json->'content'->>'room_version','1')
|
||||||
|
FROM state_events se INNER JOIN event_json ej USING (event_id)
|
||||||
|
WHERE se.room_id=rooms.room_id AND se.type='m.room.create' AND se.state_key=''
|
||||||
|
LIMIT 1
|
||||||
|
) WHERE rooms.room_version IS NULL;
|
||||||
|
|
||||||
|
-- see also rooms_version_column_3.sql.sqlite which has a copy of the above query, using
|
||||||
|
-- sqlite syntax for the json extraction.
|
|
@ -0,0 +1,23 @@
|
||||||
|
/* Copyright 2020 The Matrix.org Foundation C.I.C.
|
||||||
|
*
|
||||||
|
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
* you may not use this file except in compliance with the License.
|
||||||
|
* You may obtain a copy of the License at
|
||||||
|
*
|
||||||
|
* http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
*
|
||||||
|
* Unless required by applicable law or agreed to in writing, software
|
||||||
|
* distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
* See the License for the specific language governing permissions and
|
||||||
|
* limitations under the License.
|
||||||
|
*/
|
||||||
|
|
||||||
|
-- see rooms_version_column_3.sql.postgres for details of what's going on here.
|
||||||
|
|
||||||
|
UPDATE rooms SET room_version=(
|
||||||
|
SELECT COALESCE(json_extract(ej.json, '$.content.room_version'), '1')
|
||||||
|
FROM state_events se INNER JOIN event_json ej USING (event_id)
|
||||||
|
WHERE se.room_id=rooms.room_id AND se.type='m.room.create' AND se.state_key=''
|
||||||
|
LIMIT 1
|
||||||
|
) WHERE rooms.room_version IS NULL;
|
|
@ -15,6 +15,8 @@
|
||||||
|
|
||||||
import logging
|
import logging
|
||||||
|
|
||||||
|
from twisted.internet import defer
|
||||||
|
|
||||||
from synapse.storage._base import SQLBaseStore
|
from synapse.storage._base import SQLBaseStore
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
@ -56,7 +58,7 @@ class StateDeltasStore(SQLBaseStore):
|
||||||
# if the CSDs haven't changed between prev_stream_id and now, we
|
# if the CSDs haven't changed between prev_stream_id and now, we
|
||||||
# know for certain that they haven't changed between prev_stream_id and
|
# know for certain that they haven't changed between prev_stream_id and
|
||||||
# max_stream_id.
|
# max_stream_id.
|
||||||
return max_stream_id, []
|
return defer.succeed((max_stream_id, []))
|
||||||
|
|
||||||
def get_current_state_deltas_txn(txn):
|
def get_current_state_deltas_txn(txn):
|
||||||
# First we calculate the max stream id that will give us less than
|
# First we calculate the max stream id that will give us less than
|
||||||
|
|
|
@ -29,7 +29,11 @@ from twisted.internet import defer
|
||||||
|
|
||||||
from synapse.api.errors import StoreError
|
from synapse.api.errors import StoreError
|
||||||
from synapse.config.database import DatabaseConnectionConfig
|
from synapse.config.database import DatabaseConnectionConfig
|
||||||
from synapse.logging.context import LoggingContext, make_deferred_yieldable
|
from synapse.logging.context import (
|
||||||
|
LoggingContext,
|
||||||
|
LoggingContextOrSentinel,
|
||||||
|
make_deferred_yieldable,
|
||||||
|
)
|
||||||
from synapse.metrics.background_process_metrics import run_as_background_process
|
from synapse.metrics.background_process_metrics import run_as_background_process
|
||||||
from synapse.storage.background_updates import BackgroundUpdater
|
from synapse.storage.background_updates import BackgroundUpdater
|
||||||
from synapse.storage.engines import BaseDatabaseEngine, PostgresEngine, Sqlite3Engine
|
from synapse.storage.engines import BaseDatabaseEngine, PostgresEngine, Sqlite3Engine
|
||||||
|
@ -543,7 +547,9 @@ class Database(object):
|
||||||
Returns:
|
Returns:
|
||||||
Deferred: The result of func
|
Deferred: The result of func
|
||||||
"""
|
"""
|
||||||
parent_context = LoggingContext.current_context()
|
parent_context = (
|
||||||
|
LoggingContext.current_context()
|
||||||
|
) # type: Optional[LoggingContextOrSentinel]
|
||||||
if parent_context == LoggingContext.sentinel:
|
if parent_context == LoggingContext.sentinel:
|
||||||
logger.warning(
|
logger.warning(
|
||||||
"Starting db connection from sentinel context: metrics will be lost"
|
"Starting db connection from sentinel context: metrics will be lost"
|
||||||
|
|
|
@ -23,7 +23,7 @@ import attr
|
||||||
from signedjson.key import decode_verify_key_bytes
|
from signedjson.key import decode_verify_key_bytes
|
||||||
from unpaddedbase64 import decode_base64
|
from unpaddedbase64 import decode_base64
|
||||||
|
|
||||||
from synapse.api.errors import SynapseError
|
from synapse.api.errors import Codes, SynapseError
|
||||||
|
|
||||||
# define a version of typing.Collection that works on python 3.5
|
# define a version of typing.Collection that works on python 3.5
|
||||||
if sys.version_info[:3] >= (3, 6, 0):
|
if sys.version_info[:3] >= (3, 6, 0):
|
||||||
|
@ -166,11 +166,13 @@ class DomainSpecificString(namedtuple("DomainSpecificString", ("localpart", "dom
|
||||||
return self
|
return self
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def from_string(cls, s):
|
def from_string(cls, s: str):
|
||||||
"""Parse the string given by 's' into a structure object."""
|
"""Parse the string given by 's' into a structure object."""
|
||||||
if len(s) < 1 or s[0:1] != cls.SIGIL:
|
if len(s) < 1 or s[0:1] != cls.SIGIL:
|
||||||
raise SynapseError(
|
raise SynapseError(
|
||||||
400, "Expected %s string to start with '%s'" % (cls.__name__, cls.SIGIL)
|
400,
|
||||||
|
"Expected %s string to start with '%s'" % (cls.__name__, cls.SIGIL),
|
||||||
|
Codes.INVALID_PARAM,
|
||||||
)
|
)
|
||||||
|
|
||||||
parts = s[1:].split(":", 1)
|
parts = s[1:].split(":", 1)
|
||||||
|
@ -179,6 +181,7 @@ class DomainSpecificString(namedtuple("DomainSpecificString", ("localpart", "dom
|
||||||
400,
|
400,
|
||||||
"Expected %s of the form '%slocalname:domain'"
|
"Expected %s of the form '%slocalname:domain'"
|
||||||
% (cls.__name__, cls.SIGIL),
|
% (cls.__name__, cls.SIGIL),
|
||||||
|
Codes.INVALID_PARAM,
|
||||||
)
|
)
|
||||||
|
|
||||||
domain = parts[1]
|
domain = parts[1]
|
||||||
|
@ -235,11 +238,13 @@ class GroupID(DomainSpecificString):
|
||||||
def from_string(cls, s):
|
def from_string(cls, s):
|
||||||
group_id = super(GroupID, cls).from_string(s)
|
group_id = super(GroupID, cls).from_string(s)
|
||||||
if not group_id.localpart:
|
if not group_id.localpart:
|
||||||
raise SynapseError(400, "Group ID cannot be empty")
|
raise SynapseError(400, "Group ID cannot be empty", Codes.INVALID_PARAM)
|
||||||
|
|
||||||
if contains_invalid_mxid_characters(group_id.localpart):
|
if contains_invalid_mxid_characters(group_id.localpart):
|
||||||
raise SynapseError(
|
raise SynapseError(
|
||||||
400, "Group ID can only contain characters a-z, 0-9, or '=_-./'"
|
400,
|
||||||
|
"Group ID can only contain characters a-z, 0-9, or '=_-./'",
|
||||||
|
Codes.INVALID_PARAM,
|
||||||
)
|
)
|
||||||
|
|
||||||
return group_id
|
return group_id
|
||||||
|
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue