Synapse 0.34.0rc1 (2018-12-04)
============================== Synapse 0.34 is the first release to fully support Python 3. We recommend upgrading to Python 3, but make sure to read the [upgrade notes](UPGRADE.rst#upgrading-to-v0340) when doing so. Features -------- - Add option to track MAU stats (but not limit people) ([\#3830](https://github.com/matrix-org/synapse/issues/3830)) - Add an option to enable recording IPs for appservice users ([\#3831](https://github.com/matrix-org/synapse/issues/3831)) - Rename login type m.login.cas to m.login.sso ([\#4220](https://github.com/matrix-org/synapse/issues/4220)) - Add an option to disable search for homeservers that may not be interested in it. ([\#4230](https://github.com/matrix-org/synapse/issues/4230)) Bugfixes -------- - Pushrules can now again be made with non-ASCII rule IDs. ([\#4165](https://github.com/matrix-org/synapse/issues/4165)) - The media repository now no longer fails to decode UTF-8 filenames when downloading remote media. ([\#4176](https://github.com/matrix-org/synapse/issues/4176)) - URL previews now correctly decode non-UTF-8 text if the header contains a `<meta http-equiv="Content-Type"` header. ([\#4183](https://github.com/matrix-org/synapse/issues/4183)) - Fix an issue where public consent URLs had two slashes. ([\#4192](https://github.com/matrix-org/synapse/issues/4192)) - Fallback auth now accepts the session parameter on Python 3. ([\#4197](https://github.com/matrix-org/synapse/issues/4197)) - Remove riot.im from the list of trusted Identity Servers in the default configuration ([\#4207](https://github.com/matrix-org/synapse/issues/4207)) - fix start up failure when mau_limit_reserved_threepids set and db is postgres ([\#4211](https://github.com/matrix-org/synapse/issues/4211)) - Fix auto join failures for servers that require user consent ([\#4223](https://github.com/matrix-org/synapse/issues/4223)) - Fix exception caused by non-ascii event IDs ([\#4241](https://github.com/matrix-org/synapse/issues/4241)) - Pushers can now be unsubscribed from on Python 3. ([\#4250](https://github.com/matrix-org/synapse/issues/4250)) - Fix UnicodeDecodeError when postgres is configured to give non-English errors ([\#4253](https://github.com/matrix-org/synapse/issues/4253)) Internal Changes ---------------- - A coveragerc file, as well as the py36-coverage tox target, have been added. ([\#4180](https://github.com/matrix-org/synapse/issues/4180)) - Add a GitHub pull request template and add multiple issue templates ([\#4182](https://github.com/matrix-org/synapse/issues/4182)) - Update README to reflect the fact that #1491 is fixed ([\#4188](https://github.com/matrix-org/synapse/issues/4188)) - Run the AS senders as background processes to fix warnings ([\#4189](https://github.com/matrix-org/synapse/issues/4189)) - Add some diagnostics to the tests to detect logcontext problems ([\#4190](https://github.com/matrix-org/synapse/issues/4190)) - Add missing `jpeg` package prerequisite for OpenBSD in README. ([\#4193](https://github.com/matrix-org/synapse/issues/4193)) - Add a note saying you need to manually reclaim disk space after using the Purge History API ([\#4200](https://github.com/matrix-org/synapse/issues/4200)) - More logcontext checking in unittests ([\#4205](https://github.com/matrix-org/synapse/issues/4205)) - Ignore __pycache__ directories in the database schema folder ([\#4214](https://github.com/matrix-org/synapse/issues/4214)) - Add note to UPGRADE.rst about removing riot.im from list of trusted identity servers ([\#4224](https://github.com/matrix-org/synapse/issues/4224)) - Added automated coverage reporting to CI. ([\#4225](https://github.com/matrix-org/synapse/issues/4225)) - Garbage-collect after each unit test to fix logcontext leaks ([\#4227](https://github.com/matrix-org/synapse/issues/4227)) - add more detail to logging regarding "More than one row matched" error ([\#4234](https://github.com/matrix-org/synapse/issues/4234)) - Drop sent_transactions table ([\#4244](https://github.com/matrix-org/synapse/issues/4244)) - Add a basic .editorconfig ([\#4257](https://github.com/matrix-org/synapse/issues/4257)) - Update README.rst and UPGRADE.rst for Python 3. ([\#4260](https://github.com/matrix-org/synapse/issues/4260)) - Remove obsolete `verbose` and `log_file` settings from `homeserver.yaml` for Docker image. ([\#4261](https://github.com/matrix-org/synapse/issues/4261)) -----BEGIN PGP SIGNATURE----- iQFHBAABCgAxFiEEQlNDQm4FMsm53u1sih+T1XW16NUFAlwGiNcTHHJpY2hhcmRA bWF0cml4Lm9yZwAKCRCKH5PVdbXo1VlPCACUm4y2xcK/QziI0VF4q8dzsctwwOkP hTqWeCkQtCkC2igSiP6XFhXlW8AjP1yQ7s4Hk30ekM67BJUdXjnyvlrhpKBsnDgh FWnwTjAeZ2nuE7Cp4/R1AvVGB4X4mGVlUNHW7NtPmUkhxxV829FicaPIkPXLicZw EMoqCkyEmToWkI8juX+TOR2S4MV/nh4qvIKe/fMK4KGiAdAp4dZjFM+BimqcrmJW QfdvmZma+iZ1CTiHkBYOpofXXmuqftpSJup6gUvSskJjESlgGi2qeJ9I6Ck/fskB Rllxwva4DzamcwqQeEinS3hPojxxdWYxd7A3SmX+s3dvikKSN/UTmgrV =lb0d -----END PGP SIGNATURE----- Merge tag 'v0.34.0rc1' into matrix-org-hotfixespull/4396/head
commit
002db39a36
|
@ -4,8 +4,8 @@ jobs:
|
||||||
machine: true
|
machine: true
|
||||||
steps:
|
steps:
|
||||||
- checkout
|
- checkout
|
||||||
- run: docker build -f docker/Dockerfile -t matrixdotorg/synapse:${CIRCLE_TAG} .
|
- run: docker build -f docker/Dockerfile --label gitsha1=${CIRCLE_SHA1} -t matrixdotorg/synapse:${CIRCLE_TAG} .
|
||||||
- run: docker build -f docker/Dockerfile -t matrixdotorg/synapse:${CIRCLE_TAG}-py3 --build-arg PYTHON_VERSION=3.6 .
|
- run: docker build -f docker/Dockerfile --label gitsha1=${CIRCLE_SHA1} -t matrixdotorg/synapse:${CIRCLE_TAG}-py3 --build-arg PYTHON_VERSION=3.6 .
|
||||||
- run: docker login --username $DOCKER_HUB_USERNAME --password $DOCKER_HUB_PASSWORD
|
- run: docker login --username $DOCKER_HUB_USERNAME --password $DOCKER_HUB_PASSWORD
|
||||||
- run: docker push matrixdotorg/synapse:${CIRCLE_TAG}
|
- run: docker push matrixdotorg/synapse:${CIRCLE_TAG}
|
||||||
- run: docker push matrixdotorg/synapse:${CIRCLE_TAG}-py3
|
- run: docker push matrixdotorg/synapse:${CIRCLE_TAG}-py3
|
||||||
|
@ -13,13 +13,9 @@ jobs:
|
||||||
machine: true
|
machine: true
|
||||||
steps:
|
steps:
|
||||||
- checkout
|
- checkout
|
||||||
- run: docker build -f docker/Dockerfile -t matrixdotorg/synapse:${CIRCLE_SHA1} .
|
- run: docker build -f docker/Dockerfile --label gitsha1=${CIRCLE_SHA1} -t matrixdotorg/synapse:${CIRCLE_SHA1} .
|
||||||
- run: docker build -f docker/Dockerfile -t matrixdotorg/synapse:${CIRCLE_SHA1}-py3 --build-arg PYTHON_VERSION=3.6 .
|
- run: docker build -f docker/Dockerfile --label gitsha1=${CIRCLE_SHA1} -t matrixdotorg/synapse:${CIRCLE_SHA1}-py3 --build-arg PYTHON_VERSION=3.6 .
|
||||||
- run: docker login --username $DOCKER_HUB_USERNAME --password $DOCKER_HUB_PASSWORD
|
- run: docker login --username $DOCKER_HUB_USERNAME --password $DOCKER_HUB_PASSWORD
|
||||||
- run: docker tag matrixdotorg/synapse:${CIRCLE_SHA1} matrixdotorg/synapse:latest
|
|
||||||
- run: docker tag matrixdotorg/synapse:${CIRCLE_SHA1}-py3 matrixdotorg/synapse:latest-py3
|
|
||||||
- run: docker push matrixdotorg/synapse:${CIRCLE_SHA1}
|
|
||||||
- run: docker push matrixdotorg/synapse:${CIRCLE_SHA1}-py3
|
|
||||||
- run: docker push matrixdotorg/synapse:latest
|
- run: docker push matrixdotorg/synapse:latest
|
||||||
- run: docker push matrixdotorg/synapse:latest-py3
|
- run: docker push matrixdotorg/synapse:latest-py3
|
||||||
sytestpy2:
|
sytestpy2:
|
||||||
|
|
|
@ -0,0 +1,9 @@
|
||||||
|
# EditorConfig https://EditorConfig.org
|
||||||
|
|
||||||
|
# top-most EditorConfig file
|
||||||
|
root = true
|
||||||
|
|
||||||
|
# 4 space indentation
|
||||||
|
[*.py]
|
||||||
|
indent_style = space
|
||||||
|
indent_size = 4
|
10
.travis.yml
10
.travis.yml
|
@ -36,24 +36,24 @@ matrix:
|
||||||
env: TOX_ENV="pep8,check_isort"
|
env: TOX_ENV="pep8,check_isort"
|
||||||
|
|
||||||
- python: 2.7
|
- python: 2.7
|
||||||
env: TOX_ENV=py27 TRIAL_FLAGS="-j 2"
|
env: TOX_ENV=py27,codecov TRIAL_FLAGS="-j 2"
|
||||||
|
|
||||||
- python: 2.7
|
- python: 2.7
|
||||||
env: TOX_ENV=py27-old TRIAL_FLAGS="-j 2"
|
env: TOX_ENV=py27-old TRIAL_FLAGS="-j 2"
|
||||||
|
|
||||||
- python: 2.7
|
- python: 2.7
|
||||||
env: TOX_ENV=py27-postgres TRIAL_FLAGS="-j 4"
|
env: TOX_ENV=py27-postgres,codecov TRIAL_FLAGS="-j 4"
|
||||||
services:
|
services:
|
||||||
- postgresql
|
- postgresql
|
||||||
|
|
||||||
- python: 3.5
|
- python: 3.5
|
||||||
env: TOX_ENV=py35 TRIAL_FLAGS="-j 2"
|
env: TOX_ENV=py35,codecov TRIAL_FLAGS="-j 2"
|
||||||
|
|
||||||
- python: 3.6
|
- python: 3.6
|
||||||
env: TOX_ENV=py36 TRIAL_FLAGS="-j 2"
|
env: TOX_ENV=py36,codecov TRIAL_FLAGS="-j 2"
|
||||||
|
|
||||||
- python: 3.6
|
- python: 3.6
|
||||||
env: TOX_ENV=py36-postgres TRIAL_FLAGS="-j 4"
|
env: TOX_ENV=py36-postgres,codecov TRIAL_FLAGS="-j 4"
|
||||||
services:
|
services:
|
||||||
- postgresql
|
- postgresql
|
||||||
|
|
||||||
|
|
56
CHANGES.md
56
CHANGES.md
|
@ -1,3 +1,57 @@
|
||||||
|
Synapse 0.34.0rc1 (2018-12-04)
|
||||||
|
==============================
|
||||||
|
|
||||||
|
Synapse 0.34 is the first release to fully support Python 3. We recommend
|
||||||
|
upgrading to Python 3, but make sure to read the
|
||||||
|
[upgrade notes](UPGRADE.rst#upgrading-to-v0340) when doing so.
|
||||||
|
|
||||||
|
Features
|
||||||
|
--------
|
||||||
|
|
||||||
|
- Add option to track MAU stats (but not limit people) ([\#3830](https://github.com/matrix-org/synapse/issues/3830))
|
||||||
|
- Add an option to enable recording IPs for appservice users ([\#3831](https://github.com/matrix-org/synapse/issues/3831))
|
||||||
|
- Rename login type m.login.cas to m.login.sso ([\#4220](https://github.com/matrix-org/synapse/issues/4220))
|
||||||
|
- Add an option to disable search for homeservers that may not be interested in it. ([\#4230](https://github.com/matrix-org/synapse/issues/4230))
|
||||||
|
|
||||||
|
|
||||||
|
Bugfixes
|
||||||
|
--------
|
||||||
|
|
||||||
|
- Pushrules can now again be made with non-ASCII rule IDs. ([\#4165](https://github.com/matrix-org/synapse/issues/4165))
|
||||||
|
- The media repository now no longer fails to decode UTF-8 filenames when downloading remote media. ([\#4176](https://github.com/matrix-org/synapse/issues/4176))
|
||||||
|
- URL previews now correctly decode non-UTF-8 text if the header contains a `<meta http-equiv="Content-Type"` header. ([\#4183](https://github.com/matrix-org/synapse/issues/4183))
|
||||||
|
- Fix an issue where public consent URLs had two slashes. ([\#4192](https://github.com/matrix-org/synapse/issues/4192))
|
||||||
|
- Fallback auth now accepts the session parameter on Python 3. ([\#4197](https://github.com/matrix-org/synapse/issues/4197))
|
||||||
|
- Remove riot.im from the list of trusted Identity Servers in the default configuration ([\#4207](https://github.com/matrix-org/synapse/issues/4207))
|
||||||
|
- fix start up failure when mau_limit_reserved_threepids set and db is postgres ([\#4211](https://github.com/matrix-org/synapse/issues/4211))
|
||||||
|
- Fix auto join failures for servers that require user consent ([\#4223](https://github.com/matrix-org/synapse/issues/4223))
|
||||||
|
- Fix exception caused by non-ascii event IDs ([\#4241](https://github.com/matrix-org/synapse/issues/4241))
|
||||||
|
- Pushers can now be unsubscribed from on Python 3. ([\#4250](https://github.com/matrix-org/synapse/issues/4250))
|
||||||
|
- Fix UnicodeDecodeError when postgres is configured to give non-English errors ([\#4253](https://github.com/matrix-org/synapse/issues/4253))
|
||||||
|
|
||||||
|
|
||||||
|
Internal Changes
|
||||||
|
----------------
|
||||||
|
|
||||||
|
- A coveragerc file, as well as the py36-coverage tox target, have been added. ([\#4180](https://github.com/matrix-org/synapse/issues/4180))
|
||||||
|
- Add a GitHub pull request template and add multiple issue templates ([\#4182](https://github.com/matrix-org/synapse/issues/4182))
|
||||||
|
- Update README to reflect the fact that #1491 is fixed ([\#4188](https://github.com/matrix-org/synapse/issues/4188))
|
||||||
|
- Run the AS senders as background processes to fix warnings ([\#4189](https://github.com/matrix-org/synapse/issues/4189))
|
||||||
|
- Add some diagnostics to the tests to detect logcontext problems ([\#4190](https://github.com/matrix-org/synapse/issues/4190))
|
||||||
|
- Add missing `jpeg` package prerequisite for OpenBSD in README. ([\#4193](https://github.com/matrix-org/synapse/issues/4193))
|
||||||
|
- Add a note saying you need to manually reclaim disk space after using the Purge History API ([\#4200](https://github.com/matrix-org/synapse/issues/4200))
|
||||||
|
- More logcontext checking in unittests ([\#4205](https://github.com/matrix-org/synapse/issues/4205))
|
||||||
|
- Ignore __pycache__ directories in the database schema folder ([\#4214](https://github.com/matrix-org/synapse/issues/4214))
|
||||||
|
- Add note to UPGRADE.rst about removing riot.im from list of trusted identity servers ([\#4224](https://github.com/matrix-org/synapse/issues/4224))
|
||||||
|
- Added automated coverage reporting to CI. ([\#4225](https://github.com/matrix-org/synapse/issues/4225))
|
||||||
|
- Garbage-collect after each unit test to fix logcontext leaks ([\#4227](https://github.com/matrix-org/synapse/issues/4227))
|
||||||
|
- add more detail to logging regarding "More than one row matched" error ([\#4234](https://github.com/matrix-org/synapse/issues/4234))
|
||||||
|
- Drop sent_transactions table ([\#4244](https://github.com/matrix-org/synapse/issues/4244))
|
||||||
|
- Add a basic .editorconfig ([\#4257](https://github.com/matrix-org/synapse/issues/4257))
|
||||||
|
- Update README.rst and UPGRADE.rst for Python 3. ([\#4260](https://github.com/matrix-org/synapse/issues/4260))
|
||||||
|
- Remove obsolete `verbose` and `log_file` settings from `homeserver.yaml` for Docker image. ([\#4261](https://github.com/matrix-org/synapse/issues/4261))
|
||||||
|
|
||||||
|
|
||||||
Synapse 0.33.9 (2018-11-19)
|
Synapse 0.33.9 (2018-11-19)
|
||||||
===========================
|
===========================
|
||||||
|
|
||||||
|
@ -71,7 +125,7 @@ Synapse 0.33.8rc2 (2018-10-31)
|
||||||
Bugfixes
|
Bugfixes
|
||||||
--------
|
--------
|
||||||
|
|
||||||
- Searches that request profile info now no longer fail with a 500. Fixes
|
- Searches that request profile info now no longer fail with a 500. Fixes
|
||||||
a regression in 0.33.8rc1. ([\#4122](https://github.com/matrix-org/synapse/issues/4122))
|
a regression in 0.33.8rc1. ([\#4122](https://github.com/matrix-org/synapse/issues/4122))
|
||||||
|
|
||||||
|
|
||||||
|
|
|
@ -26,6 +26,7 @@ recursive-include synapse/static *.js
|
||||||
exclude Dockerfile
|
exclude Dockerfile
|
||||||
exclude .dockerignore
|
exclude .dockerignore
|
||||||
exclude test_postgresql.sh
|
exclude test_postgresql.sh
|
||||||
|
exclude .editorconfig
|
||||||
|
|
||||||
include pyproject.toml
|
include pyproject.toml
|
||||||
recursive-include changelog.d *
|
recursive-include changelog.d *
|
||||||
|
|
71
README.rst
71
README.rst
|
@ -86,7 +86,7 @@ Synapse is the reference Python/Twisted Matrix homeserver implementation.
|
||||||
System requirements:
|
System requirements:
|
||||||
|
|
||||||
- POSIX-compliant system (tested on Linux & OS X)
|
- POSIX-compliant system (tested on Linux & OS X)
|
||||||
- Python 2.7
|
- Python 3.5, 3.6, or 2.7
|
||||||
- At least 1GB of free RAM if you want to join large public rooms like #matrix:matrix.org
|
- At least 1GB of free RAM if you want to join large public rooms like #matrix:matrix.org
|
||||||
|
|
||||||
Installing from source
|
Installing from source
|
||||||
|
@ -101,13 +101,13 @@ header files for Python C extensions.
|
||||||
|
|
||||||
Installing prerequisites on Ubuntu or Debian::
|
Installing prerequisites on Ubuntu or Debian::
|
||||||
|
|
||||||
sudo apt-get install build-essential python2.7-dev libffi-dev \
|
sudo apt-get install build-essential python3-dev libffi-dev \
|
||||||
python-pip python-setuptools sqlite3 \
|
python-pip python-setuptools sqlite3 \
|
||||||
libssl-dev python-virtualenv libjpeg-dev libxslt1-dev
|
libssl-dev python-virtualenv libjpeg-dev libxslt1-dev
|
||||||
|
|
||||||
Installing prerequisites on ArchLinux::
|
Installing prerequisites on ArchLinux::
|
||||||
|
|
||||||
sudo pacman -S base-devel python2 python-pip \
|
sudo pacman -S base-devel python python-pip \
|
||||||
python-setuptools python-virtualenv sqlite3
|
python-setuptools python-virtualenv sqlite3
|
||||||
|
|
||||||
Installing prerequisites on CentOS 7 or Fedora 25::
|
Installing prerequisites on CentOS 7 or Fedora 25::
|
||||||
|
@ -126,12 +126,9 @@ Installing prerequisites on Mac OS X::
|
||||||
|
|
||||||
Installing prerequisites on Raspbian::
|
Installing prerequisites on Raspbian::
|
||||||
|
|
||||||
sudo apt-get install build-essential python2.7-dev libffi-dev \
|
sudo apt-get install build-essential python3-dev libffi-dev \
|
||||||
python-pip python-setuptools sqlite3 \
|
python-pip python-setuptools sqlite3 \
|
||||||
libssl-dev python-virtualenv libjpeg-dev
|
libssl-dev python-virtualenv libjpeg-dev
|
||||||
sudo pip install --upgrade pip
|
|
||||||
sudo pip install --upgrade ndg-httpsclient
|
|
||||||
sudo pip install --upgrade virtualenv
|
|
||||||
|
|
||||||
Installing prerequisites on openSUSE::
|
Installing prerequisites on openSUSE::
|
||||||
|
|
||||||
|
@ -146,20 +143,21 @@ Installing prerequisites on OpenBSD::
|
||||||
|
|
||||||
To install the Synapse homeserver run::
|
To install the Synapse homeserver run::
|
||||||
|
|
||||||
virtualenv -p python2.7 ~/.synapse
|
mkdir -p ~/synapse
|
||||||
source ~/.synapse/bin/activate
|
virtualenv -p python3 ~/synapse/env
|
||||||
|
source ~/synapse/env/bin/activate
|
||||||
pip install --upgrade pip
|
pip install --upgrade pip
|
||||||
pip install --upgrade setuptools
|
pip install --upgrade setuptools
|
||||||
pip install matrix-synapse
|
pip install matrix-synapse
|
||||||
|
|
||||||
This installs Synapse, along with the libraries it uses, into a virtual
|
This installs Synapse, along with the libraries it uses, into a virtual
|
||||||
environment under ``~/.synapse``. Feel free to pick a different directory
|
environment under ``~/synapse/env``. Feel free to pick a different directory
|
||||||
if you prefer.
|
if you prefer.
|
||||||
|
|
||||||
This Synapse installation can then be later upgraded by using pip again with the
|
This Synapse installation can then be later upgraded by using pip again with the
|
||||||
update flag::
|
update flag::
|
||||||
|
|
||||||
source ~/.synapse/bin/activate
|
source ~/synapse/env/bin/activate
|
||||||
pip install -U matrix-synapse
|
pip install -U matrix-synapse
|
||||||
|
|
||||||
In case of problems, please see the _`Troubleshooting` section below.
|
In case of problems, please see the _`Troubleshooting` section below.
|
||||||
|
@ -240,7 +238,7 @@ commandline script.
|
||||||
|
|
||||||
To get started, it is easiest to use the command line to register new users::
|
To get started, it is easiest to use the command line to register new users::
|
||||||
|
|
||||||
$ source ~/.synapse/bin/activate
|
$ source ~/synapse/env/bin/activate
|
||||||
$ synctl start # if not already running
|
$ synctl start # if not already running
|
||||||
$ register_new_matrix_user -c homeserver.yaml https://localhost:8448
|
$ register_new_matrix_user -c homeserver.yaml https://localhost:8448
|
||||||
New user localpart: erikj
|
New user localpart: erikj
|
||||||
|
@ -266,13 +264,12 @@ Running Synapse
|
||||||
===============
|
===============
|
||||||
|
|
||||||
To actually run your new homeserver, pick a working directory for Synapse to
|
To actually run your new homeserver, pick a working directory for Synapse to
|
||||||
run (e.g. ``~/.synapse``), and::
|
run (e.g. ``~/synapse``), and::
|
||||||
|
|
||||||
cd ~/.synapse
|
cd ~/synapse
|
||||||
source ./bin/activate
|
source env/bin/activate
|
||||||
synctl start
|
synctl start
|
||||||
|
|
||||||
|
|
||||||
Connecting to Synapse from a client
|
Connecting to Synapse from a client
|
||||||
===================================
|
===================================
|
||||||
|
|
||||||
|
@ -333,7 +330,7 @@ content served to web browsers a matrix API from being able to attack webapps ho
|
||||||
on the same domain. This is particularly true of sharing a matrix webclient and
|
on the same domain. This is particularly true of sharing a matrix webclient and
|
||||||
server on the same domain.
|
server on the same domain.
|
||||||
|
|
||||||
See https://github.com/vector-im/vector-web/issues/1977 and
|
See https://github.com/vector-im/riot-web/issues/1977 and
|
||||||
https://developer.github.com/changes/2014-04-25-user-content-security for more details.
|
https://developer.github.com/changes/2014-04-25-user-content-security for more details.
|
||||||
|
|
||||||
|
|
||||||
|
@ -380,35 +377,17 @@ the generated config),
|
||||||
https://www.archlinux.org/packages/community/any/python2-matrix-angular-sdk/ will also need to
|
https://www.archlinux.org/packages/community/any/python2-matrix-angular-sdk/ will also need to
|
||||||
be installed.
|
be installed.
|
||||||
|
|
||||||
Alternatively, to install using pip a few changes may be needed as ArchLinux
|
|
||||||
defaults to python 3, but synapse currently assumes python 2.7 by default:
|
|
||||||
|
|
||||||
pip may be outdated (6.0.7-1 and needs to be upgraded to 6.0.8-1 )::
|
pip may be outdated (6.0.7-1 and needs to be upgraded to 6.0.8-1 )::
|
||||||
|
|
||||||
sudo pip2.7 install --upgrade pip
|
sudo pip install --upgrade pip
|
||||||
|
|
||||||
You also may need to explicitly specify python 2.7 again during the install
|
|
||||||
request::
|
|
||||||
|
|
||||||
pip2.7 install https://github.com/matrix-org/synapse/tarball/master
|
|
||||||
|
|
||||||
If you encounter an error with lib bcrypt causing an Wrong ELF Class:
|
If you encounter an error with lib bcrypt causing an Wrong ELF Class:
|
||||||
ELFCLASS32 (x64 Systems), you may need to reinstall py-bcrypt to correctly
|
ELFCLASS32 (x64 Systems), you may need to reinstall py-bcrypt to correctly
|
||||||
compile it under the right architecture. (This should not be needed if
|
compile it under the right architecture. (This should not be needed if
|
||||||
installing under virtualenv)::
|
installing under virtualenv)::
|
||||||
|
|
||||||
sudo pip2.7 uninstall py-bcrypt
|
sudo pip uninstall py-bcrypt
|
||||||
sudo pip2.7 install py-bcrypt
|
sudo pip install py-bcrypt
|
||||||
|
|
||||||
During setup of Synapse you need to call python2.7 directly again::
|
|
||||||
|
|
||||||
cd ~/.synapse
|
|
||||||
python2.7 -m synapse.app.homeserver \
|
|
||||||
--server-name machine.my.domain.name \
|
|
||||||
--config-path homeserver.yaml \
|
|
||||||
--generate-config
|
|
||||||
|
|
||||||
...substituting your host and domain name as appropriate.
|
|
||||||
|
|
||||||
FreeBSD
|
FreeBSD
|
||||||
-------
|
-------
|
||||||
|
@ -475,7 +454,7 @@ You can fix this by manually upgrading pip and virtualenv::
|
||||||
|
|
||||||
sudo pip install --upgrade virtualenv
|
sudo pip install --upgrade virtualenv
|
||||||
|
|
||||||
You can next rerun ``virtualenv -p python2.7 synapse`` to update the virtual env.
|
You can next rerun ``virtualenv -p python3 synapse`` to update the virtual env.
|
||||||
|
|
||||||
Installing may fail during installing virtualenv with ``InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning.``
|
Installing may fail during installing virtualenv with ``InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning.``
|
||||||
You can fix this by manually installing ndg-httpsclient::
|
You can fix this by manually installing ndg-httpsclient::
|
||||||
|
@ -524,16 +503,6 @@ log lines and looking for any 'Processed request' lines which take more than
|
||||||
a few seconds to execute. Please let us know at #matrix-dev:matrix.org if
|
a few seconds to execute. Please let us know at #matrix-dev:matrix.org if
|
||||||
you see this failure mode so we can help debug it, however.
|
you see this failure mode so we can help debug it, however.
|
||||||
|
|
||||||
ArchLinux
|
|
||||||
~~~~~~~~~
|
|
||||||
|
|
||||||
If running `$ synctl start` fails with 'returned non-zero exit status 1',
|
|
||||||
you will need to explicitly call Python2.7 - either running as::
|
|
||||||
|
|
||||||
python2.7 -m synapse.app.homeserver --daemonize -c homeserver.yaml
|
|
||||||
|
|
||||||
...or by editing synctl with the correct python executable.
|
|
||||||
|
|
||||||
|
|
||||||
Upgrading an existing Synapse
|
Upgrading an existing Synapse
|
||||||
=============================
|
=============================
|
||||||
|
@ -731,7 +700,7 @@ port:
|
||||||
|
|
||||||
* Until v0.33.3, Synapse did not support SNI on the federation port
|
* Until v0.33.3, Synapse did not support SNI on the federation port
|
||||||
(`bug #1491 <https://github.com/matrix-org/synapse/issues/1491>`_). This bug
|
(`bug #1491 <https://github.com/matrix-org/synapse/issues/1491>`_). This bug
|
||||||
is now fixed, but means that federating with older servers can be unreliable
|
is now fixed, but means that federating with older servers can be unreliable
|
||||||
when using name-based virtual hosting.
|
when using name-based virtual hosting.
|
||||||
|
|
||||||
Furthermore, a number of the normal reasons for using a reverse-proxy do not
|
Furthermore, a number of the normal reasons for using a reverse-proxy do not
|
||||||
|
@ -828,7 +797,7 @@ Password reset
|
||||||
==============
|
==============
|
||||||
|
|
||||||
If a user has registered an email address to their account using an identity
|
If a user has registered an email address to their account using an identity
|
||||||
server, they can request a password-reset token via clients such as Vector.
|
server, they can request a password-reset token via clients such as Riot.
|
||||||
|
|
||||||
A manual password reset can be done via direct database access as follows.
|
A manual password reset can be done via direct database access as follows.
|
||||||
|
|
||||||
|
|
68
UPGRADE.rst
68
UPGRADE.rst
|
@ -48,6 +48,74 @@ returned by the Client-Server API:
|
||||||
# configured on port 443.
|
# configured on port 443.
|
||||||
curl -kv https://<host.name>/_matrix/client/versions 2>&1 | grep "Server:"
|
curl -kv https://<host.name>/_matrix/client/versions 2>&1 | grep "Server:"
|
||||||
|
|
||||||
|
Upgrading to v0.34.0
|
||||||
|
====================
|
||||||
|
|
||||||
|
1. This release is the first to fully support Python 3. We recommend switching
|
||||||
|
to Python 3, as it has been shown to give performance improvements.
|
||||||
|
|
||||||
|
For users who have installed Synapse into a virtualenv, we recommend doing
|
||||||
|
this by creating a new virtualenv. For example::
|
||||||
|
|
||||||
|
virtualenv -p python3 ~/synapse/env3
|
||||||
|
source ~/synapse/env3/bin/activate
|
||||||
|
pip install matrix-synapse
|
||||||
|
|
||||||
|
You can then start synapse as normal, having activated the new virtualenv::
|
||||||
|
|
||||||
|
cd ~/synapse
|
||||||
|
source env3/bin/activate
|
||||||
|
synctl start
|
||||||
|
|
||||||
|
Users who have installed from distribution packages should see the relevant
|
||||||
|
package documentation.
|
||||||
|
|
||||||
|
* When upgrading to Python 3, you **must** make sure that your log files are
|
||||||
|
configured as UTF-8, by adding ``encoding: utf8`` to the
|
||||||
|
``RotatingFileHandler`` configuration (if you have one) in your
|
||||||
|
``<server>.log.config`` file. For example, if your ``log.config`` file
|
||||||
|
contains::
|
||||||
|
|
||||||
|
handlers:
|
||||||
|
file:
|
||||||
|
class: logging.handlers.RotatingFileHandler
|
||||||
|
formatter: precise
|
||||||
|
filename: homeserver.log
|
||||||
|
maxBytes: 104857600
|
||||||
|
backupCount: 10
|
||||||
|
filters: [context]
|
||||||
|
console:
|
||||||
|
class: logging.StreamHandler
|
||||||
|
formatter: precise
|
||||||
|
filters: [context]
|
||||||
|
|
||||||
|
Then you should update this to be::
|
||||||
|
|
||||||
|
handlers:
|
||||||
|
file:
|
||||||
|
class: logging.handlers.RotatingFileHandler
|
||||||
|
formatter: precise
|
||||||
|
filename: homeserver.log
|
||||||
|
maxBytes: 104857600
|
||||||
|
backupCount: 10
|
||||||
|
filters: [context]
|
||||||
|
encoding: utf8
|
||||||
|
console:
|
||||||
|
class: logging.StreamHandler
|
||||||
|
formatter: precise
|
||||||
|
filters: [context]
|
||||||
|
|
||||||
|
There is no need to revert this change if downgrading to Python 2.
|
||||||
|
|
||||||
|
2. This release removes the ``riot.im`` from the default list of trusted
|
||||||
|
identity servers.
|
||||||
|
|
||||||
|
If ``riot.im`` is in your homeserver's list of
|
||||||
|
``trusted_third_party_id_servers``, you should remove it. It was added in
|
||||||
|
case a hypothetical future identity server was put there. If you don't
|
||||||
|
remove it, users may be unable to deactivate their accounts.
|
||||||
|
|
||||||
|
|
||||||
Upgrading to v0.33.7
|
Upgrading to v0.33.7
|
||||||
====================
|
====================
|
||||||
|
|
||||||
|
|
|
@ -0,0 +1 @@
|
||||||
|
Add an option to enable recording IPs for appservice users
|
|
@ -0,0 +1 @@
|
||||||
|
Pushrules can now again be made with non-ASCII rule IDs.
|
|
@ -0,0 +1,2 @@
|
||||||
|
Run the AS senders as background processes to fix warnings
|
||||||
|
|
|
@ -0,0 +1 @@
|
||||||
|
Add some diagnostics to the tests to detect logcontext problems
|
|
@ -0,0 +1 @@
|
||||||
|
More logcontext checking in unittests
|
|
@ -0,0 +1 @@
|
||||||
|
Fix logcontext leaks in EmailPusher and in tests
|
|
@ -0,0 +1,2 @@
|
||||||
|
fix start up failure when mau_limit_reserved_threepids set and db is postgres
|
||||||
|
|
|
@ -0,0 +1 @@
|
||||||
|
Ignore __pycache__ directories in the database schema folder
|
|
@ -0,0 +1 @@
|
||||||
|
Rename login type m.login.cas to m.login.sso
|
|
@ -0,0 +1 @@
|
||||||
|
Fix auto join failures for servers that require user consent
|
|
@ -0,0 +1 @@
|
||||||
|
Add note to UPGRADE.rst about removing riot.im from list of trusted identity servers
|
|
@ -0,0 +1 @@
|
||||||
|
Added automated coverage reporting to CI.
|
|
@ -0,0 +1 @@
|
||||||
|
Garbage-collect after each unit test to fix logcontext leaks
|
|
@ -0,0 +1 @@
|
||||||
|
Add an option to disable search for homeservers that may not be interested in it.
|
|
@ -0,0 +1 @@
|
||||||
|
add more detail to logging regarding "More than one row matched" error
|
|
@ -0,0 +1 @@
|
||||||
|
Fix exception caused by non-ascii event IDs
|
|
@ -0,0 +1 @@
|
||||||
|
Drop sent_transactions table
|
|
@ -0,0 +1 @@
|
||||||
|
Pushers can now be unsubscribed from on Python 3.
|
|
@ -0,0 +1 @@
|
||||||
|
Fix UnicodeDecodeError when postgres is configured to give non-English errors
|
|
@ -0,0 +1 @@
|
||||||
|
Add a basic .editorconfig
|
|
@ -0,0 +1 @@
|
||||||
|
Update README.rst and UPGRADE.rst for Python 3.
|
|
@ -0,0 +1 @@
|
||||||
|
Remove obsolete `verbose` and `log_file` settings from `homeserver.yaml` for Docker image.
|
|
@ -14,6 +14,7 @@ server_name: "{{ SYNAPSE_SERVER_NAME }}"
|
||||||
pid_file: /homeserver.pid
|
pid_file: /homeserver.pid
|
||||||
web_client: False
|
web_client: False
|
||||||
soft_file_limit: 0
|
soft_file_limit: 0
|
||||||
|
log_config: "/compiled/log.config"
|
||||||
|
|
||||||
## Ports ##
|
## Ports ##
|
||||||
|
|
||||||
|
@ -67,9 +68,6 @@ database:
|
||||||
## Performance ##
|
## Performance ##
|
||||||
|
|
||||||
event_cache_size: "{{ SYNAPSE_EVENT_CACHE_SIZE or "10K" }}"
|
event_cache_size: "{{ SYNAPSE_EVENT_CACHE_SIZE or "10K" }}"
|
||||||
verbose: 0
|
|
||||||
log_file: "/data/homeserver.log"
|
|
||||||
log_config: "/compiled/log.config"
|
|
||||||
|
|
||||||
## Ratelimiting ##
|
## Ratelimiting ##
|
||||||
|
|
||||||
|
|
|
@ -163,7 +163,7 @@ the logcontext was set, this will make things work out ok: provided
|
||||||
It's all too easy to forget to ``yield``: for instance if we forgot that
|
It's all too easy to forget to ``yield``: for instance if we forgot that
|
||||||
``do_some_stuff`` returned a deferred, we might plough on regardless. This
|
``do_some_stuff`` returned a deferred, we might plough on regardless. This
|
||||||
leads to a mess; it will probably work itself out eventually, but not before
|
leads to a mess; it will probably work itself out eventually, but not before
|
||||||
a load of stuff has been logged against the wrong content. (Normally, other
|
a load of stuff has been logged against the wrong context. (Normally, other
|
||||||
things will break, more obviously, if you forget to ``yield``, so this tends
|
things will break, more obviously, if you forget to ``yield``, so this tends
|
||||||
not to be a major problem in practice.)
|
not to be a major problem in practice.)
|
||||||
|
|
||||||
|
@ -440,3 +440,59 @@ To conclude: I think this scheme would have worked equally well, with less
|
||||||
danger of messing it up, and probably made some more esoteric code easier to
|
danger of messing it up, and probably made some more esoteric code easier to
|
||||||
write. But again — changing the conventions of the entire Synapse codebase is
|
write. But again — changing the conventions of the entire Synapse codebase is
|
||||||
not a sensible option for the marginal improvement offered.
|
not a sensible option for the marginal improvement offered.
|
||||||
|
|
||||||
|
|
||||||
|
A note on garbage-collection of Deferred chains
|
||||||
|
-----------------------------------------------
|
||||||
|
|
||||||
|
It turns out that our logcontext rules do not play nicely with Deferred
|
||||||
|
chains which get orphaned and garbage-collected.
|
||||||
|
|
||||||
|
Imagine we have some code that looks like this:
|
||||||
|
|
||||||
|
.. code:: python
|
||||||
|
|
||||||
|
listener_queue = []
|
||||||
|
|
||||||
|
def on_something_interesting():
|
||||||
|
for d in listener_queue:
|
||||||
|
d.callback("foo")
|
||||||
|
|
||||||
|
@defer.inlineCallbacks
|
||||||
|
def await_something_interesting():
|
||||||
|
new_deferred = defer.Deferred()
|
||||||
|
listener_queue.append(new_deferred)
|
||||||
|
|
||||||
|
with PreserveLoggingContext():
|
||||||
|
yield new_deferred
|
||||||
|
|
||||||
|
Obviously, the idea here is that we have a bunch of things which are waiting
|
||||||
|
for an event. (It's just an example of the problem here, but a relatively
|
||||||
|
common one.)
|
||||||
|
|
||||||
|
Now let's imagine two further things happen. First of all, whatever was
|
||||||
|
waiting for the interesting thing goes away. (Perhaps the request times out,
|
||||||
|
or something *even more* interesting happens.)
|
||||||
|
|
||||||
|
Secondly, let's suppose that we decide that the interesting thing is never
|
||||||
|
going to happen, and we reset the listener queue:
|
||||||
|
|
||||||
|
.. code:: python
|
||||||
|
|
||||||
|
def reset_listener_queue():
|
||||||
|
listener_queue.clear()
|
||||||
|
|
||||||
|
So, both ends of the deferred chain have now dropped their references, and the
|
||||||
|
deferred chain is now orphaned, and will be garbage-collected at some point.
|
||||||
|
Note that ``await_something_interesting`` is a generator function, and when
|
||||||
|
Python garbage-collects generator functions, it gives them a chance to clean
|
||||||
|
up by making the ``yield`` raise a ``GeneratorExit`` exception. In our case,
|
||||||
|
that means that the ``__exit__`` handler of ``PreserveLoggingContext`` will
|
||||||
|
carefully restore the request context, but there is now nothing waiting for
|
||||||
|
its return, so the request context is never cleared.
|
||||||
|
|
||||||
|
To reiterate, this problem only arises when *both* ends of a deferred chain
|
||||||
|
are dropped. Dropping the the reference to a deferred you're supposed to be
|
||||||
|
calling is probably bad practice, so this doesn't actually happen too much.
|
||||||
|
Unfortunately, when it does happen, it will lead to leaked logcontexts which
|
||||||
|
are incredibly hard to track down.
|
||||||
|
|
|
@ -27,4 +27,4 @@ try:
|
||||||
except ImportError:
|
except ImportError:
|
||||||
pass
|
pass
|
||||||
|
|
||||||
__version__ = "0.33.9"
|
__version__ = "0.34.0rc1"
|
||||||
|
|
|
@ -188,17 +188,33 @@ class Auth(object):
|
||||||
"""
|
"""
|
||||||
# Can optionally look elsewhere in the request (e.g. headers)
|
# Can optionally look elsewhere in the request (e.g. headers)
|
||||||
try:
|
try:
|
||||||
user_id, app_service = yield self._get_appservice_user_id(request)
|
ip_addr = self.hs.get_ip_from_request(request)
|
||||||
if user_id:
|
user_agent = request.requestHeaders.getRawHeaders(
|
||||||
request.authenticated_entity = user_id
|
b"User-Agent",
|
||||||
defer.returnValue(
|
default=[b""]
|
||||||
synapse.types.create_requester(user_id, app_service=app_service)
|
)[0].decode('ascii', 'surrogateescape')
|
||||||
)
|
|
||||||
|
|
||||||
access_token = self.get_access_token_from_request(
|
access_token = self.get_access_token_from_request(
|
||||||
request, self.TOKEN_NOT_FOUND_HTTP_STATUS
|
request, self.TOKEN_NOT_FOUND_HTTP_STATUS
|
||||||
)
|
)
|
||||||
|
|
||||||
|
user_id, app_service = yield self._get_appservice_user_id(request)
|
||||||
|
if user_id:
|
||||||
|
request.authenticated_entity = user_id
|
||||||
|
|
||||||
|
if ip_addr and self.hs.config.track_appservice_user_ips:
|
||||||
|
yield self.store.insert_client_ip(
|
||||||
|
user_id=user_id,
|
||||||
|
access_token=access_token,
|
||||||
|
ip=ip_addr,
|
||||||
|
user_agent=user_agent,
|
||||||
|
device_id="dummy-device", # stubbed
|
||||||
|
)
|
||||||
|
|
||||||
|
defer.returnValue(
|
||||||
|
synapse.types.create_requester(user_id, app_service=app_service)
|
||||||
|
)
|
||||||
|
|
||||||
user_info = yield self.get_user_by_access_token(access_token, rights)
|
user_info = yield self.get_user_by_access_token(access_token, rights)
|
||||||
user = user_info["user"]
|
user = user_info["user"]
|
||||||
token_id = user_info["token_id"]
|
token_id = user_info["token_id"]
|
||||||
|
@ -208,11 +224,6 @@ class Auth(object):
|
||||||
# stubbed out.
|
# stubbed out.
|
||||||
device_id = user_info.get("device_id")
|
device_id = user_info.get("device_id")
|
||||||
|
|
||||||
ip_addr = self.hs.get_ip_from_request(request)
|
|
||||||
user_agent = request.requestHeaders.getRawHeaders(
|
|
||||||
b"User-Agent",
|
|
||||||
default=[b""]
|
|
||||||
)[0].decode('ascii', 'surrogateescape')
|
|
||||||
if user and access_token and ip_addr:
|
if user and access_token and ip_addr:
|
||||||
yield self.store.insert_client_ip(
|
yield self.store.insert_client_ip(
|
||||||
user_id=user.to_string(),
|
user_id=user.to_string(),
|
||||||
|
|
|
@ -53,8 +53,8 @@ import logging
|
||||||
from twisted.internet import defer
|
from twisted.internet import defer
|
||||||
|
|
||||||
from synapse.appservice import ApplicationServiceState
|
from synapse.appservice import ApplicationServiceState
|
||||||
|
from synapse.metrics.background_process_metrics import run_as_background_process
|
||||||
from synapse.util.logcontext import run_in_background
|
from synapse.util.logcontext import run_in_background
|
||||||
from synapse.util.metrics import Measure
|
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
@ -104,14 +104,23 @@ class _ServiceQueuer(object):
|
||||||
self.clock = clock
|
self.clock = clock
|
||||||
|
|
||||||
def enqueue(self, service, event):
|
def enqueue(self, service, event):
|
||||||
# if this service isn't being sent something
|
|
||||||
self.queued_events.setdefault(service.id, []).append(event)
|
self.queued_events.setdefault(service.id, []).append(event)
|
||||||
run_in_background(self._send_request, service)
|
|
||||||
|
# start a sender for this appservice if we don't already have one
|
||||||
|
|
||||||
|
if service.id in self.requests_in_flight:
|
||||||
|
return
|
||||||
|
|
||||||
|
run_as_background_process(
|
||||||
|
"as-sender-%s" % (service.id, ),
|
||||||
|
self._send_request, service,
|
||||||
|
)
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def _send_request(self, service):
|
def _send_request(self, service):
|
||||||
if service.id in self.requests_in_flight:
|
# sanity-check: we shouldn't get here if this service already has a sender
|
||||||
return
|
# running.
|
||||||
|
assert(service.id not in self.requests_in_flight)
|
||||||
|
|
||||||
self.requests_in_flight.add(service.id)
|
self.requests_in_flight.add(service.id)
|
||||||
try:
|
try:
|
||||||
|
@ -119,12 +128,10 @@ class _ServiceQueuer(object):
|
||||||
events = self.queued_events.pop(service.id, [])
|
events = self.queued_events.pop(service.id, [])
|
||||||
if not events:
|
if not events:
|
||||||
return
|
return
|
||||||
|
try:
|
||||||
with Measure(self.clock, "servicequeuer.send"):
|
yield self.txn_ctrl.send(service, events)
|
||||||
try:
|
except Exception:
|
||||||
yield self.txn_ctrl.send(service, events)
|
logger.exception("AS request failed")
|
||||||
except Exception:
|
|
||||||
logger.exception("AS request failed")
|
|
||||||
finally:
|
finally:
|
||||||
self.requests_in_flight.discard(service.id)
|
self.requests_in_flight.discard(service.id)
|
||||||
|
|
||||||
|
@ -223,7 +230,12 @@ class _Recoverer(object):
|
||||||
self.backoff_counter = 1
|
self.backoff_counter = 1
|
||||||
|
|
||||||
def recover(self):
|
def recover(self):
|
||||||
self.clock.call_later((2 ** self.backoff_counter), self.retry)
|
def _retry():
|
||||||
|
run_as_background_process(
|
||||||
|
"as-recoverer-%s" % (self.service.id,),
|
||||||
|
self.retry,
|
||||||
|
)
|
||||||
|
self.clock.call_later((2 ** self.backoff_counter), _retry)
|
||||||
|
|
||||||
def _backoff(self):
|
def _backoff(self):
|
||||||
# cap the backoff to be around 8.5min => (2^9) = 512 secs
|
# cap the backoff to be around 8.5min => (2^9) = 512 secs
|
||||||
|
|
|
@ -33,11 +33,16 @@ class AppServiceConfig(Config):
|
||||||
def read_config(self, config):
|
def read_config(self, config):
|
||||||
self.app_service_config_files = config.get("app_service_config_files", [])
|
self.app_service_config_files = config.get("app_service_config_files", [])
|
||||||
self.notify_appservices = config.get("notify_appservices", True)
|
self.notify_appservices = config.get("notify_appservices", True)
|
||||||
|
self.track_appservice_user_ips = config.get("track_appservice_user_ips", False)
|
||||||
|
|
||||||
def default_config(cls, **kwargs):
|
def default_config(cls, **kwargs):
|
||||||
return """\
|
return """\
|
||||||
# A list of application service config file to use
|
# A list of application service config file to use
|
||||||
app_service_config_files: []
|
app_service_config_files: []
|
||||||
|
|
||||||
|
# Whether or not to track application service IP addresses. Implicitly
|
||||||
|
# enables MAU tracking for application service users.
|
||||||
|
track_appservice_user_ips: False
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
|
|
@ -62,6 +62,11 @@ class ServerConfig(Config):
|
||||||
# master, potentially causing inconsistency.
|
# master, potentially causing inconsistency.
|
||||||
self.enable_media_repo = config.get("enable_media_repo", True)
|
self.enable_media_repo = config.get("enable_media_repo", True)
|
||||||
|
|
||||||
|
# whether to enable search. If disabled, new entries will not be inserted
|
||||||
|
# into the search tables and they will not be indexed. Users will receive
|
||||||
|
# errors when attempting to search for messages.
|
||||||
|
self.enable_search = config.get("enable_search", True)
|
||||||
|
|
||||||
self.filter_timeline_limit = config.get("filter_timeline_limit", -1)
|
self.filter_timeline_limit = config.get("filter_timeline_limit", -1)
|
||||||
|
|
||||||
# Whether we should block invites sent to users on this server
|
# Whether we should block invites sent to users on this server
|
||||||
|
@ -384,7 +389,12 @@ class ServerConfig(Config):
|
||||||
# mau_limit_reserved_threepids:
|
# mau_limit_reserved_threepids:
|
||||||
# - medium: 'email'
|
# - medium: 'email'
|
||||||
# address: 'reserved_user@example.com'
|
# address: 'reserved_user@example.com'
|
||||||
|
#
|
||||||
|
# Room searching
|
||||||
|
#
|
||||||
|
# If disabled, new messages will not be indexed for searching and users
|
||||||
|
# will receive errors when searching for messages. Defaults to enabled.
|
||||||
|
# enable_search: true
|
||||||
""" % locals()
|
""" % locals()
|
||||||
|
|
||||||
def read_arguments(self, args):
|
def read_arguments(self, args):
|
||||||
|
|
|
@ -217,7 +217,19 @@ class RegistrationHandler(BaseHandler):
|
||||||
user_id = None
|
user_id = None
|
||||||
token = None
|
token = None
|
||||||
attempts += 1
|
attempts += 1
|
||||||
|
if not self.hs.config.user_consent_at_registration:
|
||||||
|
yield self._auto_join_rooms(user_id)
|
||||||
|
|
||||||
|
defer.returnValue((user_id, token))
|
||||||
|
|
||||||
|
@defer.inlineCallbacks
|
||||||
|
def _auto_join_rooms(self, user_id):
|
||||||
|
"""Automatically joins users to auto join rooms - creating the room in the first place
|
||||||
|
if the user is the first to be created.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
user_id(str): The user to join
|
||||||
|
"""
|
||||||
# auto-join the user to any rooms we're supposed to dump them into
|
# auto-join the user to any rooms we're supposed to dump them into
|
||||||
fake_requester = create_requester(user_id)
|
fake_requester = create_requester(user_id)
|
||||||
|
|
||||||
|
@ -226,7 +238,6 @@ class RegistrationHandler(BaseHandler):
|
||||||
if self.hs.config.autocreate_auto_join_rooms:
|
if self.hs.config.autocreate_auto_join_rooms:
|
||||||
count = yield self.store.count_all_users()
|
count = yield self.store.count_all_users()
|
||||||
should_auto_create_rooms = count == 1
|
should_auto_create_rooms = count == 1
|
||||||
|
|
||||||
for r in self.hs.config.auto_join_rooms:
|
for r in self.hs.config.auto_join_rooms:
|
||||||
try:
|
try:
|
||||||
if should_auto_create_rooms:
|
if should_auto_create_rooms:
|
||||||
|
@ -256,7 +267,15 @@ class RegistrationHandler(BaseHandler):
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error("Failed to join new user to %r: %r", r, e)
|
logger.error("Failed to join new user to %r: %r", r, e)
|
||||||
|
|
||||||
defer.returnValue((user_id, token))
|
@defer.inlineCallbacks
|
||||||
|
def post_consent_actions(self, user_id):
|
||||||
|
"""A series of registration actions that can only be carried out once consent
|
||||||
|
has been granted
|
||||||
|
|
||||||
|
Args:
|
||||||
|
user_id (str): The user to join
|
||||||
|
"""
|
||||||
|
yield self._auto_join_rooms(user_id)
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def appservice_register(self, user_localpart, as_token):
|
def appservice_register(self, user_localpart, as_token):
|
||||||
|
|
|
@ -50,6 +50,9 @@ class SearchHandler(BaseHandler):
|
||||||
dict to be returned to the client with results of search
|
dict to be returned to the client with results of search
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
if not self.hs.config.enable_search:
|
||||||
|
raise SynapseError(400, "Search is disabled on this homeserver")
|
||||||
|
|
||||||
batch_group = None
|
batch_group = None
|
||||||
batch_group_key = None
|
batch_group_key = None
|
||||||
batch_token = None
|
batch_token = None
|
||||||
|
|
|
@ -27,7 +27,7 @@ from twisted.web.client import PartialDownloadError
|
||||||
|
|
||||||
from synapse.api.errors import Codes, LoginError, SynapseError
|
from synapse.api.errors import Codes, LoginError, SynapseError
|
||||||
from synapse.http.server import finish_request
|
from synapse.http.server import finish_request
|
||||||
from synapse.http.servlet import parse_json_object_from_request
|
from synapse.http.servlet import RestServlet, parse_json_object_from_request
|
||||||
from synapse.types import UserID
|
from synapse.types import UserID
|
||||||
from synapse.util.msisdn import phone_number_to_msisdn
|
from synapse.util.msisdn import phone_number_to_msisdn
|
||||||
|
|
||||||
|
@ -83,6 +83,7 @@ class LoginRestServlet(ClientV1RestServlet):
|
||||||
PATTERNS = client_path_patterns("/login$")
|
PATTERNS = client_path_patterns("/login$")
|
||||||
SAML2_TYPE = "m.login.saml2"
|
SAML2_TYPE = "m.login.saml2"
|
||||||
CAS_TYPE = "m.login.cas"
|
CAS_TYPE = "m.login.cas"
|
||||||
|
SSO_TYPE = "m.login.sso"
|
||||||
TOKEN_TYPE = "m.login.token"
|
TOKEN_TYPE = "m.login.token"
|
||||||
JWT_TYPE = "m.login.jwt"
|
JWT_TYPE = "m.login.jwt"
|
||||||
|
|
||||||
|
@ -105,6 +106,10 @@ class LoginRestServlet(ClientV1RestServlet):
|
||||||
if self.saml2_enabled:
|
if self.saml2_enabled:
|
||||||
flows.append({"type": LoginRestServlet.SAML2_TYPE})
|
flows.append({"type": LoginRestServlet.SAML2_TYPE})
|
||||||
if self.cas_enabled:
|
if self.cas_enabled:
|
||||||
|
flows.append({"type": LoginRestServlet.SSO_TYPE})
|
||||||
|
|
||||||
|
# we advertise CAS for backwards compat, though MSC1721 renamed it
|
||||||
|
# to SSO.
|
||||||
flows.append({"type": LoginRestServlet.CAS_TYPE})
|
flows.append({"type": LoginRestServlet.CAS_TYPE})
|
||||||
|
|
||||||
# While its valid for us to advertise this login type generally,
|
# While its valid for us to advertise this login type generally,
|
||||||
|
@ -384,11 +389,11 @@ class SAML2RestServlet(ClientV1RestServlet):
|
||||||
defer.returnValue((200, {"status": "not_authenticated"}))
|
defer.returnValue((200, {"status": "not_authenticated"}))
|
||||||
|
|
||||||
|
|
||||||
class CasRedirectServlet(ClientV1RestServlet):
|
class CasRedirectServlet(RestServlet):
|
||||||
PATTERNS = client_path_patterns("/login/cas/redirect", releases=())
|
PATTERNS = client_path_patterns("/login/(cas|sso)/redirect")
|
||||||
|
|
||||||
def __init__(self, hs):
|
def __init__(self, hs):
|
||||||
super(CasRedirectServlet, self).__init__(hs)
|
super(CasRedirectServlet, self).__init__()
|
||||||
self.cas_server_url = hs.config.cas_server_url.encode('ascii')
|
self.cas_server_url = hs.config.cas_server_url.encode('ascii')
|
||||||
self.cas_service_url = hs.config.cas_service_url.encode('ascii')
|
self.cas_service_url = hs.config.cas_service_url.encode('ascii')
|
||||||
|
|
||||||
|
|
|
@ -42,7 +42,7 @@ class PushRuleRestServlet(ClientV1RestServlet):
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def on_PUT(self, request):
|
def on_PUT(self, request):
|
||||||
spec = _rule_spec_from_path(request.postpath)
|
spec = _rule_spec_from_path([x.decode('utf8') for x in request.postpath])
|
||||||
try:
|
try:
|
||||||
priority_class = _priority_class_from_spec(spec)
|
priority_class = _priority_class_from_spec(spec)
|
||||||
except InvalidRuleException as e:
|
except InvalidRuleException as e:
|
||||||
|
@ -103,7 +103,7 @@ class PushRuleRestServlet(ClientV1RestServlet):
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def on_DELETE(self, request):
|
def on_DELETE(self, request):
|
||||||
spec = _rule_spec_from_path(request.postpath)
|
spec = _rule_spec_from_path([x.decode('utf8') for x in request.postpath])
|
||||||
|
|
||||||
requester = yield self.auth.get_user_by_req(request)
|
requester = yield self.auth.get_user_by_req(request)
|
||||||
user_id = requester.user.to_string()
|
user_id = requester.user.to_string()
|
||||||
|
@ -134,7 +134,7 @@ class PushRuleRestServlet(ClientV1RestServlet):
|
||||||
|
|
||||||
rules = format_push_rules_for_user(requester.user, rules)
|
rules = format_push_rules_for_user(requester.user, rules)
|
||||||
|
|
||||||
path = request.postpath[1:]
|
path = [x.decode('utf8') for x in request.postpath][1:]
|
||||||
|
|
||||||
if path == []:
|
if path == []:
|
||||||
# we're a reference impl: pedantry is our job.
|
# we're a reference impl: pedantry is our job.
|
||||||
|
@ -142,11 +142,10 @@ class PushRuleRestServlet(ClientV1RestServlet):
|
||||||
PushRuleRestServlet.SLIGHTLY_PEDANTIC_TRAILING_SLASH_ERROR
|
PushRuleRestServlet.SLIGHTLY_PEDANTIC_TRAILING_SLASH_ERROR
|
||||||
)
|
)
|
||||||
|
|
||||||
if path[0] == b'':
|
if path[0] == '':
|
||||||
defer.returnValue((200, rules))
|
defer.returnValue((200, rules))
|
||||||
elif path[0] == b'global':
|
elif path[0] == 'global':
|
||||||
path = [x.decode('ascii') for x in path[1:]]
|
result = _filter_ruleset_with_path(rules['global'], path[1:])
|
||||||
result = _filter_ruleset_with_path(rules['global'], path)
|
|
||||||
defer.returnValue((200, result))
|
defer.returnValue((200, result))
|
||||||
else:
|
else:
|
||||||
raise UnrecognizedRequestError()
|
raise UnrecognizedRequestError()
|
||||||
|
@ -190,12 +189,24 @@ class PushRuleRestServlet(ClientV1RestServlet):
|
||||||
|
|
||||||
|
|
||||||
def _rule_spec_from_path(path):
|
def _rule_spec_from_path(path):
|
||||||
|
"""Turn a sequence of path components into a rule spec
|
||||||
|
|
||||||
|
Args:
|
||||||
|
path (sequence[unicode]): the URL path components.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
dict: rule spec dict, containing scope/template/rule_id entries,
|
||||||
|
and possibly attr.
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
UnrecognizedRequestError if the path components cannot be parsed.
|
||||||
|
"""
|
||||||
if len(path) < 2:
|
if len(path) < 2:
|
||||||
raise UnrecognizedRequestError()
|
raise UnrecognizedRequestError()
|
||||||
if path[0] != b'pushrules':
|
if path[0] != 'pushrules':
|
||||||
raise UnrecognizedRequestError()
|
raise UnrecognizedRequestError()
|
||||||
|
|
||||||
scope = path[1].decode('ascii')
|
scope = path[1]
|
||||||
path = path[2:]
|
path = path[2:]
|
||||||
if scope != 'global':
|
if scope != 'global':
|
||||||
raise UnrecognizedRequestError()
|
raise UnrecognizedRequestError()
|
||||||
|
@ -203,13 +214,13 @@ def _rule_spec_from_path(path):
|
||||||
if len(path) == 0:
|
if len(path) == 0:
|
||||||
raise UnrecognizedRequestError()
|
raise UnrecognizedRequestError()
|
||||||
|
|
||||||
template = path[0].decode('ascii')
|
template = path[0]
|
||||||
path = path[1:]
|
path = path[1:]
|
||||||
|
|
||||||
if len(path) == 0 or len(path[0]) == 0:
|
if len(path) == 0 or len(path[0]) == 0:
|
||||||
raise UnrecognizedRequestError()
|
raise UnrecognizedRequestError()
|
||||||
|
|
||||||
rule_id = path[0].decode('ascii')
|
rule_id = path[0]
|
||||||
|
|
||||||
spec = {
|
spec = {
|
||||||
'scope': scope,
|
'scope': scope,
|
||||||
|
@ -220,7 +231,7 @@ def _rule_spec_from_path(path):
|
||||||
path = path[1:]
|
path = path[1:]
|
||||||
|
|
||||||
if len(path) > 0 and len(path[0]) > 0:
|
if len(path) > 0 and len(path[0]) > 0:
|
||||||
spec['attr'] = path[0].decode('ascii')
|
spec['attr'] = path[0]
|
||||||
|
|
||||||
return spec
|
return spec
|
||||||
|
|
||||||
|
|
|
@ -142,7 +142,7 @@ class PushersRemoveRestServlet(RestServlet):
|
||||||
To allow pusher to be delete by clicking a link (ie. GET request)
|
To allow pusher to be delete by clicking a link (ie. GET request)
|
||||||
"""
|
"""
|
||||||
PATTERNS = client_path_patterns("/pushers/remove$")
|
PATTERNS = client_path_patterns("/pushers/remove$")
|
||||||
SUCCESS_HTML = "<html><body>You have been unsubscribed</body><html>"
|
SUCCESS_HTML = b"<html><body>You have been unsubscribed</body><html>"
|
||||||
|
|
||||||
def __init__(self, hs):
|
def __init__(self, hs):
|
||||||
super(PushersRemoveRestServlet, self).__init__()
|
super(PushersRemoveRestServlet, self).__init__()
|
||||||
|
|
|
@ -457,6 +457,7 @@ class RegisterRestServlet(RestServlet):
|
||||||
yield self.store.user_set_consent_version(
|
yield self.store.user_set_consent_version(
|
||||||
registered_user_id, self.hs.config.user_consent_version,
|
registered_user_id, self.hs.config.user_consent_version,
|
||||||
)
|
)
|
||||||
|
yield self.registration_handler.post_consent_actions(registered_user_id)
|
||||||
|
|
||||||
defer.returnValue((200, return_dict))
|
defer.returnValue((200, return_dict))
|
||||||
|
|
||||||
|
|
|
@ -89,6 +89,7 @@ class ConsentResource(Resource):
|
||||||
|
|
||||||
self.hs = hs
|
self.hs = hs
|
||||||
self.store = hs.get_datastore()
|
self.store = hs.get_datastore()
|
||||||
|
self.registration_handler = hs.get_handlers().registration_handler
|
||||||
|
|
||||||
# this is required by the request_handler wrapper
|
# this is required by the request_handler wrapper
|
||||||
self.clock = hs.get_clock()
|
self.clock = hs.get_clock()
|
||||||
|
@ -199,6 +200,7 @@ class ConsentResource(Resource):
|
||||||
if e.code != 404:
|
if e.code != 404:
|
||||||
raise
|
raise
|
||||||
raise NotFoundError("Unknown user")
|
raise NotFoundError("Unknown user")
|
||||||
|
yield self.registration_handler.post_consent_actions(qualified_user_id)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
self._render_template(request, "success.html")
|
self._render_template(request, "success.html")
|
||||||
|
|
|
@ -298,6 +298,8 @@ def _resolve_normal_events(events, auth_events):
|
||||||
|
|
||||||
def _ordered_events(events):
|
def _ordered_events(events):
|
||||||
def key_func(e):
|
def key_func(e):
|
||||||
return -int(e.depth), hashlib.sha1(e.event_id.encode('ascii')).hexdigest()
|
# we have to use utf-8 rather than ascii here because it turns out we allow
|
||||||
|
# people to send us events with non-ascii event IDs :/
|
||||||
|
return -int(e.depth), hashlib.sha1(e.event_id.encode('utf-8')).hexdigest()
|
||||||
|
|
||||||
return sorted(events, key=key_func)
|
return sorted(events, key=key_func)
|
||||||
|
|
|
@ -12,35 +12,30 @@
|
||||||
<h1>Log in with one of the following methods</h1>
|
<h1>Log in with one of the following methods</h1>
|
||||||
|
|
||||||
<span id="feedback" style="color: #f00"></span>
|
<span id="feedback" style="color: #f00"></span>
|
||||||
<br/>
|
|
||||||
<br/>
|
|
||||||
|
|
||||||
<div id="loading">
|
<div id="loading">
|
||||||
<img src="spinner.gif" />
|
<img src="spinner.gif" />
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div id="cas_flow" class="login_flow" style="display:none"
|
<div id="sso_flow" class="login_flow" style="display:none">
|
||||||
onclick="gotoCas(); return false;">
|
Single-sign on:
|
||||||
CAS Authentication: <button id="cas_button" style="margin: 10px">Log in</button>
|
<form id="sso_form" action="/_matrix/client/r0/login/sso/redirect" method="get">
|
||||||
|
<input id="sso_redirect_url" type="hidden" name="redirectUrl" value=""/>
|
||||||
|
<input type="submit" value="Log in"/>
|
||||||
|
</form>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<br/>
|
<div id="password_flow" class="login_flow" style="display:none">
|
||||||
|
Password Authentication:
|
||||||
|
<form onsubmit="matrixLogin.password_login(); return false;">
|
||||||
|
<input id="user_id" size="32" type="text" placeholder="Matrix ID (e.g. bob)" autocapitalize="off" autocorrect="off" />
|
||||||
|
<br/>
|
||||||
|
<input id="password" size="32" type="password" placeholder="Password"/>
|
||||||
|
<br/>
|
||||||
|
|
||||||
<form id="password_form" class="login_flow" style="display:none"
|
<input type="submit" value="Log in"/>
|
||||||
onsubmit="matrixLogin.password_login(); return false;">
|
</form>
|
||||||
<div>
|
</div>
|
||||||
Password Authentication:<br/>
|
|
||||||
|
|
||||||
<div style="text-align: center">
|
|
||||||
<input id="user_id" size="32" type="text" placeholder="Matrix ID (e.g. bob)" autocapitalize="off" autocorrect="off" />
|
|
||||||
<br/>
|
|
||||||
<input id="password" size="32" type="password" placeholder="Password"/>
|
|
||||||
<br/>
|
|
||||||
|
|
||||||
<button type="submit" style="margin: 10px">Log in</button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</form>
|
|
||||||
|
|
||||||
<div id="no_login_types" type="button" class="login_flow" style="display:none">
|
<div id="no_login_types" type="button" class="login_flow" style="display:none">
|
||||||
Log in currently unavailable.
|
Log in currently unavailable.
|
||||||
|
|
|
@ -1,7 +1,8 @@
|
||||||
window.matrixLogin = {
|
window.matrixLogin = {
|
||||||
endpoint: location.origin + "/_matrix/client/api/v1/login",
|
endpoint: location.origin + "/_matrix/client/r0/login",
|
||||||
serverAcceptsPassword: false,
|
serverAcceptsPassword: false,
|
||||||
serverAcceptsCas: false
|
serverAcceptsCas: false,
|
||||||
|
serverAcceptsSso: false,
|
||||||
};
|
};
|
||||||
|
|
||||||
var submitPassword = function(user, pwd) {
|
var submitPassword = function(user, pwd) {
|
||||||
|
@ -40,12 +41,6 @@ var errorFunc = function(err) {
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
var gotoCas = function() {
|
|
||||||
var this_page = window.location.origin + window.location.pathname;
|
|
||||||
var redirect_url = matrixLogin.endpoint + "/cas/redirect?redirectUrl=" + encodeURIComponent(this_page);
|
|
||||||
window.location.replace(redirect_url);
|
|
||||||
}
|
|
||||||
|
|
||||||
var setFeedbackString = function(text) {
|
var setFeedbackString = function(text) {
|
||||||
$("#feedback").text(text);
|
$("#feedback").text(text);
|
||||||
};
|
};
|
||||||
|
@ -53,12 +48,18 @@ var setFeedbackString = function(text) {
|
||||||
var show_login = function() {
|
var show_login = function() {
|
||||||
$("#loading").hide();
|
$("#loading").hide();
|
||||||
|
|
||||||
|
var this_page = window.location.origin + window.location.pathname;
|
||||||
|
$("#sso_redirect_url").val(encodeURIComponent(this_page));
|
||||||
|
|
||||||
if (matrixLogin.serverAcceptsPassword) {
|
if (matrixLogin.serverAcceptsPassword) {
|
||||||
$("#password_form").show();
|
$("#password_flow").show();
|
||||||
}
|
}
|
||||||
|
|
||||||
if (matrixLogin.serverAcceptsCas) {
|
if (matrixLogin.serverAcceptsSso) {
|
||||||
$("#cas_flow").show();
|
$("#sso_flow").show();
|
||||||
|
} else if (matrixLogin.serverAcceptsCas) {
|
||||||
|
$("#sso_form").attr("action", "/_matrix/client/r0/login/cas/redirect");
|
||||||
|
$("#sso_flow").show();
|
||||||
}
|
}
|
||||||
|
|
||||||
if (!matrixLogin.serverAcceptsPassword && !matrixLogin.serverAcceptsCas) {
|
if (!matrixLogin.serverAcceptsPassword && !matrixLogin.serverAcceptsCas) {
|
||||||
|
@ -67,8 +68,8 @@ var show_login = function() {
|
||||||
};
|
};
|
||||||
|
|
||||||
var show_spinner = function() {
|
var show_spinner = function() {
|
||||||
$("#password_form").hide();
|
$("#password_flow").hide();
|
||||||
$("#cas_flow").hide();
|
$("#sso_flow").hide();
|
||||||
$("#no_login_types").hide();
|
$("#no_login_types").hide();
|
||||||
$("#loading").show();
|
$("#loading").show();
|
||||||
};
|
};
|
||||||
|
@ -84,7 +85,10 @@ var fetch_info = function(cb) {
|
||||||
matrixLogin.serverAcceptsCas = true;
|
matrixLogin.serverAcceptsCas = true;
|
||||||
console.log("Server accepts CAS");
|
console.log("Server accepts CAS");
|
||||||
}
|
}
|
||||||
|
if ("m.login.sso" === flow.type) {
|
||||||
|
matrixLogin.serverAcceptsSso = true;
|
||||||
|
console.log("Server accepts SSO");
|
||||||
|
}
|
||||||
if ("m.login.password" === flow.type) {
|
if ("m.login.password" === flow.type) {
|
||||||
matrixLogin.serverAcceptsPassword = true;
|
matrixLogin.serverAcceptsPassword = true;
|
||||||
console.log("Server accepts password");
|
console.log("Server accepts password");
|
||||||
|
|
|
@ -19,30 +19,23 @@ a:hover { color: #000; }
|
||||||
a:active { color: #000; }
|
a:active { color: #000; }
|
||||||
|
|
||||||
input {
|
input {
|
||||||
width: 90%
|
|
||||||
}
|
|
||||||
|
|
||||||
textarea, input {
|
|
||||||
font-family: inherit;
|
|
||||||
font-size: inherit;
|
|
||||||
margin: 5px;
|
margin: 5px;
|
||||||
}
|
}
|
||||||
|
|
||||||
.smallPrint {
|
textbox, input[type="text"], input[type="password"] {
|
||||||
color: #888;
|
width: 90%;
|
||||||
font-size: 9pt ! important;
|
|
||||||
font-style: italic ! important;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
.g-recaptcha div {
|
form {
|
||||||
margin: auto;
|
text-align: center;
|
||||||
|
margin: 10px 0 0 0;
|
||||||
}
|
}
|
||||||
|
|
||||||
.login_flow {
|
.login_flow {
|
||||||
|
width: 300px;
|
||||||
text-align: left;
|
text-align: left;
|
||||||
padding: 10px;
|
padding: 10px;
|
||||||
margin-bottom: 40px;
|
margin-bottom: 40px;
|
||||||
display: inline-block;
|
|
||||||
|
|
||||||
-webkit-border-radius: 10px;
|
-webkit-border-radius: 10px;
|
||||||
-moz-border-radius: 10px;
|
-moz-border-radius: 10px;
|
||||||
|
|
|
@ -119,7 +119,6 @@ class DataStore(RoomMemberStore, RoomStore,
|
||||||
db_conn, "device_lists_stream", "stream_id",
|
db_conn, "device_lists_stream", "stream_id",
|
||||||
)
|
)
|
||||||
|
|
||||||
self._transaction_id_gen = IdGenerator(db_conn, "sent_transactions", "id")
|
|
||||||
self._access_tokens_id_gen = IdGenerator(db_conn, "access_tokens", "id")
|
self._access_tokens_id_gen = IdGenerator(db_conn, "access_tokens", "id")
|
||||||
self._event_reports_id_gen = IdGenerator(db_conn, "event_reports", "id")
|
self._event_reports_id_gen = IdGenerator(db_conn, "event_reports", "id")
|
||||||
self._push_rule_id_gen = IdGenerator(db_conn, "push_rules", "id")
|
self._push_rule_id_gen = IdGenerator(db_conn, "push_rules", "id")
|
||||||
|
|
|
@ -29,6 +29,7 @@ from synapse.api.errors import StoreError
|
||||||
from synapse.storage.engines import PostgresEngine
|
from synapse.storage.engines import PostgresEngine
|
||||||
from synapse.util.caches.descriptors import Cache
|
from synapse.util.caches.descriptors import Cache
|
||||||
from synapse.util.logcontext import LoggingContext, PreserveLoggingContext
|
from synapse.util.logcontext import LoggingContext, PreserveLoggingContext
|
||||||
|
from synapse.util.stringutils import exception_to_unicode
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
@ -249,32 +250,32 @@ class SQLBaseStore(object):
|
||||||
except self.database_engine.module.OperationalError as e:
|
except self.database_engine.module.OperationalError as e:
|
||||||
# This can happen if the database disappears mid
|
# This can happen if the database disappears mid
|
||||||
# transaction.
|
# transaction.
|
||||||
logger.warn(
|
logger.warning(
|
||||||
"[TXN OPERROR] {%s} %s %d/%d",
|
"[TXN OPERROR] {%s} %s %d/%d",
|
||||||
name, e, i, N
|
name, exception_to_unicode(e), i, N
|
||||||
)
|
)
|
||||||
if i < N:
|
if i < N:
|
||||||
i += 1
|
i += 1
|
||||||
try:
|
try:
|
||||||
conn.rollback()
|
conn.rollback()
|
||||||
except self.database_engine.module.Error as e1:
|
except self.database_engine.module.Error as e1:
|
||||||
logger.warn(
|
logger.warning(
|
||||||
"[TXN EROLL] {%s} %s",
|
"[TXN EROLL] {%s} %s",
|
||||||
name, e1,
|
name, exception_to_unicode(e1),
|
||||||
)
|
)
|
||||||
continue
|
continue
|
||||||
raise
|
raise
|
||||||
except self.database_engine.module.DatabaseError as e:
|
except self.database_engine.module.DatabaseError as e:
|
||||||
if self.database_engine.is_deadlock(e):
|
if self.database_engine.is_deadlock(e):
|
||||||
logger.warn("[TXN DEADLOCK] {%s} %d/%d", name, i, N)
|
logger.warning("[TXN DEADLOCK] {%s} %d/%d", name, i, N)
|
||||||
if i < N:
|
if i < N:
|
||||||
i += 1
|
i += 1
|
||||||
try:
|
try:
|
||||||
conn.rollback()
|
conn.rollback()
|
||||||
except self.database_engine.module.Error as e1:
|
except self.database_engine.module.Error as e1:
|
||||||
logger.warn(
|
logger.warning(
|
||||||
"[TXN EROLL] {%s} %s",
|
"[TXN EROLL] {%s} %s",
|
||||||
name, e1,
|
name, exception_to_unicode(e1),
|
||||||
)
|
)
|
||||||
continue
|
continue
|
||||||
raise
|
raise
|
||||||
|
@ -849,9 +850,9 @@ class SQLBaseStore(object):
|
||||||
rowcount = cls._simple_update_txn(txn, table, keyvalues, updatevalues)
|
rowcount = cls._simple_update_txn(txn, table, keyvalues, updatevalues)
|
||||||
|
|
||||||
if rowcount == 0:
|
if rowcount == 0:
|
||||||
raise StoreError(404, "No row found")
|
raise StoreError(404, "No row found (%s)" % (table,))
|
||||||
if rowcount > 1:
|
if rowcount > 1:
|
||||||
raise StoreError(500, "More than one row matched")
|
raise StoreError(500, "More than one row matched (%s)" % (table,))
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def _simple_select_one_txn(txn, table, keyvalues, retcols,
|
def _simple_select_one_txn(txn, table, keyvalues, retcols,
|
||||||
|
@ -868,9 +869,9 @@ class SQLBaseStore(object):
|
||||||
if not row:
|
if not row:
|
||||||
if allow_none:
|
if allow_none:
|
||||||
return None
|
return None
|
||||||
raise StoreError(404, "No row found")
|
raise StoreError(404, "No row found (%s)" % (table,))
|
||||||
if txn.rowcount > 1:
|
if txn.rowcount > 1:
|
||||||
raise StoreError(500, "More than one row matched")
|
raise StoreError(500, "More than one row matched (%s)" % (table,))
|
||||||
|
|
||||||
return dict(zip(retcols, row))
|
return dict(zip(retcols, row))
|
||||||
|
|
||||||
|
@ -902,9 +903,9 @@ class SQLBaseStore(object):
|
||||||
|
|
||||||
txn.execute(sql, list(keyvalues.values()))
|
txn.execute(sql, list(keyvalues.values()))
|
||||||
if txn.rowcount == 0:
|
if txn.rowcount == 0:
|
||||||
raise StoreError(404, "No row found")
|
raise StoreError(404, "No row found (%s)" % (table,))
|
||||||
if txn.rowcount > 1:
|
if txn.rowcount > 1:
|
||||||
raise StoreError(500, "more than one row matched")
|
raise StoreError(500, "More than one row matched (%s)" % (table,))
|
||||||
|
|
||||||
def _simple_delete(self, table, keyvalues, desc):
|
def _simple_delete(self, table, keyvalues, desc):
|
||||||
return self.runInteraction(
|
return self.runInteraction(
|
||||||
|
|
|
@ -34,8 +34,9 @@ class MonthlyActiveUsersStore(SQLBaseStore):
|
||||||
self.hs = hs
|
self.hs = hs
|
||||||
self.reserved_users = ()
|
self.reserved_users = ()
|
||||||
# Do not add more reserved users than the total allowable number
|
# Do not add more reserved users than the total allowable number
|
||||||
self._initialise_reserved_users(
|
self._new_transaction(
|
||||||
dbconn.cursor(),
|
dbconn, "initialise_mau_threepids", [], [],
|
||||||
|
self._initialise_reserved_users,
|
||||||
hs.config.mau_limits_reserved_threepids[:self.hs.config.max_mau_value],
|
hs.config.mau_limits_reserved_threepids[:self.hs.config.max_mau_value],
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
|
@ -25,7 +25,7 @@ logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
# Remember to update this number every time a change is made to database
|
# Remember to update this number every time a change is made to database
|
||||||
# schema files, so the users will be informed on server restarts.
|
# schema files, so the users will be informed on server restarts.
|
||||||
SCHEMA_VERSION = 52
|
SCHEMA_VERSION = 53
|
||||||
|
|
||||||
dir_path = os.path.abspath(os.path.dirname(__file__))
|
dir_path = os.path.abspath(os.path.dirname(__file__))
|
||||||
|
|
||||||
|
@ -257,7 +257,7 @@ def _upgrade_existing_database(cur, current_version, applied_delta_files,
|
||||||
module.run_create(cur, database_engine)
|
module.run_create(cur, database_engine)
|
||||||
if not is_empty:
|
if not is_empty:
|
||||||
module.run_upgrade(cur, database_engine, config=config)
|
module.run_upgrade(cur, database_engine, config=config)
|
||||||
elif ext == ".pyc":
|
elif ext == ".pyc" or file_name == "__pycache__":
|
||||||
# Sometimes .pyc files turn up anyway even though we've
|
# Sometimes .pyc files turn up anyway even though we've
|
||||||
# disabled their generation; e.g. from distribution package
|
# disabled their generation; e.g. from distribution package
|
||||||
# installers. Silently skip it
|
# installers. Silently skip it
|
||||||
|
|
|
@ -1,32 +0,0 @@
|
||||||
# Copyright 2016 OpenMarket Ltd
|
|
||||||
#
|
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
|
||||||
# you may not use this file except in compliance with the License.
|
|
||||||
# You may obtain a copy of the License at
|
|
||||||
#
|
|
||||||
# http://www.apache.org/licenses/LICENSE-2.0
|
|
||||||
#
|
|
||||||
# Unless required by applicable law or agreed to in writing, software
|
|
||||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
|
||||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
|
||||||
# See the License for the specific language governing permissions and
|
|
||||||
# limitations under the License.
|
|
||||||
|
|
||||||
import logging
|
|
||||||
|
|
||||||
from synapse.storage.engines import PostgresEngine
|
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
|
||||||
|
|
||||||
|
|
||||||
def run_create(cur, database_engine, *args, **kwargs):
|
|
||||||
if isinstance(database_engine, PostgresEngine):
|
|
||||||
cur.execute("TRUNCATE sent_transactions")
|
|
||||||
else:
|
|
||||||
cur.execute("DELETE FROM sent_transactions")
|
|
||||||
|
|
||||||
cur.execute("CREATE INDEX sent_transactions_ts ON sent_transactions(ts)")
|
|
||||||
|
|
||||||
|
|
||||||
def run_upgrade(cur, database_engine, *args, **kwargs):
|
|
||||||
pass
|
|
|
@ -1,4 +1,4 @@
|
||||||
/* Copyright 2015, 2016 OpenMarket Ltd
|
/* Copyright 2018 New Vector Ltd
|
||||||
*
|
*
|
||||||
* Licensed under the Apache License, Version 2.0 (the "License");
|
* Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
* you may not use this file except in compliance with the License.
|
* you may not use this file except in compliance with the License.
|
||||||
|
@ -13,4 +13,4 @@
|
||||||
* limitations under the License.
|
* limitations under the License.
|
||||||
*/
|
*/
|
||||||
|
|
||||||
CREATE INDEX IF NOT EXISTS sent_transaction_txn_id ON sent_transactions(transaction_id);
|
DROP TABLE IF EXISTS sent_transactions;
|
|
@ -25,25 +25,6 @@ CREATE TABLE IF NOT EXISTS received_transactions(
|
||||||
|
|
||||||
CREATE INDEX transactions_have_ref ON received_transactions(origin, has_been_referenced);-- WHERE has_been_referenced = 0;
|
CREATE INDEX transactions_have_ref ON received_transactions(origin, has_been_referenced);-- WHERE has_been_referenced = 0;
|
||||||
|
|
||||||
|
|
||||||
-- Stores what transactions we've sent, what their response was (if we got one) and whether we have
|
|
||||||
-- since referenced the transaction in another outgoing transaction
|
|
||||||
CREATE TABLE IF NOT EXISTS sent_transactions(
|
|
||||||
id INTEGER PRIMARY KEY AUTOINCREMENT, -- This is used to apply insertion ordering
|
|
||||||
transaction_id TEXT,
|
|
||||||
destination TEXT,
|
|
||||||
response_code INTEGER DEFAULT 0,
|
|
||||||
response_json TEXT,
|
|
||||||
ts BIGINT
|
|
||||||
);
|
|
||||||
|
|
||||||
CREATE INDEX sent_transaction_dest ON sent_transactions(destination);
|
|
||||||
CREATE INDEX sent_transaction_txn_id ON sent_transactions(transaction_id);
|
|
||||||
-- So that we can do an efficient look up of all transactions that have yet to be successfully
|
|
||||||
-- sent.
|
|
||||||
CREATE INDEX sent_transaction_sent ON sent_transactions(response_code);
|
|
||||||
|
|
||||||
|
|
||||||
-- For sent transactions only.
|
-- For sent transactions only.
|
||||||
CREATE TABLE IF NOT EXISTS transaction_id_to_pdu(
|
CREATE TABLE IF NOT EXISTS transaction_id_to_pdu(
|
||||||
transaction_id INTEGER,
|
transaction_id INTEGER,
|
||||||
|
|
|
@ -25,25 +25,6 @@ CREATE TABLE IF NOT EXISTS received_transactions(
|
||||||
|
|
||||||
CREATE INDEX transactions_have_ref ON received_transactions(origin, has_been_referenced);-- WHERE has_been_referenced = 0;
|
CREATE INDEX transactions_have_ref ON received_transactions(origin, has_been_referenced);-- WHERE has_been_referenced = 0;
|
||||||
|
|
||||||
|
|
||||||
-- Stores what transactions we've sent, what their response was (if we got one) and whether we have
|
|
||||||
-- since referenced the transaction in another outgoing transaction
|
|
||||||
CREATE TABLE IF NOT EXISTS sent_transactions(
|
|
||||||
id BIGINT PRIMARY KEY, -- This is used to apply insertion ordering
|
|
||||||
transaction_id TEXT,
|
|
||||||
destination TEXT,
|
|
||||||
response_code INTEGER DEFAULT 0,
|
|
||||||
response_json TEXT,
|
|
||||||
ts BIGINT
|
|
||||||
);
|
|
||||||
|
|
||||||
CREATE INDEX sent_transaction_dest ON sent_transactions(destination);
|
|
||||||
CREATE INDEX sent_transaction_txn_id ON sent_transactions(transaction_id);
|
|
||||||
-- So that we can do an efficient look up of all transactions that have yet to be successfully
|
|
||||||
-- sent.
|
|
||||||
CREATE INDEX sent_transaction_sent ON sent_transactions(response_code);
|
|
||||||
|
|
||||||
|
|
||||||
-- For sent transactions only.
|
-- For sent transactions only.
|
||||||
CREATE TABLE IF NOT EXISTS transaction_id_to_pdu(
|
CREATE TABLE IF NOT EXISTS transaction_id_to_pdu(
|
||||||
transaction_id INTEGER,
|
transaction_id INTEGER,
|
||||||
|
|
|
@ -45,6 +45,10 @@ class SearchStore(BackgroundUpdateStore):
|
||||||
|
|
||||||
def __init__(self, db_conn, hs):
|
def __init__(self, db_conn, hs):
|
||||||
super(SearchStore, self).__init__(db_conn, hs)
|
super(SearchStore, self).__init__(db_conn, hs)
|
||||||
|
|
||||||
|
if not hs.config.enable_search:
|
||||||
|
return
|
||||||
|
|
||||||
self.register_background_update_handler(
|
self.register_background_update_handler(
|
||||||
self.EVENT_SEARCH_UPDATE_NAME, self._background_reindex_search
|
self.EVENT_SEARCH_UPDATE_NAME, self._background_reindex_search
|
||||||
)
|
)
|
||||||
|
@ -316,6 +320,8 @@ class SearchStore(BackgroundUpdateStore):
|
||||||
entries (iterable[SearchEntry]):
|
entries (iterable[SearchEntry]):
|
||||||
entries to be added to the table
|
entries to be added to the table
|
||||||
"""
|
"""
|
||||||
|
if not self.hs.config.enable_search:
|
||||||
|
return
|
||||||
if isinstance(self.database_engine, PostgresEngine):
|
if isinstance(self.database_engine, PostgresEngine):
|
||||||
sql = (
|
sql = (
|
||||||
"INSERT INTO event_search"
|
"INSERT INTO event_search"
|
||||||
|
|
|
@ -16,7 +16,8 @@
|
||||||
import random
|
import random
|
||||||
import string
|
import string
|
||||||
|
|
||||||
from six import PY3
|
import six
|
||||||
|
from six import PY2, PY3
|
||||||
from six.moves import range
|
from six.moves import range
|
||||||
|
|
||||||
_string_with_symbols = (
|
_string_with_symbols = (
|
||||||
|
@ -71,3 +72,39 @@ def to_ascii(s):
|
||||||
return s.encode("ascii")
|
return s.encode("ascii")
|
||||||
except UnicodeEncodeError:
|
except UnicodeEncodeError:
|
||||||
return s
|
return s
|
||||||
|
|
||||||
|
|
||||||
|
def exception_to_unicode(e):
|
||||||
|
"""Helper function to extract the text of an exception as a unicode string
|
||||||
|
|
||||||
|
Args:
|
||||||
|
e (Exception): exception to be stringified
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
unicode
|
||||||
|
"""
|
||||||
|
# urgh, this is a mess. The basic problem here is that psycopg2 constructs its
|
||||||
|
# exceptions with PyErr_SetString, with a (possibly non-ascii) argument. str() will
|
||||||
|
# then produce the raw byte sequence. Under Python 2, this will then cause another
|
||||||
|
# error if it gets mixed with a `unicode` object, as per
|
||||||
|
# https://github.com/matrix-org/synapse/issues/4252
|
||||||
|
|
||||||
|
# First of all, if we're under python3, everything is fine because it will sort this
|
||||||
|
# nonsense out for us.
|
||||||
|
if not PY2:
|
||||||
|
return str(e)
|
||||||
|
|
||||||
|
# otherwise let's have a stab at decoding the exception message. We'll circumvent
|
||||||
|
# Exception.__str__(), which would explode if someone raised Exception(u'non-ascii')
|
||||||
|
# and instead look at what is in the args member.
|
||||||
|
|
||||||
|
if len(e.args) == 0:
|
||||||
|
return u""
|
||||||
|
elif len(e.args) > 1:
|
||||||
|
return six.text_type(repr(e.args))
|
||||||
|
|
||||||
|
msg = e.args[0]
|
||||||
|
if isinstance(msg, bytes):
|
||||||
|
return msg.decode('utf-8', errors='replace')
|
||||||
|
else:
|
||||||
|
return msg
|
||||||
|
|
|
@ -1,5 +1,6 @@
|
||||||
# -*- coding: utf-8 -*-
|
# -*- coding: utf-8 -*-
|
||||||
# Copyright 2014-2016 OpenMarket Ltd
|
# Copyright 2014-2016 OpenMarket Ltd
|
||||||
|
# Copyright 2018 New Vector Ltd
|
||||||
#
|
#
|
||||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
# you may not use this file except in compliance with the License.
|
# you may not use this file except in compliance with the License.
|
||||||
|
@ -15,7 +16,9 @@
|
||||||
|
|
||||||
from twisted.trial import util
|
from twisted.trial import util
|
||||||
|
|
||||||
from tests import utils
|
import tests.patch_inline_callbacks
|
||||||
|
|
||||||
|
# attempt to do the patch before we load any synapse code
|
||||||
|
tests.patch_inline_callbacks.do_patch()
|
||||||
|
|
||||||
util.DEFAULT_TIMEOUT_DURATION = 10
|
util.DEFAULT_TIMEOUT_DURATION = 10
|
||||||
utils.setupdb()
|
|
||||||
|
|
|
@ -63,6 +63,14 @@ class KeyringTestCase(unittest.TestCase):
|
||||||
keys = self.mock_perspective_server.get_verify_keys()
|
keys = self.mock_perspective_server.get_verify_keys()
|
||||||
self.hs.config.perspectives = {self.mock_perspective_server.server_name: keys}
|
self.hs.config.perspectives = {self.mock_perspective_server.server_name: keys}
|
||||||
|
|
||||||
|
def assert_sentinel_context(self):
|
||||||
|
if LoggingContext.current_context() != LoggingContext.sentinel:
|
||||||
|
self.fail(
|
||||||
|
"Expected sentinel context but got %s" % (
|
||||||
|
LoggingContext.current_context(),
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
def check_context(self, _, expected):
|
def check_context(self, _, expected):
|
||||||
self.assertEquals(
|
self.assertEquals(
|
||||||
getattr(LoggingContext.current_context(), "request", None), expected
|
getattr(LoggingContext.current_context(), "request", None), expected
|
||||||
|
@ -70,8 +78,6 @@ class KeyringTestCase(unittest.TestCase):
|
||||||
|
|
||||||
@defer.inlineCallbacks
|
@defer.inlineCallbacks
|
||||||
def test_wait_for_previous_lookups(self):
|
def test_wait_for_previous_lookups(self):
|
||||||
sentinel_context = LoggingContext.current_context()
|
|
||||||
|
|
||||||
kr = keyring.Keyring(self.hs)
|
kr = keyring.Keyring(self.hs)
|
||||||
|
|
||||||
lookup_1_deferred = defer.Deferred()
|
lookup_1_deferred = defer.Deferred()
|
||||||
|
@ -99,8 +105,10 @@ class KeyringTestCase(unittest.TestCase):
|
||||||
["server1"], {"server1": lookup_2_deferred}
|
["server1"], {"server1": lookup_2_deferred}
|
||||||
)
|
)
|
||||||
self.assertFalse(wait_2_deferred.called)
|
self.assertFalse(wait_2_deferred.called)
|
||||||
|
|
||||||
# ... so we should have reset the LoggingContext.
|
# ... so we should have reset the LoggingContext.
|
||||||
self.assertIs(LoggingContext.current_context(), sentinel_context)
|
self.assert_sentinel_context()
|
||||||
|
|
||||||
wait_2_deferred.addBoth(self.check_context, "two")
|
wait_2_deferred.addBoth(self.check_context, "two")
|
||||||
|
|
||||||
# let the first lookup complete (in the sentinel context)
|
# let the first lookup complete (in the sentinel context)
|
||||||
|
@ -198,8 +206,6 @@ class KeyringTestCase(unittest.TestCase):
|
||||||
json1 = {}
|
json1 = {}
|
||||||
signedjson.sign.sign_json(json1, "server9", key1)
|
signedjson.sign.sign_json(json1, "server9", key1)
|
||||||
|
|
||||||
sentinel_context = LoggingContext.current_context()
|
|
||||||
|
|
||||||
with LoggingContext("one") as context_one:
|
with LoggingContext("one") as context_one:
|
||||||
context_one.request = "one"
|
context_one.request = "one"
|
||||||
|
|
||||||
|
@ -213,7 +219,7 @@ class KeyringTestCase(unittest.TestCase):
|
||||||
|
|
||||||
defer = kr.verify_json_for_server("server9", json1)
|
defer = kr.verify_json_for_server("server9", json1)
|
||||||
self.assertFalse(defer.called)
|
self.assertFalse(defer.called)
|
||||||
self.assertIs(LoggingContext.current_context(), sentinel_context)
|
self.assert_sentinel_context()
|
||||||
yield defer
|
yield defer
|
||||||
|
|
||||||
self.assertIs(LoggingContext.current_context(), context_one)
|
self.assertIs(LoggingContext.current_context(), context_one)
|
||||||
|
|
|
@ -150,7 +150,6 @@ class RegistrationTestCase(unittest.TestCase):
|
||||||
self.hs.config.auto_join_rooms = [room_alias_str]
|
self.hs.config.auto_join_rooms = [room_alias_str]
|
||||||
res = yield self.handler.register(localpart='jeff')
|
res = yield self.handler.register(localpart='jeff')
|
||||||
rooms = yield self.store.get_rooms_for_user(res[0])
|
rooms = yield self.store.get_rooms_for_user(res[0])
|
||||||
|
|
||||||
directory_handler = self.hs.get_handlers().directory_handler
|
directory_handler = self.hs.get_handlers().directory_handler
|
||||||
room_alias = RoomAlias.from_string(room_alias_str)
|
room_alias = RoomAlias.from_string(room_alias_str)
|
||||||
room_id = yield directory_handler.get_association(room_alias)
|
room_id = yield directory_handler.get_association(room_alias)
|
||||||
|
@ -184,3 +183,14 @@ class RegistrationTestCase(unittest.TestCase):
|
||||||
res = yield self.handler.register(localpart='jeff')
|
res = yield self.handler.register(localpart='jeff')
|
||||||
rooms = yield self.store.get_rooms_for_user(res[0])
|
rooms = yield self.store.get_rooms_for_user(res[0])
|
||||||
self.assertEqual(len(rooms), 0)
|
self.assertEqual(len(rooms), 0)
|
||||||
|
|
||||||
|
@defer.inlineCallbacks
|
||||||
|
def test_auto_create_auto_join_where_no_consent(self):
|
||||||
|
self.hs.config.user_consent_at_registration = True
|
||||||
|
self.hs.config.block_events_without_consent_error = "Error"
|
||||||
|
room_alias_str = "#room:test"
|
||||||
|
self.hs.config.auto_join_rooms = [room_alias_str]
|
||||||
|
res = yield self.handler.register(localpart='jeff')
|
||||||
|
yield self.handler.post_consent_actions(res[0])
|
||||||
|
rooms = yield self.store.get_rooms_for_user(res[0])
|
||||||
|
self.assertEqual(len(rooms), 0)
|
||||||
|
|
|
@ -0,0 +1,90 @@
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
# Copyright 2018 New Vector Ltd
|
||||||
|
#
|
||||||
|
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
# you may not use this file except in compliance with the License.
|
||||||
|
# You may obtain a copy of the License at
|
||||||
|
#
|
||||||
|
# http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
#
|
||||||
|
# Unless required by applicable law or agreed to in writing, software
|
||||||
|
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
# See the License for the specific language governing permissions and
|
||||||
|
# limitations under the License.
|
||||||
|
|
||||||
|
from __future__ import print_function
|
||||||
|
|
||||||
|
import functools
|
||||||
|
import sys
|
||||||
|
|
||||||
|
from twisted.internet import defer
|
||||||
|
from twisted.internet.defer import Deferred
|
||||||
|
from twisted.python.failure import Failure
|
||||||
|
|
||||||
|
|
||||||
|
def do_patch():
|
||||||
|
"""
|
||||||
|
Patch defer.inlineCallbacks so that it checks the state of the logcontext on exit
|
||||||
|
"""
|
||||||
|
|
||||||
|
from synapse.util.logcontext import LoggingContext
|
||||||
|
|
||||||
|
orig_inline_callbacks = defer.inlineCallbacks
|
||||||
|
|
||||||
|
def new_inline_callbacks(f):
|
||||||
|
|
||||||
|
orig = orig_inline_callbacks(f)
|
||||||
|
|
||||||
|
@functools.wraps(f)
|
||||||
|
def wrapped(*args, **kwargs):
|
||||||
|
start_context = LoggingContext.current_context()
|
||||||
|
|
||||||
|
try:
|
||||||
|
res = orig(*args, **kwargs)
|
||||||
|
except Exception:
|
||||||
|
if LoggingContext.current_context() != start_context:
|
||||||
|
err = "%s changed context from %s to %s on exception" % (
|
||||||
|
f, start_context, LoggingContext.current_context()
|
||||||
|
)
|
||||||
|
print(err, file=sys.stderr)
|
||||||
|
raise Exception(err)
|
||||||
|
raise
|
||||||
|
|
||||||
|
if not isinstance(res, Deferred) or res.called:
|
||||||
|
if LoggingContext.current_context() != start_context:
|
||||||
|
err = "%s changed context from %s to %s" % (
|
||||||
|
f, start_context, LoggingContext.current_context()
|
||||||
|
)
|
||||||
|
# print the error to stderr because otherwise all we
|
||||||
|
# see in travis-ci is the 500 error
|
||||||
|
print(err, file=sys.stderr)
|
||||||
|
raise Exception(err)
|
||||||
|
return res
|
||||||
|
|
||||||
|
if LoggingContext.current_context() != LoggingContext.sentinel:
|
||||||
|
err = (
|
||||||
|
"%s returned incomplete deferred in non-sentinel context "
|
||||||
|
"%s (start was %s)"
|
||||||
|
) % (
|
||||||
|
f, LoggingContext.current_context(), start_context,
|
||||||
|
)
|
||||||
|
print(err, file=sys.stderr)
|
||||||
|
raise Exception(err)
|
||||||
|
|
||||||
|
def check_ctx(r):
|
||||||
|
if LoggingContext.current_context() != start_context:
|
||||||
|
err = "%s completion of %s changed context from %s to %s" % (
|
||||||
|
"Failure" if isinstance(r, Failure) else "Success",
|
||||||
|
f, start_context, LoggingContext.current_context(),
|
||||||
|
)
|
||||||
|
print(err, file=sys.stderr)
|
||||||
|
raise Exception(err)
|
||||||
|
return r
|
||||||
|
|
||||||
|
res.addBoth(check_ctx)
|
||||||
|
return res
|
||||||
|
|
||||||
|
return wrapped
|
||||||
|
|
||||||
|
defer.inlineCallbacks = new_inline_callbacks
|
|
@ -30,6 +30,7 @@ from synapse.rest.media.v1._base import FileInfo
|
||||||
from synapse.rest.media.v1.filepath import MediaFilePaths
|
from synapse.rest.media.v1.filepath import MediaFilePaths
|
||||||
from synapse.rest.media.v1.media_storage import MediaStorage
|
from synapse.rest.media.v1.media_storage import MediaStorage
|
||||||
from synapse.rest.media.v1.storage_provider import FileStorageProviderBackend
|
from synapse.rest.media.v1.storage_provider import FileStorageProviderBackend
|
||||||
|
from synapse.util.logcontext import make_deferred_yieldable
|
||||||
from synapse.util.module_loader import load_module
|
from synapse.util.module_loader import load_module
|
||||||
|
|
||||||
from tests import unittest
|
from tests import unittest
|
||||||
|
@ -113,7 +114,7 @@ class MediaRepoTests(unittest.HomeserverTestCase):
|
||||||
d = Deferred()
|
d = Deferred()
|
||||||
d.addCallback(write_to)
|
d.addCallback(write_to)
|
||||||
self.fetches.append((d, destination, path, args))
|
self.fetches.append((d, destination, path, args))
|
||||||
return d
|
return make_deferred_yieldable(d)
|
||||||
|
|
||||||
client = Mock()
|
client = Mock()
|
||||||
client.get_file = get_file
|
client.get_file = get_file
|
||||||
|
|
|
@ -13,7 +13,7 @@
|
||||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
# See the License for the specific language governing permissions and
|
# See the License for the specific language governing permissions and
|
||||||
# limitations under the License.
|
# limitations under the License.
|
||||||
|
import gc
|
||||||
import hashlib
|
import hashlib
|
||||||
import hmac
|
import hmac
|
||||||
import logging
|
import logging
|
||||||
|
@ -31,10 +31,12 @@ from synapse.http.server import JsonResource
|
||||||
from synapse.http.site import SynapseRequest
|
from synapse.http.site import SynapseRequest
|
||||||
from synapse.server import HomeServer
|
from synapse.server import HomeServer
|
||||||
from synapse.types import UserID, create_requester
|
from synapse.types import UserID, create_requester
|
||||||
from synapse.util.logcontext import LoggingContextFilter
|
from synapse.util.logcontext import LoggingContext, LoggingContextFilter
|
||||||
|
|
||||||
from tests.server import get_clock, make_request, render, setup_test_homeserver
|
from tests.server import get_clock, make_request, render, setup_test_homeserver
|
||||||
from tests.utils import default_config
|
from tests.utils import default_config, setupdb
|
||||||
|
|
||||||
|
setupdb()
|
||||||
|
|
||||||
# Set up putting Synapse's logs into Trial's.
|
# Set up putting Synapse's logs into Trial's.
|
||||||
rootLogger = logging.getLogger()
|
rootLogger = logging.getLogger()
|
||||||
|
@ -102,8 +104,16 @@ class TestCase(unittest.TestCase):
|
||||||
# traceback when a unit test exits leaving things on the reactor.
|
# traceback when a unit test exits leaving things on the reactor.
|
||||||
twisted.internet.base.DelayedCall.debug = True
|
twisted.internet.base.DelayedCall.debug = True
|
||||||
|
|
||||||
old_level = logging.getLogger().level
|
# if we're not starting in the sentinel logcontext, then to be honest
|
||||||
|
# all future bets are off.
|
||||||
|
if LoggingContext.current_context() is not LoggingContext.sentinel:
|
||||||
|
self.fail(
|
||||||
|
"Test starting with non-sentinel logging context %s" % (
|
||||||
|
LoggingContext.current_context(),
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
old_level = logging.getLogger().level
|
||||||
if old_level != level:
|
if old_level != level:
|
||||||
|
|
||||||
@around(self)
|
@around(self)
|
||||||
|
@ -115,6 +125,16 @@ class TestCase(unittest.TestCase):
|
||||||
logging.getLogger().setLevel(level)
|
logging.getLogger().setLevel(level)
|
||||||
return orig()
|
return orig()
|
||||||
|
|
||||||
|
@around(self)
|
||||||
|
def tearDown(orig):
|
||||||
|
ret = orig()
|
||||||
|
# force a GC to workaround problems with deferreds leaking logcontexts when
|
||||||
|
# they are GCed (see the logcontext docs)
|
||||||
|
gc.collect()
|
||||||
|
LoggingContext.set_current_context(LoggingContext.sentinel)
|
||||||
|
|
||||||
|
return ret
|
||||||
|
|
||||||
def assertObjectHasAttributes(self, attrs, obj):
|
def assertObjectHasAttributes(self, attrs, obj):
|
||||||
"""Asserts that the given object has each of the attributes given, and
|
"""Asserts that the given object has each of the attributes given, and
|
||||||
that the value of each matches according to assertEquals."""
|
that the value of each matches according to assertEquals."""
|
||||||
|
|
29
tox.ini
29
tox.ini
|
@ -7,6 +7,7 @@ deps =
|
||||||
mock
|
mock
|
||||||
python-subunit
|
python-subunit
|
||||||
junitxml
|
junitxml
|
||||||
|
coverage
|
||||||
|
|
||||||
# needed by some of the tests
|
# needed by some of the tests
|
||||||
lxml
|
lxml
|
||||||
|
@ -27,11 +28,15 @@ deps =
|
||||||
|
|
||||||
setenv =
|
setenv =
|
||||||
PYTHONDONTWRITEBYTECODE = no_byte_code
|
PYTHONDONTWRITEBYTECODE = no_byte_code
|
||||||
|
COVERAGE_PROCESS_START = {toxinidir}/.coveragerc
|
||||||
|
|
||||||
[testenv]
|
[testenv]
|
||||||
deps =
|
deps =
|
||||||
{[base]deps}
|
{[base]deps}
|
||||||
|
|
||||||
|
whitelist_externals =
|
||||||
|
sh
|
||||||
|
|
||||||
setenv =
|
setenv =
|
||||||
{[base]setenv}
|
{[base]setenv}
|
||||||
|
|
||||||
|
@ -39,7 +44,9 @@ passenv = *
|
||||||
|
|
||||||
commands =
|
commands =
|
||||||
/usr/bin/find "{toxinidir}" -name '*.pyc' -delete
|
/usr/bin/find "{toxinidir}" -name '*.pyc' -delete
|
||||||
"{envbindir}/trial" {env:TRIAL_FLAGS:} {posargs:tests} {env:TOXSUFFIX:}
|
# Add this so that coverage will run on subprocesses
|
||||||
|
sh -c 'echo "import coverage; coverage.process_startup()" > {envsitepackagesdir}/../sitecustomize.py'
|
||||||
|
{envbindir}/coverage run "{envbindir}/trial" {env:TRIAL_FLAGS:} {posargs:tests} {env:TOXSUFFIX:}
|
||||||
|
|
||||||
[testenv:py27]
|
[testenv:py27]
|
||||||
|
|
||||||
|
@ -101,17 +108,6 @@ usedevelop=true
|
||||||
[testenv:py36]
|
[testenv:py36]
|
||||||
usedevelop=true
|
usedevelop=true
|
||||||
|
|
||||||
|
|
||||||
[testenv:py36-coverage]
|
|
||||||
usedevelop=true
|
|
||||||
deps =
|
|
||||||
{[base]deps}
|
|
||||||
coverage
|
|
||||||
commands =
|
|
||||||
/usr/bin/find "{toxinidir}" -name '*.pyc' -delete
|
|
||||||
python -m coverage run -m twisted.trial {env:TRIAL_FLAGS:} {posargs:tests} {env:TOXSUFFIX:}
|
|
||||||
|
|
||||||
|
|
||||||
[testenv:py36-postgres]
|
[testenv:py36-postgres]
|
||||||
usedevelop=true
|
usedevelop=true
|
||||||
deps =
|
deps =
|
||||||
|
@ -146,3 +142,12 @@ deps = towncrier>=18.6.0rc1
|
||||||
commands =
|
commands =
|
||||||
python -m towncrier.check --compare-with=origin/develop
|
python -m towncrier.check --compare-with=origin/develop
|
||||||
basepython = python3.6
|
basepython = python3.6
|
||||||
|
|
||||||
|
[testenv:codecov]
|
||||||
|
skip_install = True
|
||||||
|
deps =
|
||||||
|
coverage
|
||||||
|
codecov
|
||||||
|
commands =
|
||||||
|
coverage combine
|
||||||
|
codecov -X gcov
|
Loading…
Reference in New Issue