Merge remote-tracking branch 'origin/release-v1.29.0' into matrix-org-hotfixes

anoa/bundle_aggregations_state
Erik Johnston 2021-03-04 10:23:26 +00:00
commit 61a970e25f
52 changed files with 163 additions and 102 deletions

View File

@ -1,3 +1,57 @@
Synapse 1.29.0rc1 (2021-03-04)
==============================
Features
--------
- Add rate limiters to cross-user key sharing requests. ([\#8957](https://github.com/matrix-org/synapse/issues/8957))
- Add `order_by` to the admin API `GET /_synapse/admin/v1/users/<user_id>/media`. Contributed by @dklimpel. ([\#8978](https://github.com/matrix-org/synapse/issues/8978))
- Add some configuration settings to make users' profile data more private. ([\#9203](https://github.com/matrix-org/synapse/issues/9203))
- The `no_proxy` and `NO_PROXY` environment variables are now respected in proxied HTTP clients with the lowercase form taking precedence if both are present. Additionally, the lowercase `https_proxy` environment variable is now respected in proxied HTTP clients on top of existing support for the uppercase `HTTPS_PROXY` form and takes precedence if both are present. Contributed by Timothy Leung. ([\#9372](https://github.com/matrix-org/synapse/issues/9372))
- Add a configuration option, `user_directory.prefer_local_users`, which when enabled will make it more likely for users on the same server as you to appear above other users. ([\#9383](https://github.com/matrix-org/synapse/issues/9383), [\#9385](https://github.com/matrix-org/synapse/issues/9385))
- Add support for regenerating thumbnails if they have been deleted but the original image is still stored. ([\#9438](https://github.com/matrix-org/synapse/issues/9438))
- Add support for `X-Forwarded-Proto` header when using a reverse proxy. ([\#9472](https://github.com/matrix-org/synapse/issues/9472), [\#9501](https://github.com/matrix-org/synapse/issues/9501), [\#9512](https://github.com/matrix-org/synapse/issues/9512), [\#9539](https://github.com/matrix-org/synapse/issues/9539))
Bugfixes
--------
- Fix a bug where users' pushers were not all deleted when they deactivated their account. ([\#9285](https://github.com/matrix-org/synapse/issues/9285), [\#9516](https://github.com/matrix-org/synapse/issues/9516))
- Fix a bug where a lot of unnecessary presence updates were sent when joining a room. ([\#9402](https://github.com/matrix-org/synapse/issues/9402))
- Fix a bug that caused multiple calls to the experimental `shared_rooms` endpoint to return stale results. ([\#9416](https://github.com/matrix-org/synapse/issues/9416))
- Fix a bug in single sign-on which could cause a "No session cookie found" error. ([\#9436](https://github.com/matrix-org/synapse/issues/9436))
- Fix bug introduced in v1.27.0 where allowing a user to choose their own username when logging in via single sign-on did not work unless an `idp_icon` was defined. ([\#9440](https://github.com/matrix-org/synapse/issues/9440))
- Fix a bug introduced in v1.26.0 where some sequences were not properly configured when running `synapse_port_db`. ([\#9449](https://github.com/matrix-org/synapse/issues/9449))
- Fix deleting pushers when using sharded pushers. ([\#9465](https://github.com/matrix-org/synapse/issues/9465), [\#9466](https://github.com/matrix-org/synapse/issues/9466), [\#9479](https://github.com/matrix-org/synapse/issues/9479), [\#9536](https://github.com/matrix-org/synapse/issues/9536))
- Fix missing startup checks for the consistency of certain PostgreSQL sequences. ([\#9470](https://github.com/matrix-org/synapse/issues/9470))
- Fix a long-standing bug where the media repository could leak file descriptors while previewing media. ([\#9497](https://github.com/matrix-org/synapse/issues/9497))
- Properly purge the event chain cover index when purging history. ([\#9498](https://github.com/matrix-org/synapse/issues/9498))
- Fix missing chain cover index due to a schema delta not being applied correctly. Only affected servers that ran development versions. ([\#9503](https://github.com/matrix-org/synapse/issues/9503))
- Fix a bug introduced in v1.25.0 where `/_synapse/admin/join/` would fail when given a room alias. ([\#9506](https://github.com/matrix-org/synapse/issues/9506))
- Prevent presence background jobs from running when presence is disabled. ([\#9530](https://github.com/matrix-org/synapse/issues/9530))
- Fix rare edge case that caused a background update to fail if the server had rejected an event that had duplicate auth events. ([\#9537](https://github.com/matrix-org/synapse/issues/9537))
Improved Documentation
----------------------
- Update the example systemd config to propagate reloads to individual units. ([\#9463](https://github.com/matrix-org/synapse/issues/9463))
Internal Changes
----------------
- Add documentation and type hints to `parse_duration`. ([\#9432](https://github.com/matrix-org/synapse/issues/9432))
- Remove vestiges of `uploads_path` configuration setting. ([\#9462](https://github.com/matrix-org/synapse/issues/9462))
- Add a comment about systemd-python. ([\#9464](https://github.com/matrix-org/synapse/issues/9464))
- Test that we require validated email for email pushers. ([\#9496](https://github.com/matrix-org/synapse/issues/9496))
- Allow python to generate bytecode for synapse. ([\#9502](https://github.com/matrix-org/synapse/issues/9502))
- Fix incorrect type hints. ([\#9515](https://github.com/matrix-org/synapse/issues/9515), [\#9518](https://github.com/matrix-org/synapse/issues/9518))
- Add type hints to device and event report admin API. ([\#9519](https://github.com/matrix-org/synapse/issues/9519))
- Add type hints to user admin API. ([\#9521](https://github.com/matrix-org/synapse/issues/9521))
- Bump the versions of mypy and mypy-zope used for static type checking. ([\#9529](https://github.com/matrix-org/synapse/issues/9529))
Synapse 1.xx.0 Synapse 1.xx.0
============== ==============

View File

@ -1 +0,0 @@
Add rate limiters to cross-user key sharing requests.

View File

@ -1 +0,0 @@
Add `order_by` to the admin API `GET /_synapse/admin/v1/users/<user_id>/media`. Contributed by @dklimpel.

View File

@ -1 +0,0 @@
Add some configuration settings to make users' profile data more private.

View File

@ -1 +0,0 @@
Fix a bug where users' pushers were not all deleted when they deactivated their account.

View File

@ -1 +0,0 @@
The `no_proxy` and `NO_PROXY` environment variables are now respected in proxied HTTP clients with the lowercase form taking precedence if both are present. Additionally, the lowercase `https_proxy` environment variable is now respected in proxied HTTP clients on top of existing support for the uppercase `HTTPS_PROXY` form and takes precedence if both are present. Contributed by Timothy Leung.

View File

@ -1 +0,0 @@
Add a configuration option, `user_directory.prefer_local_users`, which when enabled will make it more likely for users on the same server as you to appear above other users.

View File

@ -1 +0,0 @@
Add a configuration option, `user_directory.prefer_local_users`, which when enabled will make it more likely for users on the same server as you to appear above other users.

View File

@ -1 +0,0 @@
Fix a bug where a lot of unnecessary presence updates were sent when joining a room.

View File

@ -1 +0,0 @@
Fix a bug that caused multiple calls to the experimental `shared_rooms` endpoint to return stale results.

View File

@ -1 +0,0 @@
Add documentation and type hints to `parse_duration`.

View File

@ -1 +0,0 @@
Fix a bug in single sign-on which could cause a "No session cookie found" error.

View File

@ -1 +0,0 @@
Add support for regenerating thumbnails if they have been deleted but the original image is still stored.

View File

@ -1 +0,0 @@
Fix bug introduced in v1.27.0 where allowing a user to choose their own username when logging in via single sign-on did not work unless an `idp_icon` was defined.

View File

@ -1 +0,0 @@
Fix a bug introduced in v1.26.0 where some sequences were not properly configured when running `synapse_port_db`.

View File

@ -1 +0,0 @@
Remove vestiges of `uploads_path` configuration setting.

View File

@ -1 +0,0 @@
Update the example systemd config to propagate reloads to individual units.

View File

@ -1 +0,0 @@
Add a comment about systemd-python.

View File

@ -1 +0,0 @@
Fix deleting pushers when using sharded pushers.

View File

@ -1 +0,0 @@
Fix deleting pushers when using sharded pushers.

View File

@ -1 +0,0 @@
Fix missing startup checks for the consistency of certain PostgreSQL sequences.

View File

@ -1 +0,0 @@
Add support for `X-Forwarded-Proto` header when using a reverse proxy.

View File

@ -1 +0,0 @@
Fix deleting pushers when using sharded pushers.

View File

@ -1 +0,0 @@
Test that we require validated email for email pushers.

View File

@ -1 +0,0 @@
Fix a long-standing bug where the media repository could leak file descriptors while previewing media.

View File

@ -1 +0,0 @@
Properly purge the event chain cover index when purging history.

View File

@ -1 +0,0 @@
Add support for `X-Forwarded-Proto` header when using a reverse proxy.

View File

@ -1 +0,0 @@
Allow python to generate bytecode for synapse.

View File

@ -1 +0,0 @@
Fix missing chain cover index due to a schema delta not being applied correctly. Only affected servers that ran development versions.

View File

@ -1 +0,0 @@
Fix a bug introduced in v1.25.0 where `/_synapse/admin/join/` would fail when given a room alias.

View File

@ -1 +0,0 @@
Add support for `X-Forwarded-Proto` header when using a reverse proxy.

View File

@ -1 +0,0 @@
Fix incorrect type hints.

View File

@ -1 +0,0 @@
Fix a bug where users' pushers were not all deleted when they deactivated their account.

View File

@ -1 +0,0 @@
Add type hints to device and event report admin API.

View File

@ -1 +0,0 @@
Add type hints to user admin API.

View File

@ -1 +0,0 @@
Bump the versions of mypy and mypy-zope used for static type checking.

View File

@ -1 +0,0 @@
Prevent presence background jobs from running when presence is disabled.

View File

@ -1 +0,0 @@
Fix deleting pushers when using sharded pushers.

View File

@ -1 +0,0 @@
Fix rare edge case that caused a background update to fail if the server had rejected an event that had duplicate auth events.

View File

@ -48,7 +48,7 @@ try:
except ImportError: except ImportError:
pass pass
__version__ = "1.28.0" __version__ = "1.29.0rc1"
if bool(os.environ.get("SYNAPSE_TEST_PATCH_LOG_CONTEXTS", False)): if bool(os.environ.get("SYNAPSE_TEST_PATCH_LOG_CONTEXTS", False)):
# We import here so that we don't have to install a bunch of deps when # We import here so that we don't have to install a bunch of deps when

View File

@ -23,6 +23,7 @@ from typing_extensions import ContextManager
from twisted.internet import address from twisted.internet import address
from twisted.web.resource import IResource from twisted.web.resource import IResource
from twisted.web.server import Request
import synapse import synapse
import synapse.events import synapse.events
@ -190,7 +191,7 @@ class KeyUploadServlet(RestServlet):
self.http_client = hs.get_simple_http_client() self.http_client = hs.get_simple_http_client()
self.main_uri = hs.config.worker_main_http_uri self.main_uri = hs.config.worker_main_http_uri
async def on_POST(self, request, device_id): async def on_POST(self, request: Request, device_id: Optional[str]):
requester = await self.auth.get_user_by_req(request, allow_guest=True) requester = await self.auth.get_user_by_req(request, allow_guest=True)
user_id = requester.user.to_string() user_id = requester.user.to_string()
body = parse_json_object_from_request(request) body = parse_json_object_from_request(request)
@ -223,10 +224,12 @@ class KeyUploadServlet(RestServlet):
header: request.requestHeaders.getRawHeaders(header, []) header: request.requestHeaders.getRawHeaders(header, [])
for header in (b"Authorization", b"User-Agent") for header in (b"Authorization", b"User-Agent")
} }
# Add the previous hop the the X-Forwarded-For header. # Add the previous hop to the X-Forwarded-For header.
x_forwarded_for = request.requestHeaders.getRawHeaders( x_forwarded_for = request.requestHeaders.getRawHeaders(
b"X-Forwarded-For", [] b"X-Forwarded-For", []
) )
# we use request.client here, since we want the previous hop, not the
# original client (as returned by request.getClientAddress()).
if isinstance(request.client, (address.IPv4Address, address.IPv6Address)): if isinstance(request.client, (address.IPv4Address, address.IPv6Address)):
previous_host = request.client.host.encode("ascii") previous_host = request.client.host.encode("ascii")
# If the header exists, add to the comma-separated list of the first # If the header exists, add to the comma-separated list of the first
@ -239,6 +242,14 @@ class KeyUploadServlet(RestServlet):
x_forwarded_for = [previous_host] x_forwarded_for = [previous_host]
headers[b"X-Forwarded-For"] = x_forwarded_for headers[b"X-Forwarded-For"] = x_forwarded_for
# Replicate the original X-Forwarded-Proto header. Note that
# XForwardedForRequest overrides isSecure() to give us the original protocol
# used by the client, as opposed to the protocol used by our upstream proxy
# - which is what we want here.
headers[b"X-Forwarded-Proto"] = [
b"https" if request.isSecure() else b"http"
]
try: try:
result = await self.http_client.post_json_get_json( result = await self.http_client.post_json_get_json(
self.main_uri + request.uri.decode("ascii"), body, headers=headers self.main_uri + request.uri.decode("ascii"), body, headers=headers

View File

@ -14,7 +14,7 @@
# limitations under the License. # limitations under the License.
import logging import logging
import urllib.parse import urllib.parse
from typing import List, Optional from typing import Any, Generator, List, Optional
from netaddr import AddrFormatError, IPAddress, IPSet from netaddr import AddrFormatError, IPAddress, IPSet
from zope.interface import implementer from zope.interface import implementer
@ -116,7 +116,7 @@ class MatrixFederationAgent:
uri: bytes, uri: bytes,
headers: Optional[Headers] = None, headers: Optional[Headers] = None,
bodyProducer: Optional[IBodyProducer] = None, bodyProducer: Optional[IBodyProducer] = None,
) -> defer.Deferred: ) -> Generator[defer.Deferred, Any, defer.Deferred]:
""" """
Args: Args:
method: HTTP method: GET/POST/etc method: HTTP method: GET/POST/etc
@ -177,17 +177,17 @@ class MatrixFederationAgent:
# We need to make sure the host header is set to the netloc of the # We need to make sure the host header is set to the netloc of the
# server and that a user-agent is provided. # server and that a user-agent is provided.
if headers is None: if headers is None:
headers = Headers() request_headers = Headers()
else: else:
headers = headers.copy() request_headers = headers.copy()
if not headers.hasHeader(b"host"): if not request_headers.hasHeader(b"host"):
headers.addRawHeader(b"host", parsed_uri.netloc) request_headers.addRawHeader(b"host", parsed_uri.netloc)
if not headers.hasHeader(b"user-agent"): if not request_headers.hasHeader(b"user-agent"):
headers.addRawHeader(b"user-agent", self.user_agent) request_headers.addRawHeader(b"user-agent", self.user_agent)
res = yield make_deferred_yieldable( res = yield make_deferred_yieldable(
self._agent.request(method, uri, headers, bodyProducer) self._agent.request(method, uri, request_headers, bodyProducer)
) )
return res return res

View File

@ -1049,14 +1049,14 @@ def check_content_type_is_json(headers: Headers) -> None:
RequestSendFailed: if the Content-Type header is missing or isn't JSON RequestSendFailed: if the Content-Type header is missing or isn't JSON
""" """
c_type = headers.getRawHeaders(b"Content-Type") content_type_headers = headers.getRawHeaders(b"Content-Type")
if c_type is None: if content_type_headers is None:
raise RequestSendFailed( raise RequestSendFailed(
RuntimeError("No Content-Type header received from remote server"), RuntimeError("No Content-Type header received from remote server"),
can_retry=False, can_retry=False,
) )
c_type = c_type[0].decode("ascii") # only the first header c_type = content_type_headers[0].decode("ascii") # only the first header
val, options = cgi.parse_header(c_type) val, options = cgi.parse_header(c_type)
if val != "application/json": if val != "application/json":
raise RequestSendFailed( raise RequestSendFailed(

View File

@ -21,6 +21,7 @@ import logging
import types import types
import urllib import urllib
from http import HTTPStatus from http import HTTPStatus
from inspect import isawaitable
from io import BytesIO from io import BytesIO
from typing import ( from typing import (
Any, Any,
@ -30,6 +31,7 @@ from typing import (
Iterable, Iterable,
Iterator, Iterator,
List, List,
Optional,
Pattern, Pattern,
Tuple, Tuple,
Union, Union,
@ -79,10 +81,12 @@ def return_json_error(f: failure.Failure, request: SynapseRequest) -> None:
"""Sends a JSON error response to clients.""" """Sends a JSON error response to clients."""
if f.check(SynapseError): if f.check(SynapseError):
error_code = f.value.code # mypy doesn't understand that f.check asserts the type.
error_dict = f.value.error_dict() exc = f.value # type: SynapseError # type: ignore
error_code = exc.code
error_dict = exc.error_dict()
logger.info("%s SynapseError: %s - %s", request, error_code, f.value.msg) logger.info("%s SynapseError: %s - %s", request, error_code, exc.msg)
else: else:
error_code = 500 error_code = 500
error_dict = {"error": "Internal server error", "errcode": Codes.UNKNOWN} error_dict = {"error": "Internal server error", "errcode": Codes.UNKNOWN}
@ -91,7 +95,7 @@ def return_json_error(f: failure.Failure, request: SynapseRequest) -> None:
"Failed handle request via %r: %r", "Failed handle request via %r: %r",
request.request_metrics.name, request.request_metrics.name,
request, request,
exc_info=(f.type, f.value, f.getTracebackObject()), exc_info=(f.type, f.value, f.getTracebackObject()), # type: ignore
) )
# Only respond with an error response if we haven't already started writing, # Only respond with an error response if we haven't already started writing,
@ -128,7 +132,8 @@ def return_html_error(
`{msg}` placeholders), or a jinja2 template `{msg}` placeholders), or a jinja2 template
""" """
if f.check(CodeMessageException): if f.check(CodeMessageException):
cme = f.value # mypy doesn't understand that f.check asserts the type.
cme = f.value # type: CodeMessageException # type: ignore
code = cme.code code = cme.code
msg = cme.msg msg = cme.msg
@ -142,7 +147,7 @@ def return_html_error(
logger.error( logger.error(
"Failed handle request %r", "Failed handle request %r",
request, request,
exc_info=(f.type, f.value, f.getTracebackObject()), exc_info=(f.type, f.value, f.getTracebackObject()), # type: ignore
) )
else: else:
code = HTTPStatus.INTERNAL_SERVER_ERROR code = HTTPStatus.INTERNAL_SERVER_ERROR
@ -151,7 +156,7 @@ def return_html_error(
logger.error( logger.error(
"Failed handle request %r", "Failed handle request %r",
request, request,
exc_info=(f.type, f.value, f.getTracebackObject()), exc_info=(f.type, f.value, f.getTracebackObject()), # type: ignore
) )
if isinstance(error_template, str): if isinstance(error_template, str):
@ -278,7 +283,7 @@ class _AsyncResource(resource.Resource, metaclass=abc.ABCMeta):
raw_callback_return = method_handler(request) raw_callback_return = method_handler(request)
# Is it synchronous? We'll allow this for now. # Is it synchronous? We'll allow this for now.
if isinstance(raw_callback_return, (defer.Deferred, types.CoroutineType)): if isawaitable(raw_callback_return):
callback_return = await raw_callback_return callback_return = await raw_callback_return
else: else:
callback_return = raw_callback_return # type: ignore callback_return = raw_callback_return # type: ignore
@ -399,8 +404,10 @@ class JsonResource(DirectServeJsonResource):
A tuple of the callback to use, the name of the servlet, and the A tuple of the callback to use, the name of the servlet, and the
key word arguments to pass to the callback key word arguments to pass to the callback
""" """
# At this point the path must be bytes.
request_path_bytes = request.path # type: bytes # type: ignore
request_path = request_path_bytes.decode("ascii")
# Treat HEAD requests as GET requests. # Treat HEAD requests as GET requests.
request_path = request.path.decode("ascii")
request_method = request.method request_method = request.method
if request_method == b"HEAD": if request_method == b"HEAD":
request_method = b"GET" request_method = b"GET"
@ -551,7 +558,7 @@ class _ByteProducer:
request: Request, request: Request,
iterator: Iterator[bytes], iterator: Iterator[bytes],
): ):
self._request = request self._request = request # type: Optional[Request]
self._iterator = iterator self._iterator = iterator
self._paused = False self._paused = False
@ -563,7 +570,7 @@ class _ByteProducer:
""" """
Send a list of bytes as a chunk of a response. Send a list of bytes as a chunk of a response.
""" """
if not data: if not data or not self._request:
return return
self._request.write(b"".join(data)) self._request.write(b"".join(data))

View File

@ -14,7 +14,7 @@
import contextlib import contextlib
import logging import logging
import time import time
from typing import Optional, Union from typing import Optional, Type, Union
import attr import attr
from zope.interface import implementer from zope.interface import implementer
@ -57,7 +57,7 @@ class SynapseRequest(Request):
def __init__(self, channel, *args, **kw): def __init__(self, channel, *args, **kw):
Request.__init__(self, channel, *args, **kw) Request.__init__(self, channel, *args, **kw)
self.site = channel.site self.site = channel.site # type: SynapseSite
self._channel = channel # this is used by the tests self._channel = channel # this is used by the tests
self.start_time = 0.0 self.start_time = 0.0
@ -96,25 +96,34 @@ class SynapseRequest(Request):
def get_request_id(self): def get_request_id(self):
return "%s-%i" % (self.get_method(), self.request_seq) return "%s-%i" % (self.get_method(), self.request_seq)
def get_redacted_uri(self): def get_redacted_uri(self) -> str:
uri = self.uri """Gets the redacted URI associated with the request (or placeholder if the URI
has not yet been received).
Note: This is necessary as the placeholder value in twisted is str
rather than bytes, so we need to sanitise `self.uri`.
Returns:
The redacted URI as a string.
"""
uri = self.uri # type: Union[bytes, str]
if isinstance(uri, bytes): if isinstance(uri, bytes):
uri = self.uri.decode("ascii", errors="replace") uri = uri.decode("ascii", errors="replace")
return redact_uri(uri) return redact_uri(uri)
def get_method(self): def get_method(self) -> str:
"""Gets the method associated with the request (or placeholder if not """Gets the method associated with the request (or placeholder if method
method has yet been received). has not yet been received).
Note: This is necessary as the placeholder value in twisted is str Note: This is necessary as the placeholder value in twisted is str
rather than bytes, so we need to sanitise `self.method`. rather than bytes, so we need to sanitise `self.method`.
Returns: Returns:
str The request method as a string.
""" """
method = self.method method = self.method # type: Union[bytes, str]
if isinstance(method, bytes): if isinstance(method, bytes):
method = self.method.decode("ascii") return self.method.decode("ascii")
return method return method
def render(self, resrc): def render(self, resrc):
@ -432,7 +441,9 @@ class SynapseSite(Site):
assert config.http_options is not None assert config.http_options is not None
proxied = config.http_options.x_forwarded proxied = config.http_options.x_forwarded
self.requestFactory = XForwardedForRequest if proxied else SynapseRequest self.requestFactory = (
XForwardedForRequest if proxied else SynapseRequest
) # type: Type[Request]
self.access_logger = logging.getLogger(logger_name) self.access_logger = logging.getLogger(logger_name)
self.server_version_string = server_version_string.encode("ascii") self.server_version_string = server_version_string.encode("ascii")

View File

@ -32,7 +32,7 @@ from twisted.internet.endpoints import (
TCP4ClientEndpoint, TCP4ClientEndpoint,
TCP6ClientEndpoint, TCP6ClientEndpoint,
) )
from twisted.internet.interfaces import IPushProducer, ITransport from twisted.internet.interfaces import IPushProducer, IStreamClientEndpoint, ITransport
from twisted.internet.protocol import Factory, Protocol from twisted.internet.protocol import Factory, Protocol
from twisted.python.failure import Failure from twisted.python.failure import Failure
@ -121,7 +121,9 @@ class RemoteHandler(logging.Handler):
try: try:
ip = ip_address(self.host) ip = ip_address(self.host)
if isinstance(ip, IPv4Address): if isinstance(ip, IPv4Address):
endpoint = TCP4ClientEndpoint(_reactor, self.host, self.port) endpoint = TCP4ClientEndpoint(
_reactor, self.host, self.port
) # type: IStreamClientEndpoint
elif isinstance(ip, IPv6Address): elif isinstance(ip, IPv6Address):
endpoint = TCP6ClientEndpoint(_reactor, self.host, self.port) endpoint = TCP6ClientEndpoint(_reactor, self.host, self.port)
else: else:

View File

@ -527,7 +527,7 @@ class ReactorLastSeenMetric:
REGISTRY.register(ReactorLastSeenMetric()) REGISTRY.register(ReactorLastSeenMetric())
def runUntilCurrentTimer(func): def runUntilCurrentTimer(reactor, func):
@functools.wraps(func) @functools.wraps(func)
def f(*args, **kwargs): def f(*args, **kwargs):
now = reactor.seconds() now = reactor.seconds()
@ -590,13 +590,14 @@ def runUntilCurrentTimer(func):
try: try:
# Ensure the reactor has all the attributes we expect # Ensure the reactor has all the attributes we expect
reactor.runUntilCurrent reactor.seconds # type: ignore
reactor._newTimedCalls reactor.runUntilCurrent # type: ignore
reactor.threadCallQueue reactor._newTimedCalls # type: ignore
reactor.threadCallQueue # type: ignore
# runUntilCurrent is called when we have pending calls. It is called once # runUntilCurrent is called when we have pending calls. It is called once
# per iteratation after fd polling. # per iteratation after fd polling.
reactor.runUntilCurrent = runUntilCurrentTimer(reactor.runUntilCurrent) reactor.runUntilCurrent = runUntilCurrentTimer(reactor, reactor.runUntilCurrent) # type: ignore
# We manually run the GC each reactor tick so that we can get some metrics # We manually run the GC each reactor tick so that we can get some metrics
# about time spent doing GC, # about time spent doing GC,

View File

@ -14,7 +14,7 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import logging import logging
from typing import TYPE_CHECKING, Iterable, Optional, Tuple from typing import TYPE_CHECKING, Any, Generator, Iterable, Optional, Tuple
from twisted.internet import defer from twisted.internet import defer
@ -307,7 +307,7 @@ class ModuleApi:
@defer.inlineCallbacks @defer.inlineCallbacks
def get_state_events_in_room( def get_state_events_in_room(
self, room_id: str, types: Iterable[Tuple[str, Optional[str]]] self, room_id: str, types: Iterable[Tuple[str, Optional[str]]]
) -> defer.Deferred: ) -> Generator[defer.Deferred, Any, defer.Deferred]:
"""Gets current state events for the given room. """Gets current state events for the given room.
(This is exposed for compatibility with the old SpamCheckerApi. We should (This is exposed for compatibility with the old SpamCheckerApi. We should

View File

@ -15,11 +15,12 @@
# limitations under the License. # limitations under the License.
import logging import logging
import urllib.parse import urllib.parse
from typing import TYPE_CHECKING, Any, Dict, Iterable, Union from typing import TYPE_CHECKING, Any, Dict, Iterable, Optional, Union
from prometheus_client import Counter from prometheus_client import Counter
from twisted.internet.error import AlreadyCalled, AlreadyCancelled from twisted.internet.error import AlreadyCalled, AlreadyCancelled
from twisted.internet.interfaces import IDelayedCall
from synapse.api.constants import EventTypes from synapse.api.constants import EventTypes
from synapse.events import EventBase from synapse.events import EventBase
@ -71,7 +72,7 @@ class HttpPusher(Pusher):
self.data = pusher_config.data self.data = pusher_config.data
self.backoff_delay = HttpPusher.INITIAL_BACKOFF_SEC self.backoff_delay = HttpPusher.INITIAL_BACKOFF_SEC
self.failing_since = pusher_config.failing_since self.failing_since = pusher_config.failing_since
self.timed_call = None self.timed_call = None # type: Optional[IDelayedCall]
self._is_processing = False self._is_processing = False
self._group_unread_count_by_room = hs.config.push_group_unread_count_by_room self._group_unread_count_by_room = hs.config.push_group_unread_count_by_room
self._pusherpool = hs.get_pusherpool() self._pusherpool = hs.get_pusherpool()

View File

@ -108,9 +108,7 @@ class ReplicationDataHandler:
# Map from stream to list of deferreds waiting for the stream to # Map from stream to list of deferreds waiting for the stream to
# arrive at a particular position. The lists are sorted by stream position. # arrive at a particular position. The lists are sorted by stream position.
self._streams_to_waiters = ( self._streams_to_waiters = {} # type: Dict[str, List[Tuple[int, Deferred]]]
{}
) # type: Dict[str, List[Tuple[int, Deferred[None]]]]
async def on_rdata( async def on_rdata(
self, stream_name: str, instance_name: str, token: int, rows: list self, stream_name: str, instance_name: str, token: int, rows: list

View File

@ -38,6 +38,7 @@ from typing import (
import twisted.internet.base import twisted.internet.base
import twisted.internet.tcp import twisted.internet.tcp
from twisted.internet import defer
from twisted.mail.smtp import sendmail from twisted.mail.smtp import sendmail
from twisted.web.iweb import IPolicyForHTTPS from twisted.web.iweb import IPolicyForHTTPS
@ -403,7 +404,7 @@ class HomeServer(metaclass=abc.ABCMeta):
return RoomShutdownHandler(self) return RoomShutdownHandler(self)
@cache_in_self @cache_in_self
def get_sendmail(self) -> sendmail: def get_sendmail(self) -> Callable[..., defer.Deferred]:
return sendmail return sendmail
@cache_in_self @cache_in_self

View File

@ -522,7 +522,9 @@ class MultiSSOTestCase(unittest.HomeserverTestCase):
shorthand=False, shorthand=False,
) )
self.assertEqual(channel.code, 302, channel.result) self.assertEqual(channel.code, 302, channel.result)
cas_uri = channel.headers.getRawHeaders("Location")[0] location_headers = channel.headers.getRawHeaders("Location")
assert location_headers
cas_uri = location_headers[0]
cas_uri_path, cas_uri_query = cas_uri.split("?", 1) cas_uri_path, cas_uri_query = cas_uri.split("?", 1)
# it should redirect us to the login page of the cas server # it should redirect us to the login page of the cas server
@ -545,7 +547,9 @@ class MultiSSOTestCase(unittest.HomeserverTestCase):
+ "&idp=saml", + "&idp=saml",
) )
self.assertEqual(channel.code, 302, channel.result) self.assertEqual(channel.code, 302, channel.result)
saml_uri = channel.headers.getRawHeaders("Location")[0] location_headers = channel.headers.getRawHeaders("Location")
assert location_headers
saml_uri = location_headers[0]
saml_uri_path, saml_uri_query = saml_uri.split("?", 1) saml_uri_path, saml_uri_query = saml_uri.split("?", 1)
# it should redirect us to the login page of the SAML server # it should redirect us to the login page of the SAML server
@ -567,17 +571,21 @@ class MultiSSOTestCase(unittest.HomeserverTestCase):
+ "&idp=oidc", + "&idp=oidc",
) )
self.assertEqual(channel.code, 302, channel.result) self.assertEqual(channel.code, 302, channel.result)
oidc_uri = channel.headers.getRawHeaders("Location")[0] location_headers = channel.headers.getRawHeaders("Location")
assert location_headers
oidc_uri = location_headers[0]
oidc_uri_path, oidc_uri_query = oidc_uri.split("?", 1) oidc_uri_path, oidc_uri_query = oidc_uri.split("?", 1)
# it should redirect us to the auth page of the OIDC server # it should redirect us to the auth page of the OIDC server
self.assertEqual(oidc_uri_path, TEST_OIDC_AUTH_ENDPOINT) self.assertEqual(oidc_uri_path, TEST_OIDC_AUTH_ENDPOINT)
# ... and should have set a cookie including the redirect url # ... and should have set a cookie including the redirect url
cookies = dict( cookie_headers = channel.headers.getRawHeaders("Set-Cookie")
h.split(";")[0].split("=", maxsplit=1) assert cookie_headers
for h in channel.headers.getRawHeaders("Set-Cookie") cookies = {} # type: Dict[str, str]
) for h in cookie_headers:
key, value = h.split(";")[0].split("=", maxsplit=1)
cookies[key] = value
oidc_session_cookie = cookies["oidc_session"] oidc_session_cookie = cookies["oidc_session"]
macaroon = pymacaroons.Macaroon.deserialize(oidc_session_cookie) macaroon = pymacaroons.Macaroon.deserialize(oidc_session_cookie)
@ -590,9 +598,9 @@ class MultiSSOTestCase(unittest.HomeserverTestCase):
# that should serve a confirmation page # that should serve a confirmation page
self.assertEqual(channel.code, 200, channel.result) self.assertEqual(channel.code, 200, channel.result)
self.assertTrue( content_type_headers = channel.headers.getRawHeaders("Content-Type")
channel.headers.getRawHeaders("Content-Type")[-1].startswith("text/html") assert content_type_headers
) self.assertTrue(content_type_headers[-1].startswith("text/html"))
p = TestHtmlParser() p = TestHtmlParser()
p.feed(channel.text_body) p.feed(channel.text_body)
p.close() p.close()
@ -806,6 +814,7 @@ class CASTestCase(unittest.HomeserverTestCase):
self.assertEqual(channel.code, 302) self.assertEqual(channel.code, 302)
location_headers = channel.headers.getRawHeaders("Location") location_headers = channel.headers.getRawHeaders("Location")
assert location_headers
self.assertEqual(location_headers[0][: len(redirect_url)], redirect_url) self.assertEqual(location_headers[0][: len(redirect_url)], redirect_url)
@override_config({"sso": {"client_whitelist": ["https://legit-site.com/"]}}) @override_config({"sso": {"client_whitelist": ["https://legit-site.com/"]}})
@ -1248,7 +1257,9 @@ class UsernamePickerTestCase(HomeserverTestCase):
# that should redirect to the username picker # that should redirect to the username picker
self.assertEqual(channel.code, 302, channel.result) self.assertEqual(channel.code, 302, channel.result)
picker_url = channel.headers.getRawHeaders("Location")[0] location_headers = channel.headers.getRawHeaders("Location")
assert location_headers
picker_url = location_headers[0]
self.assertEqual(picker_url, "/_synapse/client/pick_username/account_details") self.assertEqual(picker_url, "/_synapse/client/pick_username/account_details")
# ... with a username_mapping_session cookie # ... with a username_mapping_session cookie
@ -1291,6 +1302,7 @@ class UsernamePickerTestCase(HomeserverTestCase):
) )
self.assertEqual(chan.code, 302, chan.result) self.assertEqual(chan.code, 302, chan.result)
location_headers = chan.headers.getRawHeaders("Location") location_headers = chan.headers.getRawHeaders("Location")
assert location_headers
# send a request to the completion page, which should 302 to the client redirectUrl # send a request to the completion page, which should 302 to the client redirectUrl
chan = self.make_request( chan = self.make_request(
@ -1300,6 +1312,7 @@ class UsernamePickerTestCase(HomeserverTestCase):
) )
self.assertEqual(chan.code, 302, chan.result) self.assertEqual(chan.code, 302, chan.result)
location_headers = chan.headers.getRawHeaders("Location") location_headers = chan.headers.getRawHeaders("Location")
assert location_headers
# ensure that the returned location matches the requested redirect URL # ensure that the returned location matches the requested redirect URL
path, query = location_headers[0].split("?", 1) path, query = location_headers[0].split("?", 1)