Skip to content

Commit

Permalink
Merge branch 'potel-base' into antonpirker/potel/openai
Browse files Browse the repository at this point in the history
  • Loading branch information
antonpirker committed Dec 6, 2024
2 parents 841dd4e + fdb5cdc commit 54f2635
Show file tree
Hide file tree
Showing 35 changed files with 626 additions and 194 deletions.
25 changes: 25 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,30 @@
# Changelog

## 2.19.2

### Various fixes & improvements

- Deepcopy and ensure get_all function always terminates (#3861) by @cmanallen
- Cleanup chalice test environment (#3858) by @antonpirker

## 2.19.1

### Various fixes & improvements

- Fix errors when instrumenting Django cache (#3855) by @BYK
- Copy `scope.client` reference as well (#3857) by @sl0thentr0py
- Don't give up on Spotlight on 3 errors (#3856) by @BYK
- Add missing stack frames (#3673) by @antonpirker
- Fix wrong metadata type in async gRPC interceptor (#3205) by @fdellekart
- Rename launch darkly hook to match JS SDK (#3743) by @aliu39
- Script for checking if our instrumented libs are Python 3.13 compatible (#3425) by @antonpirker
- Improve Ray tests (#3846) by @antonpirker
- Test with Celery `5.5.0rc3` (#3842) by @sentrivana
- Fix asyncio testing setup (#3832) by @sl0thentr0py
- Bump `codecov/codecov-action` from `5.0.2` to `5.0.7` (#3821) by @dependabot
- Fix CI (#3834) by @sentrivana
- Use new ClickHouse GH action (#3826) by @antonpirker

## 2.19.0

### Various fixes & improvements
Expand Down
199 changes: 103 additions & 96 deletions MIGRATION_GUIDE.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,102 +20,109 @@ Looking to upgrade from Sentry SDK 2.x to 3.x? Here's a comprehensive list of wh
- Redis integration: In Redis pipeline spans there is no `span["data"]["redis.commands"]` that contains a dict `{"count": 3, "first_ten": ["cmd1", "cmd2", ...]}` but instead `span["data"]["redis.commands.count"]` (containing `3`) and `span["data"]["redis.commands.first_ten"]` (containing `["cmd1", "cmd2", ...]`).
- clickhouse-driver integration: The query is now available under the `db.query.text` span attribute (only if `send_default_pii` is `True`).
- `sentry_sdk.init` now returns `None` instead of a context manager.
- The `sampling_context` argument of `traces_sampler` now additionally contains all span attributes known at span start.
- If you're using the Celery integration, the `sampling_context` argument of `traces_sampler` doesn't contain the `celery_job` dictionary anymore. Instead, the individual keys are now available as:

| Dictionary keys | Sampling context key |
| ---------------------- | -------------------- |
| `celery_job["args"]` | `celery.job.args` |
| `celery_job["kwargs"]` | `celery.job.kwargs` |
| `celery_job["task"]` | `celery.job.task` |

Note that all of these are serialized, i.e., not the original `args` and `kwargs` but rather OpenTelemetry-friendly span attributes.

- If you're using the AIOHTTP integration, the `sampling_context` argument of `traces_sampler` doesn't contain the `aiohttp_request` object anymore. Instead, some of the individual properties of the request are accessible, if available, as follows:

| Request property | Sampling context key(s) |
| ---------------- | ------------------------------- |
| `path` | `url.path` |
| `query_string` | `url.query` |
| `method` | `http.request.method` |
| `host` | `server.address`, `server.port` |
| `scheme` | `url.scheme` |
| full URL | `url.full` |

- If you're using the Tornado integration, the `sampling_context` argument of `traces_sampler` doesn't contain the `tornado_request` object anymore. Instead, some of the individual properties of the request are accessible, if available, as follows:

| Request property | Sampling context key(s) |
| ---------------- | --------------------------------------------------- |
| `path` | `url.path` |
| `query` | `url.query` |
| `protocol` | `url.scheme` |
| `method` | `http.request.method` |
| `host` | `server.address`, `server.port` |
| `version` | `network.protocol.name`, `network.protocol.version` |
| full URL | `url.full` |

- If you're using the generic WSGI integration, the `sampling_context` argument of `traces_sampler` doesn't contain the `wsgi_environ` object anymore. Instead, the individual properties of the environment are accessible, if available, as follows:

| Env property | Sampling context key(s) |
| ----------------- | ------------------------------------------------- |
| `PATH_INFO` | `url.path` |
| `QUERY_STRING` | `url.query` |
| `REQUEST_METHOD` | `http.request.method` |
| `SERVER_NAME` | `server.address` |
| `SERVER_PORT` | `server.port` |
| `SERVER_PROTOCOL` | `server.protocol.name`, `server.protocol.version` |
| `wsgi.url_scheme` | `url.scheme` |
| full URL | `url.full` |

- If you're using the generic ASGI integration, the `sampling_context` argument of `traces_sampler` doesn't contain the `asgi_scope` object anymore. Instead, the individual properties of the scope, if available, are accessible as follows:

| Scope property | Sampling context key(s) |
| -------------- | ------------------------------- |
| `type` | `network.protocol.name` |
| `scheme` | `url.scheme` |
| `path` | `url.path` |
| `query` | `url.query` |
| `http_version` | `network.protocol.version` |
| `method` | `http.request.method` |
| `server` | `server.address`, `server.port` |
| `client` | `client.address`, `client.port` |
| full URL | `url.full` |

- If you're using the RQ integration, the `sampling_context` argument of `traces_sampler` doesn't contain the `rq_job` object anymore. Instead, the individual properties of the job and the queue, if available, are accessible as follows:

| RQ property | Sampling context key(s) |
| --------------- | ---------------------------- |
| `rq_job.args` | `rq.job.args` |
| `rq_job.kwargs` | `rq.job.kwargs` |
| `rq_job.func` | `rq.job.func` |
| `queue.name` | `messaging.destination.name` |
| `rq_job.id` | `messaging.message.id` |

Note that `rq.job.args`, `rq.job.kwargs`, and `rq.job.func` are serialized and not the actual objects on the job.

- If you're using the AWS Lambda integration, the `sampling_context` argument of `traces_sampler` doesn't contain the `aws_event` and `aws_context` objects anymore. Instead, the following, if available, is accessible:

| AWS property | Sampling context key(s) |
| ------------------------------------------- | ----------------------- |
| `aws_event["httpMethod"]` | `http.request.method` |
| `aws_event["queryStringParameters"]` | `url.query` |
| `aws_event["path"]` | `url.path` |
| full URL | `url.full` |
| `aws_event["headers"]["X-Forwarded-Proto"]` | `network.protocol.name` |
| `aws_event["headers"]["Host"]` | `server.address` |
| `aws_context["function_name"]` | `faas.name` |

- If you're using the GCP integration, the `sampling_context` argument of `traces_sampler` doesn't contain the `gcp_env` and `gcp_event` keys anymore. Instead, the following, if available, is accessible:

| Old sampling context key | New sampling context key |
| --------------------------------- | -------------------------- |
| `gcp_env["function_name"]` | `faas.name` |
| `gcp_env["function_region"]` | `faas.region` |
| `gcp_env["function_project"]` | `gcp.function.project` |
| `gcp_env["function_identity"]` | `gcp.function.identity` |
| `gcp_env["function_entry_point"]` | `gcp.function.entry_point` |
| `gcp_event.method` | `http.request.method` |
| `gcp_event.query_string` | `url.query` |
- The `sampling_context` argument of `traces_sampler` and `profiles_sampler` now additionally contains all span attributes known at span start.
- The integration-specific content of the `sampling_context` argument of `traces_sampler` and `profiles_sampler` now looks different.
- The Celery integration doesn't add the `celery_job` dictionary anymore. Instead, the individual keys are now available as:

| Dictionary keys | Sampling context key | Example |
| ---------------------- | --------------------------- | ------------------------------ |
| `celery_job["args"]` | `celery.job.args.{index}` | `celery.job.args.0` |
| `celery_job["kwargs"]` | `celery.job.kwargs.{kwarg}` | `celery.job.kwargs.kwarg_name` |
| `celery_job["task"]` | `celery.job.task` | |

Note that all of these are serialized, i.e., not the original `args` and `kwargs` but rather OpenTelemetry-friendly span attributes.

- The AIOHTTP integration doesn't add the `aiohttp_request` object anymore. Instead, some of the individual properties of the request are accessible, if available, as follows:

| Request property | Sampling context key(s) |
| ----------------- | ------------------------------- |
| `path` | `url.path` |
| `query_string` | `url.query` |
| `method` | `http.request.method` |
| `host` | `server.address`, `server.port` |
| `scheme` | `url.scheme` |
| full URL | `url.full` |
| `request.headers` | `http.request.header.{header}` |

- The Tornado integration doesn't add the `tornado_request` object anymore. Instead, some of the individual properties of the request are accessible, if available, as follows:

| Request property | Sampling context key(s) |
| ----------------- | --------------------------------------------------- |
| `path` | `url.path` |
| `query` | `url.query` |
| `protocol` | `url.scheme` |
| `method` | `http.request.method` |
| `host` | `server.address`, `server.port` |
| `version` | `network.protocol.name`, `network.protocol.version` |
| full URL | `url.full` |
| `request.headers` | `http.request.header.{header}` |

- The WSGI integration doesn't add the `wsgi_environ` object anymore. Instead, the individual properties of the environment are accessible, if available, as follows:

| Env property | Sampling context key(s) |
| ----------------- | ------------------------------------------------- |
| `PATH_INFO` | `url.path` |
| `QUERY_STRING` | `url.query` |
| `REQUEST_METHOD` | `http.request.method` |
| `SERVER_NAME` | `server.address` |
| `SERVER_PORT` | `server.port` |
| `SERVER_PROTOCOL` | `server.protocol.name`, `server.protocol.version` |
| `wsgi.url_scheme` | `url.scheme` |
| full URL | `url.full` |
| `HTTP_*` | `http.request.header.{header}` |

- The ASGI integration doesn't add the `asgi_scope` object anymore. Instead, the individual properties of the scope, if available, are accessible as follows:

| Scope property | Sampling context key(s) |
| -------------- | ------------------------------- |
| `type` | `network.protocol.name` |
| `scheme` | `url.scheme` |
| `path` | `url.path` |
| `query` | `url.query` |
| `http_version` | `network.protocol.version` |
| `method` | `http.request.method` |
| `server` | `server.address`, `server.port` |
| `client` | `client.address`, `client.port` |
| full URL | `url.full` |
| `headers` | `http.request.header.{header}` |

-The RQ integration doesn't add the `rq_job` object anymore. Instead, the individual properties of the job and the queue, if available, are accessible as follows:

| RQ property | Sampling context key | Example |
| --------------- | ---------------------------- | ---------------------- |
| `rq_job.args` | `rq.job.args.{index}` | `rq.job.args.0` |
| `rq_job.kwargs` | `rq.job.kwargs.{kwarg}` | `rq.job.args.my_kwarg` |
| `rq_job.func` | `rq.job.func` | |
| `queue.name` | `messaging.destination.name` | |
| `rq_job.id` | `messaging.message.id` | |

Note that `rq.job.args`, `rq.job.kwargs`, and `rq.job.func` are serialized and not the actual objects on the job.

- The AWS Lambda integration doesn't add the `aws_event` and `aws_context` objects anymore. Instead, the following, if available, is accessible:

| AWS property | Sampling context key(s) |
| ------------------------------------------- | ------------------------------- |
| `aws_event["httpMethod"]` | `http.request.method` |
| `aws_event["queryStringParameters"]` | `url.query` |
| `aws_event["path"]` | `url.path` |
| full URL | `url.full` |
| `aws_event["headers"]["X-Forwarded-Proto"]` | `network.protocol.name` |
| `aws_event["headers"]["Host"]` | `server.address` |
| `aws_context["function_name"]` | `faas.name` |
| `aws_event["headers"]` | `http.request.headers.{header}` |

- The GCP integration doesn't add the `gcp_env` and `gcp_event` keys anymore. Instead, the following, if available, is accessible:

| Old sampling context key | New sampling context key |
| --------------------------------- | ------------------------------ |
| `gcp_env["function_name"]` | `faas.name` |
| `gcp_env["function_region"]` | `faas.region` |
| `gcp_env["function_project"]` | `gcp.function.project` |
| `gcp_env["function_identity"]` | `gcp.function.identity` |
| `gcp_env["function_entry_point"]` | `gcp.function.entry_point` |
| `gcp_event.method` | `http.request.method` |
| `gcp_event.query_string` | `url.query` |
| `gcp_event.headers` | `http.request.header.{header}` |


### Removed
Expand Down
2 changes: 1 addition & 1 deletion docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@
copyright = "2019-{}, Sentry Team and Contributors".format(datetime.now().year)
author = "Sentry Team and Contributors"

release = "2.19.0"
release = "2.19.2"
version = ".".join(release.split(".")[:2]) # The short X.Y version.


Expand Down
124 changes: 124 additions & 0 deletions scripts/ready_yet/main.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,124 @@
import time
import re
import sys

import requests

from collections import defaultdict

from pathlib import Path

from tox.config.cli.parse import get_options
from tox.session.state import State
from tox.config.sets import CoreConfigSet
from tox.config.source.tox_ini import ToxIni

PYTHON_VERSION = "3.13"

MATCH_LIB_SENTRY_REGEX = r"py[\d\.]*-(.*)-.*"

PYPI_PROJECT_URL = "https://pypi.python.org/pypi/{project}/json"
PYPI_VERSION_URL = "https://pypi.python.org/pypi/{project}/{version}/json"


def get_tox_envs(tox_ini_path: Path) -> list:
tox_ini = ToxIni(tox_ini_path)
conf = State(get_options(), []).conf
tox_section = next(tox_ini.sections())
core_config_set = CoreConfigSet(
conf, tox_section, tox_ini_path.parent, tox_ini_path
)
(
core_config_set.loaders.extend(
tox_ini.get_loaders(
tox_section,
base=[],
override_map=defaultdict(list, {}),
conf=core_config_set,
)
)
)
return core_config_set.load("env_list")


def get_libs(tox_ini: Path, regex: str) -> set:
libs = set()
for env in get_tox_envs(tox_ini):
match = re.match(regex, env)
if match:
libs.add(match.group(1))

return sorted(libs)


def main():
"""
Check if libraries in our tox.ini are ready for Python version defined in `PYTHON_VERSION`.
"""
print(f"Checking libs from tox.ini for Python {PYTHON_VERSION} compatibility:")

ready = set()
not_ready = set()
not_found = set()

tox_ini = Path(__file__).parent.parent.parent.joinpath("tox.ini")

libs = get_libs(tox_ini, MATCH_LIB_SENTRY_REGEX)

for lib in libs:
print(".", end="")
sys.stdout.flush()

# Get latest version of lib
url = PYPI_PROJECT_URL.format(project=lib)
pypi_data = requests.get(url)

if pypi_data.status_code != 200:
not_found.add(lib)
continue

latest_version = pypi_data.json()["info"]["version"]

# Get supported Python version of latest version of lib
url = PYPI_PROJECT_URL.format(project=lib, version=latest_version)
pypi_data = requests.get(url)

if pypi_data.status_code != 200:
continue

classifiers = pypi_data.json()["info"]["classifiers"]

if f"Programming Language :: Python :: {PYTHON_VERSION}" in classifiers:
ready.add(lib)
else:
not_ready.add(lib)

# cut pypi some slack
time.sleep(0.1)

# Print report
print("\n")
print(f"\nReady for Python {PYTHON_VERSION}:")
if len(ready) == 0:
print("- None ")

for x in sorted(ready):
print(f"- {x}")

print(f"\nNOT ready for Python {PYTHON_VERSION}:")
if len(not_ready) == 0:
print("- None ")

for x in sorted(not_ready):
print(f"- {x}")

print("\nNot found on PyPI:")
if len(not_found) == 0:
print("- None ")

for x in sorted(not_found):
print(f"- {x}")


if __name__ == "__main__":
main()
2 changes: 2 additions & 0 deletions scripts/ready_yet/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
requests
tox
16 changes: 16 additions & 0 deletions scripts/ready_yet/run.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
#!/usr/bin/env bash

# exit on first error
set -xe

reset

# create and activate virtual environment
python -m venv .venv
source .venv/bin/activate

# Install (or update) requirements
python -m pip install -r requirements.txt

# Run the script
python main.py
Loading

0 comments on commit 54f2635

Please sign in to comment.