Skip to content

Instantly share code, notes, and snippets.

@mdboom
Created June 23, 2025 19:15
Show Gist options
  • Select an option

  • Save mdboom/b8970ce7293b4432973247ca24dc585f to your computer and use it in GitHub Desktop.

Select an option

Save mdboom/b8970ce7293b4432973247ca24dc585f to your computer and use it in GitHub Desktop.
benchmark unknown
aiohttp [unknown run]
chameleon [unknown run]
flaskblogging [unknown run]
gevent_hub [unknown build]
gunicorn [unknown run]
kinto [unknown build]
mypy2 [unknown build]
pytorch_alexnet_inference [unknown build]
sqlalchemy_declarative [unknown build]
sqlalchemy_imperative [unknown build]
tornado_http [unknown run]

aiohttp on unknown unknown

Log for aiohttp on unknown unknown
# /home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b/bin/python -u /home/benchmarking/actions-runner/_work/benchmarking/benchmarking/pyston-benchmarks/benchmarks/bm_aiohttp/run_benchmark.py --inherit-environ PYTHON_JIT,PYPERF_PERF_RECORD_EXTRA_OPTS,PYPERFORMANCE_RUNID --output /tmp/tmpz2omuqcz
Traceback (most recent call last):
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/pyston-benchmarks/benchmarks/bm_aiohttp/run_benchmark.py", line 69, in <module>
    with context:
         ^^^^^^^
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/cpython/Lib/contextlib.py", line 141, in __enter__
    return next(self.gen)
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/pyston-benchmarks/benchmarks/bm_aiohttp/netutils.py", line 36, in serving
    waitUntilUp(addr)
    ~~~~~~~~~~~^^^^^^
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/pyston-benchmarks/benchmarks/bm_aiohttp/netutils.py", line 66, in waitUntilUp
##[error]    raise Exception('Timeout reached when trying to connect')
Exception: Timeout reached when trying to connect
Traceback (most recent call last):
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/lib/python3.11/site-packages/pyperformance/run.py", line 170, in run_benchmarks
    result = bench.run(
             ^^^^^^^^^^
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/lib/python3.11/site-packages/pyperformance/_benchmark.py", line 189, in run
    bench = _run_perf_script(
            ^^^^^^^^^^^^^^^^^
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/lib/python3.11/site-packages/pyperformance/_benchmark.py", line 240, in _run_perf_script
    raise RuntimeError("Benchmark died")
RuntimeError: Benchmark died
Command failed with exit code 1

chameleon on unknown unknown

Log for chameleon on unknown unknown
    ~~~~^^
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/lib/python3.11/site-packages/pyperformance/data-files/benchmarks/bm_chameleon/run_benchmark.py", line 25, in main
    tmpl = PageTemplate(BIGTABLE_ZPT)
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b/lib/python3.15/site-packages/chameleon/zpt/template.py", line 205, in __init__
    super(PageTemplate, self).__init__(body, **config)
    ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b/lib/python3.15/site-packages/chameleon/template.py", line 137, in __init__
    self.write(body)
    ~~~~~~~~~~^^^^^^
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b/lib/python3.15/site-packages/chameleon/template.py", line 235, in write
    self.cook(body)
    ~~~~~~~~~^^^^^^
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b/lib/python3.15/site-packages/chameleon/template.py", line 167, in cook
    program = self._cook(body, digest, names)
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b/lib/python3.15/site-packages/chameleon/template.py", line 245, in _cook
    source = self._compile(body, builtins)
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b/lib/python3.15/site-packages/chameleon/template.py", line 276, in _compile
    program = self.parse(body)
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b/lib/python3.15/site-packages/chameleon/zpt/template.py", line 227, in parse
    return MacroProgram(
        body, self.mode, self.filename,
    ...<9 lines>...
        tokenizer=self.tokenizer
    )
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b/lib/python3.15/site-packages/chameleon/zpt/program.py", line 174, in __init__
    super(MacroProgram, self).__init__(*args, **kwargs)
    ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b/lib/python3.15/site-packages/chameleon/program.py", line 35, in __init__
    node = self.visit(kind, args)
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b/lib/python3.15/site-packages/chameleon/program.py", line 41, in visit
    return visitor(*args)
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b/lib/python3.15/site-packages/chameleon/zpt/program.py", line 320, in visit_element
    ATTRIBUTES = self._create_attributes_nodes(
        prepared, I18N_ATTRIBUTES, STATIC_ATTRIBUTES,
    )
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b/lib/python3.15/site-packages/chameleon/zpt/program.py", line 794, in _create_attributes_nodes
    default = ast.Str(s=text) if text is not None else None
              ^^^^^^^
AttributeError: module 'ast' has no attribute 'Str'
Traceback (most recent call last):
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/lib/python3.11/site-packages/pyperformance/run.py", line 170, in run_benchmarks
    result = bench.run(
             ^^^^^^^^^^
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/lib/python3.11/site-packages/pyperformance/_benchmark.py", line 189, in run
    bench = _run_perf_script(
            ^^^^^^^^^^^^^^^^^
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/lib/python3.11/site-packages/pyperformance/_benchmark.py", line 240, in _run_perf_script
    raise RuntimeError("Benchmark died")
RuntimeError: Benchmark died
Command failed with exit code 1

flaskblogging on unknown unknown

Log for flaskblogging on unknown unknown
# /home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b/bin/python -u /home/benchmarking/actions-runner/_work/benchmarking/benchmarking/pyston-benchmarks/benchmarks/bm_flaskblogging/run_benchmark.py --inherit-environ PYTHON_JIT,PYPERF_PERF_RECORD_EXTRA_OPTS,PYPERFORMANCE_RUNID --output /tmp/tmpjlxwe2jj
Traceback (most recent call last):
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/pyston-benchmarks/benchmarks/bm_flaskblogging/run_benchmark.py", line 76, in <module>
    with context:
         ^^^^^^^
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/cpython/Lib/contextlib.py", line 141, in __enter__
    return next(self.gen)
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/pyston-benchmarks/benchmarks/bm_flaskblogging/netutils.py", line 36, in serving
    waitUntilUp(addr)
    ~~~~~~~~~~~^^^^^^
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/pyston-benchmarks/benchmarks/bm_flaskblogging/netutils.py", line 66, in waitUntilUp
##[error]    raise Exception('Timeout reached when trying to connect')
Exception: Timeout reached when trying to connect
Traceback (most recent call last):
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/lib/python3.11/site-packages/pyperformance/run.py", line 170, in run_benchmarks
    result = bench.run(
             ^^^^^^^^^^
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/lib/python3.11/site-packages/pyperformance/_benchmark.py", line 189, in run
    bench = _run_perf_script(
            ^^^^^^^^^^^^^^^^^
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/lib/python3.11/site-packages/pyperformance/_benchmark.py", line 240, in _run_perf_script
    raise RuntimeError("Benchmark died")
RuntimeError: Benchmark died
Command failed with exit code 1

gevent_hub on unknown unknown

Log for gevent_hub on unknown unknown
      [1/1] Cythonizing src/gevent/libev/corecext.pyx
      Traceback (most recent call last):
        File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b-bm-gevent_hub/lib/python3.15/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 389, in <module>
          main()
          ~~~~^^
        File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b-bm-gevent_hub/lib/python3.15/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 373, in main
          json_out["return_val"] = hook(**hook_input["kwargs"])
                                   ~~~~^^^^^^^^^^^^^^^^^^^^^^^^
        File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b-bm-gevent_hub/lib/python3.15/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 143, in get_requires_for_build_wheel
          return hook(config_settings)
        File "/tmp/pip-build-env-i5x2pe20/overlay/lib/python3.15/site-packages/setuptools/build_meta.py", line 331, in get_requires_for_build_wheel
          return self._get_build_requires(config_settings, requirements=[])
                 ~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/tmp/pip-build-env-i5x2pe20/overlay/lib/python3.15/site-packages/setuptools/build_meta.py", line 301, in _get_build_requires
          self.run_setup()
          ~~~~~~~~~~~~~~^^
        File "/tmp/pip-build-env-i5x2pe20/overlay/lib/python3.15/site-packages/setuptools/build_meta.py", line 317, in run_setup
          exec(code, locals())
          ~~~~^^^^^^^^^^^^^^^^
        File "<string>", line 54, in <module>
        File "/tmp/pip-install-mbjzidot/gevent_692e9c7c8d914a4bbc78f6821a85e2d9/_setuputils.py", line 249, in cythonize1
          new_ext = cythonize(
                    ~~~~~~~~~^
              [ext],
              ^^^^^^
          ...<46 lines>...
              # cache="build/cycache",
              ^^^^^^^^^^^^^^^^^^^^^^^^
          )[0]
          ^
        File "/tmp/pip-build-env-i5x2pe20/overlay/lib/python3.15/site-packages/Cython/Build/Dependencies.py", line 1154, in cythonize
          cythonize_one(*args)
          ~~~~~~~~~~~~~^^^^^^^
        File "/tmp/pip-build-env-i5x2pe20/overlay/lib/python3.15/site-packages/Cython/Build/Dependencies.py", line 1298, in cythonize_one
          raise CompileError(None, pyx_file)
      Cython.Compiler.Errors.CompileError: src/gevent/libev/corecext.pyx
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error

× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> See above for output.

note: This error originates from a subprocess, and is likely not a problem with pip.
Command failed with exit code 1


==================================================

gunicorn on unknown unknown

Log for gunicorn on unknown unknown
# /home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b/bin/python -u /home/benchmarking/actions-runner/_work/benchmarking/benchmarking/pyston-benchmarks/benchmarks/bm_gunicorn/run_benchmark.py --inherit-environ PYTHON_JIT,PYPERF_PERF_RECORD_EXTRA_OPTS,PYPERFORMANCE_RUNID --output /tmp/tmpu3guvei9
Traceback (most recent call last):
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/pyston-benchmarks/benchmarks/bm_gunicorn/run_benchmark.py", line 84, in <module>
    with context:
         ^^^^^^^
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/cpython/Lib/contextlib.py", line 141, in __enter__
    return next(self.gen)
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/pyston-benchmarks/benchmarks/bm_gunicorn/netutils.py", line 36, in serving
    waitUntilUp(addr)
    ~~~~~~~~~~~^^^^^^
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/pyston-benchmarks/benchmarks/bm_gunicorn/netutils.py", line 66, in waitUntilUp
##[error]    raise Exception('Timeout reached when trying to connect')
Exception: Timeout reached when trying to connect
Traceback (most recent call last):
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/lib/python3.11/site-packages/pyperformance/run.py", line 170, in run_benchmarks
    result = bench.run(
             ^^^^^^^^^^
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/lib/python3.11/site-packages/pyperformance/_benchmark.py", line 189, in run
    bench = _run_perf_script(
            ^^^^^^^^^^^^^^^^^
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/lib/python3.11/site-packages/pyperformance/_benchmark.py", line 240, in _run_perf_script
    raise RuntimeError("Benchmark died")
RuntimeError: Benchmark died
Command failed with exit code 1

kinto on unknown unknown

Log for kinto on unknown unknown
         38 | Py_DEPRECATED(3.11) PyAPI_FUNC(void) Py_SetPythonHome(const wchar_t *);
            |                                      ^~~~~~~~~~~~~~~~
      plugins/python/python_plugin.c:278:2: warning: ‘Py_SetProgramName’ is deprecated [-Wdeprecated-declarations]
        278 |  Py_SetProgramName(pname);
            |  ^~~~~~~~~~~~~~~~~
      In file included from /home/benchmarking/actions-runner/_work/benchmarking/benchmarking/cpython/Include/Python.h:125,
                       from plugins/python/uwsgi_python.h:4,
                       from plugins/python/python_plugin.c:1:
      /home/benchmarking/actions-runner/_work/benchmarking/benchmarking/cpython/Include/pylifecycle.h:37:38: note: declared here
         37 | Py_DEPRECATED(3.11) PyAPI_FUNC(void) Py_SetProgramName(const wchar_t *);
            |                                      ^~~~~~~~~~~~~~~~~
      plugins/python/python_plugin.c:287:2: warning: ‘Py_OptimizeFlag’ is deprecated [-Wdeprecated-declarations]
        287 |  Py_OptimizeFlag = up.optimize;
            |  ^~~~~~~~~~~~~~~
      In file included from /home/benchmarking/actions-runner/_work/benchmarking/benchmarking/cpython/Include/Python.h:78,
                       from plugins/python/uwsgi_python.h:4,
                       from plugins/python/python_plugin.c:1:
      /home/benchmarking/actions-runner/_work/benchmarking/benchmarking/cpython/Include/cpython/pydebug.h:13:37: note: declared here
         13 | Py_DEPRECATED(3.12) PyAPI_DATA(int) Py_OptimizeFlag;
            |                                     ^~~~~~~~~~~~~~~
      plugins/python/python_plugin.c: In function ‘uwsgi_python_suspend’:
      plugins/python/python_plugin.c:1612:66: error: ‘PyThreadState’ {aka ‘struct _ts’} has no member named ‘c_recursion_remaining’; did you mean ‘py_recursion_remaining’?
       1612 |   up.current_c_recursion_remaining[wsgi_req->async_id] = tstate->c_recursion_remaining;
            |                                                                  ^~~~~~~~~~~~~~~~~~~~~
            |                                                                  py_recursion_remaining
      plugins/python/python_plugin.c:1629:51: error: ‘PyThreadState’ {aka ‘struct _ts’} has no member named ‘c_recursion_remaining’; did you mean ‘py_recursion_remaining’?
       1629 |   up.current_main_c_recursion_remaining = tstate->c_recursion_remaining;
            |                                                   ^~~~~~~~~~~~~~~~~~~~~
            |                                                   py_recursion_remaining
      plugins/python/python_plugin.c: In function ‘uwsgi_python_resume’:
      plugins/python/python_plugin.c:1873:11: error: ‘PyThreadState’ {aka ‘struct _ts’} has no member named ‘c_recursion_remaining’; did you mean ‘py_recursion_remaining’?
       1873 |   tstate->c_recursion_remaining = up.current_c_recursion_remaining[wsgi_req->async_id];
            |           ^~~~~~~~~~~~~~~~~~~~~
            |           py_recursion_remaining
      plugins/python/python_plugin.c:1890:11: error: ‘PyThreadState’ {aka ‘struct _ts’} has no member named ‘c_recursion_remaining’; did you mean ‘py_recursion_remaining’?
       1890 |   tstate->c_recursion_remaining = up.current_main_c_recursion_remaining;
            |           ^~~~~~~~~~~~~~~~~~~~~
            |           py_recursion_remaining
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for uWSGI
  Running setup.py clean for uWSGI
Successfully built kinto
Failed to build uWSGI
ERROR: Failed to build installable wheels for some pyproject.toml based projects (uWSGI)
Command failed with exit code 1


==================================================

mypy2 on unknown unknown

Log for mypy2 on unknown unknown
Looking in links: /tmp/tmp7ss7fm4_
Processing /tmp/tmp7ss7fm4_/pip-25.1.1-py3-none-any.whl
Installing collected packages: pip
  changing mode of /home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b-bm-mypy2/bin/pip3 to 755
  changing mode of /home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b-bm-mypy2/bin/pip3.15 to 755
Successfully installed pip-25.1.1
# /home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b-bm-mypy2/bin/python -m pip install -U 'setuptools>=18.5' wheel
Collecting setuptools>=18.5
  Using cached setuptools-80.9.0-py3-none-any.whl.metadata (6.6 kB)
Collecting wheel
  Using cached wheel-0.45.1-py3-none-any.whl.metadata (2.3 kB)
Using cached setuptools-80.9.0-py3-none-any.whl (1.2 MB)
Using cached wheel-0.45.1-py3-none-any.whl (72 kB)
Installing collected packages: wheel, setuptools

Successfully installed setuptools-80.9.0 wheel-0.45.1
# /home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b-bm-mypy2/bin/python -m pip --version
pip 25.1.1 from /home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b-bm-mypy2/lib/python3.15/site-packages/pip (python 3.15)
Installing requirements into the virtual environment /home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b-bm-mypy2
# /home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b-bm-mypy2/bin/python -m pip install --no-binary=mypy mypy==1.13.0 mypy-extensions==1.0.0 typing-extensions==4.2.0 pyperf==2.9.0
Collecting mypy==1.13.0
  Using cached mypy-1.13.0.tar.gz (3.2 MB)
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'done'
Collecting mypy-extensions==1.0.0
  Using cached mypy_extensions-1.0.0-py3-none-any.whl.metadata (1.1 kB)
Collecting typing-extensions==4.2.0
  Using cached typing_extensions-4.2.0-py3-none-any.whl.metadata (6.2 kB)
Collecting pyperf==2.9.0
  Using cached pyperf-2.9.0-py3-none-any.whl.metadata (5.9 kB)
INFO: pip is looking at multiple versions of mypy to determine which version is compatible with other requirements. This could take a while.
ERROR: Cannot install mypy==1.13.0 and typing-extensions==4.2.0 because these package versions have conflicting dependencies.

The conflict is caused by:
    The user requested typing-extensions==4.2.0
    mypy 1.13.0 depends on typing_extensions>=4.6.0

To fix this you could try to:
1. loosen the range of package versions you've specified
2. remove package versions to allow pip to attempt to solve the dependency conflict

ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/topics/dependency-resolution/#dealing-with-dependency-conflicts
Command failed with exit code 1


==================================================

pytorch_alexnet_inference on unknown unknown

Log for pytorch_alexnet_inference on unknown unknown
  error: subprocess-exited-with-error
  
  × Preparing metadata (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> [31 lines of output]
      Running from numpy source directory.
      <string>:460: UserWarning: Unrecognized setuptools command, proceeding with generating Cython sources and expanding templates
      Traceback (most recent call last):
        File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b-bm-pytorch_alexnet_inference/lib/python3.15/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 389, in <module>
          main()
          ~~~~^^
        File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b-bm-pytorch_alexnet_inference/lib/python3.15/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 373, in main
          json_out["return_val"] = hook(**hook_input["kwargs"])
                                   ~~~~^^^^^^^^^^^^^^^^^^^^^^^^
        File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b-bm-pytorch_alexnet_inference/lib/python3.15/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 175, in prepare_metadata_for_build_wheel
          return hook(metadata_directory, config_settings)
        File "/tmp/pip-build-env-et7u0e9s/overlay/lib/python3.15/site-packages/setuptools/build_meta.py", line 374, in prepare_metadata_for_build_wheel
          self.run_setup()
          ~~~~~~~~~~~~~~^^
        File "/tmp/pip-build-env-et7u0e9s/overlay/lib/python3.15/site-packages/setuptools/build_meta.py", line 512, in run_setup
          super().run_setup(setup_script=setup_script)
          ~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/tmp/pip-build-env-et7u0e9s/overlay/lib/python3.15/site-packages/setuptools/build_meta.py", line 317, in run_setup
          exec(code, locals())
          ~~~~^^^^^^^^^^^^^^^^
        File "<string>", line 489, in <module>
        File "<string>", line 465, in setup_package
        File "/tmp/pip-install-_0nhubei/numpy_f3816c9c5ec7486e9cfeec820e58a8dc/numpy/distutils/core.py", line 24, in <module>
          from numpy.distutils.command import config, config_compiler, \
          ...<2 lines>...
               install_clib
        File "/tmp/pip-install-_0nhubei/numpy_f3816c9c5ec7486e9cfeec820e58a8dc/numpy/distutils/command/config.py", line 19, in <module>
          from numpy.distutils.mingw32ccompiler import generate_manifest
        File "/tmp/pip-install-_0nhubei/numpy_f3816c9c5ec7486e9cfeec820e58a8dc/numpy/distutils/mingw32ccompiler.py", line 28, in <module>
          from distutils.msvccompiler import get_build_version as get_build_msvc_version
      ModuleNotFoundError: No module named 'distutils.msvccompiler'
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed

× Encountered error while generating package metadata.
╰─> See above for output.

note: This is an issue with the package mentioned above, not pip.
hint: See above for details.
Command failed with exit code 1


==================================================

sqlalchemy_declarative on unknown unknown

Log for sqlalchemy_declarative on unknown unknown
      src/greenlet/TPythonState.cpp: In member function ‘void greenlet::PythonState::unexpose_frames()’:
      src/greenlet/TPythonState.cpp:179:51: error: invalid use of incomplete type ‘_PyInterpreterFrame’ {aka ‘struct _PyInterpreterFrame’}
        179 |         _PyInterpreterFrame *prev_exposed = iframe->previous;
            |                                                   ^~
      In file included from /home/benchmarking/actions-runner/_work/benchmarking/benchmarking/cpython/Include/pyframe.h:19,
                       from /home/benchmarking/actions-runner/_work/benchmarking/benchmarking/cpython/Include/Python.h:104,
                       from src/greenlet/greenlet.cpp:16:
      /home/benchmarking/actions-runner/_work/benchmarking/benchmarking/cpython/Include/cpython/pyframe.h:25:8: note: forward declaration of ‘_PyInterpreterFrame’ {aka ‘struct _PyInterpreterFrame’}
         25 | struct _PyInterpreterFrame;
            |        ^~~~~~~~~~~~~~~~~~~
      In file included from src/greenlet/greenlet.cpp:36:
      src/greenlet/TPythonState.cpp:181:23: error: invalid use of incomplete type ‘_PyInterpreterFrame’ {aka ‘struct _PyInterpreterFrame’}
        181 |         memcpy(&iframe->previous, &iframe->frame_obj->_f_frame_data[0],
            |                       ^~
      In file included from /home/benchmarking/actions-runner/_work/benchmarking/benchmarking/cpython/Include/pyframe.h:19,
                       from /home/benchmarking/actions-runner/_work/benchmarking/benchmarking/cpython/Include/Python.h:104,
                       from src/greenlet/greenlet.cpp:16:
      /home/benchmarking/actions-runner/_work/benchmarking/benchmarking/cpython/Include/cpython/pyframe.h:25:8: note: forward declaration of ‘_PyInterpreterFrame’ {aka ‘struct _PyInterpreterFrame’}
         25 | struct _PyInterpreterFrame;
            |        ^~~~~~~~~~~~~~~~~~~
      In file included from src/greenlet/greenlet.cpp:36:
      src/greenlet/TPythonState.cpp:181:42: error: invalid use of incomplete type ‘_PyInterpreterFrame’ {aka ‘struct _PyInterpreterFrame’}
        181 |         memcpy(&iframe->previous, &iframe->frame_obj->_f_frame_data[0],
            |                                          ^~
      In file included from /home/benchmarking/actions-runner/_work/benchmarking/benchmarking/cpython/Include/pyframe.h:19,
                       from /home/benchmarking/actions-runner/_work/benchmarking/benchmarking/cpython/Include/Python.h:104,
                       from src/greenlet/greenlet.cpp:16:
      /home/benchmarking/actions-runner/_work/benchmarking/benchmarking/cpython/Include/cpython/pyframe.h:25:8: note: forward declaration of ‘_PyInterpreterFrame’ {aka ‘struct _PyInterpreterFrame’}
         25 | struct _PyInterpreterFrame;
            |        ^~~~~~~~~~~~~~~~~~~
      In file included from src/greenlet/greenlet.cpp:36:
      src/greenlet/TPythonState.cpp: In member function ‘void greenlet::PythonState::operator>>(PyThreadState*)’:
      src/greenlet/TPythonState.cpp:212:13: error: ‘PyThreadState’ {aka ‘struct _ts’} has no member named ‘c_recursion_remaining’; did you mean ‘py_recursion_remaining’?
        212 |     tstate->c_recursion_remaining = Py_C_RECURSION_LIMIT - this->c_recursion_depth;
            |             ^~~~~~~~~~~~~~~~~~~~~
            |             py_recursion_remaining
      src/greenlet/TPythonState.cpp:212:37: error: ‘Py_C_RECURSION_LIMIT’ was not declared in this scope
        212 |     tstate->c_recursion_remaining = Py_C_RECURSION_LIMIT - this->c_recursion_depth;
            |                                     ^~~~~~~~~~~~~~~~~~~~
      error: command '/usr/bin/g++' failed with exit code 1
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for greenlet
Failed to build greenlet
ERROR: Failed to build installable wheels for some pyproject.toml based projects (greenlet)
Command failed with exit code 1


==================================================

sqlalchemy_imperative on unknown unknown

Log for sqlalchemy_imperative on unknown unknown
      src/greenlet/TPythonState.cpp: In member function ‘void greenlet::PythonState::unexpose_frames()’:
      src/greenlet/TPythonState.cpp:179:51: error: invalid use of incomplete type ‘_PyInterpreterFrame’ {aka ‘struct _PyInterpreterFrame’}
        179 |         _PyInterpreterFrame *prev_exposed = iframe->previous;
            |                                                   ^~
      In file included from /home/benchmarking/actions-runner/_work/benchmarking/benchmarking/cpython/Include/pyframe.h:19,
                       from /home/benchmarking/actions-runner/_work/benchmarking/benchmarking/cpython/Include/Python.h:104,
                       from src/greenlet/greenlet.cpp:16:
      /home/benchmarking/actions-runner/_work/benchmarking/benchmarking/cpython/Include/cpython/pyframe.h:25:8: note: forward declaration of ‘_PyInterpreterFrame’ {aka ‘struct _PyInterpreterFrame’}
         25 | struct _PyInterpreterFrame;
            |        ^~~~~~~~~~~~~~~~~~~
      In file included from src/greenlet/greenlet.cpp:36:
      src/greenlet/TPythonState.cpp:181:23: error: invalid use of incomplete type ‘_PyInterpreterFrame’ {aka ‘struct _PyInterpreterFrame’}
        181 |         memcpy(&iframe->previous, &iframe->frame_obj->_f_frame_data[0],
            |                       ^~
      In file included from /home/benchmarking/actions-runner/_work/benchmarking/benchmarking/cpython/Include/pyframe.h:19,
                       from /home/benchmarking/actions-runner/_work/benchmarking/benchmarking/cpython/Include/Python.h:104,
                       from src/greenlet/greenlet.cpp:16:
      /home/benchmarking/actions-runner/_work/benchmarking/benchmarking/cpython/Include/cpython/pyframe.h:25:8: note: forward declaration of ‘_PyInterpreterFrame’ {aka ‘struct _PyInterpreterFrame’}
         25 | struct _PyInterpreterFrame;
            |        ^~~~~~~~~~~~~~~~~~~
      In file included from src/greenlet/greenlet.cpp:36:
      src/greenlet/TPythonState.cpp:181:42: error: invalid use of incomplete type ‘_PyInterpreterFrame’ {aka ‘struct _PyInterpreterFrame’}
        181 |         memcpy(&iframe->previous, &iframe->frame_obj->_f_frame_data[0],
            |                                          ^~
      In file included from /home/benchmarking/actions-runner/_work/benchmarking/benchmarking/cpython/Include/pyframe.h:19,
                       from /home/benchmarking/actions-runner/_work/benchmarking/benchmarking/cpython/Include/Python.h:104,
                       from src/greenlet/greenlet.cpp:16:
      /home/benchmarking/actions-runner/_work/benchmarking/benchmarking/cpython/Include/cpython/pyframe.h:25:8: note: forward declaration of ‘_PyInterpreterFrame’ {aka ‘struct _PyInterpreterFrame’}
         25 | struct _PyInterpreterFrame;
            |        ^~~~~~~~~~~~~~~~~~~
      In file included from src/greenlet/greenlet.cpp:36:
      src/greenlet/TPythonState.cpp: In member function ‘void greenlet::PythonState::operator>>(PyThreadState*)’:
      src/greenlet/TPythonState.cpp:212:13: error: ‘PyThreadState’ {aka ‘struct _ts’} has no member named ‘c_recursion_remaining’; did you mean ‘py_recursion_remaining’?
        212 |     tstate->c_recursion_remaining = Py_C_RECURSION_LIMIT - this->c_recursion_depth;
            |             ^~~~~~~~~~~~~~~~~~~~~
            |             py_recursion_remaining
      src/greenlet/TPythonState.cpp:212:37: error: ‘Py_C_RECURSION_LIMIT’ was not declared in this scope
        212 |     tstate->c_recursion_remaining = Py_C_RECURSION_LIMIT - this->c_recursion_depth;
            |                                     ^~~~~~~~~~~~~~~~~~~~
      error: command '/usr/bin/g++' failed with exit code 1
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for greenlet
ERROR: Failed to build installable wheels for some pyproject.toml based projects (greenlet)
Failed to build greenlet
Command failed with exit code 1


==================================================

tornado_http on unknown unknown

Log for tornado_http on unknown unknown
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/lib/python3.11/site-packages/pyperformance/data-files/benchmarks/bm_tornado_http/run_benchmark.py", line 54, in make_http_server
    server.add_sockets(sockets)
    ~~~~~~~~~~~~~~~~~~^^^^^^^^^
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b/lib/python3.15/site-packages/tornado/tcpserver.py", line 204, in add_sockets
    self._handlers[sock.fileno()] = add_accept_handler(
                                    ~~~~~~~~~~~~~~~~~~^
        sock, self._handle_connection
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    )
    ^
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b/lib/python3.15/site-packages/tornado/netutil.py", line 247, in add_accept_handler
    io_loop = IOLoop.current()
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b/lib/python3.15/site-packages/tornado/ioloop.py", line 265, in current
    loop = asyncio.get_event_loop()
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/cpython/Lib/asyncio/events.py", line 718, in get_event_loop
    raise RuntimeError('There is no current event loop in thread %r.'
                       % threading.current_thread().name)
RuntimeError: There is no current event loop in thread 'MainThread'.
Traceback (most recent call last):
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/lib/python3.11/site-packages/pyperformance/data-files/benchmarks/bm_tornado_http/run_benchmark.py", line 102, in <module>
    runner.bench_time_func('tornado_http', bench_tornado)
    ~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b/lib/python3.15/site-packages/pyperf/_runner.py", line 499, in bench_time_func
    result = self._main(task)
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b/lib/python3.15/site-packages/pyperf/_runner.py", line 465, in _main
    bench = self._manager()
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b/lib/python3.15/site-packages/pyperf/_runner.py", line 678, in _manager
    bench = Manager(self).create_bench()
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b/lib/python3.15/site-packages/pyperf/_manager.py", line 243, in create_bench
    worker_bench, run = self.create_worker_bench()
                        ~~~~~~~~~~~~~~~~~~~~~~~~^^
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b/lib/python3.15/site-packages/pyperf/_manager.py", line 142, in create_worker_bench
    suite = self.create_suite()
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b/lib/python3.15/site-packages/pyperf/_manager.py", line 132, in create_suite
    suite = self.spawn_worker(self.calibrate_loops, 0)
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b/lib/python3.15/site-packages/pyperf/_manager.py", line 118, in spawn_worker
    raise RuntimeError("%s failed with exit code %s"
                       % (cmd[0], exitcode))
RuntimeError: /home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/cpython3.15-5e1bac0ad2fb-compat-64aaec271e6b/bin/python failed with exit code 1
Traceback (most recent call last):
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/lib/python3.11/site-packages/pyperformance/run.py", line 170, in run_benchmarks
    result = bench.run(
             ^^^^^^^^^^
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/lib/python3.11/site-packages/pyperformance/_benchmark.py", line 189, in run
    bench = _run_perf_script(
            ^^^^^^^^^^^^^^^^^
  File "/home/benchmarking/actions-runner/_work/benchmarking/benchmarking/venv/lib/python3.11/site-packages/pyperformance/_benchmark.py", line 240, in _run_perf_script
    raise RuntimeError("Benchmark died")
RuntimeError: Benchmark died
Command failed with exit code 1
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment