活动介绍

C:\Users\99797\AppData\Local\Programs\Python\Python39\python.exe -X pycache_prefix=C:\Users\99797\AppData\Local\JetBrains\PyCharm2024.1\cpython-cache "C:/Program Files/JetBrains/PyCharm 2024.1.1/plugins/python/helpers/pydev/pydevd.py" --multiprocess --qt-support=auto --client 127.0.0.1 --port 49244 --file D:\python.py\pythonProject\csv2data_scaled.py 已连接到 pydev 调试器(内部版本号 241.15989.155)Traceback (most recent call last): File "C:\Program Files\JetBrains\PyCharm 2024.1.1\plugins\python\helpers\pydev\pydevd.py", line 1535, in _exec pydev_imports.execfile(file, globals, locals) # execute the script File "C:\Program Files\JetBrains\PyCharm 2024.1.1\plugins\python\helpers\pydev\_pydev_imps\_pydev_execfile.py", line 18, in execfile exec(compile(contents+"\n", file, 'exec'), glob, loc) File "D:\python.py\pythonProject\csv2data_scaled.py", line 18, in <module> pivot_df = df.pivot( File "C:\Users\99797\AppData\Local\Programs\Python\Python39\lib\site-packages\pandas\core\frame.py", line 8414, in pivot return pivot(self, index=index, columns=columns, values=values) File "C:\Users\99797\AppData\Local\Programs\Python\Python39\lib\site-packages\pandas\core\reshape\pivot.py", line 557, in pivot result = indexed.unstack(columns_listlike) # type: ignore[arg-type] File "C:\Users\99797\AppData\Local\Programs\Python\Python39\lib\site-packages\pandas\core\frame.py", line 8961, in unstack result = unstack(self, level, fill_value) File "C:\Users\99797\AppData\Local\Programs\Python\Python39\lib\site-packages\pandas\core\reshape\reshape.py", line 475, in unstack return _unstack_frame(obj, level, fill_value=fill_value) File "C:\Users\99797\AppData\Local\Programs\Python\Python39\lib\site-packages\pandas\core\reshape\reshape.py", line 498, in _unstack_frame unstacker = _Unstacker(obj.index, level=level, constructor=obj._constructor) File "C:\Users\99797\AppData\Local\Programs\Python\Python39\lib\site-packages\pandas\core\reshape\reshape.py", line 136, in __init__ self._make_selectors() File "C:\Users\99797\AppData\Local\Programs\Python\Python39\lib\site-packages\pandas\core\reshape\reshape.py", line 188, in _make_selectors raise ValueError("Index contains duplicate entries, cannot reshape") ValueError: Index contains duplicate entries, cannot reshape 进程已结束,退出代码为 1

时间: 2025-06-23 12:16:24 AIGC 浏览: 25
<think>嗯,用户遇到了一个Python的错误,我需要仔细看看这个错误信息。首先,错误发生在执行pivot操作的时候,提示“Index contains duplicate entries, cannot reshape”。这说明在尝试重塑数据时,存在重复的索引条目,导致无法正确进行pivot操作。 用户提供的代码路径是csv2data_scaled.py的第18行,调用了df.pivot。pandas的pivot方法要求索引和列的组合是唯一的,否则会报这个错。我需要让用户理解为什么会出现重复的条目,以及如何解决这个问题。 可能的解决方案包括检查数据中是否有重复的行,或者在pivot之前使用聚合函数,比如取平均值或求和。用户可能需要使用pivot_table来代替pivot,这样可以通过指定聚合函数来处理重复项。 接下来,我需要一步步引导用户解决问题。首先,建议他们检查数据中是否存在重复的索引和列的组合。可以让他们在pivot之前打印数据或者检查是否有重复项。然后,如果他们确实需要处理重复数据,推荐使用pivot_table并选择合适的聚合方法,比如aggfunc='first'或者其他的函数。 另外,用户可能对pivot和pivot_table的区别不太清楚,需要简单解释一下两者的不同,说明pivot_table更适合处理重复数据的情况。同时,提醒用户根据实际数据情况选择合适的聚合方式,确保结果符合预期。 最后,确保用户理解错误的原因以及如何调整他们的代码,避免再次出现同样的问题。可能需要他们分享数据样本或相关代码片段,以便更准确地诊断问题,但用户可能无法提供,所以给出通用的解决方案更为合适。</think>这个错误信息显示在使用 pandas 的 `pivot` 方法时遇到了重复索引的问题。以下是逐步分析和解决方案: --- ### **错误原因分析** 1. **错误位置**:发生在代码第 18 行的 `df.pivot()` 方法 2. **错误类型**:`ValueError: Index contains duplicate entries, cannot reshape` 3. **核心问题**:当使用 `pivot` 重塑数据时,指定的 `index` + `columns` 组合存在重复值,导致无法生成唯一索引的二维表格 --- ### **复现问题的场景示例** 假设原始数据如下: | 行号 | 学生 | 科目 | 成绩 | |------|------|------|------| | 1 | 张三 | 数学 | 90 | | 2 | 张三 | 数学 | 85 | <-- 重复的"张三+数学"组合 当尝试 `df.pivot(index='学生', columns='科目', values='成绩')` 时,会因为重复的组合无法生成唯一索引而报错。 --- ### **解决方案** #### 方法 1:检查并清理重复数据 ```python # 检查重复的 index+columns 组合 duplicates = df.duplicated(subset=['学生', '科目'], keep=False) print(df[duplicates]) # 解决方案:保留第一个/最后一个,或删除重复项 df_clean = df.drop_duplicates(subset=['学生', '科目'], keep='first') ``` #### 方法 2:使用 `pivot_table` 聚合数据 ```python pivot_df = df.pivot_table( index='学生', columns='科目', values='成绩', aggfunc='mean' # 可替换为 sum/first/max 等 ) ``` --- ### **关键参数对比** | 方法 | 适用场景 | 重复数据处理方式 | |------------|------------------------------|------------------------| | `pivot` | 保证 index+columns 唯一 | 直接报错 | | `pivot_table` | 允许重复,需指定聚合函数 | 通过 `aggfunc` 聚合处理 | --- ### **调试建议** 1. 在 `pivot` 前添加 `print(df[['index_col', 'columns_col']].value_counts())` 定位重复项 2. 使用 PyCharm 的调试功能检查 `df` 的数据结构 3. 如果数据需要保留重复项,必须改用 `pivot_table` + 聚合函数 请根据实际数据特征选择合适的方法,并确保索引组合的唯一性或明确聚合逻辑。
阅读全文

相关推荐

C:\Users\毛有嘉\PycharmProjects\PythonProject6\.venv\Scripts\python.exe -X pycache_prefix=C:\Users\毛有嘉\AppData\Local\JetBrains\PyCharm2024.3\cpython-cache "C:/Users/毛有嘉/AppData/Local/Programs/PyCharm Professional 2024.3.3/plugins/python-ce/helpers/pydev/pydevd.py" --multiprocess --qt-support=auto --client 127.0.0.1 --port 57453 --file C:\Users\毛有嘉\PycharmProjects\PythonProject6\合并.py 已连接到 pydev 调试器(内部版本号 243.24978.54)Traceback (most recent call last): File "C:\Users\毛有嘉\AppData\Local\Programs\PyCharm Professional 2024.3.3\plugins\python-ce\helpers\pydev\pydevd.py", line 1570, in _exec pydev_imports.execfile(file, globals, locals) # execute the script ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\毛有嘉\AppData\Local\Programs\PyCharm Professional 2024.3.3\plugins\python-ce\helpers\pydev\_pydev_imps\_pydev_execfile.py", line 18, in execfile exec(compile(contents+"\n", file, 'exec'), glob, loc) File "C:\Users\毛有嘉\PycharmProjects\PythonProject6\合并.py", line 18, in <module> merged_data = pd.merge(merged_data, data, on='ID', how='outer', suffixes=('_left', '_right')) # 自定义后缀避免冲突 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\毛有嘉\PycharmProjects\PythonProject6\.venv\Lib\site-packages\pandas\core\reshape\merge.py", line 184, in merge return op.get_result(copy=copy) ^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\毛有嘉\PycharmProjects\PythonProject6\.venv\Lib\site-packages\pandas\core\reshape\merge.py", line 888, in get_result result = self._reindex_and_concat( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\毛有嘉\PycharmProjects\PythonProject6\.venv\Lib\site-packages\pandas\core\reshape\merge.py", line 840, in _reindex_and_concat llabels, rlabels = _items_overlap_with_suffix( ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\毛有嘉\PycharmProjects\PythonProject6\.venv\Lib\site-packages\pandas\core\reshape\merge.py", line 2757, in _items_overlap_with_suffix raise MergeError( pandas.errors.MergeError: Passing 'suffixes' which cause duplicate columns {'householdID_left', 'communityID_left'} is not allowed.

Microsoft Windows [版本 10.0.26100.4652] (c) Microsoft Corporation。保留所有权利。 C:\Users\薛小洁>torchrun --nproc_per_node=1 train.py W0729 09:39:42.181000 56100 site-packages\torch\distributed\elastic\multiprocessing\redirects.py:29] NOTE: Redirects are currently not supported in Windows or MacOs. Traceback (most recent call last): File "<frozen runpy>", line 198, in _run_module_as_main File "<frozen runpy>", line 88, in _run_code File "C:\Users\薛小洁\AppData\Local\Programs\Python\Python312\Scripts\torchrun.exe\__main__.py", line 7, in <module> File "C:\Users\薛小洁\AppData\Local\Programs\Python\Python312\Lib\site-packages\torch\distributed\elastic\multiprocessing\errors\__init__.py", line 355, in wrapper return f(*args, **kwargs) ^^^^^^^^^^^^^^^^^^ File "C:\Users\薛小洁\AppData\Local\Programs\Python\Python312\Lib\site-packages\torch\distributed\run.py", line 892, in main run(args) File "C:\Users\薛小洁\AppData\Local\Programs\Python\Python312\Lib\site-packages\torch\distributed\run.py", line 883, in run elastic_launch( File "C:\Users\薛小洁\AppData\Local\Programs\Python\Python312\Lib\site-packages\torch\distributed\launcher\api.py", line 139, in __call__ return launch_agent(self._config, self._entrypoint, list(args)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\薛小洁\AppData\Local\Programs\Python\Python312\Lib\site-packages\torch\distributed\launcher\api.py", line 261, in launch_agent result = agent.run() ^^^^^^^^^^^ File "C:\Users\薛小洁\AppData\Local\Programs\Python\Python312\Lib\site-packages\torch\distributed\elastic\metrics\api.py", line 138, in wrapper result = f(*args, **kwargs) ^^^^^^^^^^^^^^^^^^ File "C:\Users\薛小洁\AppData\Local\Programs\Python\Python312\Lib\site-packages\torch\distributed\elastic\agent\server\api.py", line 711, in run result = self._invoke_run(role) ^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\薛小洁\AppData\Local\Programs\Python\Python312\Lib\site-packages\torch\distributed\elastic\agent\server\api.py", line 864, in _invoke_run self._initialize_workers(self._worker_group) File "C:\Users\薛小洁\AppData\Local\Programs\Python\Python312\Lib\site-packages\torch\distributed\elastic\metrics\api.py", line 138, in wrapper result = f(*args, **kwargs) ^^^^^^^^^^^^^^^^^^ File "C:\Users\薛小洁\AppData\Local\Programs\Python\Python312\Lib\site-packages\torch\distributed\elastic\agent\server\api.py", line 683, in _initialize_workers self._rendezvous(worker_group) File "C:\Users\薛小洁\AppData\Local\Programs\Python\Python312\Lib\site-packages\torch\distributed\elastic\metrics\api.py", line 138, in wrapper result = f(*args, **kwargs) ^^^^^^^^^^^^^^^^^^ File "C:\Users\薛小洁\AppData\Local\Programs\Python\Python312\Lib\site-packages\torch\distributed\elastic\agent\server\api.py", line 500, in _rendezvous rdzv_info = spec.rdzv_handler.next_rendezvous() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\薛小洁\AppData\Local\Programs\Python\Python312\Lib\site-packages\torch\distributed\elastic\rendezvous\static_tcp_rendezvous.py", line 67, in next_rendezvous self._store = TCPStore( # type: ignore[call-arg] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ torch.distributed.DistStoreError: use_libuv was requested but PyTorch was built without libuv support, run with USE_LIBUV=0 to disable it.

C:\Users\流萤>pip install Scipy Collecting Scipy Using cached scipy-1.10.1.tar.gz (42.4 MB) Installing build dependencies ... done Getting requirements to build wheel ... done Installing backend dependencies ... done Preparing metadata (pyproject.toml) ... error error: subprocess-exited-with-error × Preparing metadata (pyproject.toml) did not run successfully. │ exit code: 1 ╰─> [12 lines of output] + meson setup --prefix=c:\python\python38-32 C:\Users\流萤\AppData\Local\Temp\pip-install-6nxcxv9v\scipy_4bbebe14fac74123bd4b1eab11096f4a C:\Users\流萤\AppData\Local\Temp\pip-install-6nxcxv9v\scipy_4bbebe14fac74123bd4b1eab11096f4a\.mesonpy-jw542hr7\build --native-file=C:\Users\流萤\AppData\Local\Temp\pip-install-6nxcxv9v\scipy_4bbebe14fac74123bd4b1eab11096f4a\.mesonpy-native-file.ini -Ddebug=false -Doptimization=2 The Meson build system Version: 1.1.1 Source dir: C:\Users\流萤\AppData\Local\Temp\pip-install-6nxcxv9v\scipy_4bbebe14fac74123bd4b1eab11096f4a Build dir: C:\Users\流萤\AppData\Local\Temp\pip-install-6nxcxv9v\scipy_4bbebe14fac74123bd4b1eab11096f4a\.mesonpy-jw542hr7\build Build type: native build Project name: SciPy Project version: 1.10.1 ..\..\meson.build:1:0: ERROR: Compiler cl can not compile programs. A full log can be found at C:\Users\流萤\AppData\Local\Temp\pip-install-6nxcxv9v\scipy_4bbebe14fac74123bd4b1eab11096f4a\.mesonpy-jw542hr7\build\meson-logs\meson-log.txt [end of output] note: This error originates from a subprocess, and is likely not a problem with pip. error: metadata-generation-failed × Encountered error while generating package metadata. ╰─> See above for output. note: This is an issue with the package mentioned above, not pip. hint: See above for details.

PS D:\我的文档\桌面\zq> & C:/Users/zhang-yongqin/AppData/Local/Programs/Python/Python313/python.exe d:/我的文档/桌面/zq/字符串处理.py 2025-08-12 15:42:49,153 - INFO - 开始读取文件… 2025-08-12 15:42:49,293 - INFO - 找到 34066 个唯一字符串 2025-08-12 15:42:49,293 - INFO - 开始聚类处理… 2025-08-12 15:42:49,294 - INFO - 处理批次 1/7,包含 5000 个字符串 C:\Users\zhang-yongqin\AppData\Local\Programs\Python\Python313\Lib\site-packages\sklearn\utils\deprecation.py:132: FutureWarning: ‘force_all_finite’ was renamed to ‘ensure_all_finite’ in 1.6 and will be removed in 1.8. warnings.warn( C:\Users\zhang-yongqin\AppData\Local\Programs\Python\Python313\Lib\site-packages\sklearn\utils\deprecation.py:132: FutureWarning: ‘force_all_finite’ was renamed to ‘ensure_all_finite’ in 1.6 and will be removed in 1.8. warnings.warn( 2025-08-12 15:42:58,722 - INFO - 处理批次 2/7,包含 5000 个字符串 C:\Users\zhang-yongqin\AppData\Local\Programs\Python\Python313\Lib\site-packages\sklearn\utils\deprecation.py:132: FutureWarning: ‘force_all_finite’ was renamed to ‘ensure_all_finite’ in 1.6 and will be removed in 1.8. warnings.warn( C:\Users\zhang-yongqin\AppData\Local\Programs\Python\Python313\Lib\site-packages\sklearn\utils\deprecation.py:132: FutureWarning: ‘force_all_finite’ was renamed to ‘ensure_all_finite’ in 1.6 and will be removed in 1.8. warnings.warn( 2025-08-12 15:43:08,491 - INFO - 处理批次 3/7,包含 5000 个字符串 C:\Users\zhang-yongqin\AppData\Local\Programs\Python\Python313\Lib\site-packages\sklearn\utils\deprecation.py:132: FutureWarning: ‘force_all_finite’ was renamed to ‘ensure_all_finite’ in 1.6 and will be removed in 1.8. warnings.warn( C:\Users\zhang-yongqin\AppData\Local\Programs\Python\Python313\Lib\site-packages\sklearn\utils\deprecation.py:132: FutureWarning: ‘force_all_finite’ was renamed to ‘ensure_all_finite’ in 1.6 and will be removed in 1.8. warnings.warn( C:\Users\zhang-yongqin\AppData\Local\Programs\Python\Python313\Lib\site-packages\sklearn\utils\deprecation.py:132: FutureWarning: ‘force_all_finite’ was renamed to ‘ensure_all_finite’ in 1.6 and will be removed in 1.8. warnings.warn( warnings.warn( C:\Users\zhang-yongqin\AppData\Local\Programs\Python\Python313\Lib\site-packages\sklearn\utils\deprecation.py:132: FutureWarning: ‘force_all_finite’ was renamed to ‘ensure_all_finite’ in 1.6 and will be removed in 1.8. warnings.warn( C:\Users\zhang-yongqin\AppData\Local\Programs\Python\Python313\Lib\site-packages\sklearn\utils\deprecation.py:132: FutureWarning: ‘force_all_finite’ was renamed to ‘ensure_all_finite’ in 1.6 and will be removed in 1.8. warnings.warn( 2025-08-12 15:43:18,128 - INFO - 处理批次 4/7,包含 5000 个字符串 C:\Users\zhang-yongqin\AppData\Local\Programs\Python\Python313\Lib\site-packages\sklearn\utils\deprecation.py:132: FutureWarning: ‘force_all_finite’ was renamed to ‘ensure_all_finite’ in 1.6 and will be removed in 1.8. warnings.warn( 2025-08-12 15:43:18,128 - INFO - 处理批次 4/7,包含 5000 个字符串 C:\Users\zhang-yongqin\AppData\Local\Programs\Python\Python313\Lib\site-packages\sklearn\utils\deprecation.py:132: FutureWarning: ‘force_all_finite’ was renamed to ‘ensure_all_finite’ in 1.6 and will be removed in 1.8. 2025-08-12 15:43:18,128 - INFO - 处理批次 4/7,包含 5000 个字符串 C:\Users\zhang-yongqin\AppData\Local\Programs\Python\Python313\Lib\site-packages\sklearn\utils\deprecation.py:132: FutureWarning: ‘force_all_finite’ was renamed to ‘ensure_all_finite’ in 1.6 and will be removed in 1.8. warnings.warn( C:\Users\zhang-yongqin\AppData\Local\Programs\Python\Python313\Lib\site-packages\sklearn\utils\deprecation.py:132: FutureWarning: ‘force_all_finite’ was renamed to ‘ensure_all_finite’ in 1.6 and will be removed in 1.8. C:\Users\zhang-yongqin\AppData\Local\Programs\Python\Python313\Lib\site-packages\sklearn\utils\deprecation.py:132: FutureWarning: ‘force_all_finite’ was renamed to ‘ensure_all_finite’ in 1.6 and will be removed in 1.8. warnings.warn( C:\Users\zhang-yongqin\AppData\Local\Programs\Python\Python313\Lib\site-packages\sklearn\utils\deprecation.py:132: FutureWarning: ‘force_all_finite’ was renamed to ‘ensure_all_finite’ in 1.6 and will be removed in 1.8. warnings.warn( warnings.warn( C:\Users\zhang-yongqin\AppData\Local\Programs\Python\Python313\Lib\site-packages\sklearn\utils\deprecation.py:132: FutureWarning: ‘force_all_finite’ was renamed to ‘ensure_all_finite’ in 1.6 and will be removed in 1.8. warnings.warn( C:\Users\zhang-yongqin\AppData\Local\Programs\Python\Python313\Lib\site-packages\sklearn\utils\deprecation.py:132: FutureWarning: ‘force_all_finite’ was renamed to ‘ensure_all_finite’ in 1.6 and will be removed in 1.8. warnings.warn( 2025-08-12 15:43:27,914 - INFO - 处理批次 5/7,包含 5000 个字符串 warnings.warn( 2025-08-12 15:43:27,914 - INFO - 处理批次 5/7,包含 5000 个字符串 2025-08-12 15:43:27,914 - INFO - 处理批次 5/7,包含 5000 个字符串 C:\Users\zhang-yongqin\AppData\Local\Programs\Python\Python313\Lib\site-packages\sklearn\utils\deprecation.py:132: FutureWarning: ‘force_all_finite’ was renamed to ‘ensure_all_finite’ in 1.6 and will be removed in 1.8. C:\Users\zhang-yongqin\AppData\Local\Programs\Python\Python313\Lib\site-packages\sklearn\utils\deprecation.py:132: FutureWarning: ‘force_all_finite’ was renamed to ‘ensure_all_finite’ in 1.6 and will be removed in 1.8. warnings.warn( C:\Users\zhang-yongqin\AppData\Local\Programs\Python\Python313\Lib\site-packages\sklearn\utils\deprecation.py:132: FutureWarning: ‘force_all_finite’ was renamed to ‘ensure_all_finite’ in 1.6 and will be removed in 1.8. warnings.warn( 2025-08-12 15:43:37,486 - INFO - 处理批次 6/7,包含 5000 个字符串 C:\Users\zhang-yongqin\AppData\Local\Programs\Python\Python313\Lib\site-packages\sklearn\utils\deprecation.py:132: FutureWarning: ‘force_all_finite’ was renamed to ‘ensure_all_finite’ in 1.6 and will be removed in 1.8. warnings.warn( C:\Users\zhang-yongqin\AppData\Local\Programs\Python\Python313\Lib\site-packages\sklearn\utils\deprecation.py:132: FutureWarning: ‘force_all_finite’ was renamed to ‘ensure_all_finite’ in 1.6 and will be removed in 1.8. warnings.warn( 2025-08-12 15:43:47,311 - INFO - 处理批次 7/7,包含 4066 个字符串 C:\Users\zhang-yongqin\AppData\Local\Programs\Python\Python313\Lib\site-packages\sklearn\utils\deprecation.py:132: FutureWarning: ‘force_all_finite’ was renamed to ‘ensure_all_finite’ in 1.6 and will be removed in 1.8. warnings.warn( C:\Users\zhang-yongqin\AppData\Local\Programs\Python\Python313\Lib\site-packages\sklearn\utils\deprecation.py:132: FutureWarning: ‘force_all_finite’ was renamed to ‘ensure_all_finite’ in 1.6 and will be removed in 1.8. warnings.warn( 2025-08-12 15:43:55,982 - INFO - 开始合并 1106 个聚类 2025-08-12 15:43:59,888 - INFO - 合并后剩余 1103 个聚类 2025-08-12 15:44:01,683 - INFO - 聚类完成,耗时: 72.39秒 2025-08-12 15:44:01,683 - INFO - 验证聚类结果… 一直卡在这里不动了

#下面程序运行时报错: C:\Users\Administrator\AppData\Local\Programs\Python\Python312\python.exe C:\Users\Administrator\AppData\Local\Programs\Python\Python312\Lib\site-packages\transformers\utils\generic.py Traceback (most recent call last): File "C:\Users\Administrator\AppData\Local\Programs\Python\Python312\Lib\site-packages\transformers\utils\generic.py", line 34, in <module> from ..utils import logging ImportError: attempted relative import with no known parent package 进程已结束,退出代码为 1 ------------------------------------------------------------------------------------------------ import inspect import json import os import tempfile import warnings from collections import OrderedDict, UserDict, defaultdict from collections.abc import Iterable, MutableMapping from contextlib import ExitStack, contextmanager from dataclasses import dataclass, fields, is_dataclass from enum import Enum from functools import partial, wraps from typing import Any, Callable, ContextManager, Optional, TypedDict import numpy as np from packaging import version from ..utils import logging from .import_utils import ( get_torch_version, is_flax_available, is_mlx_available, is_tf_available, is_torch_available, is_torch_fx_proxy, requires, ) _CAN_RECORD_REGISTRY = {} logger = logging.get_logger(__name__) if is_torch_available(): # required for @can_return_tuple decorator to work with torchdynamo import torch # noqa: F401 from ..model_debugging_utils import model_addition_debugger_context class cached_property(property): """ Descriptor that mimics @property but caches output in member variable. From tensorflow_datasets Built-in in functools from Python 3.8. """ def __get__(self, obj, objtype=None): # See docs.python.org/3/howto/descriptor.html#properties if obj is None: return self if self.fget is None: raise AttributeError("unreadable attribute") attr = "__cached_" + self.fget.__name__ cached = getattr(obj, attr, None) if cached is None: cached = self.fget(obj) setattr(obj, attr, cached) return cached # vendored from distutils.util def strtobool(val): """Convert a string representation of truth to true (1) or false (0). True values are 'y', 'yes', 't', 'true', 'on', and '1'; false values are 'n', 'no', 'f', 'false', 'off', and '0'. Raises ValueError if 'val' is anything else. """ val = val.lower() if val in {"y", "yes", "t", "true", "on", "1"}: return 1 if val in {"n", "no", "f", "false", "off", "0"}: return 0 raise ValueError(f"invalid truth value {val!r}") def infer_framework_from_repr(x): """ Tries to guess the framework of an object x from its repr (brittle but will help in is_tensor to try the frameworks in a smart order, without the need to import the frameworks). """ representation = str(type(x)) if representation.startswith("<class 'torch."): return "pt" elif representation.startswith("<class 'tensorflow."): return "tf" elif representation.startswith("<class 'jax"): return "jax" elif representation.startswith("<class 'numpy."): return "np" elif representation.startswith("<class 'mlx."): return "mlx" def _get_frameworks_and_test_func(x): """ Returns an (ordered since we are in Python 3.7+) dictionary framework to test function, which places the framework we can guess from the repr first, then Numpy, then the others. """ framework_to_test = { "pt": is_torch_tensor, "tf": is_tf_tensor, "jax": is_jax_tensor, "np": is_numpy_array, "mlx": is_mlx_array, } preferred_framework = infer_framework_from_repr(x) # We will test this one first, then numpy, then the others. frameworks = [] if preferred_framework is None else [preferred_framework] if preferred_framework != "np": frameworks.append("np") frameworks.extend([f for f in framework_to_test if f not in [preferred_framework, "np"]]) return {f: framework_to_test[f] for f in frameworks} def is_tensor(x): """ Tests if x is a torch.Tensor, tf.Tensor, jaxlib.xla_extension.DeviceArray, np.ndarray or mlx.array in the order defined by infer_framework_from_repr """ # This gives us a smart order to test the frameworks with the corresponding tests. framework_to_test_func = _get_frameworks_and_test_func(x) for test_func in framework_to_test_func.values(): if test_func(x): return True # Tracers if is_torch_fx_proxy(x): return True if is_flax_available(): from jax.core import Tracer if isinstance(x, Tracer): return True return False def _is_numpy(x): return isinstance(x, np.ndarray) def is_numpy_array(x): """ Tests if x is a numpy array or not. """ return _is_numpy(x) def _is_torch(x): import torch return isinstance(x, torch.Tensor) def is_torch_tensor(x): """ Tests if x is a torch tensor or not. Safe to call even if torch is not installed. """ return False if not is_torch_available() else _is_torch(x) def _is_torch_device(x): import torch return isinstance(x, torch.device) def is_torch_device(x): """ Tests if x is a torch device or not. Safe to call even if torch is not installed. """ return False if not is_torch_available() else _is_torch_device(x) def _is_torch_dtype(x): import torch if isinstance(x, str): if hasattr(torch, x): x = getattr(torch, x) else: return False return isinstance(x, torch.dtype) def is_torch_dtype(x): """ Tests if x is a torch dtype or not. Safe to call even if torch is not installed. """ return False if not is_torch_available() else _is_torch_dtype(x) def _is_tensorflow(x): import tensorflow as tf return isinstance(x, tf.Tensor) def is_tf_tensor(x): """ Tests if x is a tensorflow tensor or not. Safe to call even if tensorflow is not installed. """ return False if not is_tf_available() else _is_tensorflow(x) def _is_tf_symbolic_tensor(x): import tensorflow as tf # the is_symbolic_tensor predicate is only available starting with TF 2.14 if hasattr(tf, "is_symbolic_tensor"): return tf.is_symbolic_tensor(x) return isinstance(x, tf.Tensor) def is_tf_symbolic_tensor(x): """ Tests if x is a tensorflow symbolic tensor or not (ie. not eager). Safe to call even if tensorflow is not installed. """ return False if not is_tf_available() else _is_tf_symbolic_tensor(x) def _is_jax(x): import jax.numpy as jnp # noqa: F811 return isinstance(x, jnp.ndarray) def is_jax_tensor(x): """ Tests if x is a Jax tensor or not. Safe to call even if jax is not installed. """ return False if not is_flax_available() else _is_jax(x) def _is_mlx(x): import mlx.core as mx return isinstance(x, mx.array) def is_mlx_array(x): """ Tests if x is a mlx array or not. Safe to call even when mlx is not installed. """ return False if not is_mlx_available() else _is_mlx(x) def to_py_obj(obj): """ Convert a TensorFlow tensor, PyTorch tensor, Numpy array or python list to a python list. """ if isinstance(obj, (int, float)): return obj elif isinstance(obj, (dict, UserDict)): return {k: to_py_obj(v) for k, v in obj.items()} elif isinstance(obj, (list, tuple)): try: arr = np.array(obj) if np.issubdtype(arr.dtype, np.integer) or np.issubdtype(arr.dtype, np.floating): return arr.tolist() except Exception: pass return [to_py_obj(o) for o in obj] framework_to_py_obj = { "pt": lambda obj: obj.tolist(), "tf": lambda obj: obj.numpy().tolist(), "jax": lambda obj: np.asarray(obj).tolist(), "np": lambda obj: obj.tolist(), } # This gives us a smart order to test the frameworks with the corresponding tests. framework_to_test_func = _get_frameworks_and_test_func(obj) for framework, test_func in framework_to_test_func.items(): if test_func(obj): return framework_to_py_obj[framework](obj) # tolist also works on 0d np arrays if isinstance(obj, np.number): return obj.tolist() else: return obj def to_numpy(obj): """ Convert a TensorFlow tensor, PyTorch tensor, Numpy array or python list to a Numpy array. """ framework_to_numpy = { "pt": lambda obj: obj.detach().cpu().numpy(), "tf": lambda obj: obj.numpy(), "jax": lambda obj: np.asarray(obj), "np": lambda obj: obj, } if isinstance(obj, (dict, UserDict)): return {k: to_numpy(v) for k, v in obj.items()} elif isinstance(obj, (list, tuple)): return np.array(obj) # This gives us a smart order to test the frameworks with the corresponding tests. framework_to_test_func = _get_frameworks_and_test_func(obj) for framework, test_func in framework_to_test_func.items(): if test_func(obj): return framework_to_numpy[framework](obj) return obj class ModelOutput(OrderedDict): """ Base class for all model outputs as dataclass. Has a __getitem__ that allows indexing by integer or slice (like a tuple) or strings (like a dictionary) that will ignore the None attributes. Otherwise behaves like a regular python dictionary. <Tip warning={true}> You can't unpack a ModelOutput directly. Use the [~utils.ModelOutput.to_tuple] method to convert it to a tuple before. </Tip> """ def __init_subclass__(cls) -> None: """Register subclasses as pytree nodes. This is necessary to synchronize gradients when using torch.nn.parallel.DistributedDataParallel with static_graph=True with modules that output ModelOutput subclasses. """ if is_torch_available(): if version.parse(get_torch_version()) >= version.parse("2.2"): from torch.utils._pytree import register_pytree_node register_pytree_node( cls, _model_output_flatten, partial(_model_output_unflatten, output_type=cls), serialized_type_name=f"{cls.__module__}.{cls.__name__}", ) else: from torch.utils._pytree import _register_pytree_node _register_pytree_node( cls, _model_output_flatten, partial(_model_output_unflatten, output_type=cls), ) def __init__(self, *args, **kwargs): super().__init__(*args, **kwargs) # Subclasses of ModelOutput must use the @dataclass decorator # This check is done in __init__ because the @dataclass decorator operates after __init_subclass__ # issubclass() would return True for issubclass(ModelOutput, ModelOutput) when False is needed # Just need to check that the current class is not ModelOutput is_modeloutput_subclass = self.__class__ != ModelOutput if is_modeloutput_subclass and not is_dataclass(self): raise TypeError( f"{self.__module__}.{self.__class__.__name__} is not a dataclass." " This is a subclass of ModelOutput and so must use the @dataclass decorator." ) def __post_init__(self): """Check the ModelOutput dataclass. Only occurs if @dataclass decorator has been used. """ class_fields = fields(self) # Safety and consistency checks if not len(class_fields): raise ValueError(f"{self.__class__.__name__} has no fields.") if not all(field.default is None for field in class_fields[1:]): raise ValueError(f"{self.__class__.__name__} should not have more than one required field.") first_field = getattr(self, class_fields[0].name) other_fields_are_none = all(getattr(self, field.name) is None for field in class_fields[1:]) if other_fields_are_none and not is_tensor(first_field): if isinstance(first_field, dict): iterator = first_field.items() first_field_iterator = True else: try: iterator = iter(first_field) first_field_iterator = True except TypeError: first_field_iterator = False # if we provided an iterator as first field and the iterator is a (key, value) iterator # set the associated fields if first_field_iterator: for idx, element in enumerate(iterator): if not isinstance(element, (list, tuple)) or len(element) != 2 or not isinstance(element[0], str): if idx == 0: # If we do not have an iterator of key/values, set it as attribute self[class_fields[0].name] = first_field else: # If we have a mixed iterator, raise an error raise ValueError( f"Cannot set key/value for {element}. It needs to be a tuple (key, value)." ) break setattr(self, element[0], element[1]) if element[1] is not None: self[element[0]] = element[1] elif first_field is not None: self[class_fields[0].name] = first_field else: for field in class_fields: v = getattr(self, field.name) if v is not None: self[field.name] = v def __delitem__(self, *args, **kwargs): raise Exception(f"You cannot use __delitem__ on a {self.__class__.__name__} instance.") def setdefault(self, *args, **kwargs): raise Exception(f"You cannot use setdefault on a {self.__class__.__name__} instance.") def pop(self, *args, **kwargs): raise Exception(f"You cannot use pop on a {self.__class__.__name__} instance.") def update(self, *args, **kwargs): raise Exception(f"You cannot use update on a {self.__class__.__name__} instance.") def __getitem__(self, k): if isinstance(k, str): inner_dict = dict(self.items()) return inner_dict[k] else: return self.to_tuple()[k] def __setattr__(self, name, value): if name in self.keys() and value is not None: # Don't call self.__setitem__ to avoid recursion errors super().__setitem__(name, value) super().__setattr__(name, value) def __setitem__(self, key, value): # Will raise a KeyException if needed super().__setitem__(key, value) # Don't call self.__setattr__ to avoid recursion errors super().__setattr__(key, value) def __reduce__(self): if not is_dataclass(self): return super().__reduce__() callable, _args, *remaining = super().__reduce__() args = tuple(getattr(self, field.name) for field in fields(self)) return callable, args, *remaining def to_tuple(self) -> tuple[Any]: """ Convert self to a tuple containing all the attributes/keys that are not None. """ return tuple(self[k] for k in self.keys()) if is_torch_available(): import torch.utils._pytree as _torch_pytree def _model_output_flatten(output: ModelOutput) -> tuple[list[Any], "_torch_pytree.Context"]: return list(output.values()), list(output.keys()) def _model_output_unflatten( values: Iterable[Any], context: "_torch_pytree.Context", output_type=None, ) -> ModelOutput: return output_type(**dict(zip(context, values))) if version.parse(get_torch_version()) >= version.parse("2.2"): _torch_pytree.register_pytree_node( ModelOutput, _model_output_flatten, partial(_model_output_unflatten, output_type=ModelOutput), serialized_type_name=f"{ModelOutput.__module__}.{ModelOutput.__name__}", ) else: _torch_pytree._register_pytree_node( ModelOutput, _model_output_flatten, partial(_model_output_unflatten, output_type=ModelOutput), ) class ExplicitEnum(str, Enum): """ Enum with more explicit error message for missing values. """ @classmethod def _missing_(cls, value): raise ValueError( f"{value} is not a valid {cls.__name__}, please select one of {list(cls._value2member_map_.keys())}" ) class PaddingStrategy(ExplicitEnum): """ Possible values for the padding argument in [PreTrainedTokenizerBase.__call__]. Useful for tab-completion in an IDE. """ LONGEST = "longest" MAX_LENGTH = "max_length" DO_NOT_PAD = "do_not_pad" class TensorType(ExplicitEnum): """ Possible values for the return_tensors argument in [PreTrainedTokenizerBase.__call__]. Useful for tab-completion in an IDE. """ PYTORCH = "pt" TENSORFLOW = "tf" NUMPY = "np" JAX = "jax" MLX = "mlx" class ContextManagers: """ Wrapper for contextlib.ExitStack which enters a collection of context managers. Adaptation of ContextManagers in the fastcore library. """ def __init__(self, context_managers: list[ContextManager]): self.context_managers = context_managers self.stack = ExitStack() def __enter__(self): for context_manager in self.context_managers: self.stack.enter_context(context_manager) def __exit__(self, *args, **kwargs): self.stack.__exit__(*args, **kwargs) def can_return_loss(model_class): """ Check if a given model can return loss. Args: model_class (type): The class of the model. """ framework = infer_framework(model_class) if framework == "tf": signature = inspect.signature(model_class.call) # TensorFlow models elif framework == "pt": signature = inspect.signature(model_class.forward) # PyTorch models else: signature = inspect.signature(model_class.__call__) # Flax models for p in signature.parameters: if p == "return_loss" and signature.parameters[p].default is True: return True return False def find_labels(model_class): """ Find the labels used by a given model. Args: model_class (type): The class of the model. """ model_name = model_class.__name__ framework = infer_framework(model_class) if framework == "tf": signature = inspect.signature(model_class.call) # TensorFlow models elif framework == "pt": signature = inspect.signature(model_class.forward) # PyTorch models else: signature = inspect.signature(model_class.__call__) # Flax models if "QuestionAnswering" in model_name: return [p for p in signature.parameters if "label" in p or p in ("start_positions", "end_positions")] else: return [p for p in signature.parameters if "label" in p] def flatten_dict(d: MutableMapping, parent_key: str = "", delimiter: str = "."): """Flatten a nested dict into a single level dict.""" def _flatten_dict(d, parent_key="", delimiter="."): for k, v in d.items(): key = str(parent_key) + delimiter + str(k) if parent_key else k if v and isinstance(v, MutableMapping): yield from flatten_dict(v, key, delimiter=delimiter).items() else: yield key, v return dict(_flatten_dict(d, parent_key, delimiter)) @contextmanager def working_or_temp_dir(working_dir, use_temp_dir: bool = False): if use_temp_dir: with tempfile.TemporaryDirectory() as tmp_dir: yield tmp_dir else: yield working_dir def transpose(array, axes=None): """ Framework-agnostic version of numpy.transpose that will work on torch/TensorFlow/Jax tensors as well as NumPy arrays. """ if is_numpy_array(array): return np.transpose(array, axes=axes) elif is_torch_tensor(array): return array.T if axes is None else array.permute(*axes) elif is_tf_tensor(array): import tensorflow as tf return tf.transpose(array, perm=axes) elif is_jax_tensor(array): import jax.numpy as jnp return jnp.transpose(array, axes=axes) else: raise ValueError(f"Type not supported for transpose: {type(array)}.") def reshape(array, newshape): """ Framework-agnostic version of numpy.reshape that will work on torch/TensorFlow/Jax tensors as well as NumPy arrays. """ if is_numpy_array(array): return np.reshape(array, newshape) elif is_torch_tensor(array): return array.reshape(*newshape) elif is_tf_tensor(array): import tensorflow as tf return tf.reshape(array, newshape) elif is_jax_tensor(array): import jax.numpy as jnp return jnp.reshape(array, newshape) else: raise ValueError(f"Type not supported for reshape: {type(array)}.") def squeeze(array, axis=None): """ Framework-agnostic version of numpy.squeeze that will work on torch/TensorFlow/Jax tensors as well as NumPy arrays. """ if is_numpy_array(array): return np.squeeze(array, axis=axis) elif is_torch_tensor(array): return array.squeeze() if axis is None else array.squeeze(dim=axis) elif is_tf_tensor(array): import tensorflow as tf return tf.squeeze(array, axis=axis) elif is_jax_tensor(array): import jax.numpy as jnp return jnp.squeeze(array, axis=axis) else: raise ValueError(f"Type not supported for squeeze: {type(array)}.") def expand_dims(array, axis): """ Framework-agnostic version of numpy.expand_dims that will work on torch/TensorFlow/Jax tensors as well as NumPy arrays. """ if is_numpy_array(array): return np.expand_dims(array, axis) elif is_torch_tensor(array): return array.unsqueeze(dim=axis) elif is_tf_tensor(array): import tensorflow as tf return tf.expand_dims(array, axis=axis) elif is_jax_tensor(array): import jax.numpy as jnp return jnp.expand_dims(array, axis=axis) else: raise ValueError(f"Type not supported for expand_dims: {type(array)}.") def tensor_size(array): """ Framework-agnostic version of numpy.size that will work on torch/TensorFlow/Jax tensors as well as NumPy arrays. """ if is_numpy_array(array): return np.size(array) elif is_torch_tensor(array): return array.numel() elif is_tf_tensor(array): import tensorflow as tf return tf.size(array) elif is_jax_tensor(array): return array.size else: raise ValueError(f"Type not supported for tensor_size: {type(array)}.") def infer_framework(model_class): """ Infers the framework of a given model without using isinstance(), because we cannot guarantee that the relevant classes are imported or available. """ for base_class in inspect.getmro(model_class): module = base_class.__module__ name = base_class.__name__ if module.startswith("tensorflow") or module.startswith("keras") or name == "TFPreTrainedModel": return "tf" elif module.startswith("torch") or name == "PreTrainedModel": return "pt" elif module.startswith("flax") or module.startswith("jax") or name == "FlaxPreTrainedModel": return "flax" else: raise TypeError(f"Could not infer framework from class {model_class}.") def torch_int(x): """ Casts an input to a torch int64 tensor if we are in a tracing context, otherwise to a Python int. """ if not is_torch_available(): return int(x) import torch return x.to(torch.int64) if torch.jit.is_tracing() and isinstance(x, torch.Tensor) else int(x) def torch_float(x): """ Casts an input to a torch float32 tensor if we are in a tracing context, otherwise to a Python float. """ if not is_torch_available(): return int(x) import torch return x.to(torch.float32) if torch.jit.is_tracing() and isinstance(x, torch.Tensor) else int(x) def filter_out_non_signature_kwargs(extra: Optional[list] = None): """ Decorator to filter out named arguments that are not in the function signature. This decorator ensures that only the keyword arguments that match the function's signature, or are specified in the extra list, are passed to the function. Any additional keyword arguments are filtered out and a warning is issued. Parameters: extra (Optional[list], *optional*): A list of extra keyword argument names that are allowed even if they are not in the function's signature. Returns: Callable: A decorator that wraps the function and filters out invalid keyword arguments. Example usage: python @filter_out_non_signature_kwargs(extra=["allowed_extra_arg"]) def my_function(arg1, arg2, **kwargs): print(arg1, arg2, kwargs) my_function(arg1=1, arg2=2, allowed_extra_arg=3, invalid_arg=4) # This will print: 1 2 {"allowed_extra_arg": 3} # And issue a warning: "The following named arguments are not valid for my_function and were ignored: 'invalid_arg'" """ extra = extra or [] extra_params_to_pass = set(extra) def decorator(func): sig = inspect.signature(func) function_named_args = set(sig.parameters.keys()) valid_kwargs_to_pass = function_named_args.union(extra_params_to_pass) # Required for better warning message is_instance_method = "self" in function_named_args is_class_method = "cls" in function_named_args # Mark function as decorated func._filter_out_non_signature_kwargs = True @wraps(func) def wrapper(*args, **kwargs): valid_kwargs = {} invalid_kwargs = {} for k, v in kwargs.items(): if k in valid_kwargs_to_pass: valid_kwargs[k] = v else: invalid_kwargs[k] = v if invalid_kwargs: invalid_kwargs_names = [f"'{k}'" for k in invalid_kwargs] invalid_kwargs_names = ", ".join(invalid_kwargs_names) # Get the class name for better warning message if is_instance_method: cls_prefix = args[0].__class__.__name__ + "." elif is_class_method: cls_prefix = args[0].__name__ + "." else: cls_prefix = "" warnings.warn( f"The following named arguments are not valid for {cls_prefix}{func.__name__}" f" and were ignored: {invalid_kwargs_names}", UserWarning, stacklevel=2, ) return func(*args, **valid_kwargs) return wrapper return decorator class TransformersKwargs(TypedDict, total=False): """ Keyword arguments to be passed to the loss function Attributes: num_items_in_batch (Optional[torch.Tensor], *optional*): Number of items in the batch. It is recommended to pass it when you are doing gradient accumulation. output_hidden_states (Optional[bool], *optional*): Most of the models support outputing all hidden states computed during the forward pass. output_attentions (Optional[bool], *optional*): Turn this on to return the intermediary attention scores. output_router_logits (Optional[bool], *optional*): For MoE models, this allows returning the router logits to compute the loss. cumulative_seqlens_q (torch.LongTensor, *optional*) Gets cumulative sequence length for query state. cumulative_seqlens_k (torch.LongTensor, *optional*) Gets cumulative sequence length for key state. max_length_q (int, *optional*): Maximum sequence length for query state. max_length_k (int, *optional*): Maximum sequence length for key state. """ num_items_in_batch: Optional["torch.Tensor"] output_hidden_states: Optional[bool] output_attentions: Optional[bool] output_router_logits: Optional[bool] cumulative_seqlens_q: Optional["torch.LongTensor"] cumulative_seqlens_k: Optional["torch.LongTensor"] max_length_q: Optional[int] max_length_k: Optional[int] def is_timm_config_dict(config_dict: dict[str, Any]) -> bool: """Checks whether a config dict is a timm config dict.""" return "pretrained_cfg" in config_dict def is_timm_local_checkpoint(pretrained_model_path: str) -> bool: """ Checks whether a checkpoint is a timm model checkpoint. """ if pretrained_model_path is None: return False # in case it's Path, not str pretrained_model_path = str(pretrained_model_path) is_file = os.path.isfile(pretrained_model_path) is_dir = os.path.isdir(pretrained_model_path) # pretrained_model_path is a file if is_file and pretrained_model_path.endswith(".json"): with open(pretrained_model_path) as f: config_dict = json.load(f) return is_timm_config_dict(config_dict) # pretrained_model_path is a directory with a config.json if is_dir and os.path.exists(os.path.join(pretrained_model_path, "config.json")): with open(os.path.join(pretrained_model_path, "config.json")) as f: config_dict = json.load(f) return is_timm_config_dict(config_dict) return False def set_attribute_for_modules(module: "torch.nn.Module", key: str, value: Any): """ Set a value to a module and all submodules. """ setattr(module, key, value) for submodule in module.children(): set_attribute_for_modules(submodule, key, value) def del_attribute_from_modules(module: "torch.nn.Module", key: str): """ Delete a value from a module and all submodules. """ # because we might remove it previously in case it's a shared module, e.g. activation function if hasattr(module, key): delattr(module, key) for submodule in module.children(): del_attribute_from_modules(submodule, key) def can_return_tuple(func): """ Decorator to wrap model method, to call output.to_tuple() if return_dict=False passed as a kwarg or use_return_dict=False is set in the config. Note: output.to_tuple() convert output to tuple skipping all None values. """ @wraps(func) def wrapper(self, *args, **kwargs): return_dict = self.config.return_dict if hasattr(self, "config") else True return_dict_passed = kwargs.pop("return_dict", return_dict) if return_dict_passed is not None: return_dict = return_dict_passed output = func(self, *args, **kwargs) if not return_dict and not isinstance(output, tuple): output = output.to_tuple() return output return wrapper # if is_torch_available(): # @torch._dynamo.disable @dataclass @requires(backends=("torch",)) class OutputRecorder: """ Configuration for recording outputs from a model via hooks. Attributes: target_class (Type): The class (e.g., nn.Module) to which the hook will be attached. index (Optional[int]): If the output is a tuple/list, optionally record only at a specific index. layer_name (Optional[str]): Name of the submodule to target (if needed), e.g., "transformer.layer.3.attn". class_name (Optional[str]): Name of the class to which the hook will be attached. Could be the suffix of class name in some cases. """ target_class: "type[torch.nn.Module]" index: Optional[int] = 0 layer_name: Optional[str] = None class_name: Optional[str] = None def check_model_inputs(func): """ Decorator to intercept specific layer outputs without using hooks. Compatible with torch.compile (Dynamo tracing). """ @wraps(func) def wrapper(self, *args, **kwargs): use_cache = kwargs.get("use_cache") if use_cache is None: use_cache = getattr(self.config, "use_cache", False) return_dict = kwargs.pop("return_dict", None) if return_dict is None: return_dict = getattr(self.config, "return_dict", True) if getattr(self, "gradient_checkpointing", False) and self.training and use_cache: logger.warning_once( "use_cache=True is incompatible with gradient checkpointing. Setting use_cache=False." ) use_cache = False kwargs["use_cache"] = use_cache all_args = kwargs.copy() if "kwargs" in all_args: for k, v in all_args["kwargs"].items(): all_args[k] = v capture_flags = _CAN_RECORD_REGISTRY.get(str(self.__class__), {}) # there is a weak ref for executorch recordable_keys = { f"output_{k}": all_args.get( f"output_{k}", getattr( self.config, f"output_{k}", all_args.get("output_attentions", getattr(self.config, "output_attentions", False)), ), ) for k in capture_flags } collected_outputs = defaultdict(tuple) monkey_patched_layers = [] def make_capture_wrapper(module, orig_forward, key, index): @wraps(orig_forward) def wrapped_forward(*args, **kwargs): if key == "hidden_states" and len(collected_outputs[key]) == 0: collected_outputs[key] += (args[0],) if kwargs.get("debug_io", False): with model_addition_debugger_context( module, kwargs.get("debug_io_dir", "~/model_debug"), kwargs.get("prune_layers") ): output = orig_forward(*args, **kwargs) else: output = orig_forward(*args, **kwargs) if not isinstance(output, tuple): collected_outputs[key] += (output,) elif output[index] is not None: if key not in collected_outputs: collected_outputs[key] = (output[index],) else: collected_outputs[key] += (output[index],) return output return wrapped_forward if any(recordable_keys.values()): capture_tasks = [] for key, layer_specs in capture_flags.items(): if not recordable_keys.get(f"output_{key}", False): continue if not isinstance(layer_specs, list): layer_specs = [layer_specs] for specs in layer_specs: if not isinstance(specs, OutputRecorder): index = 0 if "hidden_states" in key else 1 class_name = None if not isinstance(specs, str) else specs target_class = specs if not isinstance(specs, str) else None specs = OutputRecorder(target_class=target_class, index=index, class_name=class_name) capture_tasks.append((key, specs)) for name, module in self.named_modules(): for key, specs in capture_tasks: # The second check is for multimodals where only backbone layer suffix is available if (specs.target_class is not None and isinstance(module, specs.target_class)) or ( specs.class_name is not None and name.endswith(specs.class_name) ): if specs.layer_name is not None and specs.layer_name not in name: continue # Monkey patch forward original_forward = module.forward module.forward = make_capture_wrapper(module, original_forward, key, specs.index) monkey_patched_layers.append((module, original_forward)) outputs = func(self, *args, **kwargs) # Restore original forward methods for module, original_forward in monkey_patched_layers: module.forward = original_forward # Inject collected outputs into model output for key in collected_outputs: if key == "hidden_states": collected_outputs[key] = collected_outputs[key][:-1] if hasattr(outputs, "vision_hidden_states"): collected_outputs[key] += (outputs.vision_hidden_states,) elif hasattr(outputs, "last_hidden_state"): collected_outputs[key] += (outputs.last_hidden_state,) outputs[key] = collected_outputs[key] elif key == "attentions": if isinstance(capture_flags[key], list) and len(capture_flags[key]) == 2: outputs[key] = collected_outputs[key][0::2] outputs["cross_" + key] = collected_outputs[key][1::2] else: outputs[key] = collected_outputs[key] else: outputs[key] = collected_outputs[key] if return_dict is False: outputs = outputs.to_tuple() return outputs return wrapper class GeneralInterface(MutableMapping): """ Dict-like object keeping track of a class-wide mapping, as well as a local one. Allows to have library-wide modifications though the class mapping, as well as local modifications in a single file with the local mapping. """ # Class instance object, so that a call to register can be reflected into all other files correctly, even if # a new instance is created (in order to locally override a given function) _global_mapping = {} def __init__(self): self._local_mapping = {} def __getitem__(self, key): # First check if instance has a local override if key in self._local_mapping: return self._local_mapping[key] return self._global_mapping[key] def __setitem__(self, key, value): # Allow local update of the default functions without impacting other instances self._local_mapping.update({key: value}) def __delitem__(self, key): del self._local_mapping[key] def __iter__(self): # Ensure we use all keys, with the overwritten ones on top return iter({**self._global_mapping, **self._local_mapping}) def __len__(self): return len(self._global_mapping.keys() | self._local_mapping.keys()) @classmethod def register(cls, key: str, value: Callable): cls._global_mapping.update({key: value}) def valid_keys(self) -> list[str]: return list(self.keys())

MultiValueDictKeyError at /auth/users/ 'password' Request Method: POST Request URL: http://127.0.0.1:8000/auth/users/ Django Version: 4.2.21 Exception Type: MultiValueDictKeyError Exception Value: 'password' Exception Location: C:\Users\高琨\AppData\Local\Programs\Python\Python39\lib\site-packages\django\utils\datastructures.py, line 86, in __getitem__ Raised during: account.views.UserView Python Executable: C:\Users\高琨\AppData\Local\Programs\Python\Python39\python.exe Python Version: 3.9.11 Python Path: ['C:\\Users\\高琨\\Desktop\\backend', 'C:\\Users\\高琨\\AppData\\Local\\Programs\\Python\\Python39\\python39.zip', 'C:\\Users\\高琨\\AppData\\Local\\Programs\\Python\\Python39\\DLLs', 'C:\\Users\\高琨\\AppData\\Local\\Programs\\Python\\Python39\\lib', 'C:\\Users\\高琨\\AppData\\Local\\Programs\\Python\\Python39', 'C:\\Users\\高琨\\AppData\\Local\\Programs\\Python\\Python39\\lib\\site-packages'] Server time: Tue, 24 Jun 2025 10:29:49 +0800 Traceback Switch to copy-and-paste view C:\Users\高琨\AppData\Local\Programs\Python\Python39\lib\site-packages\django\utils\datastructures.py, line 84, in __getitem__ list_ = super().__getitem__(key) … Local vars During handling of the above exception ('password'), another exception occurred: C:\Users\高琨\AppData\Local\Programs\Python\Python39\lib\site-packages\django\core\handlers\exception.py, line 55, in inner response = get_response(request) … Local vars C:\Users\高琨\AppData\Local\Programs\Python\Python39\lib\site-packages\django\core\handlers\base.py, line 197, in _get_response response = wrapped_callback(request, *callback_args, **callback_kwargs) … Local vars C:\Users\高琨\AppData\Local\Programs\Python\Python39\lib\site-packages\django\views\decorators\csrf.py, line 56, in wrapper_view return view_func(*args, **kwargs) … Local vars C:\Users\高琨\AppData\Local\Programs\Python\Python39\lib\site-packages\rest_framework\viewsets.py, line 124, in view return self.dispatch(request, *args, **kwargs) … Local vars C:\Users\高琨\AppData\Local\Programs\Python\Python39\lib\site-packages\rest_framework\views.py, line 509, in dispatch response = self.handle_exception(exc) … Local vars C:\Users\高琨\AppData\Local\Programs\Python\Python39\lib\site-packages\rest_framework\views.py, line 469, in handle_exception self.raise_uncaught_exception(exc) … Local vars C:\Users\高琨\AppData\Local\Programs\Python\Python39\lib\site-packages\rest_framework\views.py, line 480, in raise_uncaught_exception raise exc … Local vars C:\Users\高琨\AppData\Local\Programs\Python\Python39\lib\site-packages\rest_framework\views.py, line 506, in dispatch response = handler(request, *args, **kwargs) … Local vars C:\Users\高琨\Desktop\backend\account\views.py, line 31, in login password = request.data['password'] … Local vars C:\Users\高琨\AppData\Local\Programs\Python\Python39\lib\site-packages\django\utils\datastructures.py, line 86, in __getitem__ raise MultiValueDictKeyError(key) … Local vars Request information USER AnonymousUser GET No GET data POST Variable Value csrfmiddlewaretoken 'DWg42eYqodHAyGvU8iq7SW01Agw6SWsrGqn2RNRtENkZgyesArpAfiWcxFAkniwR' last_login '' first_name '琨' last_name '高' date_joined '' email '[email protected]' desc '啊大苏打' mobile '13698551543' gender '' FILES Variable Value avatar <InMemoryUploadedFile: 屏幕截图 2025-05-16 181140.png (image/png)> COOKIES Variable Value csrftoken '********************' META Variable Value ALLUSERSPROFILE 'C:\\ProgramData' APPDATA 'C:\\Users\\高琨\\AppData\\Roaming' CHROME_CRASHPAD_PIPE_NAME '\\\\.\\pipe\\crashpad_2004_IXNEYUSYGSDNGSXV' COLORTERM 'truecolor' COMMONPROGRAMFILES 'C:\\Program Files\\Common Files' COMMONPROGRAMFILES(X86) 'C:\\Program Files (x86)\\Common Files' COMMONPROGRAMW6432 'C:\\Program Files\\Common Files' COMPUTERNAME 'LAPTOP-SUN5G18K' COMSPEC 'C:\\WINDOWS\\system32\\cmd.exe' CONTENT_LENGTH '75110' CONTENT_TYPE 'multipart/form-data; boundary=----WebKitFormBoundaryJGK9BG3i7H6CHnvK' CSRF_COOKIE 'dEh8ZJ3dqKNzS2TICj9Dxw6l7zeoFweA' DJANGO_SETTINGS_MODULE 'core.settings' DRIVERDATA 'C:\\Windows\\System32\\Drivers\\DriverData' EFC_9700_1262719628 '1' EFC_9700_1592913036 '1' EFC_9700_2283032206 '1' EFC_9700_2775293581 '1' EFC_9700_3789132940 '1' FPS_BROWSER_APP_PROFILE_STRING 'Internet Explorer' FPS_BROWSER_USER_PROFILE_STRING 'Default' GATEWAY_INTERFACE 'CGI/1.1' HOMEDRIVE 'C:' HOMEPATH '\\Users\\高琨' HTTP_ACCEPT 'text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.7' HTTP_ACCEPT_ENCODING 'gzip, deflate, br, zstd' HTTP_ACCEPT_LANGUAGE 'zh-CN,zh;q=0.9,en;q=0.8,en-GB;q=0.7,en-US;q=0.6' HTTP_CACHE_CONTROL 'max-age=0' HTTP_CONNECTION 'keep-alive' HTTP_COOKIE '********************' HTTP_HOST '127.0.0.1:8000' HTTP_ORIGIN 'http://127.0.0.1:8000' HTTP_REFERER 'http://127.0.0.1:8000/auth/users/' HTTP_SEC_CH_UA '"Microsoft Edge";v="137", "Chromium";v="137", "Not/A)Brand";v="24"' HTTP_SEC_CH_UA_MOBILE '?0' HTTP_SEC_CH_UA_PLATFORM '"Windows"' HTTP_SEC_FETCH_DEST 'document' HTTP_SEC_FETCH_MODE 'navigate' HTTP_SEC_FETCH_SITE 'same-origin' HTTP_SEC_FETCH_USER '?1' HTTP_UPGRADE_INSECURE_REQUESTS '1' HTTP_USER_AGENT ('Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like ' 'Gecko) Chrome/137.0.0.0 Safari/537.36 Edg/137.0.0.0') LANG 'zh_CN.UTF-8' LOCALAPPDATA 'C:\\Users\\高琨\\AppData\\Local' LOGONSERVER '\\\\LAPTOP-SUN5G18K' NPM_HOME 'C:\\Program Files\\nodejs\\node_global' NUMBER_OF_PROCESSORS '16' NVM_HOME 'E:\\nvm' NVM_SYMLINK 'C:\\nvm4w\\nodejs' ONEDRIVE 'C:\\Users\\高琨\\OneDrive' ORIGINAL_XDG_CURRENT_DESKTOP 'undefined' OS 'Windows_NT' PATH ('E:\\bin\\;C:\\Windows\\system32;C:\\Windows;C:\\Windows\\System32\\Wbem;C:\\Windows\\System32\\WindowsPowerShell\\v1.0\\;C:\\Windows\\System32\\OpenSSH\\;C:\\Windows\\system32\\config\\systemprofile\\AppData\\Local\\Microsoft\\WindowsApps;C:\\windows\\system32\\HWAudioDriver\\;C:\\WINDOWS\\system32;C:\\WINDOWS;C:\\WINDOWS\\System32\\Wbem;C:\\WINDOWS\\System32\\WindowsPowerShell\\v1.0\\;C:\\WINDOWS\\System32\\OpenSSH\\;E:\\nvm;C:\\nvm4w\\nodejs;C:\\Program ' 'Files\\dotnet\\;C:\\Program Files\\nodejs\\;C:\\Program ' 'Files\\nodejs\\node_global;C:\\Windows\\System32\\node_modules\\yarn\\bin;D:\\yarn_global\\bin;F:\\;C:\\Program ' 'Files ' '(x86)\\Tencent\\微信web开发者工具\\dll;C:\\Users\\高琨\\AppData\\Local\\Microsoft\\WindowsApps\\python3.exe;D:\\phpstudy_pro\\Extensions\\php\\php7.3.4nts;D:\\phpstudy_pro\\Extensions\\composer2.5.8;C:\\Program ' 'Files (x86)\\NetSarang\\Xftp ' '8\\;C:\\Users\\高琨\\AppData\\Local\\Programs\\Python\\Python39\\Scripts\\;C:\\Users\\高琨\\AppData\\Local\\Programs\\Python\\Python39\\;C:\\Users\\高琨\\AppData\\Local\\Programs\\Python\\Launcher\\;C:\\Users\\高琨\\AppData\\Local\\Microsoft\\WindowsApps;C:\\Users\\高琨\\AppData\\Local\\Programs\\Microsoft ' 'VS Code\\bin;E:\\nvm;C:\\nvm4w\\nodejs;C:\\Users\\高琨\\AppData\\Roaming\\npm') PATHEXT '.COM;.EXE;.BAT;.CMD;.VBS;.VBE;.JS;.JSE;.WSF;.WSH;.MSC;.CPL' PATH_INFO '/auth/users/' PROCESSOR_ARCHITECTURE 'AMD64' PROCESSOR_IDENTIFIER 'Intel64 Family 6 Model 154 Stepping 3, GenuineIntel' PROCESSOR_LEVEL '6' PROCESSOR_REVISION '9a03' PROGRAMDATA 'C:\\ProgramData' PROGRAMFILES 'C:\\Program Files' PROGRAMFILES(X86) 'C:\\Program Files (x86)' PROGRAMW6432 'C:\\Program Files' PSMODULEPATH ('C:\\Users\\高琨\\Documents\\WindowsPowerShell\\Modules;C:\\Program ' 'Files\\WindowsPowerShell\\Modules;C:\\WINDOWS\\system32\\WindowsPowerShell\\v1.0\\Modules') PUBLIC 'C:\\Users\\Public' QUERY_STRING '' REMOTE_ADDR '127.0.0.1' REMOTE_HOST '' REQUEST_METHOD 'POST' RUN_MAIN 'true' SCRIPT_NAME '' SERVER_NAME '127.0.0.1' SERVER_PORT '8000' SERVER_PROTOCOL 'HTTP/1.1' SERVER_SOFTWARE 'WSGIServer/0.2' SESSIONNAME 'Console' SYSTEMDRIVE 'C:' SYSTEMROOT 'C:\\WINDOWS' TEMP 'C:\\Users\\高琨\\AppData\\Local\\Temp' TERM_PROGRAM 'vscode' TERM_PROGRAM_VERSION '1.101.1' TMP 'C:\\Users\\高琨\\AppData\\Local\\Temp' USERDOMAIN 'LAPTOP-SUN5G18K' USERDOMAIN_ROAMINGPROFILE 'LAPTOP-SUN5G18K' USERNAME '高琨' USERPROFILE 'C:\\Users\\高琨' VSCODE_INJECTION '1' VSCODE_NONCE '71bb0211-da42-4580-bb3d-e761a3af6e04' VSCODE_STABLE '1' WINDIR 'C:\\WINDOWS' ZES_ENABLE_SYSMAN '1' wsgi.errors <_io.TextIOWrapper name='<stderr>' mode='w' encoding='utf-8'> wsgi.file_wrapper <class 'wsgiref.util.FileWrapper'> wsgi.input <django.core.handlers.wsgi.LimitedStream object at 0x000001EBE9AE46D0> wsgi.multiprocess False wsgi.multithread True wsgi.run_once False wsgi.url_scheme 'http' wsgi.version (1, 0) Settings Using settings module core.settings Setting Value ABSOLUTE_URL_OVERRIDES {} ADMINS [] ALLOWED_HOSTS ['*'] APPEND_SLASH True AUTHENTICATION_BACKENDS ['django.contrib.auth.backends.ModelBackend'] AUTH_PASSWORD_VALIDATORS '********************' AUTH_USER_MODEL 'account.User' BASE_DIR WindowsPath('C:/Users/高琨/Desktop/backend') CACHES {'default': {'BACKEND': 'django.core.cache.backends.locmem.LocMemCache'}} CACHE_MIDDLEWARE_ALIAS 'default' CACHE_MIDDLEWARE_KEY_PREFIX '********************' CACHE_MIDDLEWARE_SECONDS 600 CORS_ALLOW_ALL_ORIGINS True CSRF_COOKIE_AGE 31449600 CSRF_COOKIE_DOMAIN None CSRF_COOKIE_HTTPONLY False CSRF_COOKIE_MASKED False CSRF_COOKIE_NAME 'csrftoken' CSRF_COOKIE_PATH '/' CSRF_COOKIE_SAMESITE 'Lax' CSRF_COOKIE_SECURE False CSRF_FAILURE_VIEW 'django.views.csrf.csrf_failure' CSRF_HEADER_NAME 'HTTP_X_CSRFTOKEN' CSRF_TRUSTED_ORIGINS [] CSRF_USE_SESSIONS False DATABASES {'default': {'ATOMIC_REQUESTS': False, 'AUTOCOMMIT': True, 'CONN_HEALTH_CHECKS': False, 'CONN_MAX_AGE': 0, 'ENGINE': 'django.db.backends.sqlite3', 'HOST': '', 'NAME': WindowsPath('C:/Users/高琨/Desktop/backend/db.sqlite3'), 'OPTIONS': {}, 'PASSWORD': '********************', 'PORT': '', 'TEST': {'CHARSET': None, 'COLLATION': None, 'MIGRATE': True, 'MIRROR': None, 'NAME': None}, 'TIME_ZONE': None, 'USER': ''}} DATABASE_ROUTERS [] DATA_UPLOAD_MAX_MEMORY_SIZE 2621440 DATA_UPLOAD_MAX_NUMBER_FIELDS 1000 DATA_UPLOAD_MAX_NUMBER_FILES 100 DATETIME_FORMAT 'N j, Y, P' DATETIME_INPUT_FORMATS ['%Y-%m-%d %H:%M:%S', '%Y-%m-%d %H:%M:%S.%f', '%Y-%m-%d %H:%M', '%m/%d/%Y %H:%M:%S', '%m/%d/%Y %H:%M:%S.%f', '%m/%d/%Y %H:%M', '%m/%d/%y %H:%M:%S', '%m/%d/%y %H:%M:%S.%f', '%m/%d/%y %H:%M'] DATE_FORMAT 'N j, Y' DATE_INPUT_FORMATS ['%Y-%m-%d', '%m/%d/%Y', '%m/%d/%y', '%b %d %Y', '%b %d, %Y', '%d %b %Y', '%d %b, %Y', '%B %d %Y', '%B %d, %Y', '%d %B %Y', '%d %B, %Y'] DEBUG True DEBUG_PROPAGATE_EXCEPTIONS False DECIMAL_SEPARATOR '.' DEFAULT_AUTO_FIELD 'django.db.models.BigAutoField' DEFAULT_CHARSET 'utf-8' DEFAULT_EXCEPTION_REPORTER 'django.views.debug.ExceptionReporter' DEFAULT_EXCEPTION_REPORTER_FILTER 'django.views.debug.SafeExceptionReporterFilter' DEFAULT_FILE_STORAGE 'django.core.files.storage.FileSystemStorage' DEFAULT_FROM_EMAIL 'webmaster@localhost' DEFAULT_INDEX_TABLESPACE '' DEFAULT_TABLESPACE '' DISALLOWED_USER_AGENTS [] EMAIL_BACKEND 'django.core.mail.backends.smtp.EmailBackend' EMAIL_HOST 'localhost' EMAIL_HOST_PASSWORD '********************' EMAIL_HOST_USER '' EMAIL_PORT 25 EMAIL_SSL_CERTFILE None EMAIL_SSL_KEYFILE '********************' EMAIL_SUBJECT_PREFIX '[Django] ' EMAIL_TIMEOUT None EMAIL_USE_LOCALTIME False EMAIL_USE_SSL False EMAIL_USE_TLS False FILE_UPLOAD_DIRECTORY_PERMISSIONS None FILE_UPLOAD_HANDLERS ['django.core.files.uploadhandler.MemoryFileUploadHandler', 'django.core.files.uploadhandler.TemporaryFileUploadHandler'] FILE_UPLOAD_MAX_MEMORY_SIZE 2621440 FILE_UPLOAD_PERMISSIONS 420 FILE_UPLOAD_TEMP_DIR None FIRST_DAY_OF_WEEK 0 FIXTURE_DIRS [] FORCE_SCRIPT_NAME None FORMAT_MODULE_PATH None FORM_RENDERER 'django.forms.renderers.DjangoTemplates' IGNORABLE_404_URLS [] INSTALLED_APPS ['django.contrib.admin', 'django.contrib.auth', 'django.contrib.contenttypes', 'django.contrib.sessions', 'django.contrib.messages', 'django.contrib.staticfiles', 'rest_framework', 'corsheaders', 'account'] INTERNAL_IPS [] LANGUAGES [('af', 'Afrikaans'), ('ar', 'Arabic'), ('ar-dz', 'Algerian Arabic'), ('ast', 'Asturian'), ('az', 'Azerbaijani'), ('bg', 'Bulgarian'), ('be', 'Belarusian'), ('bn', 'Bengali'), ('br', 'Breton'), ('bs', 'Bosnian'), ('ca', 'Catalan'), ('ckb', 'Central Kurdish (Sorani)'), ('cs', 'Czech'), ('cy', 'Welsh'), ('da', 'Danish'), ('de', 'German'), ('dsb', 'Lower Sorbian'), ('el', 'Greek'), ('en', 'English'), ('en-au', 'Australian English'), ('en-gb', 'British English'), ('eo', 'Esperanto'), ('es', 'Spanish'), ('es-ar', 'Argentinian Spanish'), ('es-co', 'Colombian Spanish'), ('es-mx', 'Mexican Spanish'), ('es-ni', 'Nicaraguan Spanish'), ('es-ve', 'Venezuelan Spanish'), ('et', 'Estonian'), ('eu', 'Basque'), ('fa', 'Persian'), ('fi', 'Finnish'), ('fr', 'French'), ('fy', 'Frisian'), ('ga', 'Irish'), ('gd', 'Scottish Gaelic'), ('gl', 'Galician'), ('he', 'Hebrew'), ('hi', 'Hindi'), ('hr', 'Croatian'), ('hsb', 'Upper Sorbian'), ('hu', 'Hungarian'), ('hy', 'Armenian'), ('ia', 'Interlingua'), ('id', 'Indonesian'), ('ig', 'Igbo'), ('io', 'Ido'), ('is', 'Icelandic'), ('it', 'Italian'), ('ja', 'Japanese'), ('ka', 'Georgian'), ('kab', 'Kabyle'), ('kk', 'Kazakh'), ('km', 'Khmer'), ('kn', 'Kannada'), ('ko', 'Korean'), ('ky', 'Kyrgyz'), ('lb', 'Luxembourgish'), ('lt', 'Lithuanian'), ('lv', 'Latvian'), ('mk', 'Macedonian'), ('ml', 'Malayalam'), ('mn', 'Mongolian'), ('mr', 'Marathi'), ('ms', 'Malay'), ('my', 'Burmese'), ('nb', 'Norwegian Bokmål'), ('ne', 'Nepali'), ('nl', 'Dutch'), ('nn', 'Norwegian Nynorsk'), ('os', 'Ossetic'), ('pa', 'Punjabi'), ('pl', 'Polish'), ('pt', 'Portuguese'), ('pt-br', 'Brazilian Portuguese'), ('ro', 'Romanian'), ('ru', 'Russian'), ('sk', 'Slovak'), ('sl', 'Slovenian'), ('sq', 'Albanian'), ('sr', 'Serbian'), ('sr-latn', 'Serbian Latin'), ('sv', 'Swedish'), ('sw', 'Swahili'), ('ta', 'Tamil'), ('te', 'Telugu'), ('tg', 'Tajik'), ('th', 'Thai'), ('tk', 'Turkmen'), ('tr', 'Turkish'), ('tt', 'Tatar'), ('udm', 'Udmurt'), ('uk', 'Ukrainian'), ('ur', 'Urdu'), ('uz', 'Uzbek'), ('vi', 'Vietnamese'), ('zh-hans', 'Simplified Chinese'), ('zh-hant', 'Traditional Chinese')] LANGUAGES_BIDI ['he', 'ar', 'ar-dz', 'ckb', 'fa', 'ur'] LANGUAGE_CODE 'zh-hans' LANGUAGE_COOKIE_AGE None LANGUAGE_COOKIE_DOMAIN None LANGUAGE_COOKIE_HTTPONLY False LANGUAGE_COOKIE_NAME 'django_language' LANGUAGE_COOKIE_PATH '/' LANGUAGE_COOKIE_SAMESITE None LANGUAGE_COOKIE_SECURE False LOCALE_PATHS [] LOGGING {} LOGGING_CONFIG 'logging.config.dictConfig' LOGIN_REDIRECT_URL '/accounts/profile/' LOGIN_URL '/accounts/login/' LOGOUT_REDIRECT_URL None MANAGERS [] MEDIA_ROOT 'C:/Users/高琨/Desktop/backend/media' MEDIA_URL '/media/' MESSAGE_STORAGE 'django.contrib.messages.storage.fallback.FallbackStorage' MIDDLEWARE ['corsheaders.middleware.CorsMiddleware', 'django.middleware.security.SecurityMiddleware', 'django.contrib.sessions.middleware.SessionMiddleware', 'django.middleware.common.CommonMiddleware', 'django.middleware.csrf.CsrfViewMiddleware', 'django.contrib.auth.middleware.AuthenticationMiddleware', 'django.contrib.messages.middleware.MessageMiddleware', 'django.middleware.clickjacking.XFrameOptionsMiddleware'] MIGRATION_MODULES {} MONTH_DAY_FORMAT 'F j' NUMBER_GROUPING 0 PASSWORD_HASHERS '********************' PASSWORD_RESET_TIMEOUT '********************' PREPEND_WWW False REST_FRAMEWORK {'DEFAULT_AUTHENTICATION_CLASSES': ('rest_framework_simplejwt.authentication.JWTAuthentication',), 'DEFAULT_PAGINATION_CLASS': 'rest_framework.pagination.PageNumberPagination', 'DEFAULT_PERMISSION_CLASSES': 'rest_framework.permissions.IsAuthenticated', 'NON_FIELD_ERRORS_KEY': '********************', 'PAGE_SIZE': 3} ROOT_URLCONF 'core.urls' SECRET_KEY '********************' SECRET_KEY_FALLBACKS '********************' SECURE_CONTENT_TYPE_NOSNIFF True SECURE_CROSS_ORIGIN_OPENER_POLICY 'same-origin' SECURE_HSTS_INCLUDE_SUBDOMAINS False SECURE_HSTS_PRELOAD False SECURE_HSTS_SECONDS 0 SECURE_PROXY_SSL_HEADER None SECURE_REDIRECT_EXEMPT [] SECURE_REFERRER_POLICY 'same-origin' SECURE_SSL_HOST None SECURE_SSL_REDIRECT False SERVER_EMAIL 'root@localhost' SESSION_CACHE_ALIAS 'default' SESSION_COOKIE_AGE 1209600 SESSION_COOKIE_DOMAIN None SESSION_COOKIE_HTTPONLY True SESSION_COOKIE_NAME 'sessionid' SESSION_COOKIE_PATH '/' SESSION_COOKIE_SAMESITE 'Lax' SESSION_COOKIE_SECURE False SESSION_ENGINE 'django.contrib.sessions.backends.db' SESSION_EXPIRE_AT_BROWSER_CLOSE False SESSION_FILE_PATH None SESSION_SAVE_EVERY_REQUEST False SESSION_SERIALIZER 'django.contrib.sessions.serializers.JSONSerializer' SETTINGS_MODULE 'core.settings' SHORT_DATETIME_FORMAT 'm/d/Y P' SHORT_DATE_FORMAT 'm/d/Y' SIGNING_BACKEND 'django.core.signing.TimestampSigner' SILENCED_SYSTEM_CHECKS [] SIMPLE_JWT {'ACCESS_TOKEN_LIFETIME': '********************', 'AUTH_HEADER_TYPES': ('Bearer',), 'REFRESH_TOKEN_LIFETIME': '********************', 'SIGNING_KEY': '********************'} STATICFILES_DIRS [] STATICFILES_FINDERS ['django.contrib.staticfiles.finders.FileSystemFinder', 'django.contrib.staticfiles.finders.AppDirectoriesFinder'] STATICFILES_STORAGE 'django.contrib.staticfiles.storage.StaticFilesStorage' STATIC_ROOT 'C:\\Users\\高琨\\Desktop\\backend\\static' STATIC_URL '/static/' STORAGES {'default': {'BACKEND': 'django.core.files.storage.FileSystemStorage'}, 'staticfiles': {'BACKEND': 'django.contrib.staticfiles.storage.StaticFilesStorage'}} TEMPLATES [{'APP_DIRS': True, 'BACKEND': 'django.template.backends.django.DjangoTemplates', 'DIRS': [], 'OPTIONS': {'context_processors': ['django.template.context_processors.debug', 'django.template.context_processors.request', 'django.contrib.auth.context_processors.auth', 'django.contrib.messages.context_processors.messages']}}] TEST_NON_SERIALIZED_APPS [] TEST_RUNNER 'django.test.runner.DiscoverRunner' THOUSAND_SEPARATOR ',' TIME_FORMAT 'P' TIME_INPUT_FORMATS ['%H:%M:%S', '%H:%M:%S.%f', '%H:%M'] TIME_ZONE 'Asia/Shanghai' USE_DEPRECATED_PYTZ False USE_I18N True USE_L10N True USE_THOUSAND_SEPARATOR False USE_TZ False USE_X_FORWARDED_HOST False USE_X_FORWARDED_PORT False WSGI_APPLICATION 'core.wsgi.application' X_FRAME_OPTIONS 'DENY' YEAR_MONTH_FORMAT 'F Y'

最新推荐

recommend-type

Matlab算法:2维A路径规划,鼠标选择起始点、障碍物和目标点位置,支持导出dll库、Matlab转C++,满足3维A算法需求,优化计算时间

基于MATLAB的2D和3D A*路径规划算法的实现方法。主要内容包括:通过鼠标点击交互方式设定起点、障碍物和目标点,支持地图尺寸自定义;利用MATLAB Coder工具导出DLL库并在C++环境中调用;针对3D路径规划,采用了八叉树结构管理和GPU加速的射线检测技术,显著提升了算法性能。此外,还讨论了在实际项目中遇到的问题及解决方案,如处理大规模OBJ模型时的优化技巧。 适合人群:对路径规划算法感兴趣的开发者,尤其是从事游戏开发、机器人导航等领域的人士。 使用场景及目标:适用于需要高效路径规划的应用场景,如游戏AI、自动驾驶、无人机飞行路径规划等。目标是帮助开发者掌握MATLAB环境下A*算法的具体实现,并能够将其应用于实际工程项目中。 其他说明:文中提供了详细的代码片段,展示了关键步骤和技术要点,有助于读者理解和实践。同时强调了理论与工程实践相结合的重要性,分享了一些实用的经验教训。
recommend-type

BLAKE加密哈希函数的官方Rust和C实现_the official Rust and C implementatio

BLAKE加密哈希函数的官方Rust和C实现_the official Rust and C implementations of the BLAKE3 cryptographic hash function.zip
recommend-type

清华大学袁春与华为李航合著的统计学习课件

统计学习方法是机器学习领域的重要分支,它将统计学的原理和方法应用在数据分析和模型构建上,尤其适用于解决那些不确定性和复杂性的数据问题。在本次分享的课件中,袁春和李航作为来自清华大学深圳研究生院和华为诺亚方舟实验室的专家,将为我们展示统计学习方法的理论与实践。 课件内容可能涵盖了以下几个主要知识点: 1. 统计学习的基本概念:首先,课件可能会介绍统计学习的定义,包括它是如何从统计学中独立出来,并与机器学习相结合,形成一套独立的理论体系的。同时,解释统计学习的核心思想,即利用数据来发现知识,并构建预测模型。 2. 常见统计学习模型:课程内容可能会包括线性回归、逻辑回归、决策树、随机森林、支持向量机(SVM)、神经网络等模型,这些模型是统计学习方法中经常使用到的。对于每一种模型,课件可能会详细讲解其数学原理、优缺点、适用场景以及如何在实际中应用。 3. 模型评估与选择:统计学习中的模型评估是确保模型性能和泛化能力的关键。课件可能会介绍交叉验证、AIC、BIC、ROC曲线、混淆矩阵等评估指标和方法。此外,还会讲解如何根据业务需求和评估结果选择合适的模型。 4. 正则化与模型优化:为防止过拟合,提升模型的泛化能力,统计学习方法中常常使用正则化技术。课件可能会涉及L1和L2正则化、岭回归(Ridge Regression)、套索回归(Lasso Regression)等技术,并展示如何通过正则化调整模型复杂度。 5. 统计学习在机器学习中的应用:统计学习方法不仅仅是一个理论体系,它在实际中也有广泛应用。例如,金融风险评估、生物信息学、推荐系统、自然语言处理等领域,课件可能会选取一些实际案例来说明统计学习方法的应用。 6. 统计学习前沿发展:课件可能会介绍统计学习领域的最新研究动态,包括最新算法的提出、统计学习与深度学习的结合等。 7. 课件中的实践操作:通常在理论学习之后,都会有实际操作环节,这可能包括使用R、Python等数据分析工具进行数据处理和模型构建。通过实际数据操作,学生可以更加深刻地理解统计学习方法的应用。 8. 与传统统计方法的区别:尽管统计学习方法和传统统计方法在很多方面是相通的,但它们在处理大数据、复杂数据结构以及预测模型构建方面存在明显差异。课件可能会明确指出这些差异,并突出统计学习方法的优势。 以上内容是根据标题、描述、标签以及文件名称列表推测的课件可能包含的知识点。学习这些内容,不仅能帮助学生掌握统计学习方法,而且能为学生在未来从事相关领域的工作和研究打下坚实的基础。
recommend-type

概率空间中的监督分类与文档分类方法探索

### 概率空间中的监督分类与文档分类方法探索 #### 1. 监督分类概述 在文档分类中,监督分类方法具有重要地位。这里主要聚焦于统计方法中的似然比方法。该方法通过不同类别依赖的统计模型来估计给定文档的概率,并利用这些概率的比率确定文档最可能所属的类别。 #### 2. 似然比框架 对于一个给定数据集划分为两个互斥类别A和B的情况,根据贝叶斯规则,给定观察文档D时每个类别的发生概率可表示为: - \(p(A|D) = p(D|A)× p(A)/p(D)\) (11.32a) - \(p(B|D) = p(D|B)× p(B)/p(D)\) (11.32b) 两式相除可得: \(p(
recommend-type

永磁同步电机矢量控制PID闭环函数

在永磁同步电机矢量控制中,PID(比例 - 积分 - 微分)控制器是常用的闭环控制方法,用于调节电机的各种参数,如电流、速度和位置等。以下是不同控制环中PID闭环函数的详细介绍: ### 电流环PID闭环函数 电流环主要用于控制电机的定子电流,通常分为直轴($d$轴)和交轴($q$轴)电流控制。PID控制器的输出是电压指令,用于控制逆变器的输出电压。 PID控制器的一般公式为: $$u(t) = K_p e(t) + K_i \int_{0}^{t} e(\tau) d\tau + K_d \frac{de(t)}{dt}$$ 在离散系统中,PID控制器的公式可以表示为: $$u(k)
recommend-type

GitHub新手入门:创建并发布您的第一个网站

标题所指的"emvillanueva.github.io"是一个网站地址,该地址使用GitHub Pages服务创建,它是GitHub提供的一项功能,允许开发者直接通过GitHub托管和发布静态网站。这一服务特别受那些希望简化网站部署流程的开发者欢迎,因为它无需传统服务器配置即可将网站上线。 从描述中,我们可以了解到几个关键知识点: 1. GitHub是一个为数百万开发人员共用的代码托管平台,支持版本控制和协作。它广泛用于开源项目,同时也有私有项目的服务,为开发人员提供了代码仓库、问题追踪、代码审查以及一系列其他功能。 2. GitHub Pages是GitHub提供的静态网站托管服务,允许用户通过特定的仓库来发布网站。用户可以通过简单地推送HTML、CSS、JavaScript文件到指定的分支(通常是master或main分支),而GitHub Pages将会自动发布这些文件到一个由GitHub分配的URL上。 3. 在GitHub中,"仓库"(repository)相当于项目的虚拟文件夹或容器,它包含项目的全部文件,包括源代码、文档、图像、数据等。此外,仓库还负责记录所有文件的更改历史,以及对这些文件进行版本控制。这种变更追踪机制意味着开发人员可以回滚到任何之前的状态,便于错误修复和版本迭代。 4. 在上述描述中,提到的一个项目仓库里包含三个关键文件:HTML文件、CSS文件和README文件。这些文件共同构成一个基本的网站结构: - HTML文件是网页的骨架,负责定义网页的结构和内容; - CSS文件负责网页的样式,包括颜色、字体以及其他视觉表现形式; - README文件通常用来描述项目的信息,例如项目的目的、使用说明和作者信息等。它通常以纯文本格式编写,但也可以用Markdown格式,以便于排版和展现更丰富的文档信息。 5. 描述中还提到了“JavaScript”,这是一种广泛应用于网页开发的编程语言,负责实现网页上的交互性和动态效果。虽然在这个上下文中并未明确指出JavaScript文件,但考虑到一个功能齐全的网站通常需要JavaScript来增加其功能性,因此可以推测仓库中可能还包含一个或多个JavaScript文件。 标签“JavaScript”强调了这一语言在现代网页开发中的重要性。它常用于网页的客户端脚本,可以操作文档对象模型(DOM),处理用户输入,制作动画效果,发送和接受服务器数据等。 压缩包子文件的文件名称列表中提到的"emvillanueva.github.io-master",意味着存在一个GitHub项目仓库的压缩包文件。通常,这种文件可以用来备份或传输仓库内容。在GitHub上,"master"或"main"分支是默认的主分支,存放着项目的最新稳定代码。 总结上述知识点,我们可以了解GitHub Pages服务的使用方法和优势、仓库的概念、文件类型对网站构成的重要性以及JavaScript在网页开发中的核心作用。这对于任何希望开始使用GitHub进行项目协作和网站发布的IT专业人士或爱好者都是非常重要的基础概念。
recommend-type

电影评论内容的极性分析与信息提取

### 电影评论内容的极性分析与信息提取 #### 1. 极性分析结论与方法概述 从相关结果中我们可以得出一些重要结论。首先,利用基于词频和分布的数学模型,特别是这里使用的向量空间模型方法,确实能够区分有观点内容的极性。而且,模型生成的面向领域和面向极性的表示维度的判别能力似乎相互独立。此外,结果还表明,至少对于本节所考虑的特定类型文本数据,极性判别比领域判别更具挑战性。 在极性估计方面,分析有观点内容的重要任务包括极性检测、强度估计、主观性检测和情感测量。这里我们主要关注前两项任务。极性检测是识别正负两个极性类别,而强度估计则是衡量给定内容的正负程度。从统计角度来看,我们采用基于似然比方
recommend-type

STM32F103C8T6双串口透传

虽然给定引用中未直接提及STM32F103C8T6实现双串口透传的方法,但可结合其串口通信相关内容推测大致实现思路。 STM32F103C8T6有3个USART,可利用其中两个串口来实现双串口透传。其基本方法如下: 1. **硬件连接**:连接两个外部设备到STM32F103C8T6的两个不同串口,比如一个设备连接到USART1,另一个连接到USART2。同时要保证电源、地等连接正确。 2. **工程配置**:可使用STM32CubeMx工具对工程进行配置。开启需要使用的两个串口,并设置好波特率、数据位、停止位等参数。 3. **代码实现**:在代码中,利用HAL库函数实现串口数据的接收和发
recommend-type

DevSENAI项目深度解析与技术指南

由于提供的文件信息中【标题】和【描述】都是"DevSENAI",且【标签】为空,【压缩包子文件的文件名称列表】中只有一个元素"DevSENAI-main",这些信息对于生成知识点来说非常有限。由于没有具体的上下文,我将假设"DevSENAI"是一个特定的软件开发项目名称,并且"DevSENAI-main"是该项目的主分支或主文件夹。以下是对“DevSENAI”项目可能涉及的知识点的详细说明: 1. 软件开发流程:DevSENAI项目可能遵循传统的软件开发生命周期,包括需求收集、分析、设计、实现、测试、部署和维护等阶段。每个阶段都有其特定的工作和产出,如需求规格说明书、设计文档、代码、用户手册和维护计划。 2. 版本控制系统:从文件名称"DevSENAI-main"可以推断,该项目可能使用了版本控制系统,如Git。"main"通常是Git中默认的主分支名称,它用于存放当前稳定版本的代码。版本控制对于团队协作开发项目是必不可少的,它可以帮助开发者追踪代码变更历史,合并代码,以及协作解决问题。 3. 项目管理方法论:DevSENAI可能采用了一系列项目管理实践来确保项目的顺利进行。这可能包括敏捷开发方法,比如Scrum或Kanban,以快速迭代的方式开发和交付软件。项目管理也涉及资源分配、进度跟踪、风险管理以及与利益相关者的沟通。 4. 软件工程最佳实践:为确保软件质量和项目效率,DevSENAI项目可能会采用各种软件工程的最佳实践。这可能包括代码审查、持续集成(CI)、测试驱动开发(TDD)、以及重构等技术。这些实践有助于保证代码的可维护性和项目的可持续发展。 5. 技术栈与开发工具:DevSENAI项目所涉及的技术栈将决定它使用哪些编程语言、框架、数据库、开发环境和部署工具。例如,项目可能使用Java作为后端开发语言、React作为前端框架、MySQL作为数据库,以及Docker容器化技术和Kubernetes作为容器编排工具。 6. 质量保证和测试:在"DevSENAI"项目中,质量保证是一个核心环节。这可能包括单元测试、集成测试、性能测试、安全测试等,确保软件满足既定的功能和非功能需求。自动化测试工具如Selenium、JUnit、Mocha等可能是项目中的重要组成部分。 7. 文档和知识管理:任何软件项目都需要相应的文档来记录其开发过程和最终产品。文档可能包括API文档、系统设计文档、用户手册、操作指南等。在DevSENAI项目中,有效的知识管理还可能涉及到团队内部知识共享的实践和平台。 8. 部署和运维:项目开发完成之后,还需要考虑软件的部署和运维工作。这可能包括持续部署(CD)、监控、日志记录、错误报告和恢复计划等。DevOps文化可能在DevSENAI项目中得到实践,以提高交付效率和运维效能。 9. 用户界面和体验:对于面向终端用户的软件,用户界面(UI)和用户体验(UX)设计变得尤为重要。在DevSENAI项目中,可能需要专业的UI/UX设计师参与,以确保软件产品既美观又易于使用。 10. 数据安全与合规性:在开发过程中,需要考虑到数据安全和符合相关法律法规的要求。DevSENAI项目可能需要实现安全措施,如数据加密、安全认证、访问控制以及遵循如GDPR或HIPAA等数据保护法规。 由于文件提供的信息非常有限,以上知识点是基于一般性的假设。实际的DevSENAI项目可能涉及更多具体的技术细节和专业知识,这些将需要更详尽的项目文档和信息来进行深入探讨。
recommend-type

文本关键词提取与摘要生成技术解析

### 文本关键词提取与摘要生成技术解析 #### 1. 角色重要性分析 在相关文本分析中,对角色的重要性进行了评估,以下是部分角色的相关数据: | 角色 | 出现次数 | 占比 | 相关指标值 | 另一指标值 | | ---- | ---- | ---- | ---- | ---- | | {'Sancho'} | 13 | 0.052632 | 17.369 | 0.11424 | | {'Christian'} | 9 | 0.043478 | 14.905 | 0.083947 | | {'Pedro'} | 3 | 0.032258 | 0 | 0.031398 | | {'Gin