活动介绍
file-type

make-cert工具:简化HTTPS服务器的自签名证书生成

ZIP文件

下载需积分: 26 | 143KB | 更新于2025-01-23 | 38 浏览量 | 0 下载量 举报 收藏
download 立即下载
在这个IT知识分享中,我们将聚焦于自签名证书的概念,生成自签名证书的过程以及如何利用“make-cert”这个JavaScript包来快速创建自签名证书。同时,我们还会提及HTTPS服务器的启动以及证书生成中的PEM格式和私钥的使用。 ## 自签名证书概念 自签名证书是指一个不受权威证书颁发机构(CA)信任的证书。由于它是由证书持有者自己签发的,因此在安全性上不能得到浏览器或其他实体的默认信任。它通常用于开发、测试以及企业内部网络环境,在这些场合中,实体之间的信任可以预先设定。 自签名证书在功能上和CA签发的证书一样,可以用于启用HTTPS,从而加密客户端和服务器之间的通信。不过,当用户访问使用自签名证书的HTTPS网站时,浏览器会警告证书不受信任,这是因为没有一个权威的第三方来验证服务器的身份。 ## make-cert包的使用 “make-cert”是一个方便的JavaScript库,可以快速生成自签名证书。根据给定的描述,我们可以看出该包的目的在于简化在本地环境中测试HTTPS服务器的过程。 ### 安装 使用yarn包管理器安装make-cert非常简单。首先,确保你的开发环境已经安装了yarn,然后运行以下命令: ```bash yarn add make-cert ``` ### 生成证书 安装完成后,就可以生成自签名证书了。该命令行工具会生成两个文件:`key.pem`和`cert.pem`。`key.pem`包含私钥,它用于解密通过HTTPS发送给服务器的数据;`cert.pem`包含证书,它用于向连接的客户端证明服务器的身份。 通过运行以下命令,可以在本地为`localhost`生成证书: ```bash yarn make-cert localhost ``` ### 在JavaScript代码中使用 除了命令行工具,make-cert还可以在JavaScript代码中直接使用。通过从包中导入makeCert函数,可以生成包含私钥和证书的对象。然后可以使用这些密钥来启动一个本地的HTTPS服务器。示例代码如下: ```javascript import makeCert from 'make-cert'; const { key, cert } = makeCert('localhost'); console.log(key); // 输出私钥 console.log(cert); // 输出证书 ``` 通过这种方式,开发者可以在自己的应用程序中嵌入生成证书的逻辑,进一步自动化开发流程。 ## 证书生成中的PEM格式和私钥 PEM(Privacy Enhanced Mail)格式是一种基于Base64编码的容器格式,用于存储X.509证书以及其他加密实体,如私钥。PEM格式的文件以`-----BEGIN CERTIFICATE-----`和`-----END CERTIFICATE-----`开头和结尾,对于私钥则为`-----BEGIN PRIVATE KEY-----`和`-----END PRIVATE KEY-----`。 - `key.pem`是PEM格式的私钥文件,通常包括以下内容: - 开头标识(例如`-----BEGIN RSA PRIVATE KEY-----`) - Base64编码的私钥数据 - 结尾标识(例如`-----END RSA PRIVATE KEY-----`) - `cert.pem`是PEM格式的证书文件,里面包含了公钥以及与之相关的其他信息,如: - 开头标识(例如`-----BEGIN CERTIFICATE-----`) - Base64编码的证书数据 - 结尾标识(例如`-----END CERTIFICATE-----`) ## 总结 在开发和测试阶段,make-cert提供了一个快速且方便的方法来生成自签名证书,从而启动HTTPS服务器。使用这个包,开发者可以节约时间,避免了手动生成证书的复杂步骤。虽然自签名证书不适用于生产环境,因为它们不被广泛信任的CA所签署,但它们在确保数据传输安全性的学习和测试阶段仍然是一个非常有用的工具。通过理解PEM格式和私钥的概念,开发者可以更有效地在自己的应用中处理和使用SSL/TLS证书。

相关推荐

filetype

:\Users\20685\AppData\Local\Programs\Python\Python310\python.exe D:\pythonProject_request\run.py ============================= test session starts ============================= platform win32 -- Python 3.10.11, pytest-8.3.5, pluggy-1.5.0 -- C:\Users\20685\AppData\Local\Programs\Python\Python310\python.exe cachedir: .pytest_cache metadata: {'Python': '3.10.11', 'Platform': 'Windows-10-10.0.26100-SP0', 'Packages': {'pytest': '8.3.5', 'pluggy': '1.5.0'}, 'Plugins': {'allure-pytest': '2.14.0', 'base-url': '2.1.0', 'html': '4.1.1', 'metadata': '3.1.1', 'order': '1.3.0', 'ordering': '0.6', 'rerunfailures': '15.1', 'xdist': '3.8.0'}, 'JAVA_HOME': 'C:\\Program Files\\Java\\jdk1.8.0_131', 'Base URL': ''} rootdir: D:\pythonProject_request configfile: pytest.ini plugins: allure-pytest-2.14.0, base-url-2.1.0, html-4.1.1, metadata-3.1.1, order-1.3.0, ordering-0.6, rerunfailures-15.1, xdist-3.8.0 collecting ... collected 1 item testcaes/test_all_case.py::TestAllCase::test_login[caseinfo0] FAILED ================================== FAILURES =================================== ______________________ TestAllCase.test_login[caseinfo0] ______________________ self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x0000018B5D7EE170> conn = <urllib3.connection.HTTPSConnection object at 0x0000018B5D7EE8F0> method = 'POST' url = '/adminapi/login?Content_Type=application%2Fx-www-from-urlencoded' body = 'account=admin&pwd=123456&imgcode=' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '33', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = <urllib3.connection.HTTPSConnection object at 0x0000018B5D7EE8F0> preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: > self._validate_conn(conn) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connectionpool.py:468: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connectionpool.py:1097: in _validate_conn conn.connect() C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connection.py:642: in connect sock_and_verified = _ssl_wrap_socket_and_match_hostname( C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connection.py:783: in _ssl_wrap_socket_and_match_hostname ssl_sock = ssl_wrap_socket( C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\util\ssl_.py:471: in ssl_wrap_socket ssl_sock = _ssl_wrap_socket_impl(sock, context, tls_in_tls, server_hostname) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\util\ssl_.py:515: in _ssl_wrap_socket_impl return ssl_context.wrap_socket(sock, server_hostname=server_hostname) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\ssl.py:513: in wrap_socket return self.sslsocket_class._create( C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\ssl.py:1071: in _create self.do_handshake() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <ssl.SSLSocket [closed] fd=-1, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=0> block = False @_sslcopydoc def do_handshake(self, block=False): self._check_connected() timeout = self.gettimeout() try: if timeout == 0.0 and block: self.settimeout(None) > self._sslobj.do_handshake() E ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1007) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\ssl.py:1342: SSLCertVerificationError During handling of the above exception, another exception occurred: self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x0000018B5D7EE170> method = 'POST' url = '/adminapi/login?Content_Type=application%2Fx-www-from-urlencoded' body = 'account=admin&pwd=123456&imgcode=' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '33', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/adminapi/login', query='Content_Type=application%2Fx-www-from-urlencoded', fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] <https://github.com/urllib3/urllib3/issues/651> release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connectionpool.py:791: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x0000018B5D7EE170> conn = <urllib3.connection.HTTPSConnection object at 0x0000018B5D7EE8F0> method = 'POST' url = '/adminapi/login?Content_Type=application%2Fx-www-from-urlencoded' body = 'account=admin&pwd=123456&imgcode=' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '33', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = <urllib3.connection.HTTPSConnection object at 0x0000018B5D7EE8F0> preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) > raise new_e E urllib3.exceptions.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1007) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connectionpool.py:492: SSLError The above exception was the direct cause of the following exception: self = <requests.adapters.HTTPAdapter object at 0x0000018B5D7BA680> request = <PreparedRequest [POST]>, stream = False timeout = Timeout(connect=None, read=None, total=None), verify = True cert = None, proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest <PreparedRequest>` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) <timeouts>` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connectionpool.py:845: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'POST' url = '/adminapi/login?Content_Type=application%2Fx-www-from-urlencoded' response = None error = SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1007)')) _pool = <urllib3.connectionpool.HTTPSConnectionPool object at 0x0000018B5D7EE170> _stacktrace = <traceback object at 0x0000018B5D392700> def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Retry: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='192.168.116.137', port=443): Max retries exceeded with url: /adminapi/login?Content_Type=application%2Fx-www-from-urlencoded (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1007)'))) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\util\retry.py:515: MaxRetryError During handling of the above exception, another exception occurred: self = <testcaes.test_all_case.TestAllCase object at 0x0000018B5D7EE140> caseinfo = {'request': {'data': {'account': 'admin', 'imgcode': '', 'pwd': 123456}, 'method': 'post', 'params': {'Content_Type': 'application/x-www-from-urlencoded'}, 'url': 'https://192.168.116.137/adminapi/login'}, 'validate': None} @pytest.mark.parametrize("caseinfo",read_yaml(yaml_path)) def duyo(self,caseinfo): new_caseinfo = verify_yaml(caseinfo) #在读取yaml内容后调取用例模板方法验证用例是否足够 #发送请求 > RequestUtil().send_all_request(**new_caseinfo.request) testcaes\test_all_case.py:17: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ common\requests_util.py:27: in send_all_request res = RequestUtil.sess.request(**kwargs) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\sessions.py:589: in request resp = self.send(prep, **send_kwargs) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <requests.adapters.HTTPAdapter object at 0x0000018B5D7BA680> request = <PreparedRequest [POST]>, stream = False timeout = Timeout(connect=None, read=None, total=None), verify = True cert = None, proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest <PreparedRequest>` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) <timeouts>` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. > raise SSLError(e, request=request) E requests.exceptions.SSLError: HTTPSConnectionPool(host='192.168.116.137', port=443): Max retries exceeded with url: /adminapi/login?Content_Type=application%2Fx-www-from-urlencoded (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1007)'))) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\adapters.py:698: SSLError ============================== warnings summary =============================== C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\_pytest\config\__init__.py:1277 C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\_pytest\config\__init__.py:1277: PytestAssertRewriteWarning: Module already imported so cannot be rewritten: allure_pytest self._mark_plugins_for_rewrite(hook) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\_pytest\config\__init__.py:1500 C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\_pytest\config\__init__.py:1500: PytestConfigWarning: No files were found in testpaths; consider removing or adjusting your testpaths configuration. Searching recursively from the current directory instead. self.args, self.args_source = self._decide_args( C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\_pytest\config\__init__.py:1441 C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\_pytest\config\__init__.py:1441: PytestConfigWarning: Unknown config option: Python_classes self._warn_or_fail_if_strict(f"Unknown config option: {key}\n") -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html =========================== short test summary info =========================== FAILED testcaes/test_all_case.py::TestAllCase::test_login[caseinfo0] - reques... ======================== 1 failed, 3 warnings in 0.26s ======================== 进程已结束,退出代码为 0

filetype

/* * @Description: * @Author: xiehongkai * @Date: 2020-12-23 18:11:42 * @LastEditors: 水冰淼 * @LastEditTime: 2023-04-09 11:51:17 */ "use strict"; // Template version: 1.3.1 // see http://vuejs-templates.github.io/webpack for documentation. const path = require("path"); module.exports = { dev: { // Paths assetsSubDirectory: "static", assetsPublicPath: "/", proxyTable: { "/api": { //target: 'http://47.97.171.113',//后端接口地址 //target: "http://47.97.171.113:4670", //后端接口地址 target: 'localhost', // 使用本地接口地址。 changeOrigin: true, //是否允许跨越 pathRewrite: { //"^/api": "http://47.97.171.113:4670" '^/api': 'localhost' } } /*'/': { target: 'https://oapi.dingtalk.com',//后端接口地址 changeOrigin: true,//是否允许跨越 pathRewrite: { '^/': '/' } }*/ }, // Various Dev Server settings // host: 'localhost', // can be overwritten by process.env.HOST host: "localhost", // can be overwritten by process.env.HOST port: 4671, // can be overwritten by process.env.PORT, if port is in use, a free one will be determined autoOpenBrowser: false, errorOverlay: true, notifyOnErrors: true, poll: false, // https://webpack.js.org/configuration/dev-server/#devserver-watchoptions- // Use Eslint Loader? // If true, your code will be linted during bundling and // linting errors and warnings will be shown in the console. useEslint: true, // If true, eslint errors and warnings will also be shown in the error overlay // in the browser. showEslintErrorsInOverlay: false, /** * Source Maps */ // https://webpack.js.org/configuration/devtool/#development devtool: "cheap-module-eval-source-map", // If you have problems debugging vue-files in devtools, // set this to false - it *may* help // https://vue-loader.vuejs.org/en/options.html#cachebusting cacheBusting: true, cssSourceMap: true }, build: { // Template for index.html index: path.resolve(__dirname, "../dist/index.html"), // Paths assetsRoot: path.resolve(__dirname, "../dist"), assetsSubDirectory: "static", assetsPublicPath: "/", /** * Source Maps */ productionSourceMap: false, // https://webpack.js.org/configuration/devtool/#production devtool: "#source-map", // Gzip off by default as many popular static hosts such as // Surge or Netlify already gzip all static assets for you. // Before setting to `true`, make sure to: // npm install --save-dev compression-webpack-plugin productionGzip: false, productionGzipExtensions: ["js", "css"], // Run the build command with an extra argument to // View the bundle analyzer report after build finishes: // `npm run build --report` // Set to `true` or `false` to always turn it on or off bundleAnalyzerReport: process.env.npm_config_report } }; 如何实现访问是https

剑道小子
  • 粉丝: 36
上传资源 快速赚钱