o �J�h��@s�ddlZddlZddlmZddlZddlmZdejddfdd�Zdejfdd �Z dej jfd d �Z de fd d �Z ddd�Zde fdd�Zdaej     ddefdd��ZdS)�N)� Generator)�default_generator� new_state�returncCst�|�dS)z�Sets the random number generator state. .. note:: This function only works for CPU. For CUDA, please use :func:`torch.manual_seed`, which works for both CPU and CUDA. Args: new_state (torch.ByteTensor): The desired state N)r� set_state)r�r�JC:\pinokio\api\whisper-webui.git\app\env\lib\site-packages\torch\random.py� set_rng_state s r cC�t��S)z�Returns the random number generator state as a `torch.ByteTensor`. .. note:: The returned state is for the default generator on CPU only. See also: :func:`torch.random.fork_rng`. )r� get_staterrrr� get_rng_statesr cCstt|�}ddl}|j��s|j�|�ddl}|j��s"|j�|�ddl}|j ��s1|j �|�t |�t �|�S)a�Sets the seed for generating random numbers on all devices. Returns a `torch.Generator` object. Args: seed (int): The desired seed. Value must be within the inclusive range `[-0x8000_0000_0000_0000, 0xffff_ffff_ffff_ffff]`. Otherwise, a RuntimeError is raised. Negative inputs are remapped to positive values with the formula `0xffff_ffff_ffff_ffff + seed`. rN) �int� torch.cuda�cuda�_is_in_bad_fork�manual_seed_all� torch.mps�mps� manual_seed� torch.xpu�xpu�_seed_custom_devicer��seed�torchrrrr s        rcCsnt��}ddl}|j��s|j�|�ddl}|j��s"|j�|�ddl }|j ��s1|j �|�t |�|S)z�Sets the seed for generating random numbers to a non-deterministic random number on all devices. Returns a 64 bit number used to seed the RNG. rN) rrrrrrrrrrrrrrrrr?s      rcCs�t|�}tj��}tt|�rPtt|�}d}d}t||�r2t||�r2t||��s0t||�|�d Sd Sd|�d�}|d|�d|�d|�d�7}tj|td d �d Sd S) z�Sets the seed to generate random numbers for custom device. Args: seed (int): The desired seed. See [Note: support the custom device with privateuse1] rrzSet seed for `z0` device does not take effect, please add API's �`z` and `z` to `z` device module.�)� stacklevelN) r r�_C�_get_privateuse1_backend_name�hasattr�getattr�warnings�warn� UserWarning)r�custom_backend_name�custom_device_modZ_bad_fork_nameZ_seed_all_name�messagerrrrXs     � � �rcCr )z�Returns the initial seed for generating random numbers as a Python `long`. .. note:: The returned seed is for the default generator on CPU only. )r� initial_seedrrrrr(qsr(FT�fork_rng�devicesrc #sf�t�|�j}tt|d���durtd|�d�d��|s"dVdS|durp���}|dkritsi|���d|�d|�d|���d |���d |���d |���d |�d |�d|���d|�d|�d�}t� |�dat t |��}nt |�}t� �}�fdd�|D�}zdVWt� |�t||�D] \} } �� | | �q�dSt� |�t||�D] \} } �� | | �q�w)a� Forks the RNG, so that when you return, the RNG is reset to the state that it was previously in. Args: devices (iterable of Device IDs): devices for which to fork the RNG. CPU RNG state is always forked. By default, :meth:`fork_rng` operates on all devices, but will emit a warning if your machine has a lot of devices, since this function will run very slowly in that case. If you explicitly specify devices, this warning will be suppressed enabled (bool): if ``False``, the RNG is not forked. This is a convenience argument for easily disabling the context manager without having to delete it and unindent your Python code under it. device_type (str): device type str, default is `cuda`. As for custom device, see details in [Note: support the custom device with privateuse1] Nztorch has no module of `z`, you should register z,a module by `torch._register_device_module`.�z reports that you have z& available devices, and you have used z_ without explicitly specifying which devices are being used. For safety, we initialize *every* zA device by default, which can be quite slow if you have a lot of z5s. If you know that you are only making use of a few z' devices, set the environment variable z_VISIBLE_DEVICES or the 'z' keyword argument of z� with the set of devices you are actually using. For example, if you are using CPU only, set device.upper()_VISIBLE_DEVICES= or devices=[]; if you are using device 0 only, set zb_VISIBLE_DEVICES=0 or devices=[0]. To initialize all devices and suppress this warning, set the 'z#' keyword argument to `range(torch.z.device_count())`.Tcsg|]}��|��qSr)r )�.0�device�Z device_modrr� <listcomp>�szfork_rng.<locals>.<listcomp>)rr-�typer!� RuntimeError� device_count�_fork_rng_warned_already�upperr"r#�list�ranger r �zip) r*�enabled�_callerZ _devices_kw� device_typeZ num_devicesr'Z cpu_rng_stateZdevice_rng_statesr-Zdevice_rng_staterr.rr)}sj�   ��  �������� � ��  � ��)rN)NTr)r*r)� contextlibr"�typingrr�torch._Cr�Tensorr r rrr rrr(r3�contextmanagerr)rrrr�<module>s*      ��
Memory