SoFunction
Updated on 2024-11-17

pytorch determines whether cuda determines the type of variable way

I'll cut to the chase, so let's get right to the code~

inputs = Variable((2,2))
inputs.is_cuda # will return false
inputs = Variable((2,2).cuda())
inputs.is_cuda # returns true

Judgment:

torch.is_tensor() # return true if it is a pytorch tensor type

torch.is_storage() # if it is the storage type of pytorch return ture

Here's another tip, if you need to determine whether the tensor is empty or not, you can do it as follows

>>> a=()
>>> len(a)
0
>>> len(a) is 0
True

Settings: Through some built-in functions, you can set the precision, type and print parameters of the tensor.

torch.set_default_dtype(d) # Set the default floating point type for ()
 
torch.set_default_tensor_type() # Ibid, set default tensor type for ()
>>> ([1.2, 3]).dtype   # initial default for floating point is torch.float32
torch.float32
>>> torch.set_default_dtype(torch.float64)
>>> ([1.2, 3]).dtype   # a new floating point tensor
torch.float64
>>> torch.set_default_tensor_type()
>>> ([1.2, 3]).dtype # a new floating point tensor
torch.float64
 
torch.get_default_dtype() # Get the current default floating point type
 
torch.set_printoptions(precision=None, threshold=None, edgeitems=None, linewidth=None, profile=None)#)
## set upprintingThe print parameters of the

Determining the type of a variable: either of the following methods will work

if isinstance(downsample, ):
# if (downsample) != :

Additional knowledge:pytorch: test if GPU is available

Without further ado, let's look at the code.

import torch
flag = .is_available()
print(flag)

ngpu= 1
# Decide which device we want to run on
device = ("cuda:0" if (.is_available() and ngpu > 0) else "cpu")
print(device)
print(.get_device_name(0))
print((3,3).cuda()) 
True
cuda:0
GeForce GTX 1080
tensor([[0.9530, 0.4746, 0.9819],
  [0.7192, 0.9427, 0.6768],
  [0.8594, 0.9490, 0.6551]], device='cuda:0')

Above this pytorch judgment whether cuda Judgment variable type way is all I share with you, I hope to give you a reference, and I hope you support me more.