See this inside the code
1 class ResNeXt101(): 2 def __init__(self): 3 super(ResNeXt101, self).__init__() 4 net = resnext101() # print((), net) 5 net = list(()) # () to get the surface network of resneXt # for i, value in enumerate(net): # print(i, value) 6 self.layer0 = (net[:3]) # Pack the first three layers with 0, 1, 2. print(self.layer0) 7 self.layer1 = (*net[3: 5]) # Packaging of layers 3 and 4 8 self.layer2 = net[5] 9 self.layer3 = net[6]
You can see the sixth line in the code (the serial number removes itself, I typed it in)self.layer0 = (net[:3])
cap (a poem)
Seventh row.self.layer1 = (*net[3: 5])
There is a(net[:3])
respond in singing(*net[3: 5])
Not today.()
Usage, meaning, function because I don't really understand it either. Amazing then says*net[3: 5]
Why do you need this?“ * ”
When code is written without the*
The following problem occurs when running
It means that the list is not a subclass, that is, the parameters are not right
net = list(())
This line of code takes out each layer of the model to construct a list, just try printing it yourself. The approximate output is[conv(),BatchNorm2d(), ReLU,MaxPool2d]
wait a minute!
The total is an element, not quite the same as a normal list.
When we takenet[:3]
when we pass in a list of parameters, but we use the*net[:3]
is passed in as a single element when the
list1 = ["conv", ("relu", "maxing"), ("relu", "maxing", 3), 3] list2 = [list1[:1]] list3 = [*list1[:1]] print("list2:{}, *list1[:2]:{}".format(list1[:1], *list1[:1]))
The result without ✳ is a list and with ✳ is an element, so the(*net[3: 5])
hit the nail on the head*net[3: 5]
that is, to()
Multiple layers are passed in this container.
to this article about pytorch in the (*net[3: 5]) what is the meaning of the article is introduced to this, more related pytorch (*net[3: 5]) content please search for my previous posts or continue to browse the following related articles I hope that you will support me in the future!