Search
 
SCRIPT & CODE EXAMPLE
 
CODE EXAMPLE FOR PYTHON

pytorch dataloader to device

You can modify the collate_fn to handle several items at once:

from torch.utils.data.dataloader import default_collate

device = torch.device('cuda:0')  # or whatever device/cpu you like

# the new collate function is quite generic
loader = DataLoader(demo, batch_size=50, shuffle=True, 
                    collate_fn=lambda x: tuple(x_.to(device) for x_ in default_collate(x)))
Note that if you want to have multiple workers for the dataloader, you'll need to add

torch.multiprocessing.set_start_method('spawn')
after your if __name__ == '__main__' (see this issue).

Having said that, it seems like using pin_memory=True in your DataLoader would be much more efficient. Have you tried this option?
See memory pinning for more information.
Source by stackoverflow.com #
 
PREVIOUS NEXT
Tagged: #pytorch #dataloader #device
ADD COMMENT
Topic
Name
4+5 =