fixed_batch_size_batch_sampler#

(s3prl.dataio.sampler.fixed_batch_size_batch_sampler)

The most commonly used batch sampler, recover the default batch sampler used in torch DataLoader

Authors:
  • Leo 2022

FixedBatchSizeBatchSampler#

class s3prl.dataio.sampler.fixed_batch_size_batch_sampler.FixedBatchSizeBatchSampler(data_source, batch_size: int, shuffle: bool = False, seed: int = 12345678)[source][source]#

Bases: object

The reduced timestamps for a batch should not exceed the max_timestamp. If shuffled, each indices are first shuffled before aggregated into batches

Parameters:

data_source – __len__ is implemented

set_epoch(epoch: int) None[source][source]#